Transcript for July 24, 2021 Henry Rich

Thanks to Rodrigo Girão Serrão and Rich Park for producing this transcript.

00:00:00 [Henry Rich]

Working on the J engine is the most noble calling that anyone could have, I would say. I’d say this is, this is a real value to the world that somebody has to sit down and write the C code for it. And I’m glad to be the one to do it.

00:00:14 [Music Theme]

00:00:26 [Conor Hoekstra]

Welcome to another episode of Array Cast. I’m your host Connor. We’re going to quickly go around and do brief introductions, we’ll get to a short announcement and then we're going to hop into a special episode today, as we have our first guest. So I’ll kick it over to Bob, then to Stephen, and then to Adám.

00:00:43 [Bob Therriault]

I'm Bob Therriault. I'm a J enthusiast. I can't claim to be a professional J programmer, but I have been doing it for 19, almost 20 years.

00:00:53 [Stephen Taylor]

Stephen Taylor. I'm an APL and q programmer and these days I'm also the KX librarian.

00:01:02 [Adám Brudzewsky]

I'm Adám Brudzewsky. I work full time for Dyalog, doing APL which I have been doing all my life, and professionally for 7, 8 years.

00:01:11 [CH]

And I'm your host, Connor Hoekstra. As mentioned before, as you know if you've been listening up to this point, I am not a APLer or J-er, or k-er or q-er. I write C++ professionally day to day, and I work at NVidia, but I am a massive APL fan. If you follow me on any of my social medias, my C++ followers are probably a little bit irritated at this point 'cause I don't tweet much C++, I just tweet APL & J stuff. So yeah, this is going to be an awesome episode. We've got a short announcements and then we'll hop into our first interview with our guest, so I guess I'll kick this over to Bob.

00:01:42 [BT]

Yeah, one of the drawbacks to having a language that's 50 years old is you start losing the pioneers that developed it and that's, unfortunately, we've lost another one. Ian Sharp who was the head of I.P. Sharp Associates (IPSA) passed away and there's been some really good remembrances put up - one by Roger [Hui] and one by Bob Bernecky. We'll put both of those in our show notes - the links to them anyway.

But I'm just going to read an excerpt that Lib Gibson, who worked with Ian, talked about the fact that IPSA was quite a remarkable company and headed by a remarkable man. This is Lib Gibson in 2014.

"And then there's Ian, the heart of the company. A twenty-first-century company way back in the last century, relatively flat and widely dispersed, electronics, and camaraderie - and Ian. The company was blind on Race, Creed, Color, Nationality, Sexual Orientation, and Eccentricity and Gender, I left if so is suspicion that discrimination against women was a myth. Yeah, right." Lib Gibson wrote that.

And Bob Bernecky also posted an interview that Whitney Smith had done in 1984 with Ian, so we can hear Ian's Words about the culture of the company.

[Whitney Smith]

"It's an old cliche about a company feeling like a family. But I do get that feeling around here, people are very social and and relaxed. Do you have anything to say about that? I mean has this been deliberate or has this just come about.

[Ian Sharp]

"I don't think that theres anything deliberate about just adopt a certain style and it becomes a company Style and you cant really change it. Where it comes from, who knows?"

[WS]

"Okay, I was just talking with Margaret Riley upstairs. And and she, we were talking about. I asked her, why are there so many women in high positions in this company? Because in most companies this size there aren't a lot of women who have positions of such responsibility."

[IS]

"Well if I know Margaret right she'd probably turn that around and say, why are there so many men in high positions in the company?

[WS]

I'll pass that on to her. Can you explain that?"

[IS]

"Thats what it like asking. Why are there so many people in high places in this company that we dont? Open the differentiate between the males and females. Looking for people with particular, talents particular ideas, and some of them are female I suppose so by the law of averages the female by the average is 50%.

00:04:54 [BT]

"So in his own words, that was the kind of culture that he established. And one of us on this panel, had the privilege of working with Ian. So I’ll take it over to you Stephen.

00:05:04 [ST]

I did, indeed. Well, in my mid-twenties, I had the unbelievable, good fortune to join his company and as Lib said, is it a flat hierarchy, you could talk to anyone. I was twenty-five years old. I didn't know that the world didn't work like this. The company had its own International telecommunications Network at that time, we were thinly, distributed around the world. We ran the place on email. and, with, I dunno, a company culture of civility, and honour and tolerance that I've never seen matched anywhere else.

When I madly left to join a startup after 9 years, Ian just said to me: "well, if it doesn't work out, come back."

I’ve never been in a culture like that and I’ve been saying for years that it ruins me a life in the real world. I heard a story from Ken Iverson about Ian. He spent a lot more time with him and I did. They had been on a flight. I think it it might even have been when can I was in Australia and as they were landing, Ken looks over at Ian filling out a landing card. He noticed that under occupation, he just put down programmer. And somehow that chimed with me. And I’ve been writing programmer as my occupation for the last 30 Years.

00:06:40 [CH]

Yeah, it's kind of refreshing to hear that the company, you know, back in the 70s. I was able to have a culture like that, what, whereas now fast forward, 50 years and seems like we've, there's been I don't want to say regression 'cause that sounds like a sort of unicorn company. I don't think most companies were like that, back in the 70s, but some, hopefully all corporations and startups, and small companies can aspire to, to have that kind of culture in the future 'cause it sounds like an amazing place to work. I obviously did not have the privilege of working there 'cause I wasn't born when that I think right. Right. When I was being born, IPSA was being sold to Reuters so I don't think they hire babies, but if I had the opportunity, it sounds like it would have been an absolutely amazing place to work.

00:07:29 [ST]

Well, Connor it's an amazing thing to tell young people today that my email handle SJT is older than the internet.

00:07:38 [CT]

Yeah, there is probably not a lot of people that can say that in the world. But yeah, with that we will hop into an episode that I'm super excited about. So our guest today is Henry Rich, I’m going to let Henry introduce himself. But before I do, what I will say is I primarily know Henrys name due to my sort of digging around in the J source GitHub repository which is hosted on the J Software account because J is open source. It's under GPL license, but you can go look at all the code. And if you look at the contributors, the number one, contributor since 2017, when I believe this was posted on GitHub is Henry H. Rich who has over half-a-million additions in those last four years and has more than I think twice as many comments as the next person. Erik Iverson. I believe, the founder of J Software, only has like, 221 commits. So you got like, 7 x, more commits than Eric, but that, I think is just the four years of GitHub. But the point being, is that clearly Henry's done a ton of work on, on the J source code. And yeah, I'll kick it to you, Henry, so feel free to start wherever you want in your sort of career timeline. 'Cause I know you've worn more than one hat than just programmer in your, in your history.

00:09:07 [HR]

That is true, I'll start at the beginning. I've lived my life in Raleigh North Carolina. And my programming started, I think it was 1966 when I would have been 13 the town - cause Raleigh is not much more than a town - with a time sharing account for the public school system, remarkably forward-thinking and we had a terminal in the junior high school and I discovered BASIC. I realized that was writing programs and all I ever wanted to do. I’ve done it ever since. Which meant that when I graduated from college IBM had fortunately moved into Raleigh. Thats why I went to work for IBM. The programming of those days was Assembly Language and some just based on PL1 but much lower level, not too different from C. But one summer, the modelling department, for the project that was on, I was detailed to work for them for the summer and they were using a APL. And that was just eye opening, I realized this is the way to write code. Forget about all that other stuff. I wasn't able to write APL for the next 20 years. I ended up, I joined a start-up and ended up designing graphics hardware, eventually being worked for flight simulation but its maybe 20 years later in the mid-nineties. I got the position where I could decide what language you could use, and I remembered APL. So let's go back and see if we can get some APL. We were starting up eating too much money. So Im looking around, I discovered that after go-live, or some software, which have something that seems like a p l. So, I told her I was in software Jean Iverson answer the phone. She put me on two, can I return it to explain to me? APL's pretty pretty good, but his new language is J is the modern version that really like that better. I got all the Jay and with it was immediately productive. As I was able to write the entire a model of the attack texting system for the flight simulator and a couple of pages of J. We all know the stories about the productivity of the language.

Then my first child was born and I discovered that engineering at the 80-hour week rate that I was accustomed to was just incompatible with being a good father. So, I quit programming for awhile, the last program I wrote was a program to trade stocks. This was in J about 50,000 lines of J so, big program. And that payed the bills in the period, when I was not working productively for a living.

I homeschooled my daughter, she decided she wanted to go back to high school so I followed her into high school. First as a substitute teacher and then they needed a computer science teacher, and I ended up teaching computer science and math and Latin. I taught for the computer classes, I taught J and this is mostly to raw beginning programmers. People never seen a programming language before and they picked it up just like ducks taking to water. It was the only problem I had was with the few students who had done a little bit of programming in some other language maybe C or BASIC. See, they got used to loops but if they didn't get used to loops, you can write J naturally. And I believe if it could be introduced to students at an earlier stage we would have a lot more array programmers than we do now? So I'm very happy with what you're doing here. Trying to popularize the idea of programming in this different style, it's a totally different way to think.

Anyway, after teaching for several years, now about 10 years actually in high school. I got the bug to write code again. I had been writing code all along, but no. See, maybe I should try to go back to be professional programmer, but Ive been away for a long time, you feel like your skills erode pretty quickly in this business, about the two important things happened to me. The first, I decided to write the program, which is now called {dissect}, And this, I recommend anybody who's thinking about array programming, because what dissect does is give a visual two-dimensional display of a single line of code. A single sentence in j. If you think about it you can never do that in a non-array language. I give you a C statement so you can give me a visual representation to see statement that's silly 'cause it just replaces a with a + b, but in an array language of single statement can do enough work, that it's worthwhile to have a display of it and thats what dissect does. It was it was the hardest program I ever wrote. It took three years, but the important thing to me was when I finished with it, I decided. Yeah I can I can still write write code again. And along the way the other momentous thing that happened is that as I was using dissect to try to integrate it into my classroom teaching. I discovered a bug in the J engine that made one of the demos fail and I wanted to fix it. Turned out there was nobody around to fix it because Rodger had left you J software Thats why I said, well all right. I'll fix with them. And that was my first exposure to the J software. I looked at it very interesting, I can probably do something with this, and that was, what, 4 or 5 years ago, and in the interim on I have quit teaching school and Ive done some consulting jobs, pay the light bill. But mostly what Ive concentrated on is in all my spare time working on the J. And if I do make it better and that brings me up to today, thats what Im still doing.

00:16:10 [CH]

All right. So now I have like a thousand questions but I also want to see, we've also got three three other panelists. So maybe I'll ask one question and then if whoever's got questions next we'll cycle around. But my first question of the many that I have, you mentioned that, you know, you started with BASIC back in 1966. I believe was the year that you caught the programming bug.

00:17:10 [HR]

I should say, it was BASIC and ALGOL.

00:18:05 [CH]

And then fast forward a few years, you stumbled onto APL in and sort of just a light bulb went off. And so, I think with sort of rotating panelists, when we had them on and we had a whole episode on this, we we ask, you know, what is it about the array languages that you love or that, you know, made that light bulb go off? So I can you can you speak to having seen other languages and then you saw APL and what was it about it? That spoke to you and let that you thought like, this is the way we should be programming,

00:19:03 [HR]

the incredible terseness, and expressive language, you write a line of code. And you just know this would have taken a page. If I had to write it in C you, you can get so much faster, so you get a job done. I think you still have to write a lot of commentary to explain what you've done but you can express yourself, so, so quickly. There's just nothing to compare it with. I took a job - first job after I left school was working on a database in C++, and I can just it was like a metal downshifting of Gears from from high to second. I just really just everything just as more code and takes longer I can get used to it and it works but its just a slower way to develop.

00:18:13 [CH]

Okay, so I'm going to, I'm going to break my rule. I said I'd let someone else have a question, but I'm I've got to follow up having taught J to high schoolers and you said the basically, if, if they haven't seen anything else, you never had a problem, did you never have students like, so, you're talking so you sort of mention productivity and expressivity its unmatched by other languages. Did you ever have students that in high school were sort of raising their eyebrows? 'cause it's one of the main things that, you know, developers these days, whether it's coming from python or java insert popular language that they have heard of. They say oh it just seemed sort of unreadable. So, and amongst the array programmer and you just spoke to that. That its its like the purity of expressivity that you get from these languages which is the exact opposite from not being able to read something. Did did your students ever say anything like that or

00:19:04 [HR] No, they never even thought about it. I stressed commentary in my in my classes. It was when I see the stuff, I've seen some students over the years and I always ask him are you still writing, comments? 'cause that seems to be a lost art. From the saying, in the Consulting Ive been doing

The hard part of my programming is we all know is getting your thoughts in order. If you figure out what you want to do and then you write a program to do it. And no matter what language you use, if you don't remember what your thoughts are, you're not going to be able to understand the program very well. So you should write down what it is that prompted you to do what you did in the form of commentary. And then, then you will understand the program. If somebody writes - Its certainly easy to write J Code that nobody can understand, but its also possible to write J code that people will understand

00:20:10 [CH]

Alright, Bob I see that you..I'll now and then you go ahead.

00:20:15 [BT]

Well, I guess theres two things, Henry mentioned dissect and its, it is an amazing tool and theres two parts to it. One is, it's an amazing to visually that provides. Is it to describe? It is sort of node base, each step along the way breaks into a node with connectors between. So you can actually trace the execution of the line that you got and breaking it into that second dimension allows you to see how things develop and then on top of it, he's developed a visual language just by the icons, in the cross hatching, and things like that. You can immediately tell other things about whats going on in the program and that really is essential to taking a j program and breaking it into something. You go okay. I understand that part is a fill, thats those zeroes, aren't calculated. There a fill because we have to make a rectangular array. So those kind of things In Dissect visually. But if he also mentions the comments and the few, if you download any of the add-ons in J, you can actually look at the source code of them. All of it, is there. I can open it up in J. And I can look at those files and - dissect is a large program, but most of it is really clear comments about how things have been done. And it's, it's probably more commenting than a lot of people would normally, certainly more than I would put into my J programming. It is in terms of maintenance, and those kind of things, I'm just in awe of how clean that program is and how you can follow it.Just only the actual code itself and I, I guess there's a question here somewhere. Henry, how did you develop that approach? Is it literate programming or is it different than that?

00:22:15 [HR]

Well, it just back to my the first difficult algorithm that I developed it should have been sometime in the early seventies. And as I was trying to, as I wrote it, I said, you know, this is hard enough. I ought to try to prove mathematically that this program works right. So I interspersed with the lines of code enough to prove to the program worked right. Except the program didn't work right. Which I discovered in writing with commentary and that was, that was really an experience of a lifetime, 'cause you know this is what you want to do. You have to think you might as well write out what you're thinking and then if possible you'll write enough to know that your program works. And what I found in Dissect - more than once - I've gone in to make a change and I find this and, okay, here's what Im going to make the change and then I'll read a comment that says, by the way, don't try to make this change because if you do is going to break something else somewhere that after that then, yes, thank goodness. That's good commentary.

00:23:20 [CH]

Alright should we kick it to - cos if Stephen and Adam, if you don't have questions, Im just going to keep on Stephen... go ahead.

00:23:30 [ST]

One of the most attractive things about J is the tacit programming, which completely took my breath away when I first saw it. And for those of you listening who've not come across it, an example of tacit programming is, say, the function to calculate the statistical mean. The average. To get the statistical mean of a list of numbers you sum them and then you divide it by the count. Sum in J, as in APL is plus slash. Divide is, is it the percentage symbol in J? And the count is the pound, or hash - octothorp, thank you - Yes. So, you expect to see those in the definition of your function and if you were writing in Dyalog APL with direct definition there'd be a couple of curly - lambda-like little braces around it and some tokens for the arguments. But in J the function for average is simply plus slash divide hash (+/%#) with parens around it. And it's like all the rest of that stuff, you don't need! I've admired tacit programming. I've looked and thought I want to be able to do that and then when I tried it in J I found, yes, sometimes I can do that.

I wonder, Henry, what your experience has been with tacit programming and in particular with teaching it to kids and whether, as some people have said, it's actually a step too far in making code terse.

00:25:18 [HR]

I think they're right about the step too far, but, that said it's beautiful and it's very efficient. It's magnificent and Ken just created a language - a grammar that has no punctuation. It's just parentheses in the order of symbols and that's a language. I don't know if you're aware there used to be a much richer of version of the language that was taken away 15 years ago. It was possible to do tacit programming with conjunctions, modified... much more complicated stuff. The problem was there were only I think maybe five people in the world that really understood it. It never let me down, I could do anything I wanted to do but it's just... I'd say, it's just too much to try to teach beginners. When I teach J, I don't teach tacit programming. Well, you know, that's not totally true. Because there's not a sharp distinction between explicit programming and tacit programming. If I want to write the function to take an average like you say plus slash percent hash (+/%#) and put it in parentheses: that produces a verb. I can give that a name and I could call that a tacit program. But equally I can just take that sequence of symbols and stick it in a sentence without a name. And now, I have those three verbs tied together in that way in the middle of a sentence. Or maybe as part of a larger thing that doesn't have any names to it. Well is that tacit programming or not?. Well, it's some of each. You think it was a little bit of tacit programming, embedded, in an explicit sentence. So at that level it's very valuable because you don't have to keep referring to the same names. So, you know, if you, if you got three or four functions that you're going to apply to two names, you can use the tacit language to describe that without actually having to create a tacit function.

So when I was teaching and I taught explicit programming to begin with. The biggest drawback to tacit programming is it's very hard to maintain in my opinion. Now there are people do it all the time, but I went tacit happy in that first program I wrote. When I was learning J - my first J program I did it all tacit and not only was it very hard but it was just so hard to fix. And I realized then let's not go too far here. So I use tacit forms for things that are very well described, almost mathematical functions or functions that have a very well-defined suspect is not going to change. You can make them tacit, that's okay. But otherwise just use explicit forms. But do know what a hook in the fork is. Those people who don't know J, those are different ways of combining two or three operators into a single function quickly because that's very expressive. But don't feel like you're somehow letting yourself down by not using tacit programming. The rest of the language is rich enough that you're doing fine if you just write explicit code.

00:29:13 [AB]

Wasn't it very awkward? J only gained a neat, in my opinion at least, direct form or explicit form very recently. For all these years it has always felt clunky to me to write explicit code in J and really pushed people into writing tacit.

00:29:35 [HR]

Oh, uh, well, you do have a point. The form of writing and... you just have to get over that. You just have to say name equal colon verb define - followed by the body of the verb, followed by parenthesis. What's different now is instead of verb define, you can use a double brace. And instead of a trailing parenthesis, you can use a double brace. That doesn't strike me as being so repellent. It certainly shouldn't change your programming style. Although I do like the direct definition form. It's much cleaner and it allows you to write functions in one line.

00:30:21 [CH]

One thing, too, I should mention just for our non-array listeners that aren't familiar with tacit. It's known as something else in other programming language communities. It's the same thing as point-free programming, where very confusingly point does not refer to points, but points refers to arguments. So really you can translate point-free to argument-free. So in the case where we were just describing average, you can do the tacit version in APL now, but at one point you couldn't so if you had a dfun you'd be mentioning Alpha and Omega or - I guess actually just alpha 'cause it's a unary operation at the end of the day - whereas in the tacit form you don't have to - sorry Omega. Yeah, omega's for the unary. But in the tacit version you don't have to mention anything. So in J the equivalent of alpha and omega are x and y, and you don't need to mention those.

So I think Henry when you were talking about tacit can refer to defining tacit functions, but also you can use a "tacit" expression in the midst of a line. That yeah, it's, it's sort of a mix of both 'cause at the end of the day, if you're mentioning arguments at the macro-level, it's sort of hybrid. But in the midst of it, if there's a tacit expression - that yeah, it can be very expressive. Adam, I think you were going to mention something.

00:31:45 [AB]

I think we can make things a bit more broad. Yes, at some point Iverson, inspired by traditional mathematical notation for function composition, came up with this system of handling especially two arguments function (dyadic/infix) functions in a composed way like this but there's more to the tacit programming in APL-like languages than just that. As soon as you have an adverb or operator or conjunction that takes a function, or even an array, and derives a new function/verb from that, then you're doing tacit programming. So, in a sense, it has always been with the APL family and the thing in common for all these languages is that plus slash does a sum. That's okay for J, for APL. And what's happening there is that you are composing pieces together. It's not two separate things. The plus and the slash combine into a single entity. And so even, when Henry was mentioning, that you should be aware of these things. Even when you do the most explicit of all programming in these languages, you're still going to actually be using tacit programming without thinking about it.

00:32:58 [HR]

Yes, that's a very good point. And if there are people out there who are thinking about array programming. You just wouldn't believe just how effective these modifiers are. You take plus you take slash, you put 'em together, you've got a new function. And there are what about a dozen of those things that can be used in combination with each other and it's just amazing how much of what you want to do can be done just like that with the composition of a few operators and verbs.

00:33:35 [CH]

Yeah, I think I just in our last episode or maybe it was two episodes, I remarked about that - that C++ has a very powerful, rich suite of algorithms in their algorithm libraries. But the the power that you get in sort of what can also be called - we call them operators - some people have said that really we should just call them functions 'cause they're just like, in functional programming they're higher order functions: they're functions that can take other functions as arguments. And the power that you get. Like, they're just - the way Ken and Roger designed the language, it's, in my opinion it's unmatched by anything. I've experienced another programming languages. Like the key operator in a APL. I'm not sure if that comes from J. It does sort of your group by functionality. Yeah, when I, when I discovered that I was like, oh, wow, this is like group by but completely unconstrained, like, you can do absolutely anything that you could possibly want with it.

Bob do you have a question that you want to stay on this topic before I go in a new tangent?

00:34:41 [BT]

Actually I think I was probably gonna go on a new tangent, and that tangent is that recently you've been consulting with the Monument group and one of the things that they seem to have done is parallelize the arrays and I believe the work you were doing with Monument was along the lines of starting to take, take the arguments, you working within the array, breaking it down and, and running to various different threads. And I'm really interested in how you did that. How you approach that and I believe you used J as the language for that.

00:34:55 [HR]

I was a consultant with Monument for a while and was a small shareholder of the company. Monument chose J for all the right reasons. The productivity of code development. The product is an artificial intelligence suite directed at businesses who want to do that sort of thing. The code's written in J and they - Monument - was interested in making better use of a modern multi-core computer given that the the J language is a one processor system. We spent some time on that. One thing we tried that didn't work very well was parallelizing primitives. The parallel system there, which is called Jx, and I believe Monument is offering it for public evaluation now, if you want to find it. We used OpenMP as the basic model of parallelism and we tried parallelizing the individual primitives. So you want to add an array of 100,000 items, you can split it up and have each core add up 10,000 of them. It turns out to be - it worked - but it didn't speed things up much because it's not a very cache friendly implementation. Generally, I create an array and the array is going to be on some processor. If I try to split the operation into 10 threads, nine of those threads are going to have to go fetch the data from the processor that has it. It's gonna be in that cache or back to L3 cache which is slow. The threads that don't have the data in a cache close to the CPU are going to be slow and they're going to have to synchronize the end of the operation.

It turned out that parallelism at a primitive level was not effective. So we tried parallelism at a higher level where they take a whole verb, something that runs does a significant amount of computation and maybe runs a couple hundred milliseconds and let that run in a separate trip. So, we have implemented that and that's what MAI is announcing and it works pretty well. It doesn't use OpenMP, it uses a futures model where you - there's a modifier that says take this verb, take it, and run it in a thread and when the result comes back, the result of the thread is a future, which is defined as a box whose content show up when you look at them. If you look at them before they're ready, you block when the when the thread finishes, it fills in the future. And so the advantage of this is that the thread itself will run cache friendly. It will be implemented as individual J primitives and it'll be local to its core and it doesn't have to synchronize immediately, it can synchronize later when the data is needed by whatever processor spawned the threads.

So it's a much more efficient way to execute and also it's effective for code development purposes in that you can say, let me try this as a separate thread and see how it works and you dont have to go to the trouble of creating a whole new, J instance and figuring out how to share data back and forth. You can you give it a try and see how it works.

00:39:30 [BT]

Sounds to me like you've got Heisenberg's cats working for you.

00:39:49 [AB]

Not exactly as it's not the observation of the data that causes it to be computed. It will be computed even if you don't look at it. But, I'm a little bit confused, you mentioned very briefly the result is a box - a future - but is that really a box 'cause I saw some examples of it. For example using the the rank conjunction or, in APL terms, the rank operator. So we have this function thats being applied to subarrays of a larger array. So here, the idea is you have a lot of data, too much data to compute sequentially. So, we're splitting it up into chunks. Say we have a three-dimensional structure and I want all the sub arrays that are two-dimensional - all the layers - to be computed separately. So, what I would do is I use this new conjunction you've made with my function. And then that derived verb I then apply rank-2. I apply it on the subarrays that have rank two, the ones that are two-dimensional - that's the layers - and then we get back something. But the nature of the application on rank is that you do not know how many dimensions the final result will have. Say, if I apply something to the layers of a three dimensional array and the result of each layer, which is a two dimensional thing, is a single number. For example, I sum all the numbers that are there - flatten it or something. Then the outer shape has one dimension and the inner shape has no dimensions so we get a one-dimensional list at the end. How do you deal with not knowing what the structure will be?

00:41:27 [HR]

Well, that's why the future is a box. You have to have an explicit unboxing operation to join the individual layer results together.

00:41:46 [AB]

So, does that mean that my function applied with this parallel conjunction and my function applied without the parallel conjunction gives two different results? One gives a boxed result and one gives a non-boxed result?

00:42:00 [HR]

Thats right. I mean it gives the same result that the function would have given to being with, it just gives it in a box. The reason for that is that you want to be able to pass this box around. I would like to be able to take that box and pass it into another function and without having to figure out what the result is. The Box may live for a long time before somebody actually looks at it. When they look at that's when you block if the value's not there. It's not a special data type but it is a box so that its contents are shielded from the rest of the world because inspecting the contents of the box is an important operation.

00:43:00 [BT]

Yeah, I guess just to clarify for people. A box is just like an atom. It's a singles scalar value. The contents could be any shape inside the box, but the external view of it is, it's just a box. You're not gonna get any conflicts with it, it's just going to be treated like any scalar.

00:43:18 [HR]

Yeah, in C terms it's like a pointer to void. It refers to some data. The box itself doesn't take much space but it's a pointer to data somewhere.

00:43:35 [AB]

It's just really interesting for me because at Dyalog we've had this experimental feature which looks very much like this parallel thing you're describing and we bumped into exactly this problem that we can't allow the contents of a future to spill out into its surroundings. It has to be enclosed. Otherwise, we will need to finish computation now.

00:43:55 [HR]

Yeah, exactly. So just make it to say it's born in a box, and looking at it is an explicit operation that the creator has to perform - the system doesn't worry about that.

00:44:10 [AB]

Then Id like to ask Stephen Taylor if something like this has been considered for k. Doesn't have this problem because every type of nesting like boxes or enclosures.

00:44:26 [ST]

I wish I was smart enough to answer that but Adam I'm not close enough to the implementation. I can say that we mostly work with relatively simple data structures. We don't even insist in q that arrays be rectangular - just lists of lists of lists. In some ways q is said to be a blend of APL and Lisp.

00:44:50 [AB]

That would seem to make things like parallelism actually simpler.

00:44:54 [ST]

Well that may be. We've had parallelization on the primitives since version 4. And we get useful gains and speed from that.

00:45:08 [CH]

Speaking of useful gains and speed, Henry, so this is this sort of you said you talked about for the two attempts, the first one didn't end up being successful because of cache coherency and the second one you sort of went up a level, was that one a lot more successful in terms of the the speed up?

00:45:21 [HR]

Yes, that's effective. The cores don't interfere with each other and they - there's always problems when you try to split a job but if you - put four cores on a problem only take a third of the time. I should back up that the, the one case where primitive level parallelism was extremely effective was for matrix multiplication. That that is the classic parallelizable algorithm. It's a lot of computation then it can be split into disjoint parts of that worked very well.

00:46:04 AB]

But even that isn't a primitive operation. That's also a conjunction.

00:46:12 [HR]

Well that's the thing there's a lot of stuff in the interpreter that is not a primitive in the language but it's implemented as a primitive. There's a primitive for real matrix multiplication and complex multiplication and there are hundreds of combinations of primitives that have special code behind them.

00:46:36 [CH]

Speaking of the interpreter, this was sort of the tangent that I was going to ask about. I believe dissect, the tool that you worked on for three years. That was sounded like, it was written in J if I'm not mistaken, but all of the work that you've done or I shouldn't say all of but I would assume a lot of, if not the majority of the work that you've done contributing to the J source code over, you said your first bug fix was four, five years ago, has been in the language that the interpreters written which is in C or what, I like to call sort of a macro DSL version of C. So I'm not sure if you're familiar but at the beginning of the year which is 2020 [sic]. Depending on when you're listening to this, if it's in the future I started trying to port the J source code to C++ 20. Not for any particular reason other than to learn about the implementation and sort of I think Ken's a genius and studying the implement's probably a good way to get inside of his brain. And very quickly I was like, I was like, yeah, sure. How hard can this be? And I think the first live stream that I did 'cause I was live streaming it on YouTube ended up being like I can't remember eight or ten hours or something just to get the thing working and very quickly I realized that this was not C code per se - like there was ten thousand macros defined in the code. So I guess my question is, is what, what is your experience when you had that first - I'll go fix a bug. Did you think nothing of the code base? Or were you overwhelmed as well?

And you clearly have managed to you know ramp up and been able to make major changes so like yeah could you just speak to that and what was your experience if you have advice for people that are trying to look into the source code, you know, what are tips and tricks to get to your level?

00:48:25 [HR]

Well you'll have an easier time than I did because there was not any commentary when I started. However, the huge advantage is, the guy who wrote it - Roger Hui - is a master and everything is done in the right way. So I've tried to maintain code written by non-masters and then a lot of the work is, why did they do it this way? And there's none of that in the J engine. It's done this way because this is the right way to do it. If you think hard enough, you'll agree that this is the right way. That still doesn't make it easy to actually figure out the details. So like, what I did to begin with was every module that I wanted to work on I first went in and filled in commentary for it. To where I could understand it, and then I worked on it. That was a that was a fair amount of work, but feasible because again, there's not a false step anywhere. Still, just just be confronted with that massive code with no commentary was daunting. I think now that anybody going in now will find an easier task cos I've commented everything that I've gotten into and that amounts to about a third of the interpreter now I think. But yeah, the macros I think you can get used to them at that was not the hardest part of it to me, it was just trying to figure out what each module was doing.

00:50:12 [CH]

And how was it? What was your experience with - 'cause you said you know going from J APL and stepping down to the C++ like that is essentially what you're doing when you're working on the interpreter. Was it painful having to write C code knowing that, you know, at the other end of this is what you want to be writing in or...

00:50:29 [HR]

Not really - I mean working on the J engine is the most noble calling that anyone could have, I'd say. I think this is a real value to the world. Someone has to sit down and write the C code for it. And I'm glad to be the one to do it. There was a lot of work to be done, mostly just in modernizing the code. The code was - Roger started it in the mid-90s, right. I think about the time I was calling Ken Iverson Roger had just put out a couple of a few releases of J. This code goes back a long way. Nobody could have figured out then what an out-of-order CPU was going to look like. So it's coded to a computing model that's basically a Motorola 68000 construction. Fetch instruction. Fetch operand. Execute. To get performance on a modern out of order CPU you have to think about, the main thing you have to think about is cache friendliness. Also minimizing miss-predicted branches and finally there's the issue of wide instructions and when you get down to the primitives you need to implement them as much as much as possible with the instructions to do multiple operations to the time. So a lot of the work is just been a modernization effort along those lines.

00:52:10 [BT]

One of the things that I've noticed, Henry, is that you've moved some of the primitives out of C and back into J. So some of the calculus primitives and you've also kind of workshopped to few like I think you're doing that with fold where before you put it in to C you're using actually J to create, I suppose the structure of... the design of it.

00:52:30 [HR]

Yes, exactly. I wasn't sure. You know, these primitives, I don' t know how Ken did it, maybe he had some trial and error but it's - designing this language is really hard. I mean, I just have to say I have designed plenty of languages as I'm sure everybody has and it's just hard to get it right. You know you can make a language that does the job you had in mind, but to get a language that does the job you had in mind and is extensible. That requires a master which Ken was and I don't think I'm at that level. So I wanted to write fold in a way that we could experiment with it. And indeed we've changed it a couple times. With the calculus that was a little different, I think. I think Roger himself decided that it should not have been done in C to begin with. I believe that at the time it was written there was some thought of doing symbolic mathematics like Mathematica and it just turned out that C is not the right language to be doing any of this. It should be done in a higher language. J is a fine language for it, so I just put that back where Roger decided it should be.

00:53:48 [CH]

Do you want to talk about... When I hear fold I think of oh, that's another name for reduce, depending on the programming language. Do you want to talk about what fold is? Or maybe that is it, but I'm I'm getting the sense it's something different.

00:53:58 [HR]

Well, the purpose in J is to pick up those things that are not well enough served by simple reduce. Uh, and they are, one is early termination. Right, so I've got an iteration. I want to stop early. And another is an operation that needs to pass a large amount of state and uh, keep some data from each iteration. In in J without fold you have reduce and scan and then there's a reverse scan which is slightly more efficient. But let's not worry about that. Reduce take some takes a verb and applies it between all the items to produce a single result. That's great. Scan takes a verb and executes on each between each item, but it also remembers the result of each execution. So if you're talking about addition, it's a running sum rather than just the sum of the array. And that's that's good if that's what you want, but sometimes you want to have a running operation, but you need some state passed from one execution over the next, so I might have a thousand element vector. And maybe it's a table or something that I want to update as I do the execution. The the scan operator, the verb with with scan would require that that intermediate state be part of the result. So if I have a vector of 1000 items and I'm going to do it 1000 times, I would have a million items of intermediate state that is not really needed in the final result, it would just be there because of the way scan is defined, see what I'm saying? Fold allows full splits the the iteration into a part that passes state to the next iteration that does not become the result, and a part that produces a little bit of result so that you can operate. You can have state vectors that are part of the iteration, but not part of the result with a huge saving in of memory requirement.

00:56:40 [CH]

So yeah, my guess is that that that state piece for the fold isn't being copied every single time it's being updated but not copied. Whereas if you try to do that, you tried to do that with a reduce. Each iteration would require copying that state, which is going to end up being super expensive.

00:56:57 [HR]

Well, the problem it's not reduce, it's the scan that's the issue because the the only thing that gets fed into the next execution of the verb in a scan is what would be part would be the result for the previous result that maybe meant much more than you need to remember. Overall, yeah, that that's that's what we're saving. So we've implemented fold and to my knowledge, it's never been used. I I I, I almost had an application for it, but then I found another way to do it. Uh, but it it it does. I feel like I've had times when I would want to use it. I just don't have any now.

00:57:42 [CH]

And so, and just to be clear, the fold does it, do the intermediate results like scan or it does only have a final result like reduce.

00:57:50 [HR]

It has an option. There's six variants. Yeah you can, you can scan forward or backward you can. You can have multiple results or one result.

00:57:53 [CH]

OK, I see and I can definitely tell you that I've had this use case where especially for a scan, that's, the thing is you want to carry state extra state that's required for the next element, but you don't actually need that element in your result. But like in a functional language, that algorithm doesn't really exist. So what you end up doing is doing some sort of scan, where you've got sort of a tuple of elements and then you know the first element of that tuple is the extra state. You need the 2nd element of your tuple is your actual result. And then, once you've done that, you've got your array of tuples, and then you just do some sort of map where you say hey, ignore the first part of the tuple. That was only the state that we needed to do the scan. So so yeah, that that has definitely come up before.

00:58:41 [HR]

Yeah, you've got it. That that is exactly what fold is written for.

00:58:45 [CH]

Yeah Adám, I know you've been. You've been trying to ask a question, yeah?

00:58:47 [AB]

Well, not ask a question, it's just I think to clear something, clarify something for the the the few listeners that had no clue what it is we're talking about. Here I I think we should just say, scan and reduce, the scan have been with APL, since almost forever, reduce has been forever and we just need to clarify that these are six new primitives added to J relatively recently. I think that wasn’t clear at all. They are trying to to they they're clearly trying to address a shortcoming in in generalization based on the original APL reduce and scan for those odd cases and and when you need them you really do need them, but most of the time you don't. You can just do with a normal reduction and scan.

00:59:40 [HR]

Yes, if if anybody ever comes up with a case where they need it and they complain about the speed, I'll implement it in C, but until then, we're going to keep. The implementation is actually in J, so we can.

00:59:51 [CH]

It's interesting that this actually is a thing you've done, because in C++ 20 I believe they modified the ISO standard on the language such that the accumulator in your reduction algorithm uses move semantics. So previously if your state that you are carrying in your reduction like that you're accumulating on. If it's something extremely expensive like a, you're trying to get the unique elements of something and you decide to do that with a hash set or something like that uhm, it's going to be very expensive, 'cause as that hash set grows, you're going to be copying it every time. But now that we have move semantics, it, move semantics, without explaining it, just basically says hey, don't don't copy this, just just move it, which avoids copies and is way faster. So this same issue that's being tackled with fold and J is something that is also quasi being tackled in the most modern C++ version at at this current time. Which is very very fascinating to me.

01:00:54 [AB]

I I'd like to also ask a thing about about this, fold it it's something I want to ask you as maybe the foremost modern day implementer and user of J and the symbols the glyphs and because you know, really the APL languages say they use like one symbol, but then there's some for each thing, and then there's some exceptions in all of them, I think, but they can use multiple and and J has had this thing that you can add a dot or a colon after a symbol to change it and that also works for it for alphanumerics. And then they’re slowly started adding multiple these. So the we mentioned there are six folds and they are the capital letter F followed by every one length one and length two combination of dot and colon. Right? J was first the first version came out one year before Unicode 1.0. And as you mentioned, also Roger and I don't know if he abandoned ship. I mean he still does things in J occasionally, but he also mentioned to me that that for he's teaching his old child his own child APL rather than J, which I was a bit surprised about. And you know, works for that like, what do you say about the character set and is it still the right choice with Unicode?

01:02:29 [HR]

Oh, I don't know. That's going to be a matter of taste. I think you if you ask anybody who's not a J user, say I, I look at all that J, I can't understand it like you can't... See, me, I'm not an APL user. I look at APL. I say I. I don't know what all those symbols are, but that that's silly you, whatever it is. You know you can learn to read the Cyrillic alphabet if you want. To write you just you learn whatever alphabet you need for the language you're going to use. Uh, but we did face this issue recently. We have when we introduced direct definition, we have the issue of what to use for delimiters for the functions. There is a Unicode brace that is different from the standard ASCII brace. And we could have used that. But we decided not to because, well, I think mostly because Eric was adamant that we not deviate from ASCII. I think attempts to go beyond ASCII have led to a lot of problems over the years, and I'm I don't think they're solved yet, but we just we decided that rather than try rather than go to a Unicode delimiter, we used a double left brace and a double right brace, even though that was an incompatible change to the word formation rules of the language that, that was better than trying to use a non-ASCII character.

01:04:11 [ST]

I like to pick up on what you were saying about the alphabets a moment ago that we learned the alphabets that we need. I see myself as having been drawn to the Iversonian languages in the 1st place is because I wanted to, I wanted to write and think a more abstract way, and not waste my scarce mental resources shuddering along with all of what you were referring to earlier as punctuation. And as part of that, as not just picking the language, but in the language, I've always tried to use the most abstract forms that I can, out of a belief that this is it's kind of a a self-improvement project, I want to improve my ability to express myself to write things in a more abstract way. I want to see patterns more deeply, so you get here that I've bought Ken story that the better notation would help you think better. And if I could risk attempting to channel Ken here he was. His standard response to criticisms that an APL expression was hard to read was to challenge the distinction and insist on distinction between what was hard and what was just unfamiliar. And the the argument leads into Whitehead's much referred to thing about we, we need to think less. We need a notation which will raise our thinking up so that we can stop sweating the small stuff. It's an argument. It's a discussion. I'm had it with Whitney several times, and I was very interested in what you were saying earlier about maybe five people in the J world who could understand the the full rich set of the, some of the earlier forms for tacit programming. Now, if you if you stood by Ken's arguments, you'd say, well, we should leave that stuff in there and wait for the rest of the world to catch up and reach that level of abstraction. But yeah, as as I said, both Whitney and I have doubted that that there's some kind of glass ceiling, beyond which the abstraction just becomes too hard and it gets it gets in your way I I wondered if that if you have any thoughts on that.

01:06:42 [HR]

Well, I was very sad when the language was simplified, but it it did reduce, it got rid of a lot of code and simplified the parse table makes execution a little faster. I, I think if when the language has been around for that long and there's and it just tacit programming is just harder to get a handle on than explicit, because with explicit you've got the alpha and omega on APL or X and Y and you know where the operands are. Whereas with tacit programming you'd have to memorize, I think it was about 25 different forms of tacit function. That was simply more than most people were going to do, I think they would be happy to memorize 25 things except it just didn't occur very often. Now what we're talking about there is uh, a language for creating operators, not a language for creating verbs in the language for combining verbs, so a hook or a fork, and as you said, you've got verb F, G, and H, and you've got alpha and omega that they operate on, and it's pretty easy. I'm going to do F and G, I going to do F and then H and then G. That's something that you do a lot in a programming environment and you say this is good because I don't have to repeat X and Y. But the corresponding case at the higher level is when I have a function that I want to execute more than once. Like so I want to say here's a verb and I want to execute this. I want some structure that allows me to do this verb on some operation and then later on do that same verb on something different. That's much more rare. Like I, I only used it you know, dozen times, perhaps you know now I don't think it, it's just not a good use of the average programmer's time to learn that abstraction for something that's not going to be used all that much. It was beautiful. The thing about it was it was so beautiful and it it always. It always did anything. I needed it to do. And I wrote a chapter in my book on it, but sadly it was, it was just too hard to, for for most people to it was too hard for it to make sense for most people to learn it.

01:09:24 [ST]

I get it if I'm following you. There's it's a kind of trade-off between the cognitive load on one part from the cognitive load on another.

01:09:32 [HR]

Yeah, that's a better way of saying what I spent two minutes saying that's right.

01:09:37 [BT]

And a plug for your book because I, well, it's in J. It's pretty much my Bible when I need to try and figure out what something is going on. And of course there's NuVoc and there's the traditional dictionary and I think you had a lot to do with NuVoc as well. But those areas that go sort of to maybe find a reference, but if I really want to understand something, I go to J for C programmers because it's an array language sort of interpreted into a very simple procedural way, so you can kind of figure it out. You worked with very simple principles, but you know you take it to the next level, and actually, I think it was in the, the chapter that you wrote about operators and maybe the parsing engine that you you said that there's very few people who actually go into trying to write operators tacitly, and that's probably why it wasn't used very much. You take the general population, they just don't move into it. But if you start to write operators, that's when you really missed the absence of it. I came to J just after those were taken out, so I never experienced them. But now when I go in to write an operator, it's like all I know that was there once. I can see where it would fit, I just can't use it anymore.

01:10:49 [HR]

Yeah, you can, but in fairness, you can only use the tacit language when it's like, for verbs you you can a lot of verbs are just harder to write in tacit form. For a complicated operator, you really need to write that explicitly anyway. So the tacit language it works on a few things and we really shouldn't spend any more time bemoaning because it's not coming back.

01:11:17 [AB]

Hold on, there's something called something else called Jx. And which is some extension to J? And again I don't know a whole lot of J, but from the little bit that I'm trying to gather here is there's some tacit form there there used to be some tacit form to define adverbs and so on. Is that and they have, certainly.

01:11:39 [HR]

Oh, that was a Pepe, Pepe Quintana project. Yes, I I haven't heard anything about that for many years. I think that's not not up to, several releases behind at best.

01:11:54 [AB]

But there's that was that what it was basically something similar to trains to to verb phrases that allowed you to define adverbs...

01:12:04 [HR]

Pepe Quintana is a programmer who writes everything in tacit form. I it it's amazing to me what what he can do in tacit programming and and so when the language diverged when, when that was taken out I I believe he kept it in his version of J and so he used it. He was doing financial operations using J and for a while he kept up with it, but I think, as J, I mean when I came in, came in and we started pushing the language forward again. I think it hasn't kept up with that.

01:12:56 [BT]

I think from what I remember with the the message boards he used to refer to some of the tricks that he used to get around as wicked, and I think it was he and Dan Brown were the ones that were coming up with these wicked things and I think in a recent version you finally they finally caved in and they realized they couldn't do the wicked stuff and keep everything running at the same time in the new version.

01:13:18 [HR]

Well, yeah, they they found, uh, a backdoor that allowed you to write a verb that produced, to write a verb that produced a verb result. That's not supposed to be possible. Verbs are supposed to produce noun results, but if you use this backdoor, then you can make some other stuff work. But it's it's not. It's contrary to the language, and I never actually went in with the intention of plugging the hole, but advances in the implementation, they have made that backdoor no longer effective, I think.

01:13:55 [CH]

Alright, so I think we've burned by the hour mark a while ago, which I'm I'm not upset about it all I I think we're going to have to have you back on Henry, 'cause I still I think have like 994 questions left out of the 1000. It's been awesome having you on I. I guess maybe I'll finish with, and this can maybe be like a teaser, 'cause I think we could probably have a whole episode just talking about you having taught J and your experience there because it's it's a topic that comes up time and time again with not respect to just J but all the array languages. Uhm, I know we were talking about this before the the recording started that you've had one student that's gone on to be rather prolific: Marshall Lockbaum so he ended up, I believe, working at Dyalog for a period of time and then has written two of his own programming languages. I and the more well known, at least in the array community. Bacon, which is it's spelled BQN, but I believe it's pronounced bacon.

01:14:53 [AB]

You're not supposed to pronounce it like that unless you're making a pun.

01:14:56 [CH]

Oh is it is the technical pronunciation is B-Q-N? Is that what we're supposed to say?

01:15:01 [AB]

I guess so yeah.

01:15:03 [CH]

Oh alright, well maybe we'll have Marshall on and he can. He can set us straight but have you had not to focus on Marshall in particular, but have you had tons of students that have, you know, absolutely fallen in love with J and then gone..., or maybe as a teacher, it's hard to track. You know which students do or do not, you know, post being in your computer science course, you know, use it in the future. Or do you have any, you know, empirical data or anecdotal stories about that?

01:15:30 [HR]

A couple of anecdotes. Well, Marshall is one of a kind, he that's all I'm reminded of a quote for him that he was not, he was not born of woman, but my God issued him directly to mathematics. I've never met anybody with that level of mathematical ability, yes, he picked up J right away. I I had a couple of students who've picked, used J in the introductory course, which is not a programming course to see mostly spreadsheets, but a little bit of J. And were so taken by it that they became professional programmers. Uhm, I had one student who, uhm, learned J and he was obviously prodigious. So I I tried to find something for him to do and I get the city of Raleigh had a or no. The state of North Carolina had a program that they were using to approve changes in.... to approve water projects. So you want to build a dam that's going to have so much outflow into a river with, and they had a program. Unfortunately, all they had was the object code and an old listing in Pascal, so they they couldn't, they couldn't make any changes to it, and the fellow who wrote it, of course, was long departed. And the the student took that Pascal program and translated it into J in a semester. And the the state started using it for their water projects. It produced exactly the same results as the previous program, but it was maintainable so, yeah, we've had student students pick it up quite nicely.

01:17:32 [CH]

That's awesome to hear. Yeah, hopefully over time more more high schools as they're developing. I know right now in the UK, Simon Peyton Jones is in the midst of standardizing like CS curriculum for all of all of the UK, and I imagine that's going to be happening in every country at some point in time, and hopefully there's going to be a space for for different paradigms, and it won't just be Python everywhere. There'll be some, some J and some APL. Because yeah, it sounds like it's a, it's a great stepping stone into this world or I'm sure we would all agree, but we're probably biased.

01:18:10 [HR]

Yes we are I I called my my J class I I called it programming for scientists and engineers, the motivation being that when you're writing code for somebody else for a living, you're going to be slinging Java or Python or whatever but when you're writing code for yourself, that's when you really want the language that has the highest productivity. That that's when you want to be writing in an array language. I think it it would be great if the students, even if they know they're not going to be writing array language as much they at least had some idea of of how programming looks when you do it that way.

01:18:50 [CH]

Yeah, it's a. It's a powerful tool to have in your tool belt, even if you only need to use it every once in a while. All right with that, I think. I think we've got a couple a couple announcements at the end. Adám, I'll kick it to you for those. And if there's other folks with announcements too, we can, we can make those now.

01:19:08 [AB]

Sure, a couple of things there is that only two, but it's only one week left of the annual APL competition. So if you want to have a go at that if you haven't started yet, probably late to participate in phase two, but could still win a prize in phase one. There is the as we mentioned the, uh, I mentioned before the APL Campfire event where we uh, we had a chance to ask questions of the big people within APL history and the next such event is on the 1st of August at 6:00 O'clock UTC, links will be in the show notes.

01:19:55 [CH]

And Stephen.

01:19:55 [ST]

Yeah, Henry was remembering back in in Raleigh, Carolina. and getting a time sharing service account and what that would have been like back in the day 'cause I had a time sharing account back in the 70s was you get to use a keyboard and there's software working at the other end and all the data is loaded up well. Progress never stands still. You can get this now in the world of kdb, if you go to kx.com/academy, you can get a kdb session all loaded up with the New York City taxi data and there's nothing to download, nothing to install, just go play and explore kx.com/academy.

01:20:44 [CH]

That's awesome.

01:20:44 [AB]

Alright, that reminds me that we just released the source code for TryAPL.org, and and so if you're interested in seeing how we have, we've implemented this white listing and execution of incoming requests, you can see exactly how, how the API, which is public is handled on the back end. Then you can go and have a look at that so much, much more of low barriers to entry are trying out the array languages.

01:21:13 [CH]

So we've got a contest, a campfire kdb+ plus in the cloud and open sourced TryAPL, so tons of stuff to check out all the links of that I think will be in the show notes and once again thank you so much for coming on Henry. This has been a blast hearing your story and your experience with J and hopefully listeners, if they were bored by all of our, you know, gushing about the languages, up till now, they've they've heard your story and maybe been like, oh OK, so it's not just not just these folks. Henry, clearly fell in love with it so maybe maybe some others will be inspired to go check out J as well. Yeah, and hopefully we'll be able to get you back on in the future, 'cause like I said, I've, I've still got a ton more questions and I'm I'm sure we could do this again several times and not run out of things to talk about.

01:21:59 [HR]

Thank you.

01:22:00 [CH]

Awesome, alright, thanks for thanks for coming on and I guess we'll say as we always do. Happy array programming.

01:22:06 [Music theme]