Transcript

Thanks to Rodrigo Girão Serrão for providing the transcript.
[ ] reference numbers refer to Show Notes

00:00:00 [Josh David]

Don't be afraid to apply it to build something because that's what I've seen.

I've seen a lot of small groups, people just take an idea and just run with it. And APL really holds your hand the whole way when you need to build something amazing.

00:00:23 [Conor Hoekstra]

Welcome to another episode of Array Cast.

00:00:26 [CH]

I'm your host Conor, and today with me I have my 3 panelists, Adám, Stephen and Bob. We're going to go around and do brief introductions and then we have a couple announcements and after that we will hop into an interview with our first time guest.

00:00:41 [Bob Therriault]

And I'll start things off.

00:00:42 [BT]

I'm Bob Therriault. I am a J programmer. I'm a J enthusiast, I guess 'cause I'm I'm actually not employed by J but I am working on the J wiki and we may have some announcements coming up on that because we're making some progress and it's wonderful.

00:00:55 [Stephen Taylor]

I'm Stephen Taylor.

I'm an APLer from way back and currently the Kx librarian.

00:01:01 [Adám Brudzewsky]

I'm Adám Brudzewsky, full-time APL programmer at Dyalog Ltd.

00:01:06 [CH]

And as mentioned before, my name is Conor. I'm a professional C++ developer, but array language enthusiast and combinator enthusiast at large as many of our regular listeners will know, so we will throw it to Adám for a couple announcements and then I'll come back to me for one more.

00:01:21 [AB]

OK, so first off the APL Seeds second instalment. [1] Uh, which is a kind of conference meeting get together a day and it's going to be the Tuesday the 29th of March. And it is geared for people who are interested in array programming, APL in particular, and to get some view and meet some people that do APL and get some instructions to that. So that's going to be really exciting. Looking forward to that and furthermore I have started a weekly event [2] in the APL Orchard [3] and so APL has a chat room, part of stack exchange. And there we take every week on Friday a problem from a past APL programming solving, problem solving competition. And people submit their answers to that and then we discuss it a bit and see how we can optimize it and variations on it. And it's kind of lively and then a follow up video based on people submissions is then published in the the week after that until the next one. So it's a really nice thing to join. You can find the previous chat logs and of course the videos are there and links to all of this, of course APL seeds as well, in the show notes.

00:02:41 [CH]

Awesome, and I think before I mentioned my announcement, it's worth noting about the "Apple Seeds" or APL Seeds conference, uh, this is the second edition of it, so the first one was last year in 2021, so I think we can, we can throw in the show notes links to all of the recorded talks [4]. I give a short talk there. I think Rodrigo, our guest from 2 episodes ago, also gave a talk there and so yeah, all the if you're not able to attend the conference in person, all of the talks will be recorded, and I think most of them are about 20 to 30 minutes long, so they're they're not like 90 minutes. They're easily digestible at sort of a lunch, if you still take those during the Pandemic times and also to, I'm not sure maybe I missed this, but it's worth noting again that it's, uh, at least I think. I mean, Adám will correct me. It's a completely free conference.

00:03:28 [AB]

Correct, yeah, it's free, but you do need to register [5] and the the link to the registration it will be, in the show, either in the show notes or on the site that we link to.

00:03:38 [CH]

OK yeah, so that's this is worth noting 'cause a lot of even single day or two day conferences these days they'll either have a small fee or sometimes a non small fee. But yeah as this is targeted at beginners potentially, if you're a listener to this podcast. You already are familiar with the languages, but if you you know folks that are getting, you know started with computer science, it's potentially worth you know, sharing that like hey, have you been interested in this new kind of paradigm? Because it's it's geared towards beginners and people that might be looking to get into the language, so I think it's a great sort of resource for people that are maybe new paradigm curious, but have never sort of dipped their toes in yet. With all that being said, my short announcement is that I was on the APL farm discord [6], which is a collection of all the different array languages. And recently just happened to stumble across an announcement update of a site called BQN Pad [7], which is basically like a tryapl.org but for BQN. And it is super awesome. It's got a syntax highlighting which the current BQN language JavaScript REPL that is hosted on, sort of the documentation site that Marshal Lochbaum done, they don't have syntax highlighting, so it's sort of just it's a nicer look, but also too a really cool feature that they have is they have like a preview result of the expression that you're currently building up, which I don't think exists on any of the other sort of online REPLs. So while you're currently typing, typically the way you work in a REPL is you hit enter and then you get the result, and then you build it up slowly but in BQN Pad you can just type the expression and every single time you type a character it sort of re evaluates it, so sometimes it'll give you little errors while you're adding, you know, one or two symbols that you need to add to sort of get the next to the next state of your expression. But anyways, super neat. I believe it's being developed by I might get this wrong, so I apologize, Andre Pop so kudos to that individual for, you know, working on this and I think you know this is similar to the announcement that Bob made you know a couple weeks ago that J is working on getting something like this up and running, so we'll have announcements about that in the future when that's ready to go out. With all of that out of the way, it is now time to introduce, as I mentioned before, a first time guest Josh David. This is going to be a pretty exciting conversation with Josh. Josh was exposed to Dyalog APL during an internship with The Carlisle Group [8], which I believe is headed by Paul Mansour, he can correct me if I'm wrong, who is I think pretty well known in the APL/array language community. He continued with APL and was a grand prize winner in the 2016 APL Problem Solving competition, which I believe means that there should be, if you attended the conference there or there shoulder might be a video of sort of going through how he won the contest, and he's a recent graduate from the University of Scranton with a degree in computer science. And he now primarily serves as an APL consultant to North American clients, and works on APL tools. So I will say welcome to the podcast, Josh, and maybe if you want to start off, confirm or deny whether there is a talk that exists of you walking through how you won that contest, and if you want, feel free to, you know, jump back to you know how you started a computer science/ how you ended up in the APL/array language world.

00:06:51 [JD]

Yeah, thanks Conor, there is a talk that's on YouTube for my presentation for for the problem solving contest so we could put a link for that somewhere. But yeah, so thanks for inviting me on. It's really exciting. It's really exciting to see you and Bob and other panelists put in the effort to produce these podcasts. It's kind of like a new revolution of information, you know. We had books and now podcasts are just exploding. Really interesting and happy to be a part of 1 so my name is, yeah Josh David 24 years old and I write APL code for a living. Was born and raised in Scranton, PA so I don't. You might have some listeners who are familiar with the the sitcom "The Office". That's the city and where it's based from and yeah, it's a real place so there's actually quite a bit of history in that city with APL, so the last APL conference well, wasn't exclusively an APL conference, it was all array languages of the millenia, in 1999, was held in Scranton, PA [9]. There were a couple of J presentations as well there and Ken Iverson himself was at this conference. So I was only two years old at the time, so I didn't. Attend it, not all of us have a track record of showing up to conferences at the age of 1. So yeah, but it was at, this conference was held at the University of Scranton. And it was one of the organizers was Doctor Stephen Mansour. So he taught statistics there and actually, his brother was my neighbour, Paul Mansour growing up, so that was how I got introduced to APL at a young age. He was actually my neighbour and he's well known in the APL community. He has his own small company that has, you know, less than a handful of APL developers. Managed to create an industry standard software for the for financial managers. That was my first introduction and I got to do some internships with him over this over my Summer breaks in high school and that was my introduction to APL. And I always knew I wanted to get into programming at a young age because I really liked building things and seeing a tangible room. So, you know from Legos as a kid too. So it just it just gives you this freedom to kind of do what you want. You have an idea in your head and then I want to express that to the computer and you know, get an executable or something that has that, the full stack. And so then I continued looking into other programming languages as well. High school I attended a computer club where in the introduction class I remember the first one we were learning about Java. It was probably going over a Hello world program or something and then I remember seeing. The words "public static void". And my first question to the instructor was, well, what is that? Why do we need it? What does it mean? And the response I got was "Oh well, just ignore that. We just need that all the time. You don't have to pay attention to it." and that kind of didn't sit right with me. And you know? Having experienced APL before, this uhm, that was kind of, kind of like the bad taste in my mouth, but I continued to explore it and then went on into college to get my bachelors in computer science and do some internships with other companies that you know, don't use APL, so just traditional java-like languages and so on. And then yeah, as you mentioned I I participated in the problem solving competitions and got second place [10]. And then in 2015 and then next year I got the grand prize in the general computing category [11] [12]so that I really got more involved with the APL community at the conferences. And then by the time I graduated, my plan was, you know, take the traditional CS approach of a pick up cracking the coding interview book. Practice some LeetCode problems. And I was headed that way and then I decided to reach out to the APL community again, see if there was any openings, and sure enough, they were starting a consulting business for North American clients and I decided to hop in on that and that's what I've been doing since I graduated about little less than three years ago, and from then I've been programming and writing APL in production environments as my 9:00 to 5:00 job.

00:11:43 [CH]

Wow, so that's pretty crazy. That you just had a neighbour that was sort of a prolific individual, and that's here... I feel like, you know what percentage of people had a parent like Adám or a neighbour like Josh, or just like the proximity to someone who was sort of a... And Stephen, do you also have a, was raising your hand, origin story that you, uh, you know your basement suite or something.

00:12:11 [ST]

OK, well I actually got a a question for Josh here. I've got huge respect for Paul Mansour as an application developer and I'm wondering it's too general question to ask what did Paul teach you? It's like, well, probably the answer that seems like everything. But I'm guessing that in teaching you APL, he probably insisted on certain ways of writing, and may have made a point of it, 'cause he's a very thoughtful, principled developer, and I wonder if you could share any of what you consider being main lessons that Paul taught you.

00:12:52 [JD]

Yeah, that's a good question. And actually you're right on with that assessment there, he's very big on coding styles, which is probably one of the biggest things I've learned there, and he actually he has recommended ways to write APL programs in these production environments. It's on his, there's a DADO gitHub repository and the wiki. There's a link on that and there's a lot of principles taken from the Clean Code book which is famous in programming circles [13], so that was one thing, because traditionally APLers, and I see this a lot too, the traditional APLer is not a programmer, so they don't have a lot of, it's kind of the double edged sword of APL. You have your one of the strengths of it is that you have the subject matter expert, the domain expert, you know someone who's not a programmer, maybe a chemist or or, you know an actuary writing code. So they're not familiar with some programming principles, and then it can, you know, you can fall into technical debt of having harder to maintain code, so having these principles in APL, it's somewhat unorthodox because APL lets you do whatever you want. And some of these principles, in particular when I gotta what what what I like in particular is keeping functions at the same level of abstraction. And if one thing is, if your code needs comments, it's not clear enough. Your function name should be clearly defined to say exactly, you know, what it does, so this way, even if your code is not clear inside the function, someone could easily rewrite it because your functions are very small too. Which is helpful too in source code environments when you know you avoid merge conflicts and so on. But having functions at the same level of abstraction, so it's either mostly English, which is calls to other functions, sort of a functional approach, or it's mostly APL. The other style is using dfns, you know, the advantages of dfns is.... Two of the hardest, he says two of the hardest problems in programming is naming your variables you know that's one of the hardest problems, so two of them are solved for you. You have alpha and omega left and right arguments. You know that that's taken care of. You don't need to worry about that. And then the other interesting thing is alpha and omega. There are very short symbols. I used to write longer variable names. But the idea that he has is you keep short one letter variable names if your function small enough, it's within scope and it's should be clear and the other magical thing about that is you get to see the relationships between primitive functions more clearly, you know, because this the APL symbols are very short, so you could kind of get a very large scale view of the function what's going on when you do that, so I've started to come back to that style of, you know, making sure your code is more broken up, modular and short. I mean there's a whole lot to this. You know, the other thing is it was very big on testing. That was a big part of my job. Making sure you have proper tests for everything and it's kind of it has lasting effects on you when you when you try to do software testing, you're always thinking about edge cases of things and making sure things go. But that's also a very important part of software development that the traditional APLer is not really expected to behave or participate in. So I was very happy to have that kind of foundation for learning about creating APL systems.

00:16:57 [CH]

I have like, I don't know 100 hundred questions. That's actually an order of magnitude less than the usual number of questions I have, so that's actually not good. I have 1000 questions like always. Because I'm I'm also here and maybe it's it's right now I'm like split between clearly we have to have or try and get at least Paul Mansour. Is it "Mansour" or "Mansour"?

00:17:18 [JD]

That's it, "Mansour" yeah.

00:17:24 [CH]

We should clearly have Paul on podcast to talk about The Carlisle Group 'cause they actually have a like you mentioned, the GitHub repo and I think actually Adám has mentioned this maybe in one of the Dyalog webinars that happened recently [14]. It might have come up about one of their sort of APL to Excel or vice versa, or some sort of format to APL and back and forth. But it looks like they have a bunch of stuff, so you mentioned de DADO and it has a ton of APL stuff. It looks like there's also like a practical APL introduction to APL course [15]. Uhm, so clearly there and everything is like 100% APL, UM, I guess. 'cause we will have Paul in the future. It's probably best to save most of those questions for him, but do you mind giving like a brief summary? 'cause he mentioned that he was building it looked it sounded like their their small company was building software tools for financial managers like you know, do you know how that business sort of started and so do they? Do they? Is it sort of consulting on a one off basis or are they building software that you know the clients or managers or purchasing? Or do you know much about that.

00:18:33 [JD]

Yeah, so maybe he could explain the origins of that, but uh, it's just the software their their primary flagship software is the collateral analysis or CAS [16], known as CAS, and is used across, pretty much all these major banks across the world use it to do some analysis and report. He also has another product which I worked on called Flip DB [17] which is a relational database management system and also entirely written in APL. So his applications, that's the other thing. His applications are entirely pretty much backend frontend APL and which is why having this experience, and other experience too, it kind of drives me nuts when I hear people say, oh, APL. Yeah, you could write snazzy 1 liners with it, but you know you can't write production code with it. So with my experience growing up and I I see this work, and I've seen people with very small teams blow the competition out of the water just with pure APL. And I yeah you, there's a difference between these one liners and production quality code, but you probably shouldn't be doing the latter without the without understanding the former, and there's some nuance to that I could get into, but to go back to your question. Uh, yeah, I'm I'm really happy that, you know, with this there's been a sort of a push recent really, really spearheaded. I can't think of too many other APL companies that are doing this open source library push as much as they have [18]. They have very useful libraries that I use in my day-to-day work. I got to do a presentation on DADO, Dyalog APL development operations. Which is an interesting name too, because DADO is actually an architectural piece on a wall. It's the base of a, on the base of a wall, so it kind of the idea is to base your APL applications on a solid foundation, which is an interesting a whole interesting topic to discuss is workflow in Dyalog APL, and especially with, now we have things like ]get. These are very powerful tools and the whole idea of APL is to have is a tool of thought to express express your ideas at the speed of thought and it's hard to do that now with the modern constraints like come version control. How do I version an application? Things like this that just or go back to the, so here's a library goes back to the core of I want to work in APL fast. I don't want to have to deal about that. I want my framework to take care of things like that, so these kind of discussions are things that you have to start having when you start using APL and production environments that, you know, if you're just solving a code golf challenge, you don't really care about this, but there's layers to it.

00:21:28 [CH]

So I think we should definitely hop back to the one liners versus production code, but I know Bob you had, I think wanted to ask a question.

00:21:35 [BT]

Yeah, it was kind of related to that because as of an APL programmer, you do have a different time frame. I suppose when you're when you're working on a problem compared to most of the other programmers who might be working in other languages, how do you interact? Is in a a larger venue? How do you interact with other programmers who might be working on a vastly different time scale than you are?

00:21:54 [JD]

So by timescale, do you mean deadlines or?

00:21:58 [BT]

Just the speed at which you can refactor and change your code and change your ideas. And I when I was going to school I I heard stories of people who were sent out to work in a, you know for a company and they were given an assignment and then they came back in 1/2 hour later and they were told well next time just take a couple of days oK, 'cause this isn't working for us.

00:22:20 [JD]

Yeah, that's true. It's interesting, yeah, working in other companies where APL isn't the isn't the only language that's used. But you can you definitely had... Me having been on the other side of the circle too, as a Java programmer primarily. I've developed in other languages too, and even in college when I would develop solutions... When I have problems, the first solution that would come to me was the array solution. So I would solve it in APL and then figure out how to write this in Java. So I started doing that a lot and then eventually I realized I'll just cut the middleman and just write in APL. Which is nice. The nice thing about working in the in production environments? Well, there's a, there's an asterisk there that people don't care what language you use, they just want to see the job get done. Ideally that's how it should be, you know, you for a class you might be required to submit a solution in a certain language, but in the real world results are what matters? And you even see this a little bit with Advent of Code. It was, it's kind of nice to see the Advent of Code. Some of the, in the leaderboards you you'll see some APL solutions, Jay Foad used to be the CTO of Dyalog, and he has some very nice repositories on GitHub showing him his solutions to these problems in APL [19]. But yeah, I mean, there's definitely an advantage in using APL, especially Dyalog APL in particular, there's a lot to it that you don't appreciate until you have a problem to solve, like interacting with the environment, whether it be database interactivity, reading, reading from Excel, controlling Excel, all these things, you could just get data in. Especially when the requirements change, it's very the, having the REPL to interactively debug code and you know just that playground there. And it really gets into the whole idea of the array-oriented program, a paradigm itself. It's not just APL, it's, if I could say this without being put into a mental asylum, it's a kind of the the the language itself kind of guides you or speaks the solution to you before you even get there so that I think that's really an important part of solving problems in APL too. It's not just the the speed at which you could do things that interact with the environment, it's that you come and look at problems in a different way that people in other languages might not see the problem. Which is why, so, for some of these companies I'm I'm not even allowed to name them, because APL gives them such a competitive advantage, they don't want that to be out there, and it's really, I've seen it create, there's application where I've seen it create billion dollar breakthroughs in a field just because of they structured their problem using an array oriented style of thinking and APL help them to do that. Even now, they're not, some of these developers are not obligated to use APL, they they can use Python or other languages, but my last meeting I had, I met, I met someone who mentioned that, you know, using these other languages is fine, but there's nothing like just a jot dot expression in APL, and for some of these, uh, more when you get into these the way APL handles structures you know with rank and depth, it's very, very hard to recreate that in in another language and deal with it in in orthogonal way, which I could talk about more, but I feel like I'm opening too many topics, so I'll pause.

00:26:19 [AB]

Let me just jump in there. It's for the listeners and but just as saying the dot expression and that's the outer product [20], which is kind of like a for for loop or double map if you want it, it's taking everything from one set of values and combining in every possible combination with everything from another set of values. And even though it's obviously has high complexity, but if that's what your test requires, it's really need to be able to do that in APL, J has a similar thing.

00:26:52 [CH]

Yeah, fun fun fact is was it? I think it was the very first episode of my other podcast, ADSP [21]. My co-host and I we talk about algorithms and finished by talking about our favourite algorithm and most people up till then, if you would watch my talks and stuff, know that I love inner product or what's known as transform, reduce or map fold in certain different languages, which exists in APL as a primitive. But that was my favourite algorithm up until I discovered outer product and I absolutely just love like there's a problem like a LeetCode problem that's like, given a string of L U R Ds which stand for left, right, up, down. Figure out if you end up where you started, which is just a matter of tabulating how many lefts ups rights and downs are there and basically checking are the lefts and rights equal, and the ups and downs. And you can do this kind of trick where you use an outer product where the left thing is a string of four characters 'LURD' and the right argument is just your input string and you can do basically like an equals and it'll generate like a four row matrix where each row corresponds to the Ls, the Rs, the Us, and the Ds, and then you can just do a plus sum on that matrix and you immediately get the counts of the Ls, the Us, the Rs, and the Ds, and like in any language except for maybe like languages like Python that have like a counter thing, you need to call like 4 different countifs or like 4 different counts. And it's just like so noisy. And in APL you can do it in like literally like you know, 6 characters or 8 characters or something. So anyways, yes, outer product jot dot is amazing. It's one of my favourite, is it still my favourite? Who knows, but it's definitely, it's very beautiful. Well, we'll link to some docs somehow for those that want to check it out. And Haskell also has it. There's very few languages that have it, but Haskell does have an outer product algorithm. If if you were a Haskell programmer listening.

00:28:53 [AB]

Marshall Lochbaum did a whole introduction at Lambda Conf to APL, introduction to APL, all based around the outer product [22].

00:29:03 [CH]

Oh yeah, I've seen that talk. That's a very good talk. Yeah, yeah we can link that as well.

00:29:08 [AB]

Now that you mentioned the inner product [23] outer product as your favourites, uhm, they were actually originally seen as kind of the same thing. That's why they use the same symbol in APL and like the outer product, was seen as kind of like a downgraded version of the inner product. It's like the it the inner product is a is a series of reductions over the outer product, so the other part is like a partial result towards the inner product [24].

00:29:34 [CH]

Interesting.

00:29:34 [AB]

And eventually Iverson even generalized the outer product more as the called the tie operator, or adverb, where you choose how many axes are being combined, how many axes are being withheld from making all the combinations. So by default it's the zero, so you combine everything. But you could also withhold certain axes like that.

00:30:06 [ST]

Just you referred to a couple of roles now for the APL programmer, which may be unlike most people experience of coding. So the the first one is fairly familiar with in the APL community, and that's the gun developer. The guy who will solve the problem faster with less code. I'll be running running more efficiently than people working in the other brand X technology. I'll be around the corner on the semi second beer by the time you finish. The other role you referred to as some people coding, and coding not being their main thing, they've got domain expertise and they can use APL to translate that into software. And of course, that's that's a role in which you don't learn a lot of computer science stuff. You're not, you're maybe not particularly into algorithms except the particular algorithm you might need for your domain. You don't know a lot about DevOps and administering stuff and organizing code and so forth. You referred particularly to actuaries. I worked for years with an actuary who's done 20 years or more of developing an APL application and has learned what he needs to know and so I guess my role in that was of a professional APL developer who can show him, well, was a day anyway, new techniques and can introduce him to other technologies, things that are going on outside the APL world, and that's something which if you're a Java developer, as far as I know, that doesn't have, your interaction with the guy with the domain expertise comes in the form of specifications. So far as I know are not a Java developer or a C# developer or C++ develop as far as I know is sitting in a room with a bunch of other developers. I wonder Josh, what's been your experience of supporting other APL programmers as a as a professional programmer in that way?

00:32:28 [JD]

Yeah, that's that that that is a big distinction between APL versus other languages. There's kind of now this push in IT organizations to silo out the application into very distinct parts, so you have a UX team, you have a UI team, you have, you know, other people, and you take persona interviews of them and try to figure out what the what the end user wants, and you know that's every time you introduce a step away from what the client wants you have this sort of entropy and you lose information that way. There's always a loss of stuff, so that's why I think it is important for when you when you see an APLer develop an application as the subject matter expert, they're they're the best person to know exactly how the user interface should flow, exactly how the user experience should be, where this button should be, and if you think about it, a lot of UI development is math based, which I think is another reason why APL was good at this, because math is, it's very easy to do math in APL and so, yeah, I mean. I I respect these other languages too. I know I have a lot of friends who develop in these other languages, but personally I can't keep up with all the changing frameworks that you know, there's always a new one to learn. And so I respect those who who can juggle all those different frameworks and everything, but to me I just want to focus on the problem more and that's why, you know, I just prefer to solve problems in APL.

00:34:15 [ST]

Paul Mansour did a conference presentation years ago. I missed it and I always wanted to catch up with him about it. I think the title of his presentation was something like "Why my mother has a particular plate, special plate for serving corn on the cob, and I don't" [25]. And I understood that the subject is core of his talk was very much what you've been talking about today about using APL, as it only stack for doing everything. And I'm remembering now that my last episode, Morten Kromberg, called us, yeah, basically he called APLers luddites. What I understood it to mean by that was that we're not chasing after every new technology. We're not, chasing up is probably not the word I want. We're not spending a lot of time staying up to date. We've got a good language and we want to do as much as we can within it. So from his experience as an APL vendor, he's got users who wait very much for the APL vendor to provide access to the new technology and explain how everything can be done in a purely APL way. So with my pure respect for Paul, I'm kind of very interested in how he's been able to able to pursue this single or single stack development.

00:35:52 [JD]

Yeah, and it's interesting now because we're seeing, uh, a sort of shift, not a shift, but a new development in this front with some people have been trying to come up with HTML solutions in APL for a while, and that's one of his latest projects called Abacus [26], is figuring out how to because how to control an HTML frontend from APL. And there's a lot of interesting things there, you know, we are going to be reaching out to, we're not going to try to reinvent the wheel on HTML. It does a very good job at rendering things. CSS. So now the idea is, yeah, part of the APL programming stack is going to have to, you're gonna have to know some CSS you're going to have to know a little bit of that, but the important thing is doing the callbacks the job instead of writing JavaScript, it would be better to have APL handle your user input, do your calculation. So that's the goal with with the Abacus Library and it's it's a it's still developing and it's interesting to be a part of that, see how that turns out.

00:37:03 [CH]

I sort of want to jump back to, at one point you mentioned the difference this sort of ties into what we're just talking about is production code, but you said that you know you could dive a little bit deeper on writing production code versus writing one liners, and I think you said you know you can't do one with the other, and I can't remember which one you said was there. But do you want to, you know, delve into that a little bit. 'cause I I definitely know that I get a lot of questions from, you know, the social media verse of, you know, me posting 1 liners and doing videos of you know one liner solutions 'cause those to be honest it's the easiest 10 minute videos to make is a LeetCode problem that is solvable in four characters. And first you show a couple of Python or Haskell or whatever, and then you go boom, you know, and it's it's and then everyone says OK, that's pretty cool, cool, but like, can you write like a nontrivial application? And like you know, I bet that's a lot more painful as soon as you have to do IO and stuff. So yeah, feel free to share your thoughts and experience and the comparison between those two.

00:37:58 [JD]

Yeah, so when you talk about a one liner, there's kind of different, there's a lot of different types of 1 liners. You can have the the code golf one liner which is you know, just for fun to see how short you can get things, but that doesn't mean, you know, one liners, there's a lot of very useful ones in APL, and actually just recently I got to, I have an example where I had, a problem I had to solve in production was a parsing problem and I had to figure out the nesting levels of parentheses. And Roger Hui has this excellent, on on the Jsoftware website jsoftware.com/paper/50, that's a great introduction to what you could do with APL with 50 functions [27] and one of them, I think this is a very interesting story, is the fifth one the parenthesis nesting [28] he talks about going to England, this is Alan Perlis, the first Turing Award winner, he's in this conference in England and Don Knuth is invited, ken Iverson is invited and Ken is up there showing this one liner. And it's to do this exact problem that I had, so this is, you know, over 50 years ago the same problem and the solution was is you know you take the right argument, you have a character string and you index that into an open and a closed parentheses, which results in, you know, if [inaudible] one two or three out of bounds, which is and then you take that and you index that into the integers, one negative one and zero, and then to the left of that you just do a plus scan, that's it. That's the whole solution to this problem, and Alan Perlis saw the magic of this. He exclaimed, he's actually sitting next to Dykstra on his right and Fritz Bauer on his left, and he explained, exclaimed like wow, that like, isn't there something to this or something along those lines? And Fritz Bauer on the left says as long as I'm alive, APL will never be used in, I think he was in, Munich, Germany, and then Dijkstra chimes in, nor in Holland. So these are two, and then Perlis realized the magic of this expression. And then here I am in a production situation I get to use this character for character as a solution, which is really a cool thing about programming in APL, I always think about programming being an ultimate balance of science and art. You know, there's the science to it. You gotta write an optimal solution. You have to make sure you know, run your performance tests, make sure everything is performing good and then artist artistic, the artistic aspect is very important to me too. Your code should look beautiful. And I think Roger Hui was one of the greatest examples of this. If you look at his code he's written and he was a big proponent of this as well. But then it also adds this kind of third dimension when you write APL code, is this historical dimension that I don't think many other languages have that you could talk about, oh this when you when you write a line of code, there's a whole bunch of history behind it, but these are these kind of expressions they're short, and there there's a whole bunch of them. The Finnish book of idioms I was, that was given to me, of APL idioms [29]. I don't know Finnish, but I could take a look at one of those expressions and understand exactly what it's doing, and maybe if I ever go to Finland, I'll use that as a Rosetta stone, it's like. You're out some expressions. I don't know if that'll get me very far, but.

00:41:39 [AB]

It's funny you mentioned this this parenthesis nesting. I mentioned in the beginning of the episode, you know, they have this chat event going on every week and this very last one, and just this past Friday [30], wasn't exactly in checking whether the parenthesis were properly nested, balanced, and this exact expression, as you say, character for character, it came up as one of the solution and I'll put it in the show notes linked to the video that I have to make this week. It's just about the problem [31].

00:42:11 [JD]

Yeah, that's interesting. 'cause I had that last month and you could see how timeless these expressions are. And yeah, if you bloated this out into a bigger solution, it wouldn't be as general and I don't think you will have someone talking about it in 50 years that you know it serves a purpose.

00:42:29 [CH]

Yeah, this I don't think it was the exact solution that you said, but I one time made a LeetCode solution video so with a bunch of languages and then showed how I thought APL was the best and it was for the maximum depth of parentheses, and I think it had like 16 or 17 characters and I had just written it on my own and not recently. But at the time I had also been reading APL papers and trying to just consume all this stuff and I came across a paper from 1979 called Operators [32] where he's going through sort of scans and outer products and stuff and he has a similar solution where he does an outer product with a 2 character string of the left parentheses and right parentheses and does a plus scan and then a reduction at the end, and I'm reading this and I'm like what the hell like this is? The LeetCode problem that I just solved 2 months ago and here ken Iverson, like literally like 50 like half a century ago is like out-LeetCoding me and then so I made another video, being like, you guys aren't going to believe this like I solved this, it was 16 characters and his solution is like it's like 8 or 10 characters or something and I think I had heard the similar story that Dijkstra was just like you know, this is an abomination. But like you know I I don't know how you can stare at that code and not even if you don't like the symbols and stuff, how can you? How can you not find that like just super beautiful and and while I'm saying that Stephen is holding up his copy of the the Finnish idiom booklet that he has. It looks like it's actually a an original 'cause it's bound. It's not like printed off and cut up. Where'd you get that, Stephen?

00:44:02 [ST]

Oh my goodness, I wish I know. It's got genuine Finnish in, so all the comments are in Finnish.

00:44:08 [CH]

And I think I think there is a PDF copy online. I've seen I've seen somewhere where you can get your hands on it and there's a bunch of different great resources where, like I've always wanted to come there's like, I think there's another one called 40 APL idioms or something like, you know that, you must know and or very common ones, and I've always wanted to go through 'cause a lot of these are written sort of decades ago without a lot of the the newer glyphs and operators that Dyalog APL has added, so I'm sure some of them will be identical, but I'm sure some of them, you can probably you know, halve the number, of characters because, you know, Dyalog APL today is, you know, I don't know if it's twice or five times or 10 times is more powerful than the APL of the 80s. But anyways, yeah, that's well, we should throw it back to, or go ahead Bob.

00:44:53 [BT]

Well, I was just going to say, do you think these these idioms or these really short programs last so many years? Because it's almost like degrees of freedom when you're working with it, it's working so precisely with what you're trying to do, it would be hard to do something that didn't follow the same form, especially when you've got somebody who originally you know originated the language, knows exactly how it is. They're more likely to use it with much more precision. Once you've seen that precision, it's almost hard to get away from that and write obfuscated code.

00:45:30 [JD]

Yeah, that's an interesting thing, and I think that's one of the other reasons I really like the APL expressivity. The thing is, a lot of people talk about type systems and how APL, you know should have a type system. Aaron Hsu does a great talk on this [33]. And and that's one of the benefits of having you know you don't have types. You don't need to declare things which allows you to have a very small solution that is general across different types too. That's very important to one of the most amazing things is the the marriage between types in APL to solve problems like the idea of there's no real distinction between a Boolean and an integer. You can use them to to, you could do math on. And that that that really helps you when you solve problems to look at it a different way. You know the the primitives are built to work across a lot of them are built to work across some types like there's no string dot index of, there's just an iota, and then you can use that on, you know, in an array of characters, an array of integers or whatever you have, hm.

00:46:41 [AB]

I wanted to respond to that timelessness of these small expressions. Conor mentioned Dyalog, especially having added lots of features, but most of what has been added is kind of plumbing around interfacing with various things. The core language hasn't really developed a lot because it's a diamond, you add anything to it, you subtract from it and and if we look at the core vocabulary of APL or any of the array language is really they, every primitive represents this fundamental concept of of transformation on arrays, everything being represented by arrays, and that doesn't really change much. You just have to have that vocabulary and you build things up from that. So these are expressions. Their timing is because people still want to do the same type of work in there in what they do. And this is how you express them. You, these Lego blocks that are fundamental concepts of information transformation. You build them up like that. And even when we add a new primitive to do something that is commonly done, then it's just about taking the old expression and substituting in some part of it that's now expressed more concisely or better with a new primitive. But it's still fundamentally the same algorithm, so these are in fact timeless.

00:48:15 [CH]

Yeah, I think, I think there's points and times where you write a piece of APL code I find where it's just, so there's only one way to write it. Like I mean, technically, there's two ways. There's the dfn way and then there's the tacit way and I always lean towards the tacit if it's, you know, less than a certain number of glyphs, but like even, was it a couple days ago I was solving a LeetCode problem that involved, you know, checking whether one string was a prefix of another one [34] and there's a classic sort of idiom that uses the atop, which is, you know, a first of find, or first atop find, so you know it'll basically find anywhere that a string is a substring of it. And if you just check the first element, it'll say well was the string on the left, you know a substring of the other one, but you run into problems where, what if the string on the left is longer than the string on the right? Then you end up with an empty list for the result of find. So I ended up coding another solution that basically does a take on the second string of the with the length of the first string, and if you end up in the situation where you're the the string that you're checking if it's a prefix, if that one is longer than your target string, it'll just basically the behaviour of take [35], which I actually didn't discover for a while, adds the default value, which is a space for a string to the end of it, so you end up with this basically, uh, once again, I think an atop where you have or it's actually not an atop, it only exists in BQN, but you can spell it in APL differently, which it's before, which is sort of length before take. And it's sort of the flipped hook, I think in J. And you can get it working by twisting and stuff in APL. But when I saw that I was just like it's 33 symbols. You've got length or tally or whatever it's called and then before and then take and that's The thing is in like Python or C++ or Java, you could spell that or right that you know, sort of many different ways. You know what are you calling the index on your loop. [37] You know, just there, technically it's an infinite number of solutions 'cause sure you could call it `i`, but you can also call it `j`, `k`, you know any any number of things. Whereas there's no index, there's no looping like there's only one sort of pure way to spell that in BQN or APL or J and uhm it's like once you once you see that you're never going to write that mini expression any other way, and there's so many of those small idioms, you know, once you see the average [36], which I think is the first of the 50 functions that Roger, who he does in that paper you mentioned Josh. And and once you see that in sort of the fork form, you're never you're never going to write average a different way, or at least I don't, I don't know that's the way a lot of the idioms sort of impact me. Like you see it once and it's like, Oh yeah, that's clearly the best way and sort of almost the only way, the right way to do that. I'm not sure if people have the same sort of impression when they see certain idioms. Bob?

00:51:24 [BT]

Well, one of the things you mentioned starting off is there's two ways to do it. One is, you know the direct definition, the other is is tacit [38]. I wonder how is tacit regarded in industry? Like, to me it would, yeah seem to be be good in some sense, you know some senses. In other ways, you probably want to stay away from it because it becomes wizard work for for the rest of the programmers.

00:51:46 [CH]

Well, so there's two questions for Josh here. What does what does josh think and what does industry think? Or even maybe 3. What is like the Carlisle group think what does Josh think and and what is industry think? Because they could all be different.

00:52:00 [JD]

That's interesting, it's funny the the first person I saw to use tacit in a production environment, well, a big tacit expression was this newer programmer who didn't really know much APL. And then I was looking at some of his code he was doing and I noticed this very complicated tacit expression. And this guy didn't know much APL and I was like, hey, this looks like something a Adám would write. And then sure enough, I I beat it, I got APL Cart [39] and it worked. You know that it came up there, so thanks to Adám we are now starting to see it in production. But there there's a there's a lot of different types of tacit functions. I think, like you mentioned, the average one that's my favourite tacit function is the split one, and that is one I think just makes a lot of sense as a tacit representation it's, you know, not equal partition right? Or same so that that's something that I find I I've used that a lot, I use that a lot in production systems, but uhm, I do find even not only just for tasks, it, like, if you took a long tacit expression. Would would I put that in production code? No, but I probably wouldn't even put that as a dfn [40] either. It's it might be more, I think at a certain level you need, you shouldn't be shy of just breaking up your function into smaller parts. Uhm, or if you name it correctly then I don't even have to open the function and I don't care what's in it, so if it's abstracted that way. But yeah, I mean tacit, I don't use them that much. I use dfns pretty much for everything. And I mean there's even traditional APL functions [41]. I'm I don't know if you've had much experience with them, procedural functions, which is another style of writing them, which is probably the, over 95% of what you'll see in production APL systems you don't work at The Carlisle Group, which uses dfns for everything.

00:54:12 [CH]

Do you run into limitations? 'cause I think I discovered at one point when you use dfns, you can't use I I know, I think the if control flow statements and then I'm not sure if it's the looping as well. Not that I mean you're always trying to avoid loops, but there's certain cases where I don't know it's it's if you're trying to the classic, I can think of is like the guessing game gets a number between one and 100 and you're just cycling through like reading in from the screen. I I don't think you can use that stuff in dfns. Do you find there's limitations? Or basically there's a different way, anything that you can do with a tradfn you can find a way to do with dfns, it's just spelt slightly differently or

00:54:53 [JD]

yeah, I, I think you could do everything with a with a dfn besides some stuff like niladic functions which I don't even know if are a good idea, anyways, a function that takes no arguments which you can't do in a dfn but it, and also this is something Paul taught me is if you need a block of an if statement if your code needs an if block then your function's too long, it should, that should be a separate function and your, you know, you can, do we have, uh, guards and dfns to evaluate uh, you know, Boolean expression on the left. If it's true, then something on the right, but I don't, I try not to even use that as much. Sometimes you have to, but, uh. You know the the whole idea of using Booleans, the APL way of doing a conditional is, you know, you can take a Boolean expression and that also works as an index. You could pick a result based on that Boolean, which I think is a lot of times a nice way. The other thing is, the power operator is another way to evaluate conditionals. This is one of my favourite operators because one of the things I look for in a programming language too is how orthogonal it is, and that's something that's a strong suit of APL. I think you know, you learn one function that it could be applied to so many different operators. Like, you learn plus scan and then, oh, maybe I'll put a Max here and Max scan works Max reduction and so on. But back to the the power operator [42]. Yeah, I mean, the idea of raising a function to the 0th power means you don't run this function, so you could use that as a Boolean thing. Raise it to the first power, run it once or you can raise it to the second or more, and you're running the function that many times and passing in the result of the previous run in. And then and then on top of that, there's the negative power which I actually got to use very recently in production code, doing some unit conversions, which is very nice. I use tacit for that. That's probably the first time I really use tacit functions because you have to define functions tacitly to be able to use the. Inverse operator on them. Another interesting thing about the. Power operator, I think Roger Hui last year had a sort of pseudo serious question about maybe what does it mean to raise a function to a fractional power? So that's not something implemented yet. I don't know, maybe your listeners or you guys might have an interesting idea on that, but that's that's an interesting problem to think of.

00:57:31 [CH]

So it sounds like a lot of the times too then, the the answer to can I do this a certain way as like maybe think about the problem differently and there's a there's a better array solution to? Whatever problem that you're trying to, whatever solution you're trying to bend into a dfn that might not look like it fits. Potentially there's a different array solution to that that I just haven't discovered or yeah discovered.

00:57:54 [JD]

And you know a lot of it, Adám has showed this a lot, too, in some of his talks, is you could take if you have a zero or a one and you're not thinking of that as a Boolean, you're thinking of it as you know, in math you could come multiply that by a number to get rid of it or or, you know, if it's one you keep the number so you could do a lot of conditional expressions just by using pure math instead of having to rely on these type of conditional statements that you know you get into branch prediction and then can really slow your program down.

00:58:29 [CH]

Yeah, that's a really nice pattern. The first time I saw it, when usually, in a functional language, you'll probably, you know, filter some list before doing a reduction on it and summing things up. But in APL, you can just multiply the mask by it instead of filtering, and the masks will correspond to ones being the things you want to keep in zeros you don't, so you can just zero things out basically and then some of them, and I think that actually tends to be more performant in some cases because you're not reallocated, being like an unknown sized uh array, which is which is really nice.

00:59:07 [AB]

But also modern day processors can use vector instructions, so it doesn't actually matter for the runtime for, within certain limits, of course, how much data you are processing. It takes the same time to go through the processor, but branch failures, branch prediction failures, they take time.

00:59:24 [CH]

One of the other questions, not to, I feel like we're skipping around here a bit, but one of the other questions I wanted to ask is. So you went to university and got a CS degree, but after having sort of discovered APL at a very young age and not only discovering it, but you know doing internships, I think you mentioned in high school, and so you'd actually had like a nontrivial amount of experience with APL before going into getting a university degree. And then while you were there, it sounds like you've learned a bunch of different languages. You know, all sort of Algol or Java-like. Do you have anything to comment on, like what, how that experience of having learned APL from a young age, how that informed sort of going through university like the whole time where you just sort of scratching your head being like it, seems like we're going backwards here? Or were you, were you trying to convince your classmates of like hey, you know trying to check this out and people just looked at you or was it just something that you like was a big secret and that no one knew that you sort of had this, you know, not alter ego like Superman or whatever, but just like you know no one ever knew that you used to code in APL and the reason you were finishing your homework faster was was unknown to people. Or are there any? Is there anything to say there, I guess.

01:00:38 [JD]

Yeah, that's a. Yeah no. I was actually, I'm actually known as the APL guy in in a lot of with a lot of my classmates, class rooms and the professors knew it too and they were kind of interested, i, you know, in our programming languages course I got to go up when we APL actually comes up in the textbook. So I get to go up to do a demonstration of it. I got to do some of my projects in APL2, if... But yeah, I'm happy I got to have that experience of, you know, more low level languages too. Then, APL, uh, it it makes you more thankful for it, but also you do learn some tricks in a traditional CS degree that you know, I don't think it's always, you have to have a completely APL solution to a problem. There's a lot of kind of it's it's really good to go through that thing, and learn about what it what is, uh, what is big O notation [43]? How do I measure how this program is going to perform or scale and learning these other data structures too are interesting, but at a certain point, you know, I, I don't want to, I get tired of you know we have to implement all the data structures at a low level, like linked lists, hash maps and so on. I'd rather spend my time focusing on one data type, which is the array that all these data types are really built off of, so I think it's I think it paid off more to just focus on that and just really focus on the APL, set of primitives, but the CS degree, yeah, I like it and I still use some techniques for that in programming. There's a lot of, I don't know if I don't think there's a problem you can't solve in APL, if it's, obviously if it's solvable, there's some problems you just can't solve in any language, but it would be interesting to see, I don't know if your listeners disagree that there are, some say you can't use APL to solve a problem. Maybe email me first name at Dyalog.com [44] and I'd like to take a look at those problems and because there's other stuff in APL too. Like we don't even talk about the idea of namespaces, which was very interesting. Per, introduced by Ron Murray, he now works on the interpreter and Dyalog internally, so that concept you could do a lot of, kind of CS-y tricks that you see like in we talked about Abacus, that library for having a virtual APL DOM to court, to represent what's going on in the HTML engine, but one of that key things is you can represent those elements as an array of namespaces [45], so you can keep track of things like having pointers to parents or other things like that, uh, But there's a lot you know it doesn't all have to be a simple 2D matrix. There's a lot of different ways to represent problems.

01:03:40 [BT]

Did the the CS degree, widening of experience, I guess it's maybe the way I look at it, does that help when you're actually looking at a production problem that you can, it gives you a way to maybe look at how you might breakdown a problem as it's presented to you and then almost use the APL's little nuggets embedded to solve these, the variety of problems that come up, you know, you've you've sort of meta broken them down as you say into functions and then the functions can be quite terse APL expressions, but the way they link together is actually maybe also related back to what you would learn in a computer science course.

01:04:22 [JD]

Yeah, yeah, I mean, problems are when you have real world problems there's there's all different types of them, some of them you know can just be solved the whole operation at a time or a based some of them you, you know there's nothing stopping you from mix and matching these problems, you know, using some more traditional CS approaches, but then it's it's all that's the nice thing about Dyalog is there's, no, it lets you have the freedom to work with all these different paradigms. You could take a functional approach, you could take an object oriented approach, you know, and I am common, yeah, I'm commonly mix and matching different, uhm, techniques array based or or other ones that you might have picked up in CS class but I don't think I think that certainly helps, but sometimes it it can hurt you more than it helps sometimes, because if the in the end, I think if you can think purely array based that will, that will give you the best result you know in terms of performance and efficiency to a lot of times you fundamentally change how the the problem solved, and it might be interesting to see a resurgence of this, I know now we're we're seeing the push to cloud environments where people are very cautious of how efficient or how many how many CPU cycles it takes to perform this problem? Because now there's a dollar amount to it, so I think that that might be an interesting area where you might see APL rise up again because sometimes you could just express a problem without all this boil, you know when without having all this boilerplate stuff, you get to see the problem more clear.

01:06:16 [CH]

Alright, so we are a little bit over time I think, but I guess maybe, well, first I'll ask, are there any questions, final questions from the other three panelists before I coop the last question ultimately for myself. But so and this is sort of an open ended question, but I'm I'm interested if there's like a question that we haven't asked or something that you want to talk about. That it's just I've sort of gotten the sense throughout this, you know, conversation that there's been a couple times where he said, oh, you know, we, we could definitely dive a bit deeper. And so it seems like you've sort of definitely viewed opinions or formed opinions on you know whether it's one liners versus production or you know, taking the way you think about problems and solving problems and array languages. And then you know using that to solve problems in non array languages like Java. So I just want to I want to make sure that there's not a question that we haven't asked or something that you want to say, whether that's just with respect to APL or programming in general, 'cause I found it, you know, super fast. I think that from a very young age you sort of you managed to discover APL, whereas I didn't really discover it until much much later in life. And you've basically turned your whole career so far, like you've worked at a couple different places and now you're consulting and full time developing in APL. And from my point of view, that's like wow, like, that's like what did what did I do differently other than not having a neighbour like, so, I'm not sure if there's like wisdom to impart or I'm sure there's a few listeners that are, you know, younger and might be thinking "Oh yeah, it's I'd love to work in you know, with APL, but you know there's way more opportunities writing Python out there." so anyways, that was sort of a super open-ended, long winded ramble, but feel free to respond however however you want to

01:08:07 [JD]

i think I I think that story that you know, I had a neighbour who introduced me to APL. It just kind of shows that it's easy to get people into APL. It's easier than we think, so if, which is why I'm happy we have all this sort of media work that you guys are doing and Adám is doing, and Richard, that I think people just need to be introduced to it because the nice thing about it is you've really been people don't realize that they've been learning APL their whole life without knowing it. When you take math class as a kid, you're learning it, and then when you there's really not that much to learn to it. It's an optimized learning, like if you learn take by virtue of knowing that you now know what drop does, and again, the orthogonal aspect of it. Yeah, things just you can with a small set of primitives you can and op operators especially you can mix and match them in in ways that just makes sense that you'd expect. For me, I really, really like working with APL because it's basically solving puzzles all day. So it's it's really fun and I'm happy I get to do it. And there are opportunities out there for people who wanted to do APL. The people, it's especially our age because you had a lot of older folks who are now near retirement who have been working on these APL systems so, so there's there could be, there are openings for for this type of work [46], and even if you don't wanna, I think there's a lot of smart minds I'm seeing in the orchard and all these places these young people, looking at APL, and I think that and don't be afraid to apply it to build something because that's what I've seen. I've seen a lot of small groups, people just take an idea and just run with it and APL really holds your hand the whole way when you need to build something amazing.

01:10:01 [CH]

Well, I think that's a perfect way to end it. If you want to have fun all day long solving problems and have your hand held and also not have to be confused about what public static void main means, come check out APL and build a career out of it. If Josh did, I guess anyone can and you don't need a neighbour, uhm, I think. I guess maybe time will tell. What is, you know, four of our next 10 guests "Also, we're neighbours with some prolific member". Then I'm just going to start to get a bit disheartened.

01:10:31 [JD]

Yeah, I did figure out what public static void meant by the time I finished my career.

01:10:36 [CH]

Well, there you go.

01:10:37 [JD]

My CS career in college so.

01:10:40 [CH]

Bob, you were going to say something.

01:10:42 [BT]

I was just going to add the usual if you'd like to get in touch with us, it's contact at ArraycCast dot com [47] and you can send emails there and we respond to them. We've had some pretty good suggestions and upcoming shows should reflect some of that, although every time we do a show, Conor seems to come up with a few new people that we really should have on and he's absolutely right, but it does tend to expand the the list of, we should have this person on, or we should do this topic, but still. That's why you can get in touch with us, contact at ArrayCast dot com and show notes have been mentioned throughout the show and they are, I think, vital. If you're if you're at all interested in what we're talking about, quite often the show notes go into much more depth and really give you an understanding about some of the concepts. So if you get a chance, show notes. And also we have transcripts of the whole thing, so that if that's useful to you for whatever recent searches, it's great. You do search from hot topic and then you can find the part of the the episode that relates to it, so I think that's about it for me.

01:11:50 [CH]

Yeah, I think our guest list is going to be like you know everyone book list. You know you finish one book and then unfortunately that book mentions four other books that you have to add to your list to read and book lists don't ever get shorter. You know incrementally, for one moment you'll subtract 1, but in aggregate they're just constantly growing. So I'm I'm pretty sure that's going to be the same thing with our guest list plus, when you mentioned that we're bringing everyone back at some point in the future, yeah, it's it's a it's a losing battle. But yeah, Josh, thank you so much for for coming on and sharing your your story. And I think it's it's pretty inspiring and it's also super interesting 'cause I don't think we've really had anyone on the show talk about writing production code in APL. We sort of talked a lot about the philosophy, you know, you know personal stories, but not not so much, you know, how do you go from writing one liners and the difference between the different types and then you know just different problems that you know all software engineers are solving, but this is, I think, sort of the first time I'm hearing on this podcast, so it's super interesting to hear, and we'll definitely make sure to link, you know, all the talks that we mentioned about you winning your APL contest. Maybe we'll have you back on. Maybe we'll have like a past contest winners at some point, and we'll bring a panel of all them on and we'll do something special like that, and we'll also make sure we get your socials, so if people want to follow you that they definitely can. So thanks for coming on. And with that we will say happy array programming.

01:13:16 [All]

Happy array programming.