Transcript

Transcript prepared by

Sanjay Cherian, Igor Kim, and Bob Therriault

Show Notes

00:00:00[Adám Brudzewsky]

I think I'd like to answer your first question as to which array language is the best one. .

00:00:04[Conor Hoekstra]

Fantastic.

00:00:06[AB]

I think it's out there, kind of floating in the ether.

00:00:09[CH]

Copout! It's not what I wanted. [LAUGHTER]

00:00:15[MUSIC]

00:00:23[CH]

Welcome to episode 87 of ArrayCast. We are recording on site at the Iverson College 2024. We will maybe talk about which number edition, because I actually don't know. But well, Stephen's here. So we'll get into that in a couple of minutes. I'm your host, Conor, as usual. And we will go around and do brief introductions from our three panelists today. We'll start with Stephen, then go to Bob, and then go to Adám.

00:00:49[Stephen Taylor]

I'm Stephen Taylor. I do APL and q and occasionally get enthusiastic about them.

00:00:54[Bob Therriault]

I'm Bob Therriault. I'm a J enthusiast. And I'd like to take this moment to thank Stephen for putting on this whole conference, because it's amazing.

00:01:02[AB]

I'm Adám Brudzewsky. I do APL at Dyalog. But I'm very interested in all the array languages.

00:01:09[CH]

And as mentioned before, I'm your host, Conor, array language enthusiast of all the array languages, of which we have representatives of not every single array language, but almost all of them. [01] So what we are doing right now is we are in the-- actually, what is the name of this room? The Leslie-Stephen room? Thank you. Thank you. That's Devon. And we have, I think, 20, 25 people, including the four of us. And so what we're going to be doing today is having kind of an open discussion about whatever we want. And we will be handing my mic off to the folks that are not sitting-- the panelists that are sitting in front of their own mics. And I think we might start with kind of the major languages that are represented in the room. And those individuals can briefly introduce themselves. I think we've had every single one of these individuals on the podcast before. But because Henry's closest to me, I'll start with Henry, who is the main developer, or the developer, of the J language. And I will say, in the future, when I'm handing off the mic, if you haven't spoken yet, be sure to mention your name, if you want where you're from, and also the primary array language that you're working on, if you are working on an array language or working with it. So I'm going to hand this mic to Henry, and we'll go from there.

00:02:24[Henry Rich]

I'm Henry Rich, principal developer of J software.

00:02:28[CH]

Awesome. So we've got Henry. I'm going to move next to Kai, who hopefully you know already, because we've had him on a couple times as well.

00:02:33[Kai Schmidt]

I'm Kai. I develop Uiua.

00:02:36[CH]

And I guess-- I mean, we've already introduced-- no, we've already introduced Adám, because he's a panelist. He's the head of language design. But he pointed over to Aaron. So I guess Aaron's going to be our non-panelist representative of DyalogAPL. Say hi, Aaron.

00:02:51[Aaron Hsu]

Hi, how's it going? So I also developed the Co-dfns compiler, so that kind of is a different thing.

00:02:57[CH]

All right, and the question is, we have q and k people in the room. Does anybody want to be a representative of the q language? We've got a bunch of heads being shaken. We also had some-- we've got also-- do you want to be a representative of the q language? No? All right, we've got q people in the room. We do have one volunteer.

00:03:20[Alex Unterrainer]

Hi, I'm Alex. I'm a KDB developer, and I also run my blog, defconq.tech, about KDB.q. That's pretty much it.

00:03:29[CH]

All right, and are there any other array languages in the room that are representative that haven't been mentioned? Oh, yeah, we do have a representative of BQN.

00:03:35[AB]

Ohh yeah. All right.

00:03:38[Brian Ellingsgaard]

Hi, I'm Brian. I developed the Ray-ed BQN library, and also a few extra for BQN.

00:03:44[CH]

And I think every single one of those folks, except for Alex, has previously been on the podcast. So I will open it to questions. I do have one locked and loaded. But if there are questions that people would like to talk about first, we can start there. If not, I can see Aaron smiling. All right, we'll start with my locked and loaded questions. It makes sense, seeing as all these folks are in the same room. What is the best array language? [LAUGHTER] Make your pitch. I'm not sure who wants to go first. I'm just the moderator here. I'm looking at Kai. I'm looking at Henry. You don't have to argue for wires is the best, necessarily. But what differentiates your array language from the other ones? All right, Brian.

00:04:32[BT]

I like that question way better.

00:04:34[CH]

It's a little more click-baity, the first one. But what differentiates your array language from the others?

00:04:38[BE]

I mean, come on. Higher order functions are necessary. Like, come on. Right? [LAUGHTER]

00:04:46[CH]

All right. Aaron had something to say. Well, come over to Aaron. Wait till you have the mic, Aaron.

00:04:50[AH]

Already making me say something controversial. Higher order functions are way overrated if you get past the second order, thank you very much.

00:04:59

What?

00:05:01[CH]

I don't even understand what that means. Do you want to elaborate?

00:05:05[AB]

He's saying that you don't need hyperators.

00:05:06[CH]

Hyperators, famously from the NARS 2000 Bob Smith episode. I actually don't know what episode that was, but we will link it in the show notes. And Brian's got a response.

00:05:16[BE]

So Aaron, in RAID BQN, I have drawing functions. And I need to hold the state after giving the inputs. So I just use the functions themselves to hold the state. How do you do that with just second order functions? You need--

00:05:31[AH]

So are you saying that you retain the state inside of closures? So you're also assuming first class procedures.

00:05:36[BE]

Yes.

00:05:37[AH]

Yeah, both of those lead to interesting performance problems, and memory management issues, and escape issues. So you can use things that imperative languages have been using for a long time to manage this instead.

00:05:46[BE]

K, yes, there are performance issues related, yes. Yes.

00:05:51[CH]

Do you want to mention what these imperative languages, what's the workaround or the language facility that they use to do that?

00:05:57[AH]

Yeah, so if you don't have higher order functions, you just have to be a little bit more manual in where you place your state and how you manage it. So global state can actually be a very useful thing if you're doing game development, and you need to manage some worldwide state that everything needs to talk about. That is unpopular in persona non grata in the functional programming community, but I think it's been taken too far. Imperative techniques do have their place for managing certain types of global persistent state.

00:06:29[BE]

I do feel like sometimes-- so I keep it very functional. So I give the state when the frame is done to the second iteration of the per frame function, meaning each frame gets the state of the previous frame. But if you have a variable outside that you modify, sometimes it tends to make the code a lot simpler. But I still avoid it for basically no reason, just because I think functional-- you know, the rumors on functional. I just believe it. And I feel like I could be wrong with that, but I haven't changed the examples because I don't believe in myself, I guess.

00:07:10[CH]

All right, Brian has drunk, drank the functional programming Kool-Aid. You've got a mic, Adám, so you don't even need to put your hand up.

00:07:19[AB]

I think I'd like to answer your first question as to which array language is the best one.

00:07:23[CH]

Fantastic.

00:07:25[AB]

I think it's out there, kind of floating in the ether.

00:07:28[CH]

Cop out! It's not what I wanted.

00:07:39[AB]

I think that all these array languages that are being invented, discovered, however you want to look at it, they are kind of orbiting the ultimate array language. And each one of them have some downsides, and each one have some upsides. And maybe over time, we can get to something. So it's not that one of them is best. Currently, the situation is each one of them is probably best in its niche, you could say. And it's taking some piece of cake. If you compare BQN and APL, then there are different benefits to BQN's more regular set of primitives and syntax. On the other hand, its set of primitives might not actually be ideal for those that actually try to use it. They tend to miss some, and there are some that they don't really get around to using much.

00:08:28[CH]

Encode encode partitioned in close.

Yeah, exactly. But it goes the other way as well. Those exist in Dyalog APL for those that are listening that aren't aware that-- On the other hand, you're missing a sort primitive and a better way to index. And the vocabulary size is something that can be discussed over time. There's definitely benefits to K's simple array model, but there are also downsides. There's some things you try to do with it where it's not so beneficial that you cannot mark what are the units of data that you're working on in a higher rank array. I mean, there's a whole thing about the spelling. k and J have a benefit of always being typable and printable without any further ado. And Weaver definitely needs its own font, otherwise it goes completely crazy. You can't see anything in the default font. And the languages are more than just what you think of as a core language. How do you deal with things further away, system services, interfaces to other languages, the way that it's done, syntactically, vocabulary, how much you rely on third party tools, how much comes bundled with the language, the interface is part of it. What's the normal way of development? I think I want to believe that there's an ideal array language somewhere, but we're just not there yet.

00:09:59[CH]

I was thinking political, a politically correct answer so.

00:10:04[BT]

I'm going to make an ecological argument that diversity is actually a strength. That if you try and go for one single array language, or you're just aiming to the exclusion of other array languages, it's like in any ecology or environment, you actually become weaker because a monoculture doesn't have the diversity and the ability to adapt to other things that you wouldn't have expected. So I can think of bananas, which are basically a monoculture. If a virus hits a banana crop, we lose most of the bananas that we usually eat because they're literally clones of each other. Well, when you have a variety, you avoid those kind of problems. It also gives people an opportunity to take the array paradigm and explore with it without being confined to any particular language in the languages that are diverse.

00:10:52[AB]

But I think there can be such a thing as too much diversity. We cannot cooperate on things if everybody speaks their own language. And so I admire J for having stayed a single language over time. It's not so old yet, but it has a respectable age. And it's pretty much just a single J, whereas a k is a mess. [02] You never know--

00:11:22[CH]

There we go, Adám. There we go. This is the content I came for.

00:11:24[AB]

Absolutely. Sure, APL is also somewhat of a mess. There are lots of different APLs. But the core functionality, the core primitives of APL, you can rely on doing pretty much the same thing, maybe except for some edge cases. But it's all the same. Everybody has this backwards compatibility, more or less, with APL 360. But in k, I don't know if there's any primitive that you can rely on doing the same thing in every k.

00:11:55[CH]

Henry wants to weigh in here.

00:11:57[HR]

Well, the evolutionary pressure that Bob referred to is out there with advances in hardware. All of these languages were designed for single processor systems, and that's 5% of your CPU power nowadays. If multithreading is a way to get to most of the power from on a CPU chip, but then there are GPUs. And the whole array method of operating on arrays, an entire array operation at a time, sounded good in a single processor, but in a multiprocessing system that wastes a great deal of memory bandwidth. And that's something that these languages will have to cope with if they're going to utilize the hardware efficiently.

00:12:55[BT]

And I agree with you, and my argument would be that having a variety of languages, you may find one over the others is better to equip. The others may learn from that, and they may also make adaptations. But if you've only got one, and everybody's on the same target, you might have more people working on the project. But I'm not sure that narrowing of the window of how you're looking at it might reduce your chance of being successful.

00:13:21[CH]

Alright, we've got a new individual, so please introduce yourself before you speak.

00:13:25[Devon McCormick]

Hi. I'm Devon McCormick. I'm a longtime APL programmer, more recently mostly in J. In favor of the diversity argument, I just would like to say Kai's talk on Uiua was very eye opening, and especially being stack-based. And it really affects the whole paradigm of monadic versus dyadic functions.

00:13:48[CH]

I'm not sure if Kai wants to follow that up.

00:13:49[KS]

Yeah. Thanks for that. I think that the power of the Array languages that we like has to do with, obviously, manipulating such a simple data structure and having all these good primitives for it. This is not something that's inherently tied to the way we construct the syntax of these languages. Before I made my language, most of the existing Array languages use infix functions and things. And that's how they conceive of most of the syntactic structure. Uiua does-- it uses as a stack. So it's kind of prefix, kind of postfix. But those aren't the only two ways you can construct languages, the syntax of a language. And they're so, not only in the design space of how we do Array primitives, but in the design space of how we construct how the language is to write and to read. There's so much unexplored space, like Adám was alluding to, of what the platonic ideal could be.

00:14:54[CH]

And before I pass this mic off to Josh, who will introduce himself, none of these talks-- because they're kind of impromptu talks. Some of them have slide decks. Some of them are more discussions. None of them are recorded, unfortunately. If you just heard Devon mention this Uiua talk, you won't be able to watch it online because it wasn't recorded. However, I do know that there is a Quest for Tacit, which was recorded and given at the New York List. Is that the best one you'd point people at?

00:15:26[KS]

No, I wouldn't point people at that one. I think the existing Arraycast episodes are probably the better ones.

0:15:30[CH]

OK, so there you go. Not a talk. I still really like the Quest for Tacit. So if you are looking for a YouTube talk, you can go watch that one. But I guess Kai's pointing back to the previous Arraycast episodes, if you haven't already seen that. All right, over to Josh. Please introduce yourself.

00:15:45[Josh David]

Yeah, I'm Josh, and I use APL. [LAUGHTER] Well, the concept of diversity of languages is an interesting one. But I think it's also a mistake to have a diversity of languages in your text stack. I think what a good array language strives for is notation of solving problems. And the breadth of problems that it could solve is an important factor in finding the optimal or best array language. If you start putting constraints on your language, like you can only use it if your data is within this size, then I think that's a mistake. I think a good array language is one that handles problem across the whole ladder of the data you're working with, and also types of data you're working with, and across domains, too. I think that's important for the best array language. Would you say it's possible, though, for different languages to specialize in different areas? So that they may be, to some extent, general, but there's a specific area that they would be working in that would be sort of their domain. I think that should be behind the scenes in interpreter, something the interpreter handles. I don't think, at its core, you should worry too much about performance when you're developing array solutions. I think you should focus on beautiful notation and let the interpreter catch up to solve those problems for you.

00:17:01[AH]

With regards to the best array language, I want to take a longer term view. So if we're thinking about language, we're first thinking about what's the language for? And I want to say that array languages should always emphasize communication between humans as their core mandate. I think that we ought to embrace that, especially Iversonian language. So then you have to ask, well, what is good human language that enables and fosters exchange of ideas and communication of knowledge and expertise and solutions amongst people? And what you want is a language that will change. You need your language to adapt, but it needs to be highly resistant to change. It needs to be not immune to change, but it needs to slowly change so that the continuity of information exists across generations with your language. So you want experiments, but when we think about these languages over time, I think you have to have experiments because they help you to understand ideas, and they help you to change things. And they will change things, but you don't want that change to hit all of the community so fast that they have to constantly rewrite everything they're doing or change the way that they communicate all the time. So the best array language is the one that will change, but change very, very slowly, and that emphasizes and optimizes for the human communication problem and over time allows the implementers the space to make that communication performant. And to help the users using the language also write performant code that's communicative.

00:18:37[BT]

So being resistant to change, I have had the opportunity to edit some recordings that were done in 1982 by a CBC reporter at IP Sharp. And one of the stories that hopefully you'll be able to hear later on in the season is Ken Iverson actually talks about the fact that IBM was kind of ignoring them during development was a strength because they couldn't get the resources to change as fast as they might have wanted to. But he said by the time we got to the point where we could make changes, we'd had a chance to think about things. And we often didn't do what we'd initially thought. So he said when he looked back on it, that resistance to everything was a plus. It allowed us to develop in directions that in the end were better choices than we would have made if we'd have been able to make the decisions when they were coming to us.

00:19:27[CH]

Because I was also thinking in the back of my head that you're advocating for slow change, but I think there's a time, if you're pre-1.0, that rapid iteration and throwing spaghetti at the wall, which is-- I mean, I don't want to speak for Kai, but Uiua definitely is experimenting with a lot of stuff and putting stuff in, taking it out. And I think that has yielded massive dividends in the form of-- I mean, I'm personally biased. If I'm going to say something for BQN, I think the combinators that BQN has are richer than any other array language. That would be my one pitch for BQN. But Uiua has gotten a bunch more combinators.[03] And I discovered during this week at Iverson College that there's actually a bunch of experimental combinators that are potentially on the way. And so I'm not sure if you want to comment on-- slow change can be good from not breaking things or moving too quickly for your users. But also, if you're pre-1.0 and you're rapid, there's a benefit to both, maybe.

00:20:25[KS]

Yeah, I think speaking to Aaron's point about the slow change and giving people time to adapt but also not being too rigid, I think the reason that you want to try so many things out is because when you want-- because you're exploring, you have to have the space to try all the things and to get that communication out. Because if you want the syntax and the way of communicating these ideas, when you're in a totally different, in this case, syntactic paradigm, it's hard to know exactly what's going to communicate in that correct way and what's necessary, what's unnecessary, what are barriers to entry, what aren't. And so that when you do hit the point where you're ready to say, I want people to use this to communicate, I want them to write things with this language, that at that point, you can reasonably be confident that you're not very quickly going to be pulling things out from under them and that you will be at a stable phase of being able to have your slow, iterative change.

00:21:35[AH]

Yeah. I think you can think about it in terms of the size of your community. You can have pockets of highly volatile experimental change that you want to encourage within small groups first. And then you can gradually allow that to percolate out into larger and larger amounts of the community. But the larger and further out it spreads, the more danger that it ossifies within the community. And so you have to do a lot of early experimentation in small groups very fast that's not going to have too big of an effect everywhere else. And then allow that to, in a measured way, propagate out to the rest of the community.

00:22:11[CH]

New individual, please introduce yourself.

00:22:13[Brandon Wilson]

Hello, my name is Brandon Wilson and I'm an APL addict. No, I'm a newly minted APL consultant.

00:22:21[BW]

Hello, my name is Brandon Wilson, and I'm an APL addict. No, I'm a newly minted APL consultant. I think this discussion about slow change versus fast change is conflating two important concepts. One is being able to iterate quickly on the problem that you're trying to solve. And another is the change of pace of the tools that you're using to iterate on the problem domain. The tools that you're trying to use-- like if the language that you're using to program is changing under your feet as you're developing, that is a huge handicap. But if being able to quickly change your code and solve the problem, that's a bonus. So I think it's important to separate out the target of what you're trying to solve and the tools that you're using to solve that problem. And think about the speed at which you want those to change and develop. Everybody's on board with that?

00:23:10[BT]

I'm just thinking that the project you're working on when you're developing a language is the language. And so it's almost like you've got meta tools that you're working to develop your language. And that's what's changing quickly, at least at a certain stage. But you're right. We're sort of mixing two things that kind of crunch together at that point. One is developing a language, and then the other is using a language to solve other problems.

00:23:35[AH]

When you talk about your meta tools about developing languages, that's an interesting question. Because if we're thinking about developing language, and then we're thinking about the tools to develop a language, what might those tools be? And I think it's very hard to develop those tools, because they don't really exist. Because as a language designer, you're designing the thing. And that has nothing to do with really the code. It has to do with the syntax, the semantics behind the language that you're building, whether you can or cannot implement it. And when you reach that point, I think you're starting to get into the point of, can your community build principles that would drive or guide this exploration space? And if you're at the principal layer, now your community has to try to agree, what are the principles of our language that we want to preserve as we do the experimentation? And that's an interesting challenge.

00:24:26[BW]

I'd push back a bit on the fact that you can't iterate on your meta tools, that they're not as tangible. I don't think they're that ethereal in any sense. I mean, if you're using Rust to implement Uiua, for example, Rust defines the kind of cognitive-- or the kind of tools that are very available and easy to reach for, right? It's probably hard-- I mean, if you're using Rust, you're not going to be immediately at your base level implementing multi- dimensional arrays in Rust to implement Uiua, right? So it's that kind of difference, the semantic difference between the concrete tools that you're using influence the way that you think about the problem. And then if those are quickly changing under your feet, it's hard to get a mental understanding of what you're even doing, right? But as Conor said, if you want to throw spaghetti at the wall, you don't want your spaghetti to be changing to other random things, dogs and rain or something. You're just doing other things. You're not solving the problem that you originally set out to do.

00:25:30[BT]

Well, and actually, it sort of bridges on something that happened yesterday, where we started asking questions about architecture. And we went for two and a half hours, like a spontaneous conversation. And I was sitting here thinking, we really would love to be recording this. Because it sounded exactly the same kind of conversation as you have in an ArrayCast episode. But it seems to me that that's the same idea. As your meta tools reflect an architecture, that if they're put together properly, you can start building into that framework, and you're not getting things changing all over the place.

00:25:58[CH]

All right, so circling back to my initial question, which we modified to what sets apart your language from the other array languages, we had Brian, who mentioned first class functions, higher order functions, in BQN, which I think is, I think it's fair to say, the most functional of the array languages. I'm not sure if there's any pushback. I mean, Kamila is, I think, out of the room right now. She might argue that Kamila Lisp is a contender for that. We had Adám talking about DyalogAPL. I don't think you actually-- you mentioned a couple primitives, but I'm not sure if you had to choose one thing what would set DyalogAPL apart from the other array languages. Do you have a response? Or does anybody on behalf of Dyalog APL have the one top thing which they think differentiates it from the other array languages? Devon, the J programmer, speaking on behalf of Dyalog-- or maybe he's got something for J. We're about to find out.

00:26:57[DM]

Well, I did start out in APL for decades, so. - There we go. He's got decades of experience, folks, longer than I've been alive. - It is the granddaddy of them all. So it has longevity and experience going for it.

00:27:05[CH]

OK. Other thoughts? We're back to Brian.

00:27:06[BE]

Well, the biggest thing with Dyalog is that it has a metric metric ton of features, where Dyalog has been improving it for ages. So it has quad WC, and it has extreme amounts of quad functions, and also it has control structures. And I mean, yeah. But of course, its longevity is also a minus in that it's stuck with a lot of features from the olden days where they didn't know some of the-- like they made mistakes, and they couldn't fix them because they're kept-- what's it called? They can't remove them because, yeah. - Backwards compatibility. - Exact backwards compatibility. But they still have a ton of features. And it's really good software. And BQN for example, doesn't have all of those, that insane amount of-- that huge platform, right?

00:28:02[CH]

Yeah, I'll add to that. One of the things that I always say is that Dyalog-- like when I've had, what, four Tacit talk episodes, my other podcast plug, link in the show notes. But when Adám came on as my fourth guest, I basically said, we're not going to talk much about the language, because there's two different YouTube channels with hundreds of videos that you can go watch those. Whereas you can't really say that about-- I mean, technically, q has some KXcon videos from last year that have been put on YouTube. And I know that there are some J videos. Like you have your YouTube channel, and there is an individual that has a pretty popular-- that he gave it at a conference. I can't remember. Long hair.

00:28:38[BT]

Oh, Tracy harms.

00:28:45[CH]

Tracy Harms, yes, gave a J talk that if you search for J language on YouTube, it's one of the top ones. So there are resources out there. But I think APL, or Dyalog APL to be specific, because there's many APLs out there, has the most resources. Even though the Uiua docs and the BQN docs and even the q docs, built-in functions page, fantastic. All fantastic documentation, but not as many videos and resources and books, if you will. I think there's three different APL books that have come out just in the last two or three years. And there's no BQN book yet. I'm sure there will be.

00:29:15[BT]

I've I've got to put a plug in for Stevens Q 201.

00:29:19[CH]

Oh, yes, yes. That's true. We've got some courses. So there is content out there. But if we had to rank by language, I think Dyalog APL definitely has the top of the charts in terms of the most resources. So we're going to go back to Devon, and then we'll come over to Brandon in a sec.

00:29:34[DM]

I'd say Dyalog by far is the best in terms of interfacing with the outside world. They've made a strong effort to sort of play well with others that the other languages probably aren't well developed enough to have to worry about.

00:29:49[CH]

This is true. - We'll come back to Brian in a sec. We're going to go in order of hands raised.

00:29:52[BW]

I'm going to bounce off of what Devon said. And I think that cultural conservatism of Dyalog is one thing that gives me confidence in using it as a tool to solve downstream problems. So in terms of overall unqualified best language, I can't really answer that for you, Conor. But I don't have much of an opinion even.

00:30:17[CH]

I'm sure you don't. I'm sure you don't.

00:30:27[BW]

What are you talking about? But in terms of if I want to think about the mathematical beautiful possibilities of an array language, I might look at BQN or something. Or if I'm looking for rapid exploration of what different new conceptualizations of array languages can look like, Uiua I think an excellent example of that. But Dyalog APL, I chose it because I really fell in love with the cultural conservatism around how the language evolves and a strong focus on making the language or giving the users confidence to be able to use the language in the domains that they're using it.

00:31:04[BE]

So about interfacing, I think it's impressive how good the FFI system in BQN is. And I'm very happy with it. Meanwhile, I guess Dyalog is improving it, especially now. But so far it hasn't been as good as interfacing as BQN. But that might just be because since BQN is so rapidly changing with its FFI system, every time a complaint it fixes it, or it, I mean Dzaima and Marshall fix BQN or improve it very fast. And I'm very happy with that. So when it comes to interfacing, I mean, rapidly evolving stuff can improve interfacing way faster than something that has the-- what's it called? That's way more static because it's so big, right?

00:31:54[HR]

Ken Iverson [04] was justly proud of APL. But he realized that he could do it better. And that's what J is. They may feel with many of the problems fixed.

00:32:09[CH]

Hold on. Hold on. - Well, I was going to say, I'm not sure if the listener's been noticing, but I've been going around. We started with BQN, then we went to Dyalog/APL. So J and Uiua and q are in the docket. So maybe that's J is the best because it's Kenneth Iverson's second attempt at APL 2.0. Over to Josh.

00:32:27[JD]

Yeah, I mean, but one of the good thing is that-- of Dyalog/APL is that we take good ideas from J. [LAUGHTER] We're kind of at-- and hire the J developers to work on our interpreter. But that's kind of an unfair advantage, like what we've been talking about with Dyalog/APL is the four decades of history. So corporations can rely on us to build software. But also, you know, KX has a similar type of reputation. You know, commercial success might be-- is an interesting metric to look at for how successful an array language is. But yeah, I mean, some other-- the newer languages are maybe more beautiful or fix some of the bugs that we're stuck with. But we try our best to try to incorporate those into the language at a conservative pace. So we are at an unfair advantage having this history. But we do need the other languages to further the development of APL. And we try to support that at Dyalog. But yeah, it'd be interesting to see if any k or q people have any thoughts.

00:33:30[AH]

I'm going to speak for the k people a little bit, because they don't really have to say much, because their bank accounts are flush with cash. Like, the rest of us are only wishing that we could have the portfolios that the K-- anybody associated with k seems to have.

00:33:46[CH]

And as I walk over to Devon, I will quote Bjarne Stroustrup, the creator of C++. He famously has said a couple of times that there are two types of languages, those that people complain about and those that people don't use.

00:33:56[DM]

Don't use well, if we're gonna measure things in terms of commercial success, Excel is the the best array language.

00:34:05[CH]

All right, we've got a new contender entering the boxing ring, Excel, with 200 million users. All right, we've got a new individual. Please introduce yourself.

00:34:14[Sasha Lopoukhine]

Hi, I'm Sasha. I research compiling very high-level descriptions of programs to weird hardware. I would like to propose a different measure (evaluation) of success and this is a use case. One of the things that I do quite often is build up shell pipelines and I use some really terse languages to manipulate text, like awk and sed and jq to manipulate JSON. One of the things I've been wondering about is array languages let you express really tersely, really powerful expressions on tabular data. I'm wondering if any of the array languages tend to be used for these shell one-liner scripts; which ones might be best for that? Because that seems like it would be quite a useful use case for them.

00:35:09[CH]

I can personally say that BQN has become, [a] not-complete replacement for bash (because certain things like grep I don't think you can compete with). But for certain things (I don't know what a good example is) ... the length of each line in a file or something like that, BQN is fantastic for that, because you can drop into it. They've got a get files function (that's a system function). And then you can just go from there. It's usually like three or four characters, which is always shorter than bash. I'm not sure if anyone else uses an array language as their kind of shell driver.

00:35:44[JD]

I just want to say Justin Dowdy has a really nice "APL in the shell" opensource repository and GitHub to do exactly that with APL in shell.

00:35:52[CH]

Awesome, we will definitely link to that in the show notes. Back over to Devin.

00:35:57[DM]

I would say I use J as a shell language. In the case of lengths of lines of the file, I have a function key that reads in the file and turns it into a vector of vectors. And also, while I have the mic, I might promote my own New York City J users group. We meet the second Tuesday of the month. We are going to finish our 20th year in operation this year. And we're a meetup under J dynamic functional programming.

00:36:21[CH]

All right, the plugging has started, folks! If you've got something else you'd like to plug, a user group ... [sentence left incomplete].

00:36:26[BT]

I've got a plug on a plug. And that's that Devin's been great about recording the minutes and the topics of the J user group going back 20 years. That's all on the J wiki!

00:36:35[CH]

Yeah, that's true. The J wiki is a very great resource, recently enhanced by, what is it?

00:36:40[BT]

[quietly] we're getting there [laughs]

00:36:41[CH]

We're getting there? I thought ... [sentence left incomplete]. Has it not been released? I thought it was released.

00:36:46[BT]

If you go looking for it, you can see sort of an alternate structure that I've put in place. But it's not the main one. It's not where people are usually introduced to. We haven't made that switch yet. Yeah.

00:36:55[CH]

Well, we will announce that on a future episode when it happens.

00:36:58[BT]

Oh, yeah. Oh, you got that.

00:37:00[AH]

I think when we're talking about what sets array languages apart, I feel like one element that hasn't been discussed yet is Co-dfns particular niche, which is there's lots of work on acceleration and performance but I think no other array language implementation has as explicitly been designed as a thing to be read as its own artifact as much as an implementation for the array language. So one of the explicit goals of Co-dfns is to demonstrate array thinking in the implementation of array languages and is designed to be read by people and to be consumed in that fashion.

00:37:43[CH]

You heard it here first, folks. We will leave a link to one of these source files of Co-dfns [laughter in the room]. Aaron Hsu saying that Co-dfns was designed to be read, which I guess, you know? For folks that haven't seen the source code, maybe give a little, if you can, verbal description of what people can expect. And you've got a couple different livestreams and videos if folks want to go see for themselves. Yeah.

00:38:12[AH]

Sure, it's just (one, two, three, four, five), five functions, all executed in a single pipeline. Each of those functions is a linear dataflow of APL expressions; commented down about what is being achieved as you transform an AST from your input source to your compiled output.

00:38:29[CH]

So this must be the most recent iteration on the compiler. This is not what was on the t-shirts that I see at conferences.

00:38:37[AH]

It's on the t-shirts.

00:38:38[CH]

It is on the t-shirts? I don't recall seeing comments on the t-shirts.

00:38:40[AH]

So the comments on the t-shirts are just the two characters on the right column [loud guffaws and plenty of laughter all round].

00:38:47[CH]

Can you repeat the number of characters (not words)? Characters.

00:38:51[AH]

Yes, the final two columns on that t-shirt are the comments. However [chuckling] the expanded version is about every four to five lines, there's a blank line and a comment that's a full line explaining what the next four or five lines do, more or less. And I think Brandon was the one who pointed me ([apparently to Brandon] you said it was about 15% comment to code ratio?) which, according to a paper that scraped GitHub's code, is in line with median comment results? Yeah. So Co-dfns matches the median code/comment ratio of standard GitHub open source projects.

00:39:30[CH]

I see. So if you do see a Co-dfns t-shirt in the wild, it's propagating fake news. The actual source does have full comments that have not been abbreviated. I'm not sure how two characters is an abbreviation [laughs in the background]. But I mean, maybe it's an acronym. But yes, they do have comments. So link in the show notes. When you gave your two-hour livestream (I think all of yours have popped off on HackerRank, but that one was like several years ago) how many iterations ago was that? And have you given not necessarily a livestream, but a talk? Because I know you gave two talks at LambdaConf [05] a couple of months ago. Have you done the same thing that you did however many years ago? So how many years ago was it? And does a new one exist for this new version?

00:40:14[AH]

Yeah. So I have been meaning to do a new one. I have been waiting a little bit till everything sort off stabilizes so far as such things go. But it is quite removed from what is in the architecture now. The same passes (the tokenizer, parser, transformer or code generator) all basically [keep] the same structure at that level. But what has changed is that the tree transformations now do not operate in any ... [sentence left incomplete]. I think when I gave that architecture, the tree transformations might have been expressed as purely tacit functions. So there was a version of the compiler in which all of the tree transformations were just a giant string of tacit transformer functions. And that was changed over to what is on the thesis edition of the t-shirts, which was released in 2019. And that makes the code much more readable on that side. And in addition to that, we were using a PEG parser to do the parsing. And that went through three or four iterations since that architecture livestream. And it was originally a manual PEG parser. Then we had a DSL metaprogrammed PEG parser. And then we got rid of the PEG parser in favor of this linear dataflow (data parallel) parsing structure that we do now. And we've given a couple of talks on that. And the code generator previously was a table dispatch code generator. And now it is, again, just linear dataflow all the way down using the same techniques. So we have unified the entire structure in the architecture to be top-down, linear on all the passes, where previously that was not the case.

00:41:54[CH]

Awesome. Well, if you ever do that livestream, I'm sure we will mention it in the show notes. Or we'll just have you back. Here's a little inside baseball. I had totally forgotten that we had Aaron [laughter for mistakenly calling him Andrew ... sentence left incomplete]. I had totally forgotten that we had Aaron on, because I think you were guest number five or six, though. But I was thinking, maybe we're going to save Aaron for episode 100. And then once again at Iverson College, I was talking to Bob. And Bob was like, what are you talking about? We had Aaron on already. Anyway, so BQN, DyalogAPL, J, Co-dfns have all been spoken for ... what kind of sets them apart. We've yet to hear from anyone on Uiua or Q. Alex, start thinking; what's the main language thing? But we already got a mic for ... poof, look at that ... the mic shows up out of nowhere.

00:42:47[KS]

I think the things that set Uiua apart kind of reveal my difference in motivations compared to a lot of the things that other array languages are developed for. Solving business problems, while I do want to be possible in Uiua, is not really a thing I care about helping with [chuckles]. When I first started listening to the ArrayCast, learning about the array languages, the thing that really strikes you when you start learning (just the aesthetics of the code itself but also the way that you construct algorithms to solve problems with arrays) is the beauty of it. It's about reducing a problem to its simplest, most elegant form. And there's something about that that I attempt to capture in Uiua, which is why you can't write local variables in Uiua, because you're forced to do everything tacitly. Because tacit code is just ... [sentence left incomplete]; when you get the right little tacit expression, it feels ... it's so nice. And so I force you to do that. So you have to figure out: "how can we reduce this problem so that data flows in a nice way" and conceives of the problem in the simplest possible terms. So yeah, the forcing you to do tacit; that's what sets Uiua apart.

00:44:13[CH]

And we do love tacit. All right. Have you got an answer on what sets q apart from the other array languages? All right, over to Alex. And congratulations ... recently married. We can cut that out if you don't want it on the pod [some laughter in the background].

00:44:30[Alex Unterrainer]

No, that's fine; can be out there. So my wife, she's going to appreciate that. I admit that she gave me permission to come here.

00:44:40[CH]

Will she listen to this episode?

00:44:42[AU]

Maybe. I will get her to listen to it [quiet laughter in the background].

00:44:45[ST]

Just to put that in context, the permission you got was because you came here days after your wedding, right?

00:44:52[AU]

Yes, yes. [lots of laughter all around]

00:44:55[AU]

Long long long week .

00:44:56[CH]

Is this the honeymoon?

00:44:57[AU]

No, no. I wouldn't do that without her [more hearty laughter]. Would be a short marriage and life for me. Anyway, so what I think sets KDB apart from all the other array languages is it solves a lot of business problems. Like, I think it's the one that's used the most in terms of clients. And you can build a full blown application just with KDB and q. Like, from your real time data stream to the ticker plan that processes the data and pushes it on to real time subscribers. You can have your historical database where you store decades of data on disk. You have your real time database that keeps the intraday data in memory. And with one language, you can solve all those problems. Like, other tech stacks, even non-array languages, don't offer you that flexibility to do it all in one programming language or database. And I think that's amazing. I cannot fully comment for the other array programming languages because I haven't used anything apart [from] q and k a little. But yeah, I think KDB has the potential to solve so many problems. It's mainly used in the financial industry because that's where Arthur started and where it was implemented. But anything big data pretty much can be dealt with KDB.

00:46:45[ST]

Alex, do you use q for basically shell scripting work?

00:46:50[AU]

Yeah, you can use it for that as well. Like parsing CSV files or doing little scripts analysis. It is mainly used as a data pipeline in investment banks or hedge funds; I have to admit that. But that's mainly because you cannot find a lot of smart KDB developers or people who then can maintain the actual business logic or problems. Like, you could do so much more with KDB [than] what people are actually doing with. They just see it as a data pipeline because they are afraid that the developers who are working on a business logic move on or they struggle to find more developers who can work on the business logic. But I think the potential of KDB is much, much bigger than it's currently used and exploited. And it's a little bit unfortunate and sad. That's why I think we should all try to expand the user community of KDB; share the knowledge we all acquired over the years and make it a wider used language. Because with the growth of data, like the information we have nowadays, it becomes more relevant to have a programming language or a database that can handle this amount of data and get immediate feedback. Sure, like other databases can do analysis, but they run for hours. I want something where I get feedback immediately. And KDB, in my opinion, is the best thing out there for that.

00:48:34[AB]

Speaking of announcements about just being married and getting married and so on, I believe you have an announcement to make now?

00:48:41[CH]

Oh, yeah. Congratulations to me, too, because I also got engaged [laughter in the background]; not married, but it's the prerequisite for marriage last time I checked.

00:48:51[HR]

I myself am not a big fan of tacit programming. But if anybody is, they really ought to look at Uiua. It is beautifully presented and solves many of the objections that I have to tacit programming. It's great work.

00:49:09[CH]

What are the objections that you have that were solved?

00:49:13[HR]

Well, particularly the elimination of monadic and dyadic; the ability to have any number of arguments. But also, just the way it's presented on a stack and you can easily see that large tacit program in one view. It just makes it easier to work with. I mean, I'm assuming this; I haven't used it. But it's much less forbidding than a long string of tacit expressions that are all limited to monads and dyads.

00:49:49[ST]

Now, well, I've got some confessions to make. I'm sitting here wearing a ball cap with a tacit expression in APL, which I adapted from Aaron, with his blessing. But I've been wondering for a while about tacit. When I first saw tacit, I was immediately falling in love with the simplicity of it: "oh, look, my argument tokens have gone; I don't have little parens around it, surely, this is my way forward". But as I've listened to the conversation on it, I was seriously wondering about bringing a ball cap which simply said: "shut up about tacit" [lots of laughter].

00:50:33[BT]

And we end the ArrayCast! [more laughter]

00:50:38[ST]

Why might one do that? Well, it seems like a classic case of a solution looking for a problem to solve. And I've been more and more impressed by what seemed to be the paucity of use cases for tacit. But there are precedents. There are precedents for this. Tensor mathematics was invented long before there was a use case for it and then it turned out to be just the thing. You were asking about what's the best array language. I've been thinking about a different one, which is, as a result of this meeting, what am I currently most interested in. And currently, it's Uiua for two reasons, both aesthetic. And I'm infamous, I think, here for being driven primarily by the poetry and the expressiveness. I don't really care [laughs] about computational. I'm not that much about solving business problems. But I do love beautiful, expressive code. And I claim that (whether there may be or may not be elephants in the room) there's space in the room for a hamster who's interested in expressiveness and the poetry. So what I've seen in Uiua is that practically everything is done by tacit. And that it seems to be, from a superficial view, that it's your adoption of the stack framework (well, adding that in) that seems to bring alive something Arthur Whitney told me that Ken Iverson thought about even dyadic functions like plus and multiply. He'd see "3 times 2 plus x" as a "2 plus on x and a 3 times". That is to say a sequence of unary functions. So I look at what's going on in Uiua and I'm thinking: "oh, we're getting that right to left flow and somehow that seems to bring tacit alive". That's as far as I've got but that's one of my two aesthetic reasons for wanting to spend more time looking at Uiua. The other is, at last, a program in color!

00:53:00[CH]

Do you want to say anything to that?

00:53:03[KS]

Yeah, I can say something real quick. I think I do see that the desire to have things kind of flow from one direction to another. That's why in Uiua, when you write "minus 1 2", you're not subtracting 2 from 1; you're subtracting 1 from 2. And that's so you can read your "minus 1" as a single unit. I'm doing "2 minus 1 divided by 5"; things like that. So maybe at first glance it may not seem totally intuitive (it may seem backwards) when you try to read things in a coherent, scanning way, you can see functions as a unit. So that desire is something I do try to preserve.

00:53:50[CH]

And I didn't even bring up tacit. This wasn't even me. This was Henry. So this is now officially tacit 6.1. And for the rest of this podcast, we're probably going to stay on this topic. And it's unfortunate Morten's not here (CTO of Dyalog) [06] because on the first day (if this is him walking through the door ... no, it's Kamila), I think it was the first day that I was here, he said something that I wish he was here to comment on. He said that tacit has the power to destroy APL. He elaborated on that and said that he is worried that the more we focus on tacit programming, the more we risk getting a reputation for being an extremely complicated academic [language]. I don't want to throw shade at Haskell but Haskell is a beautiful language that has a reputation for ... [sentence left incomplete]. Someone says Haskell; someone thinks the word monad or category theory. And I don't think that's necessarily been a good thing for Haskell in terms of the success it's experienced. But yes, we're going to go to Devin, and I think we're going to go to Aaron after that, and then maybe Adám after.

00:55:02[DM]

As someone who was initially skeptical of tacit, I've come around and I find myself writing something so I write an expression and say: "gee, I'm using the same one or two names three times in this". And I can get rid of them with tacit and just put them at the end. And so it's nice to sort of separate the code from the data in that way.

00:55:21[CH]

I couldn't have said that better myself.

00:55:24[AH]

Yeah, I think there are legitimate complaints about the danger of tacit introducing unnecessary complexity into the readability of your code. But arguably, all of array programming has that problem because it's an expressive, terse language, which requires a certain cultivated skill. Not a very hard skill, but you have to cultivate the skill and taste and aesthetic and style of what a good readable expression is, over time. And they were writing things about the elements of APL style; I think Perlis maybe did something along those lines in the '80s or '70s. And so this idea of cultivating a good aesthetic sense of what makes a good expression has to be carried forward into tacit. And I would claim one of the main principles behind good tacit is understanding why you might want something that's tacit. And you want to eliminate ... [sentence left incomplete]. It's a sub-skill of subordinating detail. So if you have a point in your code that is so critical to your domain that it's important to know about that point when you're talking about the expressions, that point needs to appear somewhere in your expressions. But very often, there are points in your code that don't have that same semantic meaning. And if they don't have that same semantic meaning, tacit allows you to elide unnecessary points from the flow of the program. So as an example, I was talking with one of our members here who was trying to understand the removal of nodes in an AST that were written in the style of the tree processing that I used in the compiler. And I was explaining or talking while I was walking, explaining the nodes. And I was naming certain things and then I realized, well, the naming of the things was obscuring his ability to understand what it was. And so then I just read out the tacit, which is like: "rightTac - 1 + interval index". And when he only saw those, the structure became immediately clear because of the flow and understanding what was around it, whereas the less tacit version became less clear and harder to communicate while we were on the street walking. Now, that doesn't always happen because sometimes you really want that point sitting there. But I think learning to judge the quality of a tacit expression based on what information it's communicating and what waste it has in the expression is the right way to know when you've gone too far into a tacit land and know when tacit is appropriate to improve readability.

00:57:56[AB]

Well, I think I can clarify a bit on what Morten expressed; that we have the risk of alienating existing APL programmers. It's come up a couple of times now that APL has been around for a while, longer than most of us. And if you say the young, cool kids are all writing complicated tacit things, and the old school APL programmers have a hard time learning this new thing (not necessarily because they can't learn it, but maybe nobody has sat down with them and taught them it or why should they take the effort to learn this newfangled thing) then I can see them concluding that: "yeah, if this is what APL has to be, if I hire somebody, an intern, whatever, and they're going to write code that I can't read, then it's not sustainable; then APL is not for me anymore; I have to give up".

00:58:51[AH]

There is a danger in a new fancy language coming onto the scene in the broader community of it getting slurped up by people who want to use it as a signal of superiority against the rest of the programming community. And we have seen that happen in the past. And I think it is worth worrying about that to some degree.

00:59:12[CH]

Are you talking about Haskell? [subdued laughter in the background]

00:59:18[AH]

Well, Haskell might be one that you could point to. I think ThePrimeagen on YouTube has this joke of Haskell is the best programming language for writing white papers. That's its main thing that you produce when you write Haskell code, obviously tongue-in-cheek. But I think that's been happening even before that. I think the Algol 60 space had a little bit of that way back in the day. It was like this ivory tower language and things like that. And some of the other languages in the '80s began to get that appeal, or some of the logic programming languages. I think you had these communities that ended up pushing away the broader community because it came across as a form of elitism.

01:00:04[CH]

If you look at talks at Haskell conferences or functional conferences, category theory [07] is overly represented in terms of giving Haskell, then, the reputation that I need to go ... There's a ton of people that say: "I don't want to have to go learn category ...". You don't need to know category theory if you don't want to do serious Haskell programming. If you just want to write a simple program and consume Haskell libraries, sure, you'll see a foldable and a type expression. But you don't actually need to know what that means in detail, other than that it's a thing that represents a list or something that can be folded on. As much as I love to talk about tacit, I think Morten isn't wrong that we shouldn't hold a conference where every single talk is Tacit, Tacit, Tacit, Tacit, Tacit. And there's not a single person ... [sentence left incomplete]. I think there's a C++ conference that has three or four different tracks, but one of them is called the Back to Basics track. And it happens every year, even though they're covering some of the same topics. And it's covering the fundamental things that most people want to do and are going to use C++ for. And then there's three other tracks that's focusing on: Reflection! Coming in C++26, a standard that's not going to be out for another two, three years, and probably not going to be implemented by ... [sentence left incomplete]. Although I think it actually is implemented by a couple of compilers; but all three compilers aren't going to support it probably until like '27 or '28. And it's great to have those talks, but you don't want your whole conference to be about the cutting edge in the future, because then everyone's going to get a different idea than what is possible. And a lot of companies don't let people upgrade until like 10 years later anyway. So.

01:01:39[AH]

I'm running out of tokens here [laughs around the room]. But I was going to say, I think Rust is a good example of the opposite of that. I think Rust has a lot of complicated ideas and yet, they did a lot of really good work to make those complicated ideas as accessible as possible to as many people as they could, in use. And I think that Rust has done very well for that.

01:01:58[BT]

And just as a comment on Aaron's tokens, well, Aaron's been involved in many conversations as you can imagine. And at this point, we actually joked about giving him tokens so he had to limit how he spoke. But he's a very controlled individual and so as a result, I don't think he's been dominating too much.

01:02:17[CH]

Yeah, we'll get you your own special sub-series on the podcast at some point. It'll just be him with the mic, folks. We won't actually be interviewing him.

01:02:25[BT]

Well, that was actually another joke that Conor said to me [laughs] is what we should do is give Aaron a mic and a topic and say: "come back in an hour with the audio for us" [everyone laughs].

01:02:36[CH]

And I said that out of love and respect because I don't think many people could do that. There's a few podcasters out there that they just literally talk to themselves once a week for an hour and I think that's a skill that most people do not have. But of the people that I could think of, I think Aaron could not only do it, but he would actually be educational and entertaining to listen to [laughs].

01:02:56[BT]

I wouldn't have to edit very much. It would be ready to go. Honestly, it would be ready to go. It would be great [everyone laughs].

01:03:04[CH]

All right, any other thoughts on tacit or we've got a hand? It might not be about tacit ... it might be. We'll find out.

01:03:09[BW]

Yeah, I'll actually derail from tacit and circle back to your original question a little bit about what characteristics make each of these array languages stand out. And one of the things I'm particularly salty about is trying to mimic the behavior of bash pipelines that are doing sed and awk in pure array languages. I don't want to use regular expressions (in quad S and quad R) in my APL, because it's sort of this hard boundary between languages and you have to explicitly pass information through. You have to think about it and I just think it messes up the flow of the language. I kind of have a question for the k guys. Do you have a better way of achieving the same result as sed and bash pipelines in k? And a better way of conceiving of that problem and getting out the same kind of results?

01:04:07[CH]

We're going back to our resident ... [sentence left incomplete] [laughter in the room].

01:04:08[BT]

He said: no, no!

01:04:09[CH]

I thought we were going back to ... someone.

01:04:13[BT]

The k guys are remaining silent in the room.

01:04:15[CH]

I mean, Aaron spoke for k earlier. We could get him to speak for k again. Well, we've got a hand.

01:04:20[Rory Kemp]

Oh, hi. I'm Rory. I've done a variety of array programming and mostly in APL but now I'm going to speak on behalf of k a little bit. I'm not a professional k programmer or anything like that. Before Brandon spoke there, I was going to propose another metric for thinking about what the best array language would be. And I want to reference the Perlis quote, that "a language that doesn't affect the way you think about programming is not worth knowing". So maybe the best array language is the one that influences you the most. And I actually think APL maybe wins because it's the original and best, in a sense but I actually think that k, because it's so simplified, really forces you to get to the essence of what your problem really is. And in response to your question about sed and awk and things, I think, unfortunately, a lot of Ks have arbitrary and annoying restrictions, like q has a subset of regex, which doesn't do the full regex and I don't know why it doesn't. Maybe for the simple queries, it's more efficient, probably. But I do actually use k as a REPL quite often. I think it's possibly the nicest array language for parsing, which might be controversial. It has built-in splits and joins, which I know Conor doesn't like implementing so maybe that's a good thing. It has easy access to shell commands; I mean, most of them do. But I think it's a very ergonomic experience, and certainly one that really (a lot more than APL now I think) forces you to distill your array logic down into the absolute bare bones and really refine it. So maybe k wins by the Perlis metric.

01:06:47[CH]

I will say that if you're not a k professional (I don't know what that makes me, but I'm below that for sure) but when I've code-golfed in k, k has a bunch of stuff that none of the other array languages have. Specifically, I'm thinking about the different kind of fixed point, like scans, where you can do something until it meets some criteria, but it'll print out all the iterations. I don't think any of the other array languages can do that. And definitely not in less characters than k can do it. So there's definitely probably some truth to what was just said. Comments on the Perlis quote? (which is on my combinatorylogic.com website at the bottom, because it's a great quote)

01:07:28[BE]

I think what changes the array thinking the most isn't just tacit, and it isn't directly the symbols (well, that could be debated). But I think there's the truth that k really does just have that changing of thinking. Meanwhile, I think BQN and APL have a lot more than that. Whether those influence thinking, I don't know. But when it comes to the array language thinking, I think k just has that. Well, OK. It has that as its core, rather. Meanwhile, I think BQN and APL have it as a side thing, where it also has tacit, and it also has other features, I guess. I don't remember the details of that. Yeah.

01:08:15[CH]

Other thoughts? Shout out to Kap, not represented in this room. But if you love the combinators and you miss the APL glyphs, Kap might be the language for you.

01:08:27 [BT]

And sorry, Elias, we're not going to do a four-hour podcast right now. Not going to happen. Still waiting for it. You wait on.

01:08:34 [CH]

And I will say, we were talking about at one point the legacy issues of Dyalog APL. That being said, one of the biggest hiccups you run into is the parsing ambiguity problem of compress, that you need to use the-- what is it-- JOT in order to get it to compose. But I've heard from-- we'll call him out-- the head of language design. We can't, I don't think, say for sure what version it'll get into. But in a future version, we're getting the-- I think it's being called the behind operator, which looks the same as it does in CAP. And it is going to be adding not just two, but three combinators. And it's going to fix that ambiguity problem. Well, now, Adám's raising his eyebrows.

01:09:18 [AB]

I'm just wondering how you're counting three.

01:09:21 [CH]

Oh, three, because it's the sigma delta. And then that combined with beside is going to get you the D2 combination.

01:09:25 [AB]

If you're now counting combinations of combinators as a combinator, then as soon as you've got a couple of them, then you've got infinitely many, right?

01:09:38 [CH]

This is a great point. However, it only counts for D2, because that's the only one I care about. And all the other ones should be spellable, in my opinion, via train or a single combinator. Technically, CAP-- I miss that CAP, you can only spell the B1 with the 2 train. They don't actually have an explicit operator for that. But they can still spell it. But I don't think the other ones, you should need to do a combination of them. Is that fair? You know, but it's my metric, so--

01:10:10 [AB]

And you're the host. You're the one that decides what's fair and not. That's true. I'd like to go back to what Henry was saying about Iverson made J to try to make things right after doing APL. I never heard anybody having spoken with Iverson about it. But Unicode [08] came out immediately after J. And I speculate that if the development process for J had taken slightly longer, then he might have canceled those plans for making it ASCII-based. Maybe not. Henry is shaking his head. I mean, yes, I can see the benefit of being able to type it with any font and any keyboard. But I mean, those languages that use real symbols for things, it's just more beautiful to look at, I would say. But I also want to contest one thing that Henry is implying here, that J came after APL. That's not true. All these array languages that are living in parallel with each other, they have the potential-- and certainly some of them do-- to influence each other. So it's not really-- if you try to make a timeline of array languages, it's not true that J came after APL. Well, it came after APL 360 and APL+ and so on. But for example, DyalogAPL has taken features from J and sometimes improved of them, and sometimes improved on the J idea to such an extent that J then took that idea back from DyalogAPL into J. There's something about key here. Henry is saying, yes, he knows. And so too. We-- DyalogAPL inspired BQN. And then there are facets of BQN that we want to borrow back into DyalogAPL, like this additional combinator and operators.

01:12:19 [CH]

And nothing. And nothing. And-- Hopefully. I mean-- Or CAP in J, as they call it. I don't like saying CAP, though, because that's CAP with a C. And you can't tell, if you're just listening, if I mean CAP with a C from J or CAP with a k as a language. 01:12:34 [BT]

You'll have to shelt one of them. It's like a difference in volume. I don't want to suggest which one you would show.

01:12:40 [CH]

We should just rename the CAP in J to-- I don't like the name nothing, though, either, because it just is like, who's on first, what's on second, or whatever that thing is.

01:12:47 [AB]

Well, it's just a train filler. It's like an empty seat.

01:12:54 [CH]

All right, I think we're going to go to Aaron. And I think maybe Ray had his hand up.

01:12:57 [AH]

We'll go to him after. I do-- actually, I want to slightly rebut Adám's speculation about the Unicode stuff. I think Unicode is a prerequisite for making some of the trade-offs that J chose unnecessary and unlikely to be chosen again. But I don't think Unicode is enough. I think if-- and I'm speculating-- if someone was trying to reinvent the Iversonian languages now, with already having-- understanding IMEs and these things that have now become very commonplace, and the sort of worldwide acceptance of IMEs, even in Latin-speaking or Latin-derived Roman alphabet languages, then something like the APL symbols or the UIua symbols or something become more accessible and more usable. But up until very recently, maybe after the iPhone-- it probably would have to have been five years after the iPhone-- up until that point, the acceptance for that level of input method was not there. And so even with the existence of Unicode, it was too much of a barrier to entry to type those symbols to make it a clear win. I think there was a strong-- you could have made a very, very strong debate for digraphs all the way up until that point, at which point I would switch over and say that there's less. Now I think we actually have a different danger, which we have to be careful of, which is we are so accepting now of a proliferation of symbols because of emojis and all of these other things. We just accept if we need a new symbol, just add it anywhere and just proliferate them. And I have seen what happens when vocabulary proliferates to that degree in programming languages. And I think now we actually have to think very carefully about scaling back and making sure we don't just randomly add a symbol just because we think it's nice. We need to be more disciplined and more principled about the cost of symbols in terms of shared understanding, onboarding, communication, et cetera. I think we're now in the opposite direction, and we should scale that back.

01:15:05 [AB]

Yeah, I think there's a danger with regards to Unicode of the private use area. Imagine somebody makes an array language, and they decide, why should I be restricted by the Unicode glyph shapes? I'll make the perfect symbol for every functionality that they have. I'll just put it all into private use area. And you need special fonts, special render, special everything to type in there.

01:15:30 [AH]

That might actually be a very good thing to do, but it should be very disciplined. Like, it shouldn't just be-- that's obviously what we might do. You need to be very careful about that. But I wouldn't want to say you shouldn't be able to do that.

01:15:41 [AB]

No. And there's often a discussion that comes up on, should you allow the users to give meaning to additional symbols that are not in the language already? And again, there's a danger of something you have to tread carefully. It's not so obvious.

01:15:57 [Ray Cannon]

Ray Cannon I'm Ray Cannon. I'm an APLer. And I had a very nice three-hour breakfast with Ken Iverson back in the day. And I asked him--

01:16:06 [CH]

Very jealous. Wow.

01:16:09 [RC]

I asked him why he was creating J. And as far as he was concerned at the time, J was just an ASCII form of APL. J was APL just with ASCII. Simple as that.

01:16:24 [AB]

But what does that mean? Does that mean that ideally he would want it rendered with appropriate characters?

01:16:31 [RC]

It was rewriting APL as an ASCII without the symbols.

01:16:40 [AB]

The question is why.

01:16:42 [RC]

Because keyboards at the time, there wasn't Unicode.

01:16:48 [HR]

Dyalog has the advantage of having a long and rich history. And there comes a disadvantage that every once in a while it's something that you've got that you wish weren't that way. Or maybe you don't, but Ken did. And Ken designed J to do those things the way he reflected what should be done at the time. The character set being one of them. It would not be difficult to have a J version with glyphs. But I guess Ken's bias has not faded away with this.

01:17:27 [AB]

I mean, we can look at it historically. And in the early days of using APL, where the character set really was a hindrance in hardware even, there were various spelling schemes that were developed to allow APL to be represented in a smaller character set. And as far as I can tell from the historical papers, that is what became the J spelling eventually. It started off with having various ASCII symbols optionally followed by or preceded by the @ symbol. And the very first paper on the J software website of historical papers where you have a language that's recognizably J is named APL backslash question mark. So I wasn't definitely saw J as an APL. It's not clear whether at the time the spelling scheme was just a spelling scheme or if it had its own purpose. But then you said, Henry, that, well, you could have J but with fancy symbols. But I don't think you can really because the spelling scheme of J where you generally have some ASCII symbol followed optionally by a dot or a colon took on its own life. And this triplet of plain symbol, symbol with a dot, symbol with a colon created an association between symbols, between a triplet of symbols that isn't easily available should you choose some more graphical symbols like those APL have. 01:19:16 [CH]

Yes, we'll go to Brandon and then we're going to come to Devin or go to Devin.

01:19:21 [BW]

I think that the discussion about the hints at technological limitations in our available input methods, how it affects how we think about these languages, I think we can actually flip this on its head a little bit and say, what limitations are imposed on us right now that don't allow us to explore other potential APL syntaxes? So one thing that I think like mathematical notation is-- consensus mathematical notation is not linear. It sort of has this two-dimensional layout aspect to it that has been honed primarily as a medium of communicating to other mathematicians. And so this sort of echoes what Aaron was saying. We're using APL primarily as a human communication medium. And I would love to hear people's thoughts on what you think that if you had the ability to input sort of two-dimensional expressions and have an APL that looks like that, like do an integral symbol with the-- or have operators where the operands are subscript and superscript or something like this. Like what could-- do you think that could be helpful? What are your particular intuitions on expanding beyond just this linear symbol notation?

01:20:34 [ST]

My intuition is we're going to get back to the original Iverson notation.

01:20:37 [DM]

Well, now you've built up a stack here.

01:20:39 [CH]

I was going to say, while you think about that question, I think you're going to pop one off and go back to the J versus APL and Unicode.

01:20:47 [DM]

OK. Three things in order. Aaron was referring to an IME. I had to look it up. It's input method editor where you can input characters that aren't ASCII with an editor. A major thing that Iverson said about J is he eliminated variable index origin, which he said was a mistake. And the third thing on that topic is that I once was going to show a friend the wonders of APL, and I bring my computer to his cabin, and there's no APL characters on the keyboard. You can't type it. You have to know how to type the stuff, whereas with ASCII, the keys are all there on any keyboard. To address Brandon's question, APL was invented as a linearization of mathematics, and I think for good reason. And I'll use an analogy here. I don't drive much, but when I do get on the road, I notice how terrible people are. And there's these people who want flying cars. I mean, if people are that bad in two dimensions, imagine how bad people will drive in three dimensions. And I can say the same about math going to two dimensions. I think that way lies madness.

01:21:55 [AB]

I think I'd like to counter that a little bit. And the language has to be available for humans. Obviously, the computer doesn't care if you're using multiple ASCII glyphs or if you're using fancy Unicode characters. So this is for humans. And there's something with humans that we need things to be rather distinct for them to take on their own identity. If we make things too similar to each other, then things get blurred together. And I think, for example, BQN has a bit of this problem. It went with a very noble idea that to help the human reader of the language parse an expression, and then you need an indication of the syntactic role of every symbol. So for all these languages that we're dealing with, even Uiwa, I think, if you don't know what the syntactic role is of a particular symbol, you simply can't parse the expression. So if it's an adverb operator, modifier, whatever you want to call it, then it captures things that are adjacent to it. If a symbol stands for a constant or a value, then it just sits there and it doesn't affect anything around it. And so BQN assists with some fairly simple rules. It says that a one modifier, a monadic operator, an adverb, is written as a superscript, and a dyadic operator, two modifier conjunction, is written with a symbol that includes a circle that is not broken, as opposed to, say, the transpose symbol, which is a circle with a backslash through it. That's a symbol that's been broken by a line. And that sounds great. So now we're just learning a couple of rules. I can parse at least the language, even if I don't know what a particular symbol does. But I think it ended up so far in making the spelling regular that it becomes harder to distinguish. They're just the individual symbols that just don't have much of their own personality. So I understand why back then they had to linearize mathematics to make it fit the technology they were using. But I think that a 2D notation with some limitations, like traditional mathematics does to a certain degree, [09] has its merits in making things very distinct. It's very easy for humans to recognize relative position and sizing, and even the style of the writing. Is this bold? Is this double struck? Is it italics? All these hints at what things are make things more distinct, make it easier to latch onto and follow. An example we can see of this is in alphabetic languages where lowercase characters have a more unruly shape. They don't fit into a nice rectangle the way uppercase letters do. They have ascenders, descenders, some of the ascenders on the right, some of the ascenders on the left. And it's simply easier, faster to read such lowercase text than it is to read all caps because things are more distinct.

01:25:20 [HR]

I disagree that programming languages are made for human to human communication. I think of them as ways to describe computation. And I personally find it easier to read English comments than reading even a well-constructed APL sentence. For human to human communication, we have, we've never gone to mathematical languages in our societies and I don't think we need to with the computer. What makes an array language different is that the non-array languages are an attempt to express your idea in a way that the computer understands at a low level. Whereas Iverson's description describes computation per se, it's a description of computation. But no more than that, not a way of making you, you only understand it because you figure out what the computation is and then internalize that in your own way.

01:26:30 [AB]

Remember that Iverson's notation started as a purely human to human notation. It could not run on a computer at all.

01:26:39 [HR]

It's a method of describing the computation.

01:26:44 [AB]

But I and and.

01:26:44 [HR]

Precisely.

01:26:44 [AB]

I would say that English as a language is not a good description language. Yes, you can put comments in your code that speak about the overall goals, hopefully not describing what the language itself is doing. But we see an increased tendency to replace English or other normal human languages in user interfaces with various symbols and diagrams, arrows, icons that apparently, well, either we have to say that these are better for the users or some people just like putting them there and without figuring out whether it's actually a better user interface or not. And everything with a balance, I mean, I've personally set some configurable user interfaces to use words instead of symbols 'cause symbols that I would rarely encounter and it's hard for me to remember what they are. For the things that are very common, and I'd much rather have my user interface have a little plus or minus for zooming in and out than having the words zoom in, zoom out, or bigger and smaller. And little arrows up and down for your font size selector and going to the next and previous page, a little X button for closing things and so on.

01:28:02 [HR]

I won't argue with that, but those are, as far as communication is concerned, grunts. Big, small, up, down. When we're trying to define a computation, we have to have a richer language. But my contention is that it's me communicating with the computer, not with the reader, because the computer requires the precision that the reader does not.

01:28:32 [CH]

I know, we gotta, or, well, let Kai go first. - Okay. - And at the risk of blowing this up another 30 minutes, we're gonna give Aaron a few tokens back, 'cause I can see him twitching off to the side, and I know he has thoughts, and so we'll let Kai go first, and then I'm not sure if someone over, we'll go to Brandon after.

01:28:53 [AH]

I washed my hands.

01:28:56 [KS]

If we wanna maybe unify some of these topics, if we're talking about, one, what is the theoretical best array language, and how do we compare and contrast what can be constructed, or was able to be constructed as a language back in the day versus now, and when you think about how the original Iverson notation was just something you drew on a blackboard or something to communicate to people, I was thinking it's almost as if the using modern technology, you could revive something resembling Iverson notation to read, maybe you have some entry method, 'cause he would just draw arrows to things to show control flow, that's not something you can put in a normal editor, but if you have a more free form of input, 'cause when we communicate with each other, when I try to explain algorithms to people, like say at my day job, I draw things on paper to say, okay, we have this here, and then this goes to this, you don't necessarily conceive of it, even in a well structured symbolic way, it's kind of free form, and it is two dimensional most of the time, and it's almost as if we, theoretically, if we're pie in the sky, what can you theoretically construct? You could have a system where, I don't know, even drawings or maybe something more structured could both serve as your notation, your communication for others, and through some complex mechanism, serve as a program for the computer.

01:30:25 [AH]

I have a quick comment on this, I did experiment with reintroducing two dimensionality to the APL notation in a linear compatible way, so there's a version of the compiler that was written in Microsoft Word that was manually typeset in a way that was copy and pasteable back out to the linear version, and used also to typographical conventions, and if you wanna see some experiments with that, you can go look at that document. 01:30:50 [CH]

And on the previous topic, no thoughts about the?

01:30:54 [AH]

Like I said, I washed my hands.

01:30:57 [CH]

All right, well, we'll be talking to Aaron probably, we're planning on doing a second episode, today is Wednesday, the 21st of August, I think is the correct month. - It is August. - It is August, actually for a second I was like, did I just get the wrong month?

01:31:13 [BT]

And I think-- - I can change that in editing.

01:31:13 [CH]

But we'll be doing another one of these with a different format on the 23rd that should be episode 88 or later, depending on how things get edited, but we'll hand it off to Brandon maybe for the last comment and then there'll be a quick last question, which will be rapid fire for whoever who wants to answer, which is, it was mentioned at one point that languages have been borrowing ideas, for a language of your choice, what is your favorite feature that has been stolen or borrowed from another language? That's the rapid fire after we hear from Brandon.

01:31:48 [BW]

Oh, I'd like to bounce off what you were saying, Kai, I mean, you were talking, conceptualizing of this question about 2D inputs or 2D representations as, let's explore the space of possibilities, right? And you were sort of pushing it out there, and I think even Devin pushed back on this saying, you know, well, if you're just trying to expand out and go to the next level and generalize, you know, we don't know that's helpful. But I'd like to really highlight that a lot of that exploration has already been done over centuries using pen and paper and mathematicians communicating with each other and themselves. And like, I mean, Henry, I think you're saying, you go back and you read your code bunches of times, I'm sure. So you're communicating with another human or with a human in your code, right? And so I think like, I mean, have you ever tried reading Euclid's "The Elements" [10] or something? - Yeah. - Yeah, and do you, I mean, I think, personally, I think it's pretty terrible experience 'cause it's all prose.

01:32:39 [HR]

Well, yeah, no diagrams, yes.

01:32:41 [BW]

And no algebra, right?

01:32:44 [HR]

Right, yes.

01:32:45 [BW]

And so I think we can sort of go back to the idea of looking at what features of mathematical notation are extremely helpful for the goals that it has. I mean, mathematical notation has different goals than APL or writing or programming does. And what features of those are pertinent and relevant and helpful to our array languages and APL and k. And how do we conceive about those features that we want in the first place? I think we're not, we're sort of trying to, our thinking is limited by the tools that we have at our hand. And I would like to sort of see, it would be nice to be able to start to expand out those things, how to speak about the things that we, the possibilities that have been explored, but we're not able to get a handle on right now.

01:33:37 [CH]

All right, has anybody got an answer? I'm looking at, I'm looking at UIua. Well, I'm not looking at UIua, I'm looking at Cap. Of the, what's your favorite feature you borrow directly from another language?

01:33:50 [KS]

Oh, I mean, I stole a bunch of primitives directly from BQN that I wouldn't have come up with myself, like group and classify. - Okay. - That's probably it.

01:34:00 [CH]

Anyone for Dyalog APL?

01:34:02 [AH]

So for APL, Roger introduced two concepts that didn't fully make it into the APL interpreter that I kind of very heavily borrow and kind of integrate into the compiler. One is the indexing operation. That's pretty trivial, easy enough. But the other is stencil as a function, not as an operator. And he introduced that very late into his space. And I consider that one of his best final hurrahs, if you will. I think it was really, really good. And I'm going to push for that in the future. But, if I think about this question more broadly, I adore the fact that so many of the features that Scheme pioneered are now considered laissez-faire, just like standard, everyday, everybody expects those features to exist in every language. I think Scheme was possibly unmatched in the degree to which these features just sort of became interesting. Like lexical scope, first class procedures, enclosures, all sorts of early pattern matching, like all this stuff, the Scheme guys were experimenting very early on. And the fact that now this is so much accepted throughout the whole space, that set of features is really great. I just wish proper macros were understood by the community at large.

01:35:12 [CH]

I think Adám, you were going to say something?

01:35:13 [AB]

Yeah, I'm a little bit back and forth between a couple of things, but Arthur Whitney in the A language came up with leading access theory. And that was then propagated into J and mostly into modern day APL implementations, including modernized APLs like DyalogAPL. I think that's a very valuable thing. Maybe goes together with the rank operator that I think also Arthur came up with originally.

01:35:50 [CH]

And I've come over here to see if Brian has, I mean, you're kind of our informal BQN representative here. Do you have any, I'm not actually sure if you're, you know, have a full list of all this stuff that BQN's stolen for other languages. I'll say for CAP, even though Elias isn't here, I'm not sure so much that CAP stole this, but I asked for it and then Elias went and implemented it and it was the sort primitives. In fact, I'd actually forked CAP and I was going to call it CAP++ and then I was going to go and try and implement the sort myself. But as soon as he found out that I wanted it, he just went and added it. And so we can say that he took that from BQN and I know that Elias has taken a lot of stuff from BQN as well.

01:36:28 [AB]

But BQN got that from extended DyalogAPL.

01:36:31 [CH]

Okay, well, yeah, there you go. There's another example. And while Brian's thinking about it, we're going to hand the mic over to Ray.

01:36:42 [RC]

Comments. I think the comments are the most important bit that's been copied from every language back to Assembler or machine code.

01:36:51 [CH]

That might win the day. Brian, any?

01:36:56 [BE]

I mean, I'm not familiar enough with the history there because I often talk about, oh, it's cool that BQN has this feature, but then Adám corrects me every time. (audience laughing) I mean, there was a Ray notation, but I know that was invented before. Well, I think so, right? I don't, I mean, Adám should correct me.

01:37:14 [AB]

Yeah, I mean, Phil Last, that veteran APLer, did a presentation at the Dyalog User Meeting in 2015 on Sicily, where he proposed a notation for arrays. And they wasn't, at the time, conceived as an integral part of the language, but just as a static type notation that could then be used to represent arrays that could be imported. But very quickly, we morphed it into something that could become part of the language, and that's where it comes from. And BQN didn't appear until five years later.

01:37:59 [BE]

So wait, okay, I need to remember the question again. The question again was, what features BQN stole?

01:38:08 [CH]

Oh yeah, but what, you know, if you have one to pick to highlight, just as like, you know, we're winding down the podcast, as you know, we're all friends here, even though I was trying to, you know, start a bunch of drama. But I mean, you can say, I think, array notation, even if it wasn't stolen from, you don't know where, but if it was, you know, the work that is being done at Dyalog, or was someone before that. That's a good enough answer, that counts.

01:38:35 [CH]

Yeah, it counts. And also, technically, higher order functions, first class functions, that was stolen from every other functional programming language that came before. We'll go to Henry, then we're gonna get finished with Devin, and then I think we're gonna have an announcement at the end of the podcast for things to look forward to in the future.

01:38:50 [HR]

I think direct function definition, which we took from, to my knowledge, Dyalog APL.

01:38:57 [AB]

But John Skolls came up with that, inspired by Iverson's own direct definition.

01:39:02 [CH]

Yeah, and last but not least?

01:39:04 [DM]

I just wanted to comment on Arthur's leading access default. I worked at the same company he did when he was working at A+ and he'd gone up to speak with Ken about this, and he came back and said, "Dad said it was okay."

01:39:20 [CH]

All right, so I think the last thing we're gonna mention is I got handed a note at some point during the podcast, and I was kind of latently aware of this, but, I mean, unfortunately, Kamila wasn't here, then she was here, now she's not here again, and she is one of, I think, four people that are gonna be either presenting talks or workshops at Lambda World in October, I think it's 1st or 2nd? 2nd or 4th, [11] I mean, it's a good thing I'm getting corrected here, but we have, are you an organizer of this conference as well? I know you're giving the workshop because we had to coordinate, but introduce yourself, tell us a bit about Lambda World. I think the schedule's actually all online now, but you can give us all the details.

01:40:02 [Jesus López-González]

OK. Thank you. My name is Jesus. I work at Outlook computing and I must say that I am a cute enthusiast. I love to say that.

01:40:09 [JLG]

Okay, thank you, my name is Jesus, I work at Abla Computing, and I must say that I am a q enthusiast, I love to say that. This conference is in Cadiz, in the south of Spain, and it's a party of functional programming, it's really lively, the city is lovely, the food, the weather, also in October, so I recommend everybody to go there. And of course, this year, well, Abla Computing is a sponsor of this conference, also my boss is in the program committee, so he's also deciding talks, so this year, we had the opportunity to make more pressure to introduce a right paradigm talks, so I think that we have Camila, you have, we have yourself, Conor, Stephen, and also Stina, the CEO of Dyalog, is going to speak there, and myself, so I invite everybody to go there, because as I mentioned before, it's a party of functional programming, so if you love that, and maybe this year, we can make more pressure on a right paradigm, so maybe in further editions, we have even more content of that, so that's all.

01:41:14 [CH]

Yeah, I am super excited about this. It's obviously not everyone that's here, but I think that's, I mean, Stina's not here, but four out of however many people are here, is a mini, you know, repeat of this for a couple days, and I saw this on the schedule, I think it's very cool that, I think, I'm not sure if there's two tracks, or three tracks, two tracks, and one of the rooms is called the Ken Iverson room, so that's how much they're leaning into array programming, 'cause I saw all these names popping up one by one, I knew behind the scenes that I would be giving a talk, I think Stephen, your face was on there first, then I saw Jesus', then I saw Camila's pop-up, then I saw Stina's pop-up, then I think mine was added last, and I was like, wow, this is a lot of array folks that are presenting at this functional conference, and then I saw Ken Iverson's name on the schedule, unfortunately, obviously not speaking, but still, a very nice, you know, salute towards him, so yes, links will be in the show notes. I'm not sure if there's anything else that we want to say, obviously, if you are listening to this, and you have a question from any one of the voices, I'm not sure if we're gonna post all the contact details, but you can definitely email Bob, I'll get him to mention that in a second, and we can forward that along, and we'll also ask, you know, if they don't mind, we can put it there, but probably it's easier to go through us, and then you can go back and forth between those people, so if you would like to reach out to any of the individuals that you heard, you can reach us at-- -

01:42:42 [BT]

Our version of the spam filter, contact@arraycast.com, so anybody who sends a message through, I think we'll probably forward most of them, but of course, if you're spamming us, it'll only get to us, we're not gonna forward it onto the different people that you're trying to spam. So, yes, contact@arraycast.com, shout out to our transcribers, because I've met several people who interact with our podcast through transcription, now that they've sort of been present in the room while it's being recorded, I don't know whether they're gonna change their habits or not, I can see advantages to what they're doing, and the way we do it, but anyway, transcribers, thank you so much, and show notes, as you mentioned many times, we'll have show notes in for more information if you wish to have it.

01:43:23 [CH]

And yes, once again, thank you to our panelists, Stephen, for organizing this, we are only halfway through, I think this is about the halfway mark, we're 20 minutes away from lunch on Wednesday, which I think marks roughly the 50% mark, so we've still got half to go, but this has been awesome so far, and thank you to everybody that participated, even if you participated silently, this has been a lot of fun, and with that, we will say, happy array programming.

[ALL]

Happy array programming.