Transcript
Thanks to Adám Brudzewsky for providing the transcript.
00:00:00 [Adám Brudzewsky]
Oh, but it's… Dyadic Hook, it's very simple, right? So, in a functional language…
00:00:04 [all]
[laughing]
00:00:07 [AB]
Yeah, who am I to say that, right? Who are you‽
00:00:20 [Conor Hoekstra]
Welcome to the last Array Cast of 2021. My name is Conor. I'll be your host and today we have with us Bob, Adám, and Stephen. So, we'll go around and do brief introductions, hop into a few news items, and then hop into a discussion about passive programming for the 4th time — and we've got a special holiday gift for you!
00:00:40 [Stephen Taylor]
I'm Stephen Taylor and I'm an APL programmer from way back and currently the KX Librarian.
00:00:47 [AB]
I'm Adám Brudzewsky and I am full-time Dyalog APL programmer, and I also spent a lot of time teaching people APL.
00:00:54 [Bob Therriault]
I'm Bob Therriault and I'm a J enthusiast and I'm currently working with people, developing the J Wiki, which is a very challenging project, but we're also learning a tremendous amount and hopefully in the new year we'll see a few changes there.
00:01:10 [CH]
And as mentioned before, my name is Conor.
00:01:12 [CH]
I'm a professional C developer, but as any of the regular listeners know, I'm a huge APL and array language fan in general.
00:01:22 [CH]
And so yeah, I think we have 3 announcements. We'll go to Bob with his announcement first, then hop to Adám and then we'll finish with Stephen.
00:01:30 [BT]
On December 17th, which was the 101st anniversary of Ken Iverson birth, the new version of J? It became no longer a beta. It's now actually an official release, so 903 is available with all its little changes and adjustments and more powerful things you can do in terms of tacit, which is certainly appropriate. So if you are interested in J, it's a good time to load 903 and find out all the things that you can do with it, and we can put links up to the Jsoftware website and all those kind of things to find out where to go. Do that and that's the announcement. And apparently according to Eric, it was a coincidence. I have my doubts, but in any case December 17th we also got a chance to celebrate Kens 101st birthday.
00:02:21 [AB]
I just want to put the word out there that the opening for submitting your ideas/concepts/proposals, for a common logo for all of APL, it closes at the end of the year, so that'll be another week from the release of this podcast episode. We'll put a link to the APL wiki page where you can submit or see the current proposals. You can also comment there on existing proposals.
00:02:51 [ST]
And I'm sorry to announce that for the second year running, the IP Sharp Associates London Office Christmas party will not take place. Of course, once again, because of the pandemic. For those of you who find it puzzling because I.P. Sharp Associates disappeared into Reuters in 1987, it's just worth noting that the Christmas parties never stopped. That was one amazing company to work for; the parties never died.
00:03:25 [CH]
That's pretty impressive, 'cause that's — what? — it's over 30 years ago now? That's almost… Is that longer than I.P. Sharp existed for, 'cause they were — what? — '64–'87 is…
00:03:37 [ST]
They run about twenty… yeah, around about 25 years.
00:03:39 [CH]
Wow, so there you go!
00:03:42 [BT]
Makes me think of Hitchhiker's Guide to the Galaxy. They went out for lunch and just never came back.
00:03:51 [CH]
Well, maybe one of these days, if the I.P. Sharp and associates London party ever comes back, I'll have to fly out. I'm not sure: do like Array Cast panelists, do we get like, you know, honorary invites? Maybe not. We'll have to see. Do you get plus-ones?
00:04:08 [BT]
There'd probably be some form of an initiation involved, and I'm not sure I'd want to be involved.
00:04:12 [CH]
Get DOSBox out and the old SHARP APL⋆PLUS running and do some kind of computation. Well, I already have that all set up, so I'm ready to go. All right, well, with that, the announcements, out of the way, today we are doing my favorite topic to talk about on these episodes, at least when we don't have guests, and that is, once again, tacit programming. And because we are in the holiday season, we thought we would, off-the-top of this episode, give you a holiday gift in the form of finally talking about the one, the only Dyadic Hook which, all the first three times, we were going to talk about this, we didn't have time for it. So, this time off-the-top of the episode, we're going to talk about it. I actually don't… We didn't talk about who's going to go… Are we going to let J, you know, Bob the J guy introduce it? Or are we going to let Adám just tear into it right away, you know? What say the panel? OK, Bob will go first and then we'll go from there.
00:05:15 [BT]
I'll at least for those of you who are joining late, a Hook in J is 2 verbs juxtaposed to each other and enclosed in parentheses. And when you have… You can have a monadic hook which has one argument, and that argument is to the right, and then you can have a dyadic hook which has two arguments and that means that the hook, the two verbs enclosed in parentheses, are inserted between the two arguments, so you have a right and a left argument, so when you have a monadic hook — and most people, I think, get the idea of a monadic Hook — it's interesting because the right argument gets copied over to the left. I don't think it actually gets copied, but it appears on the left and the way the verbs work is, the right verb works on the right argument and then the result of that becomes the right argument to the left verb and the left argument goes to the left verb. So, essentially, you're doing like a preprocessing on the right verb before the left verb works on the original argument, plus the processed argument.
With a dyadic hook you still have the two verbs in the middle, but instead of the verb [argument – Ed.] being copied across, you have the right argument, sorry, the left argument is now not copied, it just comes in on its own to work on the left verb and then the same thing happens with the right argument and the right verb. So the right argument goes to the right verb, gets processed, get sent into the left verb, and then the left argument now forms the other part. That, so that's how hooks work in J, and they are asymmetrical, which is why I like them the same way I kind of like using a claw hammer: if I'm trying to hammer nails then take them back out again because one side works one way and the other side works the other way. And that means that you have two functions, but you have to know which end to use, and so that's my explanation of dyadic hooks. And Conor is looking very confused right now.
00:07:36 [CH]
Well, so I actually understand dyadic hooks, and then I got confused while listening to that, so let's see if I can recap and people can correct? Maybe I don't. The hook is pretty simple, especially if you're coming from APL or somehow, if you're coming from BQN. It's just a fork: a 3-train where you've got your left and right. So, you've got 3 verbs, binary one in the middle and then sandwiched between two unary functions [U B U – Ed.]. And if you fix the left unary function or monadic function to Identity [I B U – Ed.], that's what a hook is.
00:08:16 [AB]
That's a monadic hook, yeah.
00:08:17 [BT]
Yeah, that's a monadic hook, yeah.
00:08:20 [CH]
That's a monadic hook. A dyadic hook, I thought was basically… it's the same.
00:08:29 [AB]
It's like… A dyadic hook, is like the APL operator Beside [∘ – Ed.] or previously called Compose, as well. So all it does is, it's the left function applied between the two arguments, but the right argument is preprocessed by the right function.
00:08:49 [CH]
Right, right, right, right.
00:08:51 [BT]
If you think of a fork, a noun-verb-verb fork [ (N V V) – Ed.], so in J, you're allowed to have the left argument being a noun, and then it basically just isn't processed. So you've got noun-verb-verb [(N V V) – Ed.] and then say you have a monadic fork. If you think of it this way, take that noun and just pull it out outside the parentheses [N(V V) – Ed.]. And now you have a dyadic hook. Because it's going to process it the same way.
00:09:23 [AB]
But that's the same… You could make it into a fork as well, by saying it's the…
00:09:29 [BT]
You can't do that with verbs, though. Like you can't do that with the verb-verb-verb form.
00:09:34 [AB]
Well, if the if the right verb is just, right, whatever function it is on-right [V⍤⊢ in APL, V@:] in J – Ed.]. Just so it's applied to the [right argument – Ed.], it's the same.
00:09:46 [CH]
It's the same thing, so this is why I sort of got confused 'cause when I was listening to what you said I was picturing something different. But this, actually… So, I don't necessarily recommend going to listen to this episode, 'cause the last two ADSP episodes, my other podcast, we did live coding, the first one's in C++ and the second one in BQN. And rightfully so, we got a couple comments being like, you know talking about. You know your random stuff, personal life stuff? Awesome. Talking about coding stuff? Awesome. Live coding solutions on a podcast? How about "no"? Because even, I think for the BQN one we barely describe the glyphs that we're working with and we're just sort of like excitedly coding and you hear us being excited about coding and probably have no idea what we're talking about.
00:10:31 [CH]
That being said, at one point in our final BQN solution — and I won't explain the whole thing 'cause I just finished how talking about walking through a Unicode symbol solution is not great — but there is a point where we are basically trying to construct an Iota sequence, but it's shifted. So, say you want numbers 5 to 10, exclusive, so that's going to be, you know, (⍳5) + 5 basically so 5 + ⍳5. And the way you did this in BQN or in APL is basically your + is your binary operation that sits in the middle, and then you're going to have two arguments, a left argument and a right argument, and this example was bad, so let's change it to 10 to 14. So it's actually you're starting at 10. Or you're adding 10 and you're doing ⍳5. And let's assume this is 0 indexed. You basically then want to, with your right argument, the 5, you want to ⍳5 to get 0 to 4, and then once you've done that preprocessing, you want to do 10 + the result of that and the way you have to spell that is basically using the Left [⊣ – Ed.) and Right [⊢ – Ed.], so the binary versions of identity. So you go Left Plus Range —which is the equivalent of ⍳ in APL — composed with Right [⊣+↕∘⊢ – Ed]. And so you basically have to add the composition operator [∘ – Ed.] and then a Left and a Right. And then I made some offhand remark 'cause I didn't actually really know what the dyadic hook and J was, and I was like, "I know that there's a dyadic hook thing in J and it may or may not be this, but potentially you can just go Plus Range a.k.a. +⍳ [+ i. in J – Ed.) and that's exactly the pattern that you want. I think and then I tested it out and it seemed to work.
00:12:22 [AB]
Oh no, this is right. This is right. But that also means… By the way, this was exactly the function that Stephen brought up as his favorite tacit thing. But so, in APL you can use the jot [∘ – Ed.], which is Beside, so you would write +∘⍳. And I noticed that when I looked at your BQN solution saying "yeah, it would be nice to have that operator there."
00:12:47 [CH]
So that was, well, so that's my question: Is there a symbol for the dyadic hook? Like there's multiple ways, typically to spell the B-Combinator. The B1-Combinator a.k.a. sort of, not forks, but like you can go the ∘ symbol in APL. Or you can just put two monadic functions side by side, and it's basically the same.
00:13:12 [AB]
We call it Atop.
00:13:14 [CH]
Yeah, does J have a symbol for dyadic hook, or is the only way to spell it a binary-unary with parentheses around it [(B U) – Ed]. 'cause I tried to look it up and I couldn't find it.
00:13:26 [BT]
Yeah, no, the same ideas as the fork which is 3 verbs together. When you put a combination of two verbs together and you put them within parentheses, that is the definition of a hook. Now with the new version of 903, you can do this because now you can start doing tacit definitions of conjunctions and adverbs. I've seen people describe how you could do a hook using that, and it's more complicated than just putting two verbs together within parentheses, but that's what you do. The interesting thing is to get the same effect as APL with your combination of two verbs, all I need to do is put an At [@: – Ed.] or an Atop [@ – Ed.] between those two verbs, and I get the same effect as you guys do.
00:14:11 [AB]
Yeah, well, I mean, we also now, at least, have an Atop operator in APL, so you can either write FG, or you can write F⍤G explicitly like this. — Tacit explicitly, eh!
00:14:23 [CH]
So, is that saying, because I don't think there is, but maybe I miss… I don't know. There is a way to spell a dyadic hook in APL using an operator or no?
00:14:36 [AB]
But that in APL or in J?
00:14:38 [CH]
In APL.
00:14:39 [AB]
In APL, a dyadic hook is just F∘G.
00:14:43 [CH]
It is?
00:14:43 [AB]
Yeah.
00:14:44 [CH]
And it's not the case in BQN?
00:14:46 [AB]
BQN has an After operator, so you would write F⟜G. It's the same thing
00:14:45 [AB]
Really‽
00:14:46 [AB]
It's this little Jot with a line to the side [⟜ – Ed.].
00:14:57 [CH]
Wow!
00:14:58 [BT]
So, so the actual functionality is really useful. Everybody figured out a way to do it, but we've all figured out different ways to do it.
00:15:06 [CH]
Really‽
00:15:06 [AB]
Yeah. But it's, yeah, it's the same thing. If you want the full J functionality in APL, you can write F∘G⍨⍨. Now it might sound silly to do commute-commute 'cause like you're swapping the right and left argument and then swapping them again, but the idea here is that if it's dyadic, it swaps them twice, so they go back to where they came from. If it's monadic, then the outer commute is actually a selfie, so it puts… That's exactly expressing that idea that bob was saying before that you take the right argument — the only argument — and also make it a left argument. And now you can commute them again because it doesn't matter that they're the same, so it doesn't matter that you swap them around, but I like that spelling much better because it's explicit about copying over the argument to the other side. I find it very odd that the same argument that's used twice without being mentioned.
00:15:57 [BT]
So yeah, I must admit when I first learned about hooks in J, that was something just bent my mind. I just went "wow, really?" and it's like you sort of feel like you're walking on quicksand. You don't know where the next pothole is, because how does that happen? And then it turns out it does happen, and then once you know it's going to happen, it's very consistent, so you deal with it. But when you first find out that you're going to end up sometimes copying your arguments and sometimes not, depending on whether it's a hook or a fork… Oh, Conor, is making exploration phases right now. This is amazing. We can just…
00:16:31 [CH]
Exploration, I'm not sure that's the word, but, uh.
00:16:34 [AB]
'tis a eureka kind of thing.
00:16:35 [CH]
I totally, I totally… I gotta be honest, I've been a bad host. I tuned out for the last like 60 seconds 'cause I got so excited about the After thing in BQN and so I tested it. It worked! Oh-my-goodness! Because that thing that I… Not… I totally interrupted. But that thing where I spelled ⊣B U∘⊢ — it's just so… I was just like, "if I ever design a language, I'm going to put a single Unicode symbol to spell this operator. I'm not sure why they haven't done that already." And sure enough, it's already there.
00:17:05 [AB]
I wasn't sure if I should comment on it on Twitter.
00:17:09 [CH]
Oh, you definitely should have.
00:17:10 [AB]
Yeah, but I didn't want to embarrass you, but now you brought it up so you…
00:17:13 [CH]
Embarrass me‽ Bro, oh, I… To the listener out there, and to the panelists: Anytime anyone ever wants to, you know, embarrass me or make me look bad and the cost of it is I get to learn something; please, please do. I care so little about, you know, whatever is it, you know, did I give the suboptimal or optimal… It's like code reviews. You know, there's this tricky balance of giving code reviews without making people feel bad.
When I get like a really, sort of — what's the word? — like deep dive, like "You can do this better. You can do this better. Blah blah blah…" like it's the best, 'cause like there's no better form of like, improving your ability as a as a developer than getting like really, really good feedback, which sometimes is like "Oh yeah, maybe I should have known this already." But like that lasts for about half of a second, and that's overwhelmed by like "Oh my goodness, look how much I'm about to learn!" I was hoping someone would… but I figured out like… I probably… 'cause I thought I tried all the operators but it clearly I didn't try After, in BQN, that is, and I guess I just assumed that — I don't know why — I assumed APL wouldn't have it. But anyways, back to…
00:18:21 [BT]
To take a really brief interlude from all our tacit discussion. Your request to be embarrassed: I have to admit something that Adám were talking about earlier is that Adám was on the Orchard, and people were asking, "Can we listen live and then ask questions?" And I said, "You know, what would be really funny? Is, we tell them all to wait till 4:30 their time wherever you… UTC. And then, at that point, we'll give them the connection and then they'll all show up, but we won't tell Conor. So partway through the show, we have all these people suddenly pop into Zoom and start asking questions."
00:19:00 [BT]
And that, we felt, would embarrass you, so we didn't do it, but you know…
00:19:04 [CH]
Ah, no, I mean, I would have just been confused. I don't know why it had been… I'd been like, well, I would just assume like Dyalog, or you know, Adám, accidentally posted the link on some company thing and… Because we had it before, not in waves of people, but yeah. Oh, I'm so… This is so awesome; I can't wait to tell Bryce! As you know I said going forward, we're going to stop doing these live coding things, but this so cool! Anyways, back to Dyadic Hook. I don't even know what I interrupted when I interrupted.
00:19:37 [BT]
Well, I think what we determined was that there are ways to do this in each of the languages and J does it without an actual primitive. They do it by combining verbs and then they do move their arguments around in interesting ways which are asymmetric. APL has different ways of doing that, but I guess what it.. So, one of the things that I heard, this really strong arguments against forks versus hooks, and this is a really good point, is if you have 3 verbs together within parentheses, you have a fork. You add a fourth verb to that. Now you have a hook. And your arguments are going to act different. And you can say, oh well, I can probably keep track of the difference between three and four, and you can, but you got to realize it's going to change. But when you start to extend because you can go many verbs, lengths, and then if it's an odd number of verbs, it's going to be a fork. And if it's an even number of verbs, it's going to be a hook.
00:20:41 [AB]
I don't think it's clear yet how pervasive this is. So, one thing is when you're writing the code, but let's say you're reading somebody code. So, here's a criticism that some K people have against the J-style — as they call it — it trains. Which is this odd–even pattern, right? So, every other function beginning with number two from the right is always applied dyadically. And the other functions, or verbs, are applied monadically if the whole derived function is applied monadically, or dyadically if the whole function is applied dyadically. And so, in order to read a train properly, you have to count. You have to always keep track of the parity of your carriages, basically. So, you start at the right and the rightmost function is applied to the argument or arguments. The next one, put it on hold, put it on your mental stack. Then you go to the third one and apply it to the argument or arguments and then you combine them with that function that's in between them. This is how the fork works. Fine. You now… That's part of the function, and that's going to evaluate. But meanwhile you continue to towards the left. Here's another function: This one is number four, so you put that on hold. Go to number 5, apply it to the argument or arguments, and take the result of that original thing you had on the right, with this new result, and apply it dyadically with number four. And so, you proceed through the train, and you can keep track of that. And then you reach the end of the train and either you will be at an odd number or at an even number. So, in APL, if there is nothing more on the left, then the last function is applied monadically, even though it would have otherwise been in a dyadic position, but it has no left argument.
00:22:37 [AB]
in J, it's more consistent in the sense that it's in the it should be dyadic, and it will be dyadic because we're just going to take the right argument, or the left argument if it's dyadic, and put it there in that last slot. So, you… like an identity function. So a Left if it was dyadic and a Right if it's monadic. With one catch! If this is the case that you need this tested addition of another argument on the left, you need to backtrack — if it's applied dyadically — you need to backtrack all the way to the beginning and start reevaluation. Why? Because everything up until now was monadic-dyadic-monadic-dyadic, even though this train is applied dyadically. OK? This is a lot of words. Let's see if we can it down a little bit, meaning: You must read trains from the right. Otherwise, you don't know what the functions do because you don't know where they are in the parity game. However, if you read from the right, you don't know the meaning of the of the functions until you reach the left edge. So that basically means it's impossible for a human to read a long train in J — once. You have to do a two-pass: First you separate all the characters; you have to parse everything — you can't parse as you go along. You have to parse everything and not evaluate it. Then you get to the end, look at your parity, and then you might, may or may not need… Well for evaluation, need to go back all the way to the beginning and start over. There's no other way!
00:24:07 [BT]
Well, there is another way, but it's outside of the way the language is actually working. You can do it with documentation and the documentation is you always have the option to put parentheses around things and you can separate them that way if you want to make it clear to yourself that that's what's happening. So, for instance, if you got a hook, you can always take your first verb or combination of verbs — because that first thing can be a hook as well — you can take that first verb, and you put parentheses around it, and if you do that consistently, that will tell you if you see parentheses around the first verb, you've got a hook, and if you don't see that you've got a fork.
00:24:48 [AB]
But if you don't know whether the writer of this code uses such a convention consistently…
00:24:54 [AB]
Yeah.
00:24:55 [AB]
… you cannot rely on it. If you see the parenthesis, sure, yeah. If you don't see the parenthesis: Tough luck!
00:25:00 [BT]
You can also do it with whitespace if you want to separate the two a bit more, that'll be another trigger, but you're right, it has to do with the writer of the code. It was something that actually, I think Bryce… Conor, you and Bryce talked about in ADSP. 'twas, he was saying, at one point you were taking, parentheses out, and you were going, "Well, you can read that". You can if you understand it, but if you don't understand it, sometimes you leave things in to make it easier for somebody else to read it. And I don't think there is… I think that's actually an intermediate form of documentation that I think is entirely valid and nobody should call it down, because if it makes it easier for people to understand, I think it's probably worth doing.
00:25:39 [AB]
I disagree.
00:25:41 [AB]
If you need if you need to put in redundant parentheses because otherwise a human needs to do a double pass on your code to understand it — must do a double pass; there's no other way — then there's something wrong with this construct.
00:25:54 [BT]
Well, to take it back. The other the other area I look at is, I don't usually go more than about four or five verbs in a row in a train just because that's kind of my limit on what I could actually parse and understand. And if I go more than that then it starts to get confusing to me and I'm better to break those things up. Again, that's a programmer's choice, it's not the language that needs that, so in other words, what you're saying now is, the language can push you to whatever level you want to. But as a programmer, you sort of have to realize your own limits and maybe your readers limits, and that's you know that's an affordance you allow, is that you say, "OK, well I'm going to make this a bit easier to understand because I'm a human being and I can only juggle 4 balls and not 5 balls."
00:26:46 [AB]
But isn't it even for the interpreter itself? It also has to do 2 passes, no? There is no way.
00:26:56 [BT]T
Now the interpreter, when it's parsing it goes through a stack of — what is it — 4–5 stack and it's just going through a series of rules and pulling them off.
00:27:07 [AB]
But you can't! You don't know how to apply the functions. You don't know if your star means multiplication or sign until you've finished parsing the entire train.
00:27:18 [BT]
Yeah, but it's only one pass.
00:27:21 [AB]
Uh, and…
00:27:23 [BT]
When you get to the end, you know; you're working through your stack.
00:27:26 [AB]
First you have to parse the entire train.
00:27:29 [BT]
Yeah.
00:27:29 [AB]
And then you can go back to the beginning and apply it. Because otherwise you don't know which function to apply. Monadic or dyadic.
00:27:37 [BT]
Yeah.
00:27:37 [AB]
You don't know whether this star means one or the other, until you have finished parsing everything, whereas in the APL style trains you can start evaluation right away.
00:27:46 [BT]
Yeah, no, you're right. You have to parse before you evaluate.
00:27:49 [AB]
So you have to do two passes; one parsing pass…
00:27:51 [BT]
Yeah.
00:27:52 [AB]
… and one evaluation pass. Whereas in APL, and BQN, you can do a single pass parse and evaluate at once.
00:27:59 [BT]
So as soon as you hit your first verb, you can then start processing.
00:28:03 [AB]
Yeah. There's no question about what it means.
00:28:05 [BT]
Yeah.
00:28:06 [AB]
And for me that makes a world of a difference.
00:28:10 [CH]
So, here's my… 'cause I'm a kind of a bit lost. I get there's some ambiguity to summarize and, sort of, I'm taking a guess when I summarize. You don't know at the end of a train in J whether your last hook is going to be monadic or dyadic? Is that what the confusion is?
00:28:36 [AB]
Firstly, I don't know what you mean by "first" and "last", but…
00:28:39 [CH]
You don't know the one on the furthest left.
00:28:42 [AB]
The one the furthest left determines whether or not the one on the first right is applied monadically or dyadically.
00:28:48 [BT]
Let's think of it like in terms of having 4 verbs in a lin. So if you start with three verbs in a line, you would have the rightmost and the one in the next… So let's order them 1, 2, 3.
00:29:08 [CH]
Isn't 3 verbs in a line a fork?
00:29:10 [AB]
Yes, so that's the contrast here. Let's call them P, Q, R, and S in that order, right? So, if you've got Q R S — and the whole… we're only talking about dyadic; the problem here is with dyadic, right? — so P Q R S is our 4-train and Q R S is our 3-train. Now, normally APL is concatenative, in that you can keep adding more things on the left and everything to the right preserves its meaning. You just keep building your expression towards the left. So here, what's happening is with the Q R S train, then S is applied between the left and right arguments. Q is applied between the left and right arguments and R is applied to their results. And then we proceed to the left if there's more.
00:30:03 [AB]
Now in P Q R S, then, because of the P — notice the P is further to the left, so according to normal APL reasoning it should not affect anything towards its right; it takes everything on its right as its right argument. (That's always the rule of teach people when I teach them APL: every function takes as its right argument everything on the right as far as it can see, bar it gets stopped by any parenthesis or statement separators or like that, but as far as you can see, it takes everything to the right.) And it's not true here because — it would be in an APL train: Then P would just be applied to the result of Q R S. But here, because of the presence of the P, then despite the existence of a left argument, S is applied monadically: It just ignores it left argument. And Q is also applied monadically: It just ignores the left argument. And there the two monadic applications of Q and S, their results are then given as arguments to R, and the result of R is used as right argument for P. And then P. uses the outer left argument as its left argument. So, the presence of something on the left governed entirely the meaning of what's on the right, which is, in my opinion, not APLy. That's not how it works.
00:31:21 [BT]
And in J what it's doing is it's going to parse… It's going to create your whole verb before it executes the verb.
00:31:26 [AB]
It has to do a two-pass.
00:31:28 [BT]
Yeah, well, I guess if you call it a two-pass, but it's parsing and then it's executing and that's how it works.
00:31:34 [ST]
I will throw in a question here. What you two are describing sounds like inviting squirrels to live in your brain, and my invitation to you both is to explain to our listeners who are not acquainted with forks and trains and hooks, the wonderful things that all this makes possible because at the moment, I mostly get "Why you want to do this?"
00:32:07 [CH]
Well, so before Adám goes, I will say that I've only ever really looked at these, like… Now that I know that BQN and APL have a glyph for this, like I'm ecstatic, that's fantastic because of what we talked about earlier, and like I can delete 3 symbols in I believe a 5-symbol left-binary-unary-compose-right [⊣B U⍤⊢ – Ed.). I got rid of 60% of the characters in my 5-character expression. That is amazing. I love it. I love it. If I can get rid of a single character, that's fantastic. If I can get rid of 60% and three. Woo, it is a good day! So, I think that…
00:32:57 [AB]
Well, you had to add 1, though.
00:32:58 [CH]
Oh yeah, that's true. That's true. OK, 40% still sounds pretty good. 2 characters and also to… It's just so much less noisy, like it's 40% shorter, but you are reducing the ceremony and the noise around what I'm trying to do and literally it's just a composition pattern, that After in BQN [F⟜G – Ed.] and Jot in APL [F∘G – Ed.] it does exactly what I want, and if I know that, now I can see it and I don't have to build up this little fork with left and right and compose [⊣F G∘⊢ in BQN and ⊣F G∘⊢ in APL – Ed.], that's way more difficult to parse in my opinion.
00:33:30 [BT]
I think that's the key thing is, you know it's not just golfing because you're reducing characters, you're actually doing it to make it more understandable because you're reducing the concepts you have to deal with.
00:33:42 [CH]
Yeah, exactly. That being said, clearly the issue here is not actually that this combinator which — I actually looked it up: It doesn't actually look there's like there is a bird that specifically is used. It's a specialization, I think of Dove according to Marshall Lochbaum's sort of birds table, so that's kind of sad, but it's… — I think the way that this affects building up three trains and four trains and N-trains, and also the fact that J for their two trains, chose Hooks and Dyadic Hooks, whereas for APL and the hook conjunction paper that we talked about that Roger, who we initially wrote in 2006, he said, you know, it was probably a mistake to choose Hooks when you can very simply spell at least the Monadic Hook with the fork. So that was always, sort of, what I thought the the drawback was. But now it seems after listening to, sort of, Adám's explanation that it's even worse than that, that the way that we're used to reading things as APLers in, like the APL sense, not just excluding J, yeah, it changes things which, yeah, I'd have to play around with it in J more, but yeah, that was sort of my response. But yes, we'll throw it now to Adám, was about to say something until I interrupted him.
00:35:09 [AB]
Well, what was I about to say?
00:35:12 [CH]
Well, Stephen asked, you know why? What's the point of these, like…
00:35:14 [AB]
Oh right, right, right, yeah.
00:35:15 [CH]
It sounds like we're having a squirrel party.
00:35:17 [CH]
You know, we've got maybe a couple squirrels already, but you introduce these sort of parsing techniques and now we've got a whole party of 14 squirrels and they're all just throwing acorns around and, you know, help us, you know, Lord help us in trying to figure out how to you know read this J stuff so. How do you respond or how would we respond, you know?
00:35:37 [AB]
In defense of the hook. So, yeah, with all my attacking the hook, let me defend the hook a bit on two parts, monadic and dyadic. So, a while ago, and our panelists were asked to collect our favorite tested functions in preparation for an episode many tacit episodes ago. And what I found was that a lot of these neat little 3-trains, forks, are actually hooks, meaning one of the outer two functions is an identity function. So, you could say, well, it cuts down on the noise, right? And if you want to check whether something is a palindrome, then you can say "the argument equals its reversal" or you could just say "equals its reversal" — that's what a hook is! So, a palindrome is something that "equals its reverse", and in J, that's how you write it: Equal reverse or match reverse.
00:36:41 [AB]
By far the most such 3- trains that are useful I've found to be when one side isn't, is an identity, and then when I write them in APL, I always have a dilemma: Which side should I put the identity on if the middle function is commutative? It should be reversal is identical to the identity [⌽≡⊢ – Ed.], or should it be the identity is equivalent to the reversal [⊢≡⌽ – Ed.]. J never has this problem because you can only put the preprocessing function on the right. So, I can totally see why there is room for such a thing, such a concept. And for the dyadic one, well you look at it in 2021 and APL has these various operators, combinators and BQN has even more, or a fuller set of them. But if you go back in time, down, like APL, I think was the first APL, or at least mainstream APL to add any kind of compositional operator and that was the Jot [∘ – Ed.] And it was just called Compose because it was the only composition. No other compositions were even considered, apparently. And it's very interesting in what's written about it that — and this applies also to the discussion of forks in the early days of APL\?, which was later named J — is that it's functionally complete, somehow. There are things you can't really write but for the most part, you can write anything using just this composition, the hook, and it's one of those combinators that have letters as well, right?
00:38:29 [BT]
Yeah, Starling.
00:38:32 [AB]
Yeah, and so, is there validity to this? Yes, it's complex, in a way, it's asymmetric, but it's that asymmetry that allows you to express anything 'cause you can move things over to the other side, and then you can apply and you can move it back again. So I totally understand why you would want to add this as the most fundamental type of composition, the one that's assigned the privileged role of no symbol. It's kind of like a first class composition. The inherent composition, the tacit composition. All other compositions must necessarily be have a symbol to distinguish them. I just don't think it's justified. It's like you can build a computer from NAND or from NOR. Nevertheless, lots of programming languages don't have NAND or NOR because they're complex. They act in strange ways that you don't really think in those kind of ways, so people tend to build up logics from AND and OR and NOT, even if they're not the most fundamental. The same thing here: The hook is maybe the most fundamental low level thing that everything else can be defined in terms off, but it's just not the tool of thought.
00:39:43 [CH]
Bob, do you want to add anything to Adám's defence of JS hook? Dyadic hook, I should say.
00:39:49 [BT]
I think the only thing I would add is in terms of the two passes, the parsing pass and the and the execution pass, I don't think your parsing pass is really a significant time or space user. I mean, it's interpreted language. It's whatever that sentence is going to do is, that's passed. It's going to happen pretty quick. And then you're going to get into execution. You're going to do what you're going to do, what you want it to. But it I suppose it is two passes, one to parse and one to execute, but I don't think the parsing pass is going to contribute very much to the whole process, the time it takes.
00:40:30 [CH]
Yeah, the one thing I'll add is that it's somewhat sad for me as now the listener knows that I care a lot about the character count, is that even in J with the hook, your spelling of IsPalindrome is still a character longer because they have — what do they call them? — digraphs? So, 2 symbols, two ASCII symbols, so it's 4 characters even though the fork is technically more verbose, it requires 3 verbs. In APL, they're all one symbol, so my reverse match identity [⌽≡⊢ – Ed.] is spelled with fewer characters than my pipe dot whatever hyphen…
00:41:12 [BT]
Colon.
00:41:13 [CH]
Hyphen colon yeah, something like that, yeah.
00:41:13 [BT]
Yeah, match reverse. [ -:|. – Ed.]
00:41:16 [CH]
Yeah I would switch those around, yeah.
00:41:18 [AB]
But even then to apply it in line, you have to put parenthesis around the 2-train anyway.
00:41:23 [AB]
Yep.
00:41:23 [BT]
And in APL, if you wanted to use the hook form — so the normal way to specify tacit palindrome would be right-tack match reverse [⊢≡⌽ – Ed.] and then you have to put parentheses around that. But if you do it inline, you can actually write match jot reverse selfie, so match beside reversal commute. [≡∘⌽⍨ – Ed.]
00:41:44 [CH]
Why do you have to do selfie [⍨ – Ed.].
00:41:46 [AB]
Because in APL you have to be explicit about using the same argument, also as left argument to match.
00:41:53 [BT]
Same reason you have to do double commute [F∘G⍨⍨ – Ed.] if you were trying to cover both cases.
00:41:57 [AB]
And so, think about it: Match Jot Reverse so the Jot here is preprocessing the right argument to the match. Right, so what we're matching is we're matching the argument 'racecar' with 'racecar'. The only thing is, we want to reverse the right argument to Match before we do the Match. So, what you do is, you say I want Match Selfie, right? [≡⍨ – Ed.] It should match itself — only right before we do that match, let's preprocess the right argument with Reverse, so we do Match After Reversal Selfie. [≡∘⌽⍨ – Ed.] That's why it's called After in BQN. And it can also be called Beside because they just happen as if they were written out straight. It would be like 'racecar'≡⌽'racecar' so they're beside each other. So the application of a tacit palindrome checker in APL is still only four characters. [≡∘⌽⍨'racecar' – Ed.]
00:42:54 [CH]
Oh, interesting. So, I've just yeah… I've always known about the B1-Combinator, a.k.a. the Blackbird which is Unary–Binary, which is Jot-double-dot, whatever, the Paw in APL. [⍤ – Ed.]
00:43:17 [AB]
"Atop".
00:43:19 [CH]
And this is Binary-Unary.
00:43:21 [AB]
Yeah, yes.
00:43:21 [CH]
Which is just ∘, but you have to turn it into Dyadic.
00:43:26 [AB]
And I like the fact that you have to be explicit about it.
00:43:29 [BT]
Yep.
00:43:30 [AB]
That that frown ⍨ is being explicit about the fact that I'm going to use this argument twice.
00:43:38 [CH]
Interesting, and that means that in the unary case, or the monodic case, of both Jot which is the little circle in APL [∘ – Ed.], well and Rank [Atop – Ed.], which is the little circle with two dots [⍤ – Ed.], they're both the B-Combinator, which is just unary function composition. But their dyadic versions go to two different combinators, so for the Rank one [Atop: ⍤ – Ed.] it's the B1-Combinator performing a binary operation followed by a unary operation and for Jot [Beside: ∘ – Ed.] it's the hook, the dyadic hook, where you're performing a unary operation followed by a binary operation. So, I was staring at this with the ≡∘⌽ and what I was thinking was, "How come this is not the B1-Combinator", which would be basically instead of it being a Reverse, it would be a 1-rotate on your string. So I have 'ADA', so it would be a 1-rotate which would give you 'DAA' and then your match would actually be Tally [actually Depth. Tally is ≢ – Ed.], so you'd get 3. And I'm pretty sure… I actually… No, it's not a 1-rotate. What is – because 1 is the result – what does 'ADA'… that doesn't actually make sense. 'ADA' rotated by 'ADA' [≡⍤⌽⍨'ADA' is ≡('ADA'⌽'ADA') – Ed.] doesn't give you anything, so if we replace this with, yeah, it's just going to give you a DOMAIN ERROR. OK, forget what I just said. But anyways, I'm not sure if the listener got that we've got something in the chat here.
00:45:25 [AB]
Well, we'll put it in the show notes, but it's a there's an APL Wiki page with some nice diagrams. So, Dyalog APL that has, as of speaking, 3 composational, pure compositional operators that we call Atop, Beside, and Over, and there's a nice diagram on the wiki that shows their relationship, and you can also then understand why they all are exhibiting the same behavior when applied monadically; it's only the dyadic form is different. And this also explains why that symbol is used, so the ∘ symbol comes from mathematics where you have f∘g that's function composition. This thing is, the mathematics only really considers f(x) as a single argument. There's very little of these infix operations, in in traditional mathematics. But as soon as you add a left argument, then the possibilities of combining them in various ways and kind of explode a lot of different possibilities, and so Dyalog has these three different ways of doing it. And BQN adds one more, which is just a reversed mirror image of them. But there're actually more, even more.
00:46:41 [CH]
It's too bad, I actually don't like these diagrams because I was confused at first. And what I really think it needs is it needs the little gray boxes to not be grey. So, what the diagrams, for the listener, is showing is basically you've got f and g, which for across Atop, Beside and Over, some of them, sometimes they're binary functions, sometimes they're unary functions, depending on which one it is, but it's always, it shows the argument boxes — where you pass your arguments to — as grey. In my opinion, they should be colored like blue for binary and like red for unary, because then it would pop out immediately that your binary function is the first thing that gets evaluated for Atop, a.k.a. the B1-Combinator, and for the two other ones, it's the last thing that gets evaluated.
00:47:30 [AB]
Oh, but then you're missing one thing, Conor. Notice that in every case, one of the two legs, so to say, is dotted, not solid, and what that means is that when this composition is used monadically that entire leg disappears and so too its gray box disappears. So, for example in the top case with g you wouldn't be able to color g blue or red or whatever to show its magnetic dyadic it depends. It depends if there is a left leg or not. If it's a left leg then g is dyadic. If there's no left leg then g is dyadic [monadic – Ed]. That's how it works, if it's always monadic.
00:48:11 [BT]
What make might make the diagram clear is if the boxes that were dependent on being dyadic or monadic, which is the left boxes were crosshatched instead of just a solid gray…
00:48:22 [AB]
That could work, yeah.
00:48:23 [BT]
… so that you knew that they disappeared as well.
00:48:24 [AB]
Well, yeah, that's a good point.
00:48:25 [CH]
I know there's an alternative version of this that is fuller, that Marshall Lochbaum, or at least that's where I've seen it. He gave a talk at Dyalog '18 or '19, where he was introducing operators in one of the Dyalog versions and he had that diagram where I think it's like a three-by-two or four-by-two, and I think he has the Monadic and dyadic version separate. Which I still do think that's an overwhelming slide, but I like it better because it, like it's very unclear from this diagram that — sure it says underneath when they're applied monadically, the dotted branch falls away — but you have to visually move each of these, in order to show that each of them are identical in the Monadic case. Like just by erasing the dotted X because they all are different, or the dotted arrow that's going from X, which is the 2nd argument, when you erase those, if you just like, take a little eraser in your paint program, you have 3 entirely different — they look different, even though they're the same. You have to do a little morph animation mentally in your head to make them look identical, and that I think is completely obfuscated by this diagram. And actually, I didn't even… So, here's a good question — or is it a good question? I don't know. — Here's a question, you can let me know if it's good or not. Is it confusing that for each of these, Atop, Beside, and Over, and Over is the Psi-Combinator, so it's Atop B1-Combinator, Over Psi-Combinator and Beside doesn't have a combinator in the Dyadic case, but all in the Monadic case, all of these are the B-Combinator, just unary function composition? Is that good? Is that a mistake? What's the… Is there utility in that? In that, like, you can build up some small expression that is useful, both monadically and dyadically, depending on the combinator that you use, 'cause most of the times, I think in my programming, whenever I'm building these up, it's intended to either be used just monadically or just dyadically, and so if that's the case, having three different combinators that, in the monadic case, all do the exact same thing, seems a bit confusing, but like, potentially there's some utility there, though, that I just haven't discovered yet. Like I never use Over, the Psi-Combinator, I never use that monadically. The only time I ever use that is when I want to use it dyadically.
00:51:00 [AB]
Well, I mean obviously, if you know whether you're going to use it monadically , then yeah, go for Atop, because that's the most natural one, because it always does the same thing, kind of: It applies one function as postprocessing to the other function's result. However, I think there are cases. Beside maybe not so much; that's the one that's the oddest. That's the hook; that's the one that's problematic — right? — where there's little relationship between the monadic and the dyadic form. For Atop, definitely, and as you said, we don't have to address that. For Over, the idea is simply you can take one argument or two arguments: Preprocess all arguments before you do your further work, OK? So, here's an example. Let's say we want to find the differences of the absolute values. Right, so then you could write that as Minus Over Absolute value or Magnitude. [-⍥| – Ed.]
00:52:12 [CH]
Yeah, but monadically your Minus is going to become Negate.
00:52:16 [AB]
Ah, but that's exactly the same, right? Because that's just Minus with a zero as default left argument.
00:52:24 [CH]
Mm, I don't know.
00:52:26 [AB]
So if you only give it one argument, it does say the distance from zero, from the origin, to the absolute value of the argument. And if you give…
00:52:35 [CH]
Is there a better example where it's not Negate 'cause I think it's debatable that, like, or… I think that's the thing is, all of these in the monadic cases where you've defined some dyadic case, you're going to implicitly be assuming, like you know, Division and Reciprocal, I agree those make sense for like monadic–dyadic, but when you're building up that dyadic version of something where you want to take the ratio, so is in the monadic case when you're given 5, do you want 20% like do you want 0.2? I understand that reciprocal is a great… 1-over is a great monadic case for the dyadic Division, but if I'm building up some expression, inserting some default for the monadic version of a dyadic expression built up using the over or the Psi-Combinator, it's a bit of a stretch, that's a bit of a stretch for me.
00:53:28 [AB]
It depends on the left function being meaningfully ambivalent. And if the… For Over, the left function has to be meaningfully ambivalent otherwise, there's no point going here. For Atop, the right function has to be meaningfully ambivalent. For Beside, the left function also has to be meaningfully ambivalent, but it's a matter of like taking a default left argument or not, kind of thing.
00:54:00 [CH]
Yeah, at first when you said that I was like "Oh yeah, that makes sense", but in my head I was actually thinking you're taking either your left and right arguments, or just a list of arguments, pre-processing them and then it's followed by monadic function, but that's actually not the case here. It's you're pressing preprocessing both of either your single, or both of your arguments, and then either applying the monadic or the dyadic definition of your glyph. Which is, I guess, that's what my whole thing is. Like, I don't necessarily the use, like… I guess I need to see a couple of use cases that were like "Oh yeah, this is definitely what you would want to do." But maybe I haven't like gotten to the right point-free level in my black belt journey of becoming like a triple black belt. And, like, at some point, you're like, "Oh, there's a bunch of cases where you want to definitely define these functions such that they can be used either monadically or dyadically, and there's a there's a ton of those cases that exist", I just haven't started thinking that way.
00:55:01 [AB]
So here is, I mean, maybe it's a bit of a of a stretch of it, but one really useful, I think use of Over is with the new, relatively new, ⎕C system function, which is case folding. Or it can be used with left argument and then it's case-mapping. So, for example you can do Match Over Case-fold that is equivalent of case-insensitive matching. Or you can do dyadic ⍳ – that's a lookup — ⍥⎕C, so that's a case-insensitive lookup. Or ∊⍥⎕C — that's a case-insensitive membership, right? So, let's say that you have some HTML building function as the left function to Over. It builds and it returns an HTML tag and it takes as right argument some content for that tag and it can take as left argument, optionally, a class that it's going to add to the to the HTML that it generates, but you don't have to give it a class because you could make a tag that doesn't have classes. And for sanity, you want to be able to feed it any type of text as arguments, but it should lowercase everything before generating anything. So here you have HTMLTag⍥⎕C. And if you give it just some content on the right, then it will lowercase that content and stick it into the tag and spit that out. If you give it a left argument, it will lowercase that left argument, use it as the class and generate the HTML tag and spit it out.
00:56:49 [CH]
So, what was the monadic and dyadic conversions of your…?
00:56:51 [AB]
So, the… Without the Over it would just be a function called HTMLTag. It takes the content of the tag on the right and optionally a class name on the left, and then it generates the HTML tag.
00:57:06 [CH]
Oh, I see, I see, I see.
00:57:07 [AB]
Maybe even better, better than case-folding actually, better example would be HTMLEsc. So you would say HTMLTag⍥HTMLEsc.
00:57:17 [CH]
Yeah.
00:57:17 [AB]
So, it makes sure to escape all the content and escape the optional class name so that you don't end up in any trouble and then it generates the HTML tag. If you don't give it any class on the left, it's just Atop; it's HTMLTag⍤HTMLEsc.
00:57:34 [CH]
So, the pattern here is that you're left argument that is going to turn your function into dyadic is a kind of optional parameter that you're specifying which in a way… Ah, no, it is slightly different than, sort of, 0- or 1÷.
00:57:56 [AB]
Well, I think it's a kind of common pattern, and APLers, experienced APLers, would tend to program in this pattern where the left argument bound with the function is a meaningful construct. So, the right argument is the main data, and the left argument is the parameter. And that is actually the reason why BQN uses 𝕨 and 𝕩 for its arguments, where 𝕩 is like the main argument and 𝕨 is what's to the left of 𝕩 in the alphabet, and it's like the parameters, the width, whatever might set up as left, that's why you have this everywhere in APL as primitives. Take: left argument is how much to take; right argument is the data to take. Rotate: left argument is how much to rotate; right argument is the data being rotated. Transpose: left argument is where you want these axes to be mapped to; right argument is the data that needs to be shuffled around. And when APLers write functions, they will also write functions like this, where the main data goes on the right and the parameters go on the left.
00:59:03 [CH]
Yeah, in your example, though, it was sort of… I can see where there's a little bit more utility there, because an optional parameter is different than, you know, a parameter that's going to completely change the meaning like reverse and rotate, right? Like reverse is different than rotate.
00:59:23 [AB]
Yeah, absolutely.
00:59:24 [CH]
And so, if you've got just a function like, I'm trying to think of a really good example. I'm sure there's one with like Execute [⍎ – Ed.] where you're given strings, either one string or two strings, and the second string is just, you know, so you know… It's a bad example. But like, but potentially like if you've got your last name like, you know, you know 'Smith', [comma is catenate – Ed.] and then your first name is 'Bob', you could do a Catenate Over Execute [,⍥⍎ – Ed.] and that will turn your string… Or actually no, this is not what you want 'cause you want the strings to be numbers anyways, there's some example where, like you can turn your strings into numeric values and then maybe you just want to turn those into a list of numbers. Once again, a bad example 'cause you can just execute a list of numbers which I discovered recently. Someone commented on a video where I was parsing from Advent of Code, a comma-separated list of numbers and I had like, you know, split on the commas and then you know, done a bunch of stuff and then they were just like, "Hold my beer." Uh, because comma separated numbers, that's like a you know, number catenate number catenate… You can just execute the whole thing and I was like, "Oh my goodness." I turned like a ten-line expression into a like a one-line or 110-character into a 1-character — that was that was the best.
01:00:49 [AB]
So, I'll promise, now in the podcast, so then I have to get around to it, but this week I'll publish a video on YouTube with some tips as to parsing those kind of text data files, 'cause I've been seeing lately, during December, people doing all strange things with Execute and I fiercely dislike evaluating stuff that you download from online without inspecting it first. There are better ways to do this.
01:01:19 [ST]
I'll chime in on that too. We've been tracking the Advent of Code competition. This year, it's very good for vectors and I'm just publishing a wrap-up of the first week and each day shows you how to ingest from the text file.
01:01:37 [CH]
Yeah, it's definitely one of the one of the trickier parts that I think that would be very useful. Even, I think, in one of the videos, one of the problems — I have only done the first four — but the 4th one is super, super fun. It's like a bingo game and it lends itself extremely well to array languages, as does the very first problem. But the 4th one, it's a little bit more involved and you have to do some row wise and column wise stuff. But like, parsing… The boards come like pretty nicely formatted and I thought "Oh, this should be pretty simple and I ended up, you know, spending I think 15 or 20 minutes having to like deshape and reshape and a bunch of stuff." And knew that like there's probably a much better way to do this, but like, I just wanted to get through the parsing part and get to like the fun part. 'cause parsing is not fun, like it's honestly why a lot of people, including myself, give up on Advent of Code like at a certain point, 'cause it becomes like… That's the kind of joke about Advent of Code is, everyone writes it in their favorite language and ends up solving it in regex. Because it's always just this, like, you know, parsing problem rather than sort of like an algorithm.
01:02:41 [ST]
Turn the into vectors, that's the key for key for the bingo boards.
01:02:47 [CH]
Turn them into vectors?
01:02:49 [ST]
Yeah, they're 5 by 5 matrices, but for manipulating them and during doing all your problem solving they're much more tractable as vectors and when you want to test to see whether a board is being is being solved, you just 5 5 reshape them
01:03:08 [CH]
Yeah, I'm looking forward. We'll have to include links in the show notes and…
01:03:13 [ST]
I'll put the links.
01:03:15 [CH]
I'll be looking forward to reading that.
01:03:17 [BT]
So, I've got a question and that's in APL, are you able to actually define the behavior of monadic or dyadic if you're explicitly defining a verb? You can in J.
01:03:31 [AB]
Yes.
01:03:32 [BT]
Yeah?
01:03:33 [AB]
So, there are a couple of different ways to do it and it depends also on your functional styles — we have a couple of different function styles — but at the very most extreme level, you can just check whether there is a left argument or not…
01:03:45 [BT]
Yeah.
01:03:46 [AB]
… and then branch there by whatever means or whatever function type you're doing. But the real cool thing is when you… There's a this syntax in APL, in Dyalog APL on some other APLs, that's called the dfns syntax. And normally in APL you do assignments with name and then left arrow and then a value. But there's a little thing there, that the left argument in such a lambda or dfn called ⍺. And normally you're not allowed to assign to these special names, but if you try to assign ⍺←some value, then that whole statement is recognized and is simply skipped if there is a left argument. But it's evaluated if there's no left argument, and this allows you to set default left arguments in a really neat way. You just say ⍺← whatever default left argument, and then you continue everything. And there's an even cooler trick, which is, there's no restrictions on the name class or it's to say, the role that the left, that ⍺ has, so if it hasn't been defined, you can even assign it a function. OK, so let's say we wanted to write a cover function for minus. So, we could begin with saying OK, ⍺←0 and the next statement we say ⍺-⍵. You could also define it as ⍺←⊢, as the function identity, and next statement, it says ⍺-⍵. So, what happens now is, if there's no left argument, alpha becomes the identity function. Next statement we have the negation of the right argument, and then we apply the left, the alpha to that, which is a function which is identity, and so it is -⍵. So, there's two different ways to do it, and sometimes you can write some really cool code where you could either use a function or even an operator as the default value for. And you can then go and do something special, and that's when it gets really fun writing ambivalent functions where you can share code. But then it takes two different pathways depending on whether ⍺ is the value that came in or ⍺ is whatever It got set to when you came into the function.
01:06:10 [BT]
Yeah, and so I'm, you know, as a programmer, having that power of defining what you're going to do, you can do it within the definition of your function as to what you want it to do, whether it's going to be monadic or dyadic, you have that level of control. So, when you were talking about well, you won't know until you know until you know whether you've got one argument or two arguments how it's going to react. You can define how it's going to react given on what it's going to give. You won't know how it will react until you know whether you have one or two, but you can define what it will do when it has one or two.
01:06:50 [AB]
Right, but this way of giving the left argument, so to say, a function value this allows you to take advantage of the pairings of primitives the monadic dyadic pairing of primitives so you can have all this code that's ambivalent, which means, I mean, it's kind of tricky. The reader would have to do probably two passes on it, reading it, saying, "OK, if this is being applied monadically, this is what's going to do, but if it's being applied dyadically, that's what it's going to do," but it's very different from my criticism of the J hook because at the time of application you know exactly what you want to get. You don't have to look up anything in advance.
01:07:30 [BT]
Well yeah, that's kind of the same as J, except that we're saying you're saying at the time of application it would be at J's point of execution, it knows what it's doing. The difference being if you've programmed that way, now you're asking it to do essentially a parsing before it can do the application.
01:07:47 [AB]
Right, which I think APL's interpreter does the parsing as well, but the human reader doesn't have to do it. The important part here is if you see it inline, if you see this function inline, you can start from the right immediately evaluating an APL train, which you cannot do in J because you have to count first. Whereas here with the lambda that's saying "OK, the left default left argument. Well, do I have a left argument? Do I not have a left argument?" If so, when I'm reading it says, "OK, there's a left argument in this place", so I just ignore that statement.
01:08:19 [BT]
And that's what comes into play when you have a dfn and you have a function as your left argument, the alpha to a function, yeah?
01:08:26 [AB]
Yeah, well, it's not really a function as a left argument, but you set the ⍺ to its function value. So, I can see in APLcart — which is my collection of stuff in APL — I have 17, yeah 16 I guess, entries where I have used this ⍺← something and in by far most of them it's being set to a non-array, meaning usually to the identity function. And sometimes to a complex statement and sometimes just to a simple thing like 1.
01:09:02 [BT]
So, dyadic hooks.
01:09:04 [CH]
All right. Yeah, dyadic hooks turns out every language has got 'em. Well, not every language J, APL, BQN.
01:09:15 [BT]
Does Python have them?
01:09:15 [AB]
Every language should have them!
01:09:19 [CH]
No Python, I mean… I'm sure you could write some little library with all of these and some crazy things done in Python. I've seen some crazy things done in Python
01:09:28 [AB]
Oh, but it's… Dyadic Hook, it's very simple, right? So, in a functional language…
01:09:32 [all]
[laughing]
01:09:34 [AB]
Yeah, who am I to say that, right? Who are you‽ It's very simple, in a functional language, you take two function arguments — right? — and then you return a function that takes two arguments. It applies one function to one of the arguments, and then it applies the other function between the one argument and the result of that first function application — and done! You can definitely define the hook in any functional language.
01:10:06 [CH]
I want you to put that at the front of the episode, Bob: It's easy!
01:10:09 [BT]
It's easy!
01:10:14 [CH]
Yeah, that I think is a… Yeah, I learned a lot; this was great. Dyadic hooks; they're everywhere. After in BQN; I'm thrilled, I'm thrilled. And now I've comprehensively understand Atop, Beside, and Over. Is there a good reason why all the monadic ones are the B-Combinator? The jury is still out. But I think we'll have an update in a year or something when I'm… Oh yeah, I guess I'm technically like two APL years old now. December 9th I sort of consider my APL birthday because that's when I really started to fall down the rabbit hole. And I say, yeah, maybe when I'm 3 APL years old we can come back and I'll see if I've discovered the truth behind the monadic definitions being the same. But yes Dyadic Hook — I hope this is everything the listener wished for during the seasonal holidays, in terms of holiday gifts, I think we've covered this. That's what? We've been talking for more than an hour here, so I think the Dyadic Hook has probably gotten more attention than any other than any other combinator or train. Anything else we want to we want to wrap up the holiday special episode with?.
01:11:43 [BT]
As always, contact@arraycast.com if you want to send an email, we certainly respond to them, and we've had some interesting emails. I forgot to mention at the beginning, Oliver Mooney put together a — for a Clojure group — he put together an introduction to J. So, if people are interested, it was actually pretty good. It's about ½ hour long and I think he does a good job of it. He's not trying to overwhelm his listener. He's showing — it's a video — so he's showing stuff and it's a good way to… I guess if you don't know the language, you can at least see the sort of things that it can do. If you're familiar with array languages, a lot of it's not going to come as much of a surprise. But if you aren't, it might be a good way just to look at stuff. And I mean, there's so many other ways. I mean, the APL Wiki — and Adám can add the different areas that he's done that are good for beginners. There's a lot of different ways to catch on if you're if you're interested. If you want to fall down that rabbit hole, there's lots of opportunities to fall down that rabbit hole.
01:12:41 [CH]
And this is on YouTube. I assume the recording, or?
01:12:44 [BT]
Yep, Yep, we'll put a link up. Oh, and actually show notes. we always put the links up, so if people want to go to show notes, we'll have links trying to explain all the things and the diagrams and the dotted lines. And there should be hashtags here. All of that, all that will be. Yeah, we'll put that stuff in.
01:13:00 [CH]
We should say… So, Stephen has a website too. Is it 5jt.com? — 'cause every single time I want to go to that — you have the blog that you recently wrote on tacit expressions and sort of evaluating it. And every single time I think it was like 3 times until I finally remembered the URL of your website. I would search and I would search your name and I'd go KX and K and then I'd be like, "Screw it" and then I would just go to the show notes of the third tacit episode where we would have a link and that was that was the fastest way to get to Stephen's blog and I could never remember it was 5jt, and for some reason I was always trying to put an S in there, I don't know why.
01:13:37 [ST]
Well, it should have been an S in there, I was just never fast enough to claim these things.
01:13:42 [CH]
It's still, I mean a 3-character URL is still pretty impressive. Given how you know you search for basically anything these days, whether it's a website or a Twitter handle, they all seem to be gone. But yes, show notes for everything, including 5jt.com and everything we've mentioned will be in the show notes, and I guess with that we will say, "Happy holidays and happy array programming!"
01:14:07 [all]
"Happy holidays and happy array programming!"