Transcript

Transcript prepared by Bob Therriault, Igor Kim and Sanjay Cherian

Show Notes

00:00:00 [Gareth Thomas]

I think the attitude of everybody in the world should be to stay curious and to respect language diversity. And whenever you try a new language, the odds are the first 10 hours, 20 hours, it's going to be annoying because it's very different to what you're used to. But I think it's really important that people hold on to those first 10, 20 hours, if nothing else, to solidify their own understanding of the language where they're best suited.

00:00:34 [Conor Hoekstra]

Welcome to episode 79 of ArrayCast. I'm your host, Conor. And today with us, we have a record three guests, and we will get to introducing them probably in a few minutes because we have a couple of announcements, a couple, a few, maybe many. But first, we're going to go around and do brief introductions. We'll start with Bob, then we'll go to Stephen, then we'll go to Adám and finish with Marshall.

00:00:55 [Bob Therriault]

I'm Bob Therriault, and I am a J enthusiast. I'm very, very, very enthusiastic about J.

00:01:00 [Stephen Taylor]

I'm Stephen Taylor. I'm not actually feeling very enthusiastic today, but I do work with APL and q.

00:01:07 [Adám Brudzewsky]

I'm Adám Brudzewsky. I am head of language design at Dialect Ltd., which means I do APL.

00:01:14 [Marshall Lochbaum]

I'm Marshall Lochbaum, worked with a lot of array languages. I work on BQN and Singeli now.

00:01:18 [CH]

And as mentioned before, my name's Conor, host of the show, massive array language fan and polyglot in general, and super excited to talk about one of the topics, teaser. I mean, probably it's in the title. We're going to be talking about MATLAB and the MathWorks company and, you know, other descendant languages from that, Julia, et cetera. But before we get to introducing the three aforementioned guests, we've got a few announcements. So we'll go to Adám first, who's got six, and then I'll circle back to me and we'll go from there.

00:01:41 [AB]

Okay. So I can group them a little bit. First are challenges to the audience. So we mentioned before the APL challenge, which is this reborn for phase one of the previous Dyalog APL problem solving competition, but easier. [01] And this one only runs for three months at a time. Round two has started. Deadline is July 31st. And then there is the APL forge that I think we mentioned with Stine, the CEO of Dyalog. And it has now properly launched, which means there's all the information necessary on the website. This year is a bit special. So the deadline is really soon, June 29th. Go check that out. And then there's something called the APL 64 conversion contest. APL 2000 will award $300 to the first six customers that can demonstrate that they have converted their APL plus win commercial application to APL 64. And the deadline for that is June 30th. I mean, the award is $300, but the product itself costs thousands. So I'm not sure how that works out, but good luck. Then the next section is meetups. The APL Bay Area user group, so around San Francisco, they have a meetup on Monday the 13th online. And it's a special edition of that where they have their elections, but also just round table discussion. And FinnAPL has their spring meetup on the 14th in Finland. So it's a real life meetup. And finally, I want to encourage the audience to support a request that was opened against GitHub's future monospace font, that is called monospace. So somebody requested that it should have APL support as well. And you can go and add various emojis to that to show that you're supportive of that. We'll leave links to all of these things in the show notes.

00:03:47 [CH]

Awesome. And we've got, I think, 2.4 or 2 and 1/2 announcements. So we'll go to Stephen for the next one, and then after that to Bob for the last 1 and 1/2.

00:03:56 [ST]

Oh, cool. If you learned q from an introductory workshop, as so many of us did, and never done any formal education since then, you might be finding that you don't feel like you've got a really thorough grip on the language. You're not always sure where to put parentheses in and so forth. You might be interested in a project called Q201, which aims to teach the language systematically and formally to people who are already using it. Q201 will be announced at the 22 May meetup, KDB meetup in London. And it looks like it'll also be announced at the Madrid-- or introduced at the Madrid meetup in the middle of June. In the meantime, you can keep your eye on Q201.org.

00:04:52 [BT]

And I'll start off with my 1/2 an announcement. My 1/2 an announcement is, as we record this, there's been a lot of noise around Shakti and a possible imminent release. So by the time you're hearing this, it may have already happened. But keep your eye on that space. We'll put something in show notes, just because there's been a lot of things being talked about with Shakti, and there may be something on the horizon, either in the very recent past or in the very near future. And then the second thing is, last show we had Christopher Augustus and Steve Thames on. And we forgot to mention that Christopher is having a meetup at a restaurant called Prontos at Caldas da Reina. It's a one-hour drive north of Lisbon. It's June 1st, so if you happen to be in Portugal-- and by the way, the train stops right opposite Prontos. So I'm hoping I'm close with those pronunciations. But that will also be in the show notes, so if you're interested in-- if you're in Portugal or near Portugal, you're interested in meeting up with a bunch of people who are interested in all these programming languages, that's an opportunity. And that's June 1st. It's from 2 in the afternoon till 5 in the afternoon. And that's it.

00:06:05 [CH]

Awesome. So that was a lot of information. If you didn't catch all of it, just be sure to check the show notes, either in your podcast app or at our website. And if you're trying to find on Google Maps where that meetup was and you can't get it from Bob's pronunciation, we'll just-- we'll link it in the show notes, folks. That's the easiest way to do it. With all the announcements out of the way, though, I am extremely excited to introduce briefly our three guests, and then we'll throw it over to them to introduce themselves. I'll go in the following order. First is Steve Wilcockson. He is the technical marketer at Quantexa. [02] At least that's his latest position. Next is David Bergstein, who is the VP of product at Posh AI. And finally is Gareth Thomas, the co-founder of VersionBay and also a fellow podcaster. I'm not sure, that might be the first time one of our guests has their own podcast. I might be misremembering, but it could be a first, folks. And I think we will throw it to them to introduce themselves in a little bit more detail. But my guess is that they all first met, or a lot of their backstory has to do with the fact that they all worked at The Math Works, which is the company that famously works on the MATLAB language. So the dates of those in order, Steve worked there from 2009 to 2018, David from 2013 to 2020, and Gareth from 2009 to 2018 as well, the same timeline as Steve. So my guess is there was some overlap. Hopefully colleagues work together. I could be completely wrong. They might have been on opposite teams, never talked to each other, and they just happened to be on this podcast for a completely different reason. But maybe in order, we'll go Steve, David, Gareth. You can each expand on your sort of background, and then we'll go from there. We'll talk about The Math Works, MATLAB, and see where the conversation takes us.

00:07:37 [Steve Wilcockson]

All right. Well, thanks for the intro, Conor. And it's a pleasure to be here today. I'm Steve. I've actually been with MathWorks indirectly since 1996. So it was my first job after leaving college. And somewhere along the line, I picked up Dave and Gareth along the way. And we'll talk about that. I've always been on the marketing side. I was a financial services industry lead. And that was where I came across q for the first time, because people would back test in MATLAB. They'd developed the trading strategies in MATLAB. And then q was better, faster, quicker, more likely to be used in production. So it turns out that I've at least referred on three customers to q and KX during my time there at Math Works. I worked at MathWorks for 20-odd years. Despite working with Dave and Gareth, I decided to leave. And I was very sad about doing so at the time. I went to work for a JVM specialist called Azul Systems. And more recently, I worked for KX, which is how I got to know some of the other reprobates on this call. Thank you very much.

00:08:45 [Dave Bergstein]

Cool. I'm Dave Bergstein. I feel like I need to introduce my language of choice, right? So I'm a MATLAB enthusiast, I'll say, but I spend plenty of time in Python. And yeah, going back, I got my start in engineering, electrical engineering. I worked in circuit design for a while. And I did use MATLAB back then, just doing simple math that had to be done. It was a convenient tool and seemed always available where I was working. I went back and got my PhD in electrical engineering, studying optics. And in that time, I did a little bit of Fortran, lots of MATLAB. Really, I guess, started my strong affection for MATLAB back then and the time I spent using it during my PhD. Yeah, went out into engineering, became an engineering manager, building optical systems. Then I moved into MATLAB. I used MATLAB so much at that point, flipped back and forth between MATLAB and Python and just seemed like a natural fit and got to know Gareth and Steve. Coming out of MathWorks, so at MathWorks, I was principal product manager for MATLAB itself. And then beyond that, I went to three different startups, all in the AI space. The first one was Tesseract, doing medical devices, using AI to diagnose disease. The next one was Pinecone, where we built a vector database, a key component of many of the modern AI systems. And now I'm at Posh, where we build chatbots for banking, again, using a vector database and lots of AI. But yeah, that's it for me.

00:10:27 [GT]

So, hello everyone. My name is Gareth Thomas. I'm actually Portuguese by nationality, born in South Africa, lived in seven different countries, and I am an electrical engineer by trade, so I studied control theory. And that's actually where I first got exposed to MATLAB. So I was the MATLAB guy at university, and I was super happy about it. Worked in a couple of countries, jobs, but I spent a large part of my career at the Mathworks. So I was about just short of 10 years, and I joined as what they call an application engineer. So I was focusing on code generation. So this is taking MATLAB algorithms and simulink blocks and putting them on embedded devices. So they call them production code generation, widely used in the automotive and aerospace industry. And then I moved on to V&V, formal verification with the tools. And as I grew in my career in the Mathworks, I thought, well, maybe the biggest bang for the buck is to help influence the strategy MathWork stakes with academia. So I was a big fan of getting MATLAB in front of all the smart people around the planet, quicker, faster, and easier. And that was kind of my role where I kind of turned on the accelerator, if you will, for the education space that MathWorks is pretty strong at. But, you know, nearly 10 years ago, by the company, then you say, OK, so what's next? Then I decided to create my own company called VersionBay. And we're a software consultancy company that focuses on helping people migrate from older versions to MATLAB, but also from MATLAB to other languages. And more recently, we've also been developing products around measuring the usage of scientific and computing tools. As a side gig, as Conor mentioned, I'm also a podcaster myself, so I enjoy amplifying stories. And a fun fact about myself is living in the Netherlands, married an Italian with kids who speak better Dutch than I do. And I am very passionate about technical scientific computing. So given that, Conor, I don't know where you want to take us with this conversation. We can go in multiple dimensions.

00:12:18 [CH]

Well, you know, as a couple of people remarked pre-podcast, you know, I've got a thousand questions already. Well, I think Marshall said I should say a thousand and seven for some reason. But, Bob?

00:12:27 [BT]

Well, one thing, we're going to break precedence here because when we did the pre-show, Gareth has an announcement.

00:12:34 [CH]

Oh, yeah. We can throw it right back to Gareth. Also, too, you can't. I've heard this on a couple of podcasts where someone says they have a podcast, but then they don't they don't plug it. You got to say the name. I'm sure you can find it on all the apps, but say the announcement, then also plug your podcast, because I'm sure if you're talking about, you know, you know, engineering, computer science related things, listeners might be interested on this podcast as well.

00:12:55 [GT]

Yeah. So thanks for the shout out. So my podcast is Inspiring Computing. And the idea is to talk about where computation meets the real world. Right. So where people in the computation space use MATLAB, Python and Julia just to share stories, how they use that on a daily basis. So that's Inspiring Computing. You can check it out all the places. But maybe what is more relevant is for the announcement that I'm also co-hosting, co-organizing the biggest Julia conference in the world. So last year was at MIT. This year, it's going to be in Eindhoven. It's actually going to be in a football stadium. So we're renting a football stadium for three days. You got 152 speakers. We've got daycare. We've got praying rooms. It's an event not to be missed. So if you ever want to come and hang out with a whole bunch of Julian fans, come to Eindhoven. That is 10, 11 and 12 of July. Tickets for sale. Sorry, there's no more speakers available, but it's also co-hosted with a Pydata. So there's going to be a mini Pydata there as well. So if you're a Python fan, but like Dave, you can also submit your proposals and we'll take it into account. But it's a big event at a football stadium.

00:13:56 [CH]

And in the Netherlands, you said?

00:13:59 [GT]

Said yeah in the Netherlands. So Idaho is the 5th largest city of the Netherlands, but you know for the folk in North America, it's an hour and a bit away from Shippo, the Amsterdam airport. So, you know, some some cities in the US, it takes an hour to cross the city, while in the Netherlands, you can go across the country almost. So the trains are really good. So the city is.

00:14:16 [GT]

And home, but think of it as an extension of Amsterdam, if you will, from a transportation perspective.

00:14:21 [CH]

Yeah, I don't want to put my geopolitical knowledge on, you know, I could be wrong about this. But I do recall learning some fact that the Netherlands because I am, my grandparents are from the Netherlands. And so I like to think I know a bit about the country, even though I've never been, I think they have a greater population than Canada. Even though they're a fraction of the size, I could be wrong. I think Canada's like 37 million. We're the second biggest country in the world by area, but I'm not sure. Is it true that the Netherlands has more?

00:14:51 [GT]

No, so Netherlands is about 16 million. But maybe the fun fact is there's more bicycles and humans in the country. So they've got about 17 million bikes, where they've got 16 million people. And that's how you get around the problem of people stealing bikes, just have more, flood the market.

00:15:06 [CH]

All right, well, so I was I was off by a factor of two. But still, if you look on a on a map of how small the Netherlands is, it's it's quite sad because you could fit if you took that population, Canada would have, I don't know, billions. Anyway, so yes, the first question I think should be the story of how all of you met Steve, you mentioned that you picked Gareth and Dave up along the way at some point. I'm not sure if you want to maybe color that a bit more. And I also apologize, I missed that you were there at the the MathWorks for 20 years, because you have it listed under the MathWorksLtd versus MathWorks. And it had a different logo. So I'm not sure if there was like a corporate change or something like that happened that caused me to miss it. But 20 years. And at the halfway point, it looks like that's when you know, I don't know if you cherry picked, I'll let you tell the story of how how you three came to work at the MathWorks.

00:15:53 [SW]

Yeah, it's very boring, but I'll do my best. Yeah. No. So I, at that point, which you pickedout 2009, I moved into what we call industry marketing for MathWorksInc. That meant that, you know, I was very much part of the corporate hub, which is based out of Boston in Massachusetts. And while I worked remotely in the UK, I was very much reporting into the Boston core. And I was concerned with strategic questions. So for example, how do we develop the financial services sector, for instance, which areas do we move into? What functionality do we need to bring out? All that sort of stuff. So think back, this is like 2009. Before that, I'd worked very locally in sort of the Cambridge domain on local activities, primarily sales and self-support. So there I was working in the Natick side, 2009. What we see is we see the rise of open source technologies coming along on the way. We see R coming on first. [03] Then we see Python coming up. Wes McKinney has been working at, I'm trying to remember the name of the hedge fund now, was it AQR, where he was based? AQR, yeah. He's authoring Pandas right at that moment in time and is taking it into production. It's an exciting time as well, because, you know, the world of high frequency trading, which I think many of us on this call have been affiliated with, is changing very fast because of all the regulations coming in. The world is becoming risk management oriented. It's an exciting time computationally and in the financial services discipline. So in terms of Dave, I'll pick on Dave first. Dave was the MATLAB product manager right at the time when Python was really gathering momentum. And Dave himself was a Python guy. And I was really enthused by what I saw coming up from the Python community. So I kind of latched onto Dave, pretended I was his best friend for a bit. And, you know, he kind of let me tag along. Gareth was doing some really cool stuff. He talked about code generation and he talked about taking MATLAB very much into production, primarily on hardware. He was also looking at cool stuff with simulating big insurance portfolios for insurance providers, for example. And we also kind of co-worked a little bit on some of the academic stuff. The academic place matters. If you teach people to use MATLAB when they're younger and enthuse them about its joys, then it makes it easier. And Gareth and I bonded a lot over that. So, and Gareth is just a crazy guy and he's fun to hang around with, as is Dave. So I'll try and meet up with them as much as I can whenever I'm traveling, but that doesn't happen very often. I think I saw Dave, what, three or four years ago back in New York, but, and that was a good time. So it's nice to be hanging out with everyone today.

00:18:48 [DB]

And so Python was brought up in this sort of 2008, 2012 range. And definitely that was a massive inflection point. Like a lot of people don't realize like, cause Python, depending on the ranking you look at, it's number one, number two language in the world. But it's, it's was created, I think in the early nineties or 1990s. It's a, it's a language that is, I think, potentially even older than Java. Java was like, I think the late nineties and Python was before that. But people think of Python as sort of a more modern language. It's, it's your first choice if you're doing data science stuff. And so maybe, I'm not sure who wants to take this, but like, what was at this time, you know, Python versus MATLAB because, you know, pandas, NumPy, both two sort of array inspired libraries that have become absolutely massive are definitely competing for a lot of the same kind of applications. So like, at that time, you know, MATLAB, I'm not sure if it's viewed as got like a way bigger piece of the pie, but then Python sort of slowly building like, at the time, like, you know, what was the feeling at the math works? You know, were people excited about Python or were they worried that, you know, Python's going to come in and eat everybody's lunch

00:20:12 [GT]

So that's a loaded question, Conor. Okay, so we'll give you a different perspective. So maybe I can chime in, maybe more from my personal side. So I got hooked on Matlab at university. So I believe that if you present the right tools to smart people, they latch onto it very quickly, and then that serves everybody well moving forward. So MathWorks has always been very strong in academia. They had lots of good things. And then along comes Python. So my role at the MathWorks at the time where Python was more of a thing was to kind of say, well, hey, you know, in academia, there are more and more groups, maybe on the natural sciences, which are starting to use Python, maybe on astronomy. So there are adjacent fields where MathWorks is typically very strong, even though they had a presence, including finance, potentially. And the idea is, well, I saw it in academia saying, oh, this is a flagpole that there's something coming. The same could be said for Julia as well, right? But then Python kind of exploded. And I think at the time, the concern was, are we losing, I guess, more people to Python than increasing the Matlab pie? And actually it turned out that we could actually accelerate the number of new users of MathWorks tools faster than the Python ecosystem could grow. And the MathWorks ecosystem has always been a very controlled growth. So I don't think MathWorks is ever aimed for exponential growth. It's always been more about a slow, steady growth because to run a company, if you have influxes too quickly, that actually can be problematic. So the DNA of a company can change if you triple the headcount from one day to another. So I think the history of the MathWorks has always been a very controlled, organized growth pan. And they hold that path very steady for the last 35 years or so. So even with the introduction of Python, I don't think it was such as like, oh, no, the world is ending. We're like, yes, there's another technology on the block. Cool. We need to pay attention to it. And I think people such as Dave and Steve were like flagpoles saying, oh, there's some cool things to be learned. And they would take that back into the development. And from there, you know, decisions would be taken to a point that Matlab has tables as well. Very similar to the pandas NumPy approaches. There's also lazy loading and lazy tables that people are not so much aware with the tall arrays in Matlab. So there's a lot of technology in MathWorks that people are not aware of. And I think MathWorks tries to keep an eye on it and bring the best from different ecosystems into their ecosystem, just like any other language.

00:22:32 [SW]

That's a great answer. Sorry, Dave, you go.

00:22:35 [DB]

I guess I was going to say also a great answer. Thinking about, you said, what what maybe changed. Right. I do think the introduction of Matplotlib was quite significant. So that it was first released in 2000 and I guess it was first developed in like earlier 2000s. But it didn't really pick up steam and wasn't really as capable until you get to the, you know, the kind of time era you're saying like 2007, 2008, 2009. And what was critical about that was that you couldn't I mean, for scientific computing and a lot of tasks, you didn't have a way to visualize easily. And Matlab had wonderful visualizations. And so I feel like they weren't really even set in the same breath, maybe, you know, before, even though Python had been around for so long. And then even though NumPy had been around then for some time as an array based language or sorry, array based module package. The point is, without a way to visualize easily, I don't know that it was catching on. But Matplotlib came and it made plotting and visualization just like you would do in Matlab. And interesting since, you know, thinking about languages, I've had people before I joined MathWorks ask me, but why is it in Matplotlib you index starting at one and yet Python is indexed starting at zero. Right. But that's because it was such a copy of Matlab graphics. It was just a one for one. They made it as close as they possibly could. So all the syntax was the same, even indexing, even though Python, you should index at zero, it indexed at one like Matlab. And that, I think, turned the tide. And then you tie into that Wes McKinney and pandas, which came around at that same time. To me, that's what changed. But, you know, I also think that it did encourage some good movement. So, for instance, Gareth mentioned table and tall and a bunch of features. So when I joined, like I was there, so let's say 2012, table had already been introduced, but the speed of development, I think, was slow. They had all these plans at MathWorks for introducing all kinds of wonderful capability that would compete with pandas and make it really easy to manipulate tables of data. I mean, pandas, I felt, was just such a runaway success of how you manipulate tabular data. And we had, there was all these ideas. And then I think that was really kind of a push to get that stuff going faster. And in that way, I think it was a positive push for us at MathWorks to put more emphasis into table, tall, and all the things that came after that I feel compete very well with pandas. Anyway, I know, Steve, you were going to say something.

00:25:37 [SW]

I was going to pick on the same points. Matplotlib, very similar to MATLAB. Coincidence? Not at all. Pandas was new. I think looking back on what Gareth said about, you know, steady growth of the MathWorks and not rushing into things, the Python world at the time on the back of pandas was exploding. And you had just that culture of open source, everyone contributing into it. And pandas just came about very, very quickly. And then it became a subject of the meetups and it just gathered a momentum of its own. But I think the upshot of that then, which I think has taken Python to being top of, I hope I say this right, the TYOBE charts, was then the pandas and the NumPy and the SciPy stack being the heart and soul of the next wave of data science that was coming in at the Facebooks of this world, the CSPs and the cloud service providers and that Silicon Valley. And I think MathWorks took its deliberate decision to say, we are engineering. We're not into that sort of space. But then Python sort of came along and really kind of grabbed that mindset in that deep data science domain. And now Python is number one, overtaken Java, primarily on the back of that reason, in my view.

00:26:52 [GT]

But there are a couple of things to unroll there, right, Steve and Dave. So you could argue that the technology is actually not superior. So NumPys and the strings, and it's not the most performant. So typically technical folk, they like the idea of the best tool for the right job. And that's what it is. But there's an emotional feeling/community that also coincided with the open source of being part of a community. So I think historically, folk working in the MathWorks ecosystem feel it's a closed ecosystem in comparison to maybe other communities. And that's another thing that played a role. I mean, even pandas, right? So Polars came along and then boom, it's orders of greatness faster. So it's not that the technology is outstanding. It was just easy to use. And I think there was a timing element, but also a sense of community, which I think is underestimated in language adoption. It's how welcome are you to join your peers? And I think you folks here on the Ray Cross are actually doing a really good job of making sure that folk feel welcome. So that's a big part of it.

00:27:50 [SW]

Just to pick on that theme, I had dinner with a friend who works at a hedge fund; long time MATLAB user. [He] moved into using Python because all the students came in and used Python. He's our age; he's sort of in his 50s. But he says: "by God, I wish I could just use MATLAB again; so much easier just to load the data and I don't have to worry about all this pandas stuff." Yeah, it's fascinating times.

00:28:16 [CH]

I think this is a great way to segue into one of the things I know that Stephen had articulated at one point and I as well. I'm curious to hear from all three of you. We talked a little bit about the libraries and sort of the rise of pandas and whatnot. But this kind of library focus ... maybe we can go back a few years to the beginning of MATLAB and the MathWorks and talk about sort of the fundamental differences between a language like Python (and even Julia and R) and what MATLAB offers that causes your colleague that you just mentioned to miss the simplicity or the ease of loading data and doing stuff. Every language has its different superpowers and things that it makes easy. Maybe if you want to tell a little bit of the genesis of the MathWorks and MATLAB and what problems it was trying to solve and why it leads people to still want to reach for it, if they had used it in the past and now have to do something. Because I myself, I do a lot of Python programming and it is a daily occurrence where my heart, my soul hurts for having to spell something a certain way. You always get the job done, but it's never as elegant as the array way of doing things. And even in NumPy (which is array inspired) it's still very clunky; you're always passing things in. Anyways, I'll throw it ... I'm not sure once again which one of you wants to take it or we want to do kind of a round robin of giving opinions of what makes MATLAB great and the origin story.

00:29:40 [GT]

Go ahead, Steve. You're the first to the MathWorks, so you take the lead and then Dave and I will correct you along the way [laughs].

00:29:43 [SW]

Yeah, as always, Gareth, nothing changes. I think there's two things you picked on there, Conor, I think to unpack. There's the origin story, and part of that (perhaps we'll come on to this) is how it intersects or doesn't with the APL and affiliated languages. And then that's how it's changed over the years. MATLAB, when it was first unleashed (I think it was 1981) it was very much a matrix, linear equation, solving system. Solving matrix linear equations; eigenvalue problems. It was a matrix algebra calculator, if you will, invented by a guy called Cleve Moler. It was based on some work done by a range of characters, John Wilkinson, various other key characters from the '50s and the '60s, who I know intersected with APL (Kenneth Iverson), in terms of the APL languages, [04] where someone built an algo implementation around matrix algebra libraries. John Wilkinson built some Fortran subroutines out of it. I think it was called Icepack. Then Cleve rewrote it as LINPACK, and that became the basis of the original thing, MATLAB. So that was the origin story, just pure and simply, matrix algebra, which happened to find a bunch of domain spaces, linear equations being the amazing one, and Gareth can talk a lot about that, et cetera, and Dave. But then over time, it's evolved. I think a phrase that the CEO would use to describe MATLAB was [it] was a Ferrari at your fingertips, where you not only have the matrix algebra at your fingertips, but you have the ability to call it, to integrate it into use cases, to build it into graphical user interfaces, to deploy it where it's needed. All in one single, speedy, performant platform. I'll just leave it at that, and I'll let the other guys chime in, but that drove a whole heap of developments over the pending 20 or 30 years, which made MATLAB the excellent all-round tool that it is today.

00:32:03 [GT]

Yeah. So if you think of MATLAB, it stands for "Matrix Laboratory". So that's kind of the dead giveaway of it. But in the meantime, it has grown into many other products so the MathWorks company sells over 130 different toolboxes or blocksets, if you will. So they've kind of grown into different domains. But I think if you go back in time, at least for me, when people would ask me: "why is MATLAB so good?", I would always bring it down to five things. When I got hooked, it was documentation: outstanding. The ease of making plots: super easy, super great. The ability to create apps (when I say apps, like a user interface) the old days' guide was ahead of its time. Then there was a whole wide selection of mathematical libraries at your fingertips, so [for] anything that you wanted to do with math, it was there in any domain. And that was just super convenient. The alternatives just weren't there. And then the last thing which was really good was this idea of task automation. So being able, from the command prompt, to glue things together; run things. It was empowering domain experts (which had a very weak or different background [from] computer scientists) being able to do things that they never imagined possible. And I think in the early days, where Mathworks started growing, the pitch was never to buy MathWorks or MATLAB's tools. It was like: "you should be using computers [for] getting insights". So the third pillar of scientific computing, if you will. You have theory; you also have your experiments; and then this third pillar of computation was appearing. And that's where MathWorks kind of grew. Now computers were on the horizon and people could actually use them for calculations more than a 2x2 matrix, which opened up a whole wide variety of applications.

00:33:45 [DB]

Yeah, I guess I'll chime in, maybe adding to everything we heard. I think from my perspective, like you said, why is it a good language? I think, to me, it turned around to what is the task? Are we building a website? Are we building a database? Or are we doing computations? And I think engineering calculations and engineering control systems and solving engineering problems. And I think that [for] solving engineering problems, it's remarkable. And I'm preaching to the choir here: how much array-based thinking, right? And matrices and how you can cast the problems this way. It is natural and it makes sense. I think Cleve and others have said: "we like to think in math". Some of us, right? The engineering community. It's the way we think about problems in some sense and so we want to express them that way. And so to me, that is the foundation of MATLAB, [it] is the fact that it lets domain experts express what they're thinking and the calculations and the designs and the control systems that they want in a way that's more natural to them. Maybe it's different than if you were building a website or building a database. And I think though, in time as the world of "software eats the world" (the famous saying) it makes sense that MATLAB grew more and more software, I mean, more and more general programming language capability on top of it. That just made sense as you wanted to integrate and do more things with MATLAB to the point that now MATLAB in some sense competes with a general purpose language, although that's not its intent. That's still not its intent, right? It's designed for engineers and scientists and it targets that audience, whereas Python is more general purpose. And I think Python, just to draw the comparison, is where you bring in call outs. It's like a glue. If you go back to its foundation in many ways, it's that you're calling out to things like NumPy that bring you the capabilities you want, but because they're bolted on, they never give you quite that expressiveness. And to me, if there's a continuum there, I would say APL seems, you know, further on the side of MATLAB and even more so, right? [chuckles] Very domain specific, very rigid, if you will in how you want to think about problems and cast problems and express them. MATLAB I see is more general, obviously, than APL and Python is even more general than MATLAB.

00:36:20 [CH]

Well, now that APL has come up, we have to ask, as folks that (combined) closing in on half a century of years working at the MathWorks, was APL a topic that the CEO ever talked about? Were there inside the company folks that had little APL interpreters that they were on the side comparing the way calculations work? Was it acknowledged? Because I know I've seen online in places that MATLAB definitely borrowed some ideas, but was that a thing of the past? What was the presence, if we will call them, the old school array languages, at the company MathWorks?

00:36:59 [SW]

I've looked a little bit into this and it's a fascinating domain. Iverson was, of course, [in the] 50s when he was building out the APL style, [05] which descended into all the other languages that we know and love. MATLAB came along in the 60s. At least the founder, Cleve Moler came along in the 60s, leading to the foundation of MATLAB in the 70s and 80s. I think what I find interesting is that Moler was very much centered on the West Coast. He was out of Stanford; he was Caltech; he was professor at New Mexico, where he kind of honed some of his teaching skills around matrices. He was very much affiliated or his professors were affiliated with the INA (the Institute for Numerical Analysis). I hope I've got that right. So I did a bit of digging because I was curious about this. When I joined KX, I wanted to know how q and MATLAB interrelated. And I did come to the conclusion that they were cousins. So Ken Iverson went over and did a semester at the INA. This was pre-Cleve, but it was working with Cleve's supervisors at college and university. But then he went back and it strikes me that you kind of had two separate domains. You have one on the West Coast, which is very free thinking; expressive West Coast. And I think that fits Moler's approach. He designed MATLAB to be an expressive language. Whereas then you've got Ken Iverson over on the East Coast with all the gang there, working very much with the big firms. That was kind of how I saw it. But there is a case, and it's online. I think there's a Georgia Tech dinner where Cleve actually talks about when he met Kenneth. And I'll just read it here: "Iverson showed me J" (this is Cleve's words apparently recorded by a guy at Georgia Tech). "I wanted MATLAB to be understood by normal people", he said. He said to me (that's Iverson) that someone once converted a program he'd written in MATLAB into APL. "I asked what that was. They told me: 'that's your program'. I couldn't recognize it. APL is about being uniform about everything, but MATLAB is a mishmash of all kinds of things".

00:39:26 [SW]

So there was some overlap and there was certainly indirect overlaps because the supervising professors were the similar people. But I don't know, it feels to me like cousins rather than brothers and sisters.

00:39:38 [DB]

Maybe I can add on to that. We should also mention the CEO is Jack Little. Jack and Cleve started the company together and while Cleve is not as directly involved day to day, Jack very much is in everything, every language decision. I want to say that they really take a strong focus on usability toward their audience of engineers and scientists, so much so that there is a notion that they don't need to take so much inspiration [chuckles] from elsewhere. You know, just saying it, right? They really want to meet their users and provide what is going to be most useful. And part of that is having a solid mental model of how the language works and how you express it and consistency. But they're also not afraid to bend that a little bit in the interest of helping our domain experts get their job done, right? And to that extent, in terms of how APL is mentioned (I don't mean this in a negative way) but I think sometimes there's a sense that APL is cryptic. I'm sure you've heard before [chuckles]. And so while there's good inspiration; it has a strong mental model, right? My sense has been at times when it does come up and people do consider a pattern, it's not that we should take it at all per se. It's more that that's interesting, but now what can we do for our users? And I think to Steve's point, there's no doubt some crossover back in the original beginnings. But by the time I joined, I would say that, on the language side, they had a strong UX culture, right? Even looking at pandas, I mean, I can give examples where I was looking at things that looked easy in pandas, but just because something looked easy in pandas didn't mean that we wanted to do it, right? We always wanted to start (and you could argue with this mentality that we should take inspiration from things around us) ... [sentence left incomplete]. But I think it's true that the culture was such that we have a good base of our users. We know who they are. We have UX experts and we want to decide for ourselves what is best. And again, to the point that some things could be bent at times, and I'll even give an example and I might butcher this, right? But maybe you can bear with me. So when we were thinking about pandas ... or I shouldn't say pandas! Like I just said, we're focused on our users and what we want to accomplish. But we have all these people saying: "okay, basically tables with named rows and columns are extremely helpful". Okay. We have table and then within a table (now maybe I'm butchering this a little bit because it was a little while ago) but you can imagine that each datetime in an array should be its own object. There's no need to have datetime as an array of numbers: Datetime is an object and you have an array of datetimes, right? There was a point at which we bent that a little bit because, we realized that people could get themselves into trouble fast because a datetime can carry a time zone. And so the last thing you want is like a table of data where one element in a column accidentally has the wrong time zone, right? We decided that that was actually a bad possibility and we don't want to enable that or allow that. We want to do everything we can to protect our audience. But this butted up against the principle that those datetimes were each their own object and we shouldn't be able to enforce (does that make sense?); we shouldn't be able to enforce the time zone on all of them. But we decided to bend that because we felt that technically they're supposed to appear like their own objects, but actually we're just going to make them one array where we say: "actually it's one big datetime with many values in it". And that's because we don't actually want to allow them to have the flexibility to each have their own time zone, even though the mental model of the language would suggest as such. But that was a decision that MathWorks makes in meeting its users and understanding its users and realizing like: "oh my gosh, that's really a bad problem that can happen other places where you accidentally have one time zone that's different". I don't know if that helps, and I might've butchered it a little bit.

00:44:17 [ST]

Well, so maybe if we kind of continue the story a little bit: so there's tables, there's time tables, event tables. So MathWorks at the time is kind of creating different objects to kind of cater for that. But I think maybe to kind of go back of continuing the story where Steve started, if you keep zooming forward and how the MathWorks has iterated towards catering for how people think there's this powerful thing called Simulink, [06] right? So the control engineers like blocks. So then eventually MathWorks started a new language on top of MATLAB. And then that continued, they created Stateflow, which is like state diagrams on top of Simulink. And then they said: "well, mechanical folk think in mechanical systems like joints; electrical circuits". Then they created another language Simscape. So the MathWorks ecosystem has been slowly but surely building additional languages on top of it, right? And historically, MathWorks never had object oriented programming, but then they morphed them also to have objects. And then they created things like System Objects to kind of streamline development between the different teams. So MathWorks has kind of been slowly but surely, I think, adding different languages and paradigms, but not necessarily spawned from APL per se, but more of what their users and how engineers think, right? So many engineers think in blocks and system architects and variants and towards the automotive and aerospace industry. And I think there's one thing that I think is really interesting [with] what they did is (correct me if I'm wrong, Dave) [for] a lot of the development of MathWorks, they kind of think about this four large categories of people. There are newbies, who are people new to the language. There are casual users who use it every now and again, then you have your proficient users who use it on a daily basis. And then you have developers. So these are folk building tools in the language for others to benefit from, right? So this very conscious sort of separating the users and figuring out which features cater for which group and where the emphasis should or should not go. I think there's a fundamental difference of where maybe Python and Julia who have kind of spawned out of taking the best from MathWorks and also using it, right? In particular, Julia did that very clearly. MathLab tends to help the newbies who are domain experts, but I don't think they tend to focus so much on developers, whereas maybe [with] the Python ecosystem and Julia, their entry point has always been empowering people to build packages, share it. And that's a bit of the open source philosophy. And that attracts a very different type of person to grow a community, right? And there's something to be said about this community of empowering the world to develop for you versus you taking control of it; you controlling exactly what gets developed and how to make the decisions, which is a bit of different philosophies. I don't know if Jack would have started today if he would have done it the same way. I don't know. But it is interesting that there's this conscious decision, I think, of separating users, which I think more languages would benefit from, right? So I would even ask the folk on APL crew, where does APL fit, right? Who gets attracted to it? It's not your average person who feels the magic of APL, right? You have to have a certain background and once it clicks, it clicks and it's beautiful, but not everyone can do that [chuckles].

00:47:23 [ML]

Well, I think the idea that MathLab meets the users where they are is, well, first, a really good illustration of MathLab, but it's also, it makes a good contrast to APL because that was definitely never Iverson's idea. What he wanted to do from the start was rethink mathematical notation and have a different way to teach ideas. He was really into teaching things with APL. He wrote several textbooks that were using APL to teach different topics. So what he wanted to do instead was to get at the foundations and provide a way to write math as a whole that was better for, you could say programming, but also for generalization and that sort of thing. So what Iverson really wanted out of it was to give people a better way to work with these systems. And that really requires someone, if you're not going to, as Iverson surely would have wanted, replace the whole math used in the education system with APL or J. If you're not going to do that, that requires someone who's really ready to throw it all out and rethink their learning process. So a lot of it is people who are interested, not in any particular application, but in the programming aspect of it: in the way we express things. But there are also a lot of people who are experts in some domain who have come to APL. It's not really as cryptic as it seems at first. So a lot of people are able to find APL. But now, I mean, I think with so many good programming languages out there that you can pick up right from the mathematical notation you've been taught, there's a lot less people who are willing to see APL as a solution to their problems as opposed to a tool that they would like to learn.

00:49:15 [DB]

Maybe I could comment (and by the way, I'm more interested than ever to learn a little bit of APL and I started tinkering so just wanted to put that out there). And I do think, thinking about philosophies, Jack had often said that he wanted it to be like, you could look at an engineering textbook and the math that was expressed as it was on the page would be the way that you would express it in MATLAB.

00:49:40 [ML]

That's exactly what Iverson wanted with APL! Just he wanted to change the textbook!

00:49:46 [DB]

[Laughing] Yeah, exactly. So that's the part I was getting to, because you were saying, to change the textbook. And you know, I also think it's funny that, well, that was an inspiration, right, of sorts, but to take the textbooks as they were. It's not uncommon that you find MATLAB notation in textbooks that aren't referring to MATLAB, right? Like, I mean, there are little bits of MATLAB notation that I think have eked out into the engineering space that weren't actually part of math notation. So many people became familiar with MATLAB. So I think they inform each other.

00:50:22 [AB]

Can you give an example?

00:50:24 [DB]

Trying to think, like, I swear I've seen like "ones" mentioned. Also, I've seen the ".*" where MATLAB wasn't mentioned.

00:50:34 [ML]

Oh, for element-wise product, yeah.

00:50:36 [DB]

Yeah, for element-wise. And I'm like" "that's MATLAB notation; that's not math, right?" [laughs] Like, it will come to me. I've seen a couple. And what I'm referring to is very slight. You know, just a subtle, like, somebody just put a dot to mean it was element-wise and my thinking is that that's not math, right? Yeah, that's an example I guess.

00:50:59 [SW]

I think picking on the rewriting of the maths books (textbooks), Steve Muller made the point about APL. It's captured in that same blog about Mathematica and Stephen Wolfram, [07] perhaps being more closer to the success of APL than perhaps MATLAB is. Don't know, throwing it out there.

00:51:18 [CH]

Stephen?

00:51:19 [ST]

Yeah, well, when Iverson called his notation APL, he was already working at IBM, and they picked a name which was taken from the one book he'd written about it, and which sounded like a programming language, APL (I think PL1 was the dominant programming language at IBM in those days). But Iverson's intention, as Marshall was saying was clearly to reform the teaching of mathematics, the way people thought about it. And an early candidate for a name for what had hitherto been known as Iverson notation was simply Iverson's "Better Math".

00:52:03 [AB]

Yeah, but people were afraid they would get in trouble with the corporate ... [sentence left incomplete]

00:52:05 [CH]

I had not heard that.

00:52:06 [AB]

Yeah, initialism.

00:52:08 [ST]

That's why you have an old guy on the podcast.

00:52:10 [ML]

[chuckles] The statement I remember about it is: "this was deemed facetious". [everyone laughs]

00:52:16 [CH]

Yeah, I mean, if you drink the Kool-Aid (which of course, at least all the panelists have; I think I can speak for all the panelists on the show), is it's undeniable that the Iverson notation was more uniform, more regular, it's better. Like, I don't know how many times on this podcast and other talks I've given, I go on my little diatribe of: we arbitrarily stopped with binary infix operators. There's plus, minus, divides, times. If you wanna talk about divides, there's like seven different ways to spell that, you know, X over Y; you've got the slash; you've got the divides. Let's just choose one; let's choose a lane. But then we stopped there, like min and max; two other very, very common binary operations. But in almost all languages, except for the APLs and BQNs and descendants of those, those are prefix functions with names: min, max. And then you get into this problem in programming languages like Haskell: what do you call the reduction that is a min reduction and a max reduction? In every language it is different. In Haskell, it's "maximum" and "minimum" for the reductions and "min" and "max" for the binary operations. You've got it in C++: it's min_element and max_element. And then min and max are functions in like a functional header or maybe they're actually algorithms and you get into overloading problems. It's just a headache, but you see in a language like APL, that is just another very common binary operation and you give it a symbol. It's like: "well, yeah, of course that makes more sense"; like, why did we stop? We just arbitrarily stopped. Who in history decided that [we'll have] plus, times, minus [and] divides but then min and max don't make it. And [for] exponentiation, we're gonna do something different. We're not gonna give it a name; we're gonna do superscript. But subscript? Well, that won't be anything. It doesn't make any sense. And in my opinion, like I said, we all drank the Kool-Aid, but when you see what Iverson saw, like it was arbitrary decisions and he said: "I could probably do better" and make everything, operators, infix and then, unary operations prefix. It is so much more beautiful, so much more elegant, but for some reason it's just ... I don't know. Maybe it'll take another 200 years and we won't all be here (well, you never know; cryogenic freezing, could happen). But anyways [chuckles], I saw Dave unmuted but Adám put up his hands. We'll go to Adám and then maybe people can respond.

00:54:35 [AB]

Yeah, well, about changing the mathematics and then having that as a programming language versus trying to make the programming language be like the mathematics. The traditional mathematical notation has also severe issues. You can't necessarily formalize it to be a computer programming language because it relies on the goodwill of the reader. You can, there are ambiguous notations, same notation used for different things in different fields. The computer doesn't know what field this you're working with. There are even plain notations that can be confusing at best or ill-defined. Like if you write in line A divided by B divided by C, you get people giving you various interesting answers. Or as I found out, various versions of the same type of graphing calculator from Texas Instruments give different results of the same exact expression. And so you can't, if your ideal is to try to implement mathematical notation, you must fail. You can't do it. And so better to fix the notation.

00:55:47 [CH]

Just be like Iverson and reinvent it.

00:55:49 [ST]

I wanna chime in before we go to our guests with the consequences of drinking the Kool-Aid young, as Conor puts it. And I think Adám is perhaps the only person who drank it younger than I did. So when I eventually threw up my hands and said, okay, I guess I do have to learn Python after that. I find myself continually getting exasperated and having to go for a walk at the tediousness of looping through this and that. Everything seems so unnecessarily clunky when I've come from what seems to me an elegant and simple way. My other problem with learning Python is I'm sure I remember at elementary school, somebody calling me a numpy.

00:56:36 [AB]

I also have this when I have to write some JavaScript [08] here and there. But we're also mixing up various things. I find it very interesting trying to step back from APL that APL and similar languages are many things together that are different from what people expect at least today. Yes, we normalize, harmonize mathematical notation, but we also start using strange symbols that people are not used to. And we also very much into, at least traditionally, APL is interactive programming rather than just static scripts. A uniform call syntax and applying functions to the individual elements of arrays, fundamental mathematical functions. Any one of these things could potentially be put into a language without the other ones. But I think there's something special about putting them all together that you don't have by just adding one or more of them.

00:57:38 [GT]

So maybe to piggyback a little bit on that, I think it's important to call out that a successful language is really, I think it boils down to how many people understand it the same way you do. So notation is really important. So if group A uses a language in one way, can group B understand the exact same thing without any difference? And when you can achieve that, you can now empower different communities to start collaborating. So I think part of the reason why MATLAB maybe took off is because a lot of cases ended up towards code generation, ended up to production. So when you get companies calling out specifically that they use the tools to add value, that kind of boosts the language very quickly, right? So the fact that you get away from just research to real life productions is a big deal. And I think a common place where math works has grown significantly is a group of domain experts write their algorithms, and then they minimize the path of giving it to a completely separate group of people programming it, name it C, C++, Java, Rust, that it's time consuming, misunderstanding, different unit tests, the whole thing, throwing that over the wall for a completely different team to start again from scratch by minimizing that path. And I think the Julia ecosystem calls it the two language problem, right? Domain experts understand math, see it one way, but computer science folk don't see or necessarily understand the symbols. I think the value of companies is being able to empower different communities to collaborate together with minimizing the misunderstandings. And I think part of the reason why Julia kind of did well is most of Julia is written in Julia. In Python, a lot of Python is not really in Python. There's a whole variety of languages behind there. So when the language evolves, well, it really depends on that package or that package ecosystem, see how it evolved. MathWorks controls it a little bit more, but both MathWorks and Julia go to LLVM and they've got this instrumented representation which enables them to do cool and nice stuff for acceleration. But it's not only the technical things. I really think it boils down to completely different backgrounds. When they look at the same thing, do they understand the exact same thing? And by minimizing that, that's where an ecosystem can flourish.

00:59:47 [CH]

Yeah, I was just listening to a podcast yesterday about how Go was a better target language for LLM code generation because it's so uniform. Like from the get-go, they had Go format or Go FMT. And they're really, like Go almost does a better job of Python's principle of like there's the zen of Python, there's one way or one sort of main way because obviously there's a bunch of different ways. But Go actually does that better than Python does. One way to code things. Sorry, I didn't even complete the thought. Because in Go, they've restricted. It's a very opinionated language. It's a very small circle that they want you to stay inside. And because almost all Go looks the same and has the same flavor, when the LLMs are training on it, they don't have this proliferation of styles. It's all one. And so it can be with high certainty generate what you're probably expecting. Anyways, it's a great point. And I think, Dave, you might want to have said something. It might be a couple of topics back. But yeah, I'll throw it over to you.

01:00:44 [DB]

I'll chime in. I hear the point about math notation is different, right? In different books. But I guess I just wanted to chime in that I think that's where I can appreciate what I think I can appreciate what Iverson and APL is going for. And yeah, I see MATLAB as a step further away where they try to meet the engineers and scientists that they're using and make best decisions in an imperfect world for notation. And then even going further again to that painting that spectrum of going further to a general purpose language. I'll say this. When I was young and I typed J in MATLAB for the first time, it gave me delight. Because I was studying electrical engineering. And to me, J is, of course, the imaginary number. And it just pleased me. I felt like, oh, I have a tool for me. Because it knows that J is the imaginary number. And that's the kinds of decisions I think around MATLAB that, yeah, J in other domains, I'm sure, means something different. But for electrical engineers, J is the imaginary number. I remember, too, one time I had written a paper and somebody was proofreading it before submission. And it was a student. And the student was upset because I didn't define J. I had equations, right? I never actually said what J meant. And then I laughed. And I think I upset the student because they felt naive in that moment. I said, no, I'm not defining J. It demonstrates who I am. It's part of my identity that we don't define J because we all know that it's the imaginary number. I don't know. I think of-yeah, maybe I'll stop there. Convolution is another one that comes up. Maybe that's the thing about convolution. That's one that at MathWorks, I was there when there was a debate. This goes back a while. That was like maybe 2012 or so. Some people at MathWorks didn't want to release moving average, which I think it's- forget now the function. There is a function in MATLAB for- I think it's called move-mean I can look it up quick- move mean. But there was a sense that we shouldn't do that because in the domains that MATLAB grew from, it's obvious that this is convolution, right? And so why would you define a function like that that is just so obviously convolution, right? And there was some debate about that. But in some domains, people get used to rolling average, moving average. So anyway, they did release moving average. And I think I want to say that financial tools maybe do have rolling average. It could be mistaken. Maybe Steve knows. But I know there was some debate about that.

01:03:26 [ML]

You have a lot more implementation options if you do it- if you know-I mean, every value in the kernel is that every non-zero value is the same number. You can implement it a lot faster than if you're actually using a general convolution method. So I would think that's pretty important.

01:03:42 [DB]

Sure. Good point.

01:03:43 [ML]

Yeah. And you could detect. You could check the kernel and see, is everything actually the same?

01:03:49 [DB]

It's a convenience. It certainly is a convenience.

01:03:52 [SW]

Just on the q thing. So Dave, you alluded to the move average. It is there, MOVAVG. And it's always been there in the financial time series toolbox in some form or other. Back in the early days of financial services picking up MATLAB, there was this time series toolbox that I think was universally not unloved by the MathWorks development team at large, but was absolutely adored by traders, technical traders. It just had what they wanted, where they wanted it, how they wanted it. This was around about the early 2000s. And it was a great piece. And the flaming Farreyes and Credit Suisse and a lot of those original prop trading desks for better or for worse used this stuff. But we never developed it. And partly for the reasons that Dave describes. Lots of semantic discussions about how these things matter across disciplines. And that was when q really came along. In sort of the early 2000s, people were looking for better supported, better implemented versions of the time series tools. And I can think of a number of people who moved from our time series, our very basic primitive time series application into q. And that was when I saw it because q was lifting off. You know, everyone was getting excited. They were good times.

01:05:16 [BT]

It strikes me that part of what we're looking at is almost cultural. That you have APL, which is very theory driven, I would say math driven, where if you latch onto that theory, there's really not very much leeway that you give to the real world for lack of a better term. You're not interested in how other people see it. You want to say, if you looked at it this way, this would be very powerful. And then on the other end of it, you've got the pythons, which are almost crowdsourced, which say, you know what, if I could do this, this would be great. I'll put that in. And it seems in the middle, you've got MATLAB, which seems to be more the vision, as you say, of Jack Little and Clyde Moeller, where they're thoughtful about every step along the way. So you're not really in one camp or the other, but your development is a little bit more, what's the world say? How do we use this, but not bending it back to theory? Do you think that's sort of a way to look at these different approaches?

01:06:13 [DB]

I think so. Yeah. And meeting the user base of engineers and scientists.

01:06:14 [BT]

Well, and I guess it comes back to Iverson being a teacher, because I think primarily, in addition to being a computing scientist and a mathematician, he was a teacher. His approach would be, you know, if I teach you to do it this way, it's going to be way better for you. So he's going to change the world by education. Whereas places like MathWorks are going to say, well, how does the world work? How do we fit this? How do we dial the noise down? Is this too much noise? Is it too little noise? And again, I get into the areas where I think the more general purpose of the glue languages are what do you want to do? And it's almost a free for all. Crowdsourcing, what becomes the most popular? And it does have more of a viral feel to it to me.

01:07:04 [CH]

So I guess maybe as we start to wind down here, because I believe we have blown past the hour mark as per usual, as predicted by Marshall. I don't think that was caught on pod, but he predicted it.

01:07:14 [ML]

I didn't predict it for this episode specifically, did I?

01:07:17 [CH]

I don't know. To be honest, it's not really. That's like saying...

01:07:21 [ML]

I'm going to blanket predict now that every future episode is going to be past the hour mark. Why are you...

01:07:27 [CH]

I was going to say, it's the equivalent of saying I'm going to flip a coin and it'll either be heads or tails. And the only way you lose is if it lands on its side. And the odds of that are very low. We've covered so much ground here. We've talked about, you know, MATLAB at MathWorks. You know, Julia's come up, Python's come up, Q's come up. And I believe in the introductions, you know, I think Dave mentioned you worked on VectorDB at one point, Pinecone, and now you're working at a Gen AI company, working on chatbots. Steve, you're working at KX, and I'm not sure how closely related you are working with the KDB.ai stuff. And I'm not actually sure what the consultancy stuff, Garrett, that you're working on. But I guess maybe to start to wind down is what is the state of the union for array languages in these technologies? And, you know, what role do languages and technologies like MATLAB and also Julia and Q for that matter, you know, have to play in 2024 and the next half decade as compared to potentially, you know, when MATLAB started back in the early '80s? I didn't actually even realize it was so early on. That was a completely different time, completely different problems being solved. You know, we even brought up the different communities is one of the sort of thoughts I had in the back of my head is that, like, you know, open source wasn't even a thing back then. Like, you know, I don't actually know what was going on in the early '80s. I know that was right around the time the Smalltalk got released. [09] But was version control even a thing back then? Like were people still just sending each other, you know, I would imagine a lot of companies were shipping each other like, oh, check this file out. And they would just copy the paste, you know, the code into the email. Or was email even out then? I assume it was. But it's just a completely different time.

01:09:04 [ML]

Didn't we have computers?

01:09:07 [CH]

I wasn't born then. I don't know. But yeah, so like, compare, contrast, you know, then versus now. And I guess maybe we can go around and get just folks thoughts on, you know, what is, like I said, the state of the union and the role that these technologies, the fit in best and the role that they have to play today.

01:09:24 [GT]

So maybe I'll take a step first, Conor. So first of all, I think languages have evolved based on communities needs, right? And they're all evolving in different directions. So I think the state of the union is a tough one. But what I will say is I think the attitude of everybody in the world should be to stay curious and to respect language diversity, right? And whenever you try a new language, the odds are the first 10 hours, 20 hours, it's going to be annoying because it's very different to what you're used to. But I think it's really important that people hold on to those first 10, 20 hours, if nothing else, to solidify their own understanding of the language where they're best suited at, right? So that is the key thing is to have a curious attitude towards life and, you know, bear with it. It's okay if the first 10, 20 hours are painful, hold on to it just a little bit longer, reach out to the community. And if nothing else, you will have a better understanding of your own language. And I think it's good to respect diversity and understand that people from math background, different from physics, from chemistry, to biology, to engineering, and that's okay. Let's just respect diversity and solidify our own understanding. So that's kind of my two cents, no matter which language is your preferred language.

01:10:34 [DB]

Maybe I'll go and I think, I feel like being so deep into AI, we live in interesting times with AI advancing as rapidly as it is. And the way that so many technical people are grabbing hold of automation tools, you know, basically AI, chat GPT, Copilot, things able to generate code. And so I think it'll be interesting to see how this all unfolds. I do believe that while it's great to automate a lot of the mundane aspects of coding, that some of the core ideas will want to express them in ways that make sense to us as humans and maybe reflect our understanding as we were taught, or maybe as Iverson would want us to have been taught. But I do believe that expressing ideas in a way that makes sense to us as humans will remain. And in that sense, maybe we'll see a greater rise or interest in who knows, maybe APL, maybe MATLAB, maybe other languages yet that will let us express our ideas as we most want to as humans, letting the, you know, robots take care of the rest. But that's an idea. I have no idea. I'll pass it off to Steve.

01:12:01 [SW]

Thanks. It's right. I think the first thing to say going back to the 80s and particularly the 90s, I think when I started working for MathWorks 96, I think the neural network toolbox had come out what, 1993, 1994. I believe Geoffrey Hinton was a user of it, I think. Yann LeCun didn't like it very much and has not liked it throughout some of the education. Not seen it as necessarily been a good for AI education. So, but a lot of that neural net generation of the 1990s, early 2000s, even earlier, built on top of MATLAB. A lot of people forget that without MATLAB and the neural net toolbox, you might not have had TensorFlow. You might not have had even chat GPT being built how it was. Okay. May not necessarily be the tool of choice now in pure AI, but it was then. That's my point number one. I think here's a good part that I think we all have. I mean, I think the power of arrays and array languages to store and analyze vectors, I think we know, we know, and we understand is key to those gen AI workflows. And I think those in the array-based languages with a common storage and compute capability have a lot to give to the wider world when it comes to understanding semantic meaning, as well as mathematical meaning. And I was a little bit involved with the KDB.ai thing. Dave was involved with Pinecone, but we do see the array languages through Python and the NumPy, NumPy, et cetera, being very much front and center of how people engage and communicate with each other in these languages. As to what the engine is, KDB.ai is a vector database. Pinecone is a vector database. A lot of people are building those vector methodologies in Python, in MATLAB, in other things too. And I think one area of focus for the array-based programming language is how can we better equip those AI domain experts to be using our languages, as well as the databases, for those AI computation and storage workflows. So that's my thinking of that. Dave, I'll spin it back to you in case you've got any other sort of Pinecone anecdotes you wanted to add on.

01:14:37 [DB]

No, other than it, you know, working with empathics does encourage working with vectors, and there's lots of opportunity there.

01:14:43 [CH]

Any final questions from all of our panelists? I mean, I could ask Dave, seeing as you're steeped in the world of, I mean, I'm also steeped, but I'm more in research, so I'm not releasing AI products. You know, if your take on the world and all our jobs is going to end at some point in the next, it's what everybody's asking, and I don't have the answer, but you might be closer to it than most if you're in the startup gen AI world.

01:15:09 [DB]

Yeah, I unfortunately don't know that I have anything interesting to add. Other than I do believe AI, I mean, it's evolving rapidly, and I think as many others have said, we'll start to see it chip away at the most mundane tasks and jobs, and I think we're already seeing it, and I think we should be ready for that to accelerate, but it's not like, I'm not a subscriber to the idea that they're taking over the world just yet, but I do think they're going to be coming for those mundane tasks and jobs, which can be a good thing, can be a bad thing, but I think that's coming.

01:15:40 [GT]

So maybe you feel that that might hit first is testing. So I think the inherent nature of, I don't know, deep learning, reinforcement learning, the idea of test generation and validating, I feel that that could be an early next step. Typically, engineers, scientists, they hate that, so I can see that latching on potentially.

01:16:01 [SW]

I think getting back to the KDB.ai and Pinecone thing, and to my new company, Quantexa, which works in graph analytics and knowledge graphs, that ability to provide context to the AI is going to define its success and the ability to create opportunities for AI to be successful, to either free ourselves or take our jobs, whichever side you stand on. But I think that ability to provide context in conjunction with the AI models, with the LLM models, will be a critical thing.

01:16:40 [DB]

You say that, often achieved with vector similarity.

01:16:43 [ST]

Yep.

01:16:44 [CH]

All right. Last chance, folks.

01:16:46 [SW]

I would just offer one final comment. It's a little known fact about MATLAB. There's this thing, Gareth has talked about the community. One of the things MATLAB did very, very early was built the MATLAB community, the so-called MATLAB File Exchange, which came about in the early 2000s. And it was where people would share code sets, M files, et cetera. This was long before GitHub came around. This was well ahead of its time. And I think there are other places too, where MATLAB has delivered cultural benefits over and above the kind of core programming and mathematical capabilities too. So, you know, MATLAB Central, File Exchange, very much saw the future. Just a final comment.

01:17:35 [CH]

And I think we will make sure to, I mean, we always do, but we will leave links in the show notes to all of the different technologies that we've mentioned on this episode. It hasn't just been MATLAB, there's been others, Julia, et cetera. So if you want to go and try any of those, head to the show notes, we'll have links there. And with that, I think we'll throw it over to Bob, who will let us know how you can contact us.

01:17:55 [BT]

Well, it's really simple. It's contact@ArrayCast.com. And if you send that email out, it'll get to us and it will always be read. I can't guarantee it will always have answers to your questions, but they always do get sent, they get read. And then depending on the questions that get sent out to the group to be able to respond. And anybody who has sent questions to us knows how that works. A couple of days later, you might get a suggestion or something, but we all do read them and we really appreciate them. So contact@ArrayCast.com is the way to get in touch with us. And I've found this hour and a bit really fascinating and really interesting discussion.

01:18:37 [CH]

Yeah. Thank you to all of you, Gareth, Dave, and Steve. It's been great. I mean, I love the history episodes where we get to go back in time and hear about the genesis of different technologies and sort of where they fit in and how they've evolved. And I think MATLAB does not come up as often as, you know, it's outside the Iversonian per se array languages of APL, J, BQN, et cetera. And so it doesn't come up as often, but we have mentioned that we've over time wanted to, you know, reach out to maybe have, I think his name is, I'm not going to get it, but he was the founder of NumPy. And I think originally it was called numeric Python and there was a paper that was released. And I've heard podcasts, we'll link it in the show notes, but it's, I think a cool idea to reach out to folks that are outside of the pure Iversonian array languages, because they're like you mentioned, Steve, there's these overlapping stories of, you know, whether it's not, you know, people that directly talk to Ken, but they talk to their, you know, advisors and they were at the same dinners and talks at times and that history, you know, we need to capture that. So it doesn't, you know, fade away into the annals of time and that, you know, maybe 20 years from now, if someone's curious, they'll go and check out this podcast episode and be able to, you know, hear a story from, from eons, you know, past, but anyways, thank you for taking more than an hour, more than an hour and a half at this point of your time to chat with us. This has been absolutely awesome.

01:20:01 [SW]

Thank you for having us. It's been great fun.

01:20:03 [GT]

Yeah, so big thanks from my side. And I think you guys are really cool. So I'm going to respect the curiosity and learn more from APL. I think there's a lot to be learned. So big thanks from my side.

01:20:11 [SW]

Yeah, you're the cool kids.

01:20:14 [CH]

I'm not sure about that. But we appreciate the sentiment. I think with that, we will say happy array programming.

01:20:21 [ALL]

Happy array programming.