Select Page

In this episode, Steve and Cam provide their top three tech/AI highlights from 2024 and provide their top predictions for 2025.

FULL TRANSCRIPT

FUTURISTIC 34

[00:00:00] SS: Recording in progress. Welcome to the future. I’m Steve Sammartino, and introducing the great OG of podcasting, Mr Cameron Reilly.

[00:00:18] CR: Turning me on, Steve. Slow down. Um, don’t turn me on until later in the show. Welcome to The Futuristic Episode 34. We’re recording this 20th of December, 2024. How are you, my little buddy?

[00:00:31] SS: I’m good. I’m great. It’s big. It’s been a big year, Cameron.

[00:00:38] CR: It has been a big year, Steven. I, I suggested to you last week that we don’t do a show last week, because OpenAI started doing this thing called the 12 Days of OpenAI, where every work day for the last 11 days, uh, they have done a live stream where they’ve launched a new thing, a new feature, a new function of ChatGPT.

[00:01:00] CR: And there’s one more day to go and I said let’s wait till next week because there’s probably something big that’s going to come up and as it turns out like Friday US time, which will be tomorrow our time, they’ll do day 12 and hopefully that’s going to be the biggest thing and we’ve sort of missed the chance to do that but even the last 11 days have been insane like there’s been so much coming out in the last 11 days of the year and while they’re coming out with stuff Google have been coming out with stuff to One up them on their stuff that they’ve come out with, so it’s just been this crazy period of leapfrogging going on in the last couple of weeks.

[00:01:38] CR: But it has been a crazy year, and your idea for the show this year is for us to do like our top things of the year and our predictions for next year, is that what you want to do?

[00:01:47] SS: That’s right. So I was thinking the top three from this year, there’s zillions, but we’re kind of trying to at least squeeze it into an hour or so. And top five each for next year. And interestingly, if I think of some of mine, there’s, there’s a bit of overlap, but I’m going to try and keep them separate. I just did my tech and AI transfer 20, 25 and did 20.

[00:02:08] SS: It could have been a hundred long. Uh, there is some overlap, but that’s what I think we do.

[00:02:15] CR: Okay, well, uh, why don’t you kick it off with your top three things from 2024?

[00:02:22] SS: Okay. So my first one is AI recursion, and it’s not a thing, but I think it’s kind of like a zeitgeist or an event. Never, ever have I seen things get so much better so quickly. Like the iterative. Uh, improvements and features that are coming in. And I think OpenAI has been extraordinary. The amount of new things that they’ve had this year, just from the LLM itself, getting better computation, uh, ability to do maths and not just language stuff, uh, imagery, video, Sora, uh, live video feed on what you’re looking at, web integration, which has been huge as well.

[00:03:11] SS: So. Just the recursion happening so quickly, and it’s even blowing my mind, and we know that change is exponential, but it feels like we’ve finally hit the exponential part of the exponential. If that makes sense, the doubling and the recursion is just quick. Let’s just think about Apple do annual events, and things didn’t change that quick, but in the kind of 24 months they’ve been in the Zeitgeist, it’s, like you say, it’s every other month there’s something, and it’s getting quicker and quicker.

[00:03:40] SS: You just mentioned they’re 12 days of open AI.

[00:03:44] CR: Yeah.

[00:03:44] SS: That has never happened in tech.

[00:03:47] CR: No. Look, it, yeah, um, I think you’re right. Recursion is crazy, and this is the sort of stuff that, uh, Kurzweil has been predicting for decades that when you get to the part of the asymptotic part of the exponential curve, that things just go insane. And it,

[00:04:12] SS: It’s incomprehensible speed and, and the recursion itself. And I imagine that this is part of it is that the tool itself is inventing new parts of the tool. Like it’s, it seems that that’s pretty clear. And I don’t think we’ve really been there yet. I think this is the first time because we’ve got, you know, Code that understands code and can self improve.

[00:04:34] SS: I think we’re actually hitting the technology self improvement element of that exponential. And I think that’s why everything is so radical right now. So for me, that was my number one thing this year that blew my mind. Uh, yeah, iterative pace on steroids. That was my number one. My number two. Alright, keep going.

[00:04:57] CR: well, I was just going to say, it’s not just like software, you know, building software, but the ability for people to Increasingly, just to have an idea and then develop it within a week. You know, I saw one of the, I think it was yesterday’s 12 days of OpenAI thing. They launched 1 800 CHAT GPT, which is

[00:05:21] CR: where you can call,

[00:05:23] SS: don’t know what it is, but I love it already because I just love a retro. I hope they’ve got a truck that drives down highways with that written on the side. I hope that so

[00:05:31] SS: much.

[00:05:31] CR: they had it flashing up on the screen while they were doing the live stream in like this really cheesy late night infomercial font. It was fantastic. If you’re in the U. S., you can call 1 800 CHAT GPT and just talk to the AI and ask it questions. If you’re outside of the U. S., You can use WhatsApp to call it and I’m, they were saying it’s, if, if you’re, if you don’t have internet connectivity, but you need to talk to ChatGPT and you have telephone connectivity, you can do it in the US.

[00:06:02] CR: I’m pretty sure, I don’t know about, I’ve never used WhatsApp that much, but I’m pretty sure you need internet, you need data to use WhatsApp, right?

[00:06:10] CR: You can’t just use it over a telephone signal.

[00:06:13] SS: Yeah, data is what you’re

[00:06:14] SS: watching.

[00:06:15] CR: sort of defeats the purpose. But, but that one of the demos that they did is one of the guys used a rotary phone to call ChatGPT.

[00:06:26] CR: And he said, I don’t think I’ve ever used a rotary phone before in my life. So this is an experience on multiple levels. But I was talking to, you know, I was explaining this to Chrissy later on last night. And I was saying like, you know, try and explain that to somebody 10 years ago, 15 years ago, the idea, we were talking about how we remember when the height of cool was to call the time service on your phone.

[00:06:54] CR: I remember in the early 90s, I would do that just because it was cool. At the tone, the time will be 12. 05. And 15 seconds. Beep. Beep. Beep. And you could call and get your horoscope read out. Now, we go, oh yeah, you can just make a phone call, and it’s for free. You can do it for free. I think they’re allowing like, you don’t even have to have a GPT account or anything like that.

[00:07:23] CR: Like 15 minutes a month or something, free phone calls, you can make to GPT from your number. Um, You make a phone call and have a conversation with an artificial intelligence that lands all good. One of the demos they did is a guy said, um, ChatGPT, my friends and I on a road trip, we’re in the middle of nowhere on this highway in California.

[00:07:44] CR: We see these houses. They’re like weird, round coloured houses. Any idea what they would be? And they go, Oh, that’s the famous Flintstone houses that were built. And they’re a tourist thing and blah, blah, blah, blah, blah. So just to be able to call AI wherever you are and ask a

[00:07:59] CR: question.

[00:08:00] SS: AI on traditional networks.

[00:08:05] CR: Yeah,

[00:08:06] SS: So not really using data. So you’re kind of not using data.

[00:08:10] CR: No, you’re using a telephone line to call it and it’s talking to you.

[00:08:13] SS: data, by the way.

[00:08:14] CR: well,

[00:08:15] SS: which I love. That was one of their greatest hacks. These are phone calls, and these are data. It’s like, yeah, they’re all ones and zeros flying through the sky, you liars.

[00:08:21] SS: I just want to point that out. I always knew, and I wasn’t falling for it. It’s.

[00:08:25] CR: Everything is ones and zeros. Um, but the, the, the, the point I was going to make about all of this was they said. That this idea came up in a OpenAI Hackathon and the team developed it in like a week to go live with. And this is getting back to the recursion stuff that you’re talking about. To have the tools available now to go, Oh, I want to build this thing.

[00:08:49] CR: And I’ve talked about this on the show, cause I do this all the time now. Oh, I’ve got this idea for a thing, an app I can build that’ll solve a problem of mine. And an hour later I can have it built. Like that is the, Era we are now entering into where billions of people can build things in an hour or a day or a week that didn’t exist before because they have an idea.

[00:09:13] CR: And we’re not quite at the stage, but we’re not far from the stage where you can just say to the app, Hey, build it for me, and it’ll do it lock, stock and barrel. It’s pretty close to that,

[00:09:27] SS: I do

[00:09:28] CR: not a hundred percent.

[00:09:30] SS: all the time. I was asking it to do something yesterday, which was I had a whole bunch of business cards that I wanted to extract all of the emails from because all of these people, when they give business cards, say,

[00:09:41] CR: closer to your mic, Steve. You’re sounding like you’re in the next room.

[00:09:45] SS: Okay. So I collect a lot of business cards at events.

[00:09:49] SS: And when I, Take the business cards off the person. I say, do you mind if I put you on my email list? I do a weekly tech thing. They’re like, Oh yeah, cool. No worries. So I get their permission. And I normally would just type it into my sub stack, but I went on to open AI, took a photo of it, put it in and said, extract the emails, but it struggled with, uh, some of the fonts and the, so I, it had told me why it was struggling with its, um, observational, uh, Metrics, and I said, what would fix it?

[00:10:18] SS: And it said, you would need this. And it gave recommendations of the software that use it. I said, do you know how that software works? And it says, yes. I said, could you build it? And it said, yes. And then it built it and I’ll put it on my client and it worked.

[00:10:32] CR: That’s

[00:10:32] CR: great,

[00:10:33] SS: So, so, so one of the things that I’m doing now, often it’ll recommend a piece of software AI that can do the tasks that it can’t do, I ask it to look at.

[00:10:43] SS: Do and research it, even provide the link and go look in that. What does it do? What does its course functions now write the code and it’ll write some code in Python, and then I download it on my client and I do it there so it can kind of already do it at the moment. It needs us to tell it soon. It won’t.

[00:10:59] SS: It’ll go, wait a minute, let me write some code to be able to do that thing. So, and I don’t think everyone has realized the power in our hands now is insane. It’s insane. I just, every day I’m agog. I’m like, holy Wow.

[00:11:15] CR: Magog.

[00:11:17] SS: My God. So, okay, number

[00:11:19] CR: You were there! You did a video from that alley recently, where Gog and Magog are, right?

[00:11:26] SS: Uh, maybe.

[00:11:27] CR: You did a TikTok in the little laneway in Melbourne where Coco Black is, and there’s the clock with the two,

[00:11:36] SS: do one there. I did do that. I

[00:11:37] CR: two ogres that are defending the clock, they’re Gog and Magog.

[00:11:42] SS: Okay. I did do that. I didn’t know they were called that. You’ve taught me something in my own city.

[00:11:47] CR: There

[00:11:47] SS: If you want to know more about the city you live in, just ask a tourist because they’ve been to all the places you never got around to.

[00:11:52] CR: Dude, I lived there for 20 years

[00:11:54] CR: and I used to do TikToks.

[00:11:55] SS: I know, I’m not calling you a tourist, but I’m

[00:11:57] CR: did a podcast 20 years ago where I had a local Melbourne tour guide and we would go around and do tours of Melbourne and record the tour so other people could listen to it as a podcast. I did it for the City of Melbourne or Tourism Victoria or something.

[00:12:15] SS: right, we’re going to be here at 4am, which I love, which I

[00:12:18] CR: Sorry, keep going. Your top three things. What was number two?

[00:12:21] CR: And

[00:12:22] SS: and number 2 for this year was Humanoid Robots. That surprised me a little bit, it shouldn’t have, in hindsight, but what I love here is the marrying up of robotics which have been around since II, making cars, manufacturing. I love the humanoid element and I love that it’s the installation of LLMs that make them functional and trainable with verbal and visual reasoning.

[00:12:44] SS: And we’ve spoken about it on the show before, within a decade we’ll all have them. For me, what they can do now is kind of like, here’s a computer in 1976 and it’s really going to change stuff. It’s like a huge, huge moment for everyone. And I think it’s going to be Maybe even bigger than the Henry Ford revolution where you’ve just got this incredible device with incredible power that just changes your life, your home, your everything, every workplace.

[00:13:14] SS: That one there was huge and it’s a little bit like what happened with the car. We had carriages for a really long time and then once we had fossil fuels and the internal combustion engine that we put those two together and we changed everything. It’s kind of like that with software and hardware on this.

[00:13:28] SS: It’s a marrying of the two. I don’t think it’s. Extraordinary. OpenAI and NVIDIA, I mean, two of the heroes at the moment coming together with the figure one, super impressive.

[00:13:38] CR: I mentioned to you off air that last night NVIDIA, Jensen Huang, Did a launch of a new piece of technology they have, it’s called the Jetson Orin Nano Super. And it’s a, basically a CPU or GPU, um, that’s basically the size of an iPhone. It, uh, is designed to run, An LLM on device, on a robot or an edge device, something on a mine, something in a factory, something in your house, that can run a fully functional LLM locally on the device without having to go off to the cloud, and they’re selling it for 249 US, and it does 67 tops, 67 trillion operations per second.

[00:14:35] CR: NELP. Um, I did an analysis of that last night with, uh, against my Pentium 400 that I had in 1995 just as, uh, for shits and giggles. And it’s, it’s basically, and that would have cost me two and a half, three grand, I think, back in the day. This, uh, That Jetson device would be about 167 million times as powerful as the Pentium 400 that I had in 1995, it was nearly 30 years ago.

[00:15:22] CR: For a tenth of a second. Tenth of the price, and it’s specifically designed to run all of NVIDIA’s AI platform, the CUDA stuff. Um, just mind boggling

[00:15:36] CR: stuff.

[00:15:37] SS: can’t even get those numbers in your head. And I think I said to you as well, when we hear 167 million times, we just think 167. Yeah, you almost don’t even comprehend it. It’s incomprehensible.

[00:15:48] CR: Yeah, absolutely.

[00:15:50] SS: Number three for me was, and this one’s kind of got a sense of irony about it. Um, cause Google’s been called a monopolist, but I just, I’m calling it the post search society because Google has cancer.

[00:16:02] SS: Certainly their search division does. Uh, I saw some numbers which astounded me, uh, which Google isn’t really. Publishing, and it’s hard to find, but up to 80 percent of heavy internet users are now using GPTs instead of search. An 80 percent substitution ratio. 80 percent

[00:16:25] CR: That can’t

[00:16:25] SS: of, now what they didn’t say, no it is, of high volume internet users are using GPTs and LLMs 80 percent of the time when they would historically before then be using search.

[00:16:39] CR: That

[00:16:39] CR: sounds

[00:16:40] SS: And I think that, I reckon I’m 90%. Yeah, not high, high frequency internet users. So, I don’t know

[00:16:46] SS: what that

[00:16:47] SS: cohort is. I don’t know what that cohort is. It could be 5 percent of super users, but, but look, I know this for sure. And let’s just get anecdotal about it. How many things, how many times a day would you now go to the LLM instead of Google?

[00:17:01] CR: Oh, a hundred, a hundred, a hundred

[00:17:02] SS: Is it 50%?

[00:17:03] CR: A hundred

[00:17:04] SS: you just told me that that’s hard to believe. And then now, you say, I must search as the content heavy internet users. How many times a day would you now go to the LLM instead of Google? Because

[00:17:11] SS: I reckon 80 90 percent of the people just don’t realize how powerful this is and they’re just not really using it yet.

[00:17:18] SS: It’s kind of like this is 1999 and no one’s really using Google. There’s a bit of a lag, uptake lag. But I think that this is my big one. I think we’re about to enter post search society, especially now that GPTs have live web and you can go to the web live. It’s like I almost don’t even use Google now. I just use it to find a location or a map or a particular website, but I don’t actually use it for anything that is knowledge based, information based, function based.

[00:17:45] SS: I don’t work with it like I used to. This is gone. And, I mean, Google obviously has Ramo and Maps and a whole other raft of important parts of their business, and they’ll be fine, and I’m not too worried if they’re not, uh, even though I’m a shareholder, but, I tell you what, this is the post search society, this is, I seriously, I am not mincing my words here, traditional search is yellow pages. it’s already happened. I’m telling you, they’re fucking dead.

[00:18:14] CR: no, I disagree. I think you’re, you’re calling it way too soon. Um, but, and I’ll tell you why in a second, but

[00:18:20] CR: before we get into

[00:18:21] SS: I’m not saying that Google won’t, I’m saying, I’m not saying that Google won’t go and get their LLMs and transition it. And this is their challenge, is to adapt. This is their challenge. And the reason I don’t think that they will, is because they have an entire infrastructure based on the revenue streams, which is pay per click and ads.

[00:18:39] SS: And I think the future is subscription, and thank God for that, because it might just save the internet. Because the last thing you want is ads, because it changes what you see. I think subscription is much better. I think they’re fucked. I think they’re fucking, their search is dead. Absolutely fucked. Dead in the water.

[00:18:55] SS: I’m calling it now. I’m fucking calling it hard.

[00:19:00] CR: so one of the things that OpenAI launched during the 12 days of OpenAI was the general availability of search inside of ChatGPT for free users, as well as previously it was available for plus users. So they’re going hard on that, but again, the thing we have to remember is that Google has a massive ecosystem, they’re a bit like Apple, they’ve got a massive device install base running Android, that where their LLM, Claude or whatever it is, will be the, not Claude, that’s Anthropic, um, Gemini will be the default on all of those devices, um, depending on what happens with their, um, Monopoly lawsuit, they may be able to block ChatGPT’s, uh, ability to be featured heavily on those devices.

[00:19:58] CR: But they’ve got a very strong tool, and they’re coming back really hard with a lot of this stuff. Gemini, They just launched a new version of Gemini in the last week that’s doing really well in the benchmarks. Their latest video, text to video tool looks like it’s way better than Sora, which OpenAI also publicly launched in the last week or so.

[00:20:21] CR: But Google’s tool, I think it’s called Veo, is not publicly available yet. Um, I don’t know, man, I think, you know, because of their install base, their device base, and, you know, you’ve got Chromebooks and all that kind of stuff, and the Gmail user base, and all that kind of stuff,

[00:20:37] CR: that they’ve

[00:20:38] SS: I didn’t say Google was dead. I said search, as we know it, has got fucking terminal

[00:20:43] CR: well, it’s, no, but search is just going to evolve into,

[00:20:46] CR: search is just evolving into AI

[00:20:49] SS: I’ll tell you

[00:20:50] CR: AI

[00:20:50] SS: I’ll tell you why it won’t.

[00:20:51] SS: And it never does. It won’t and it never does and I’m going to tell you why because this happens fucking again and again and again.

[00:20:59] SS: Is that companies always love their business model more than they love the technology in there.

[00:21:05] CR: Christian Clayton’s innovator’s dilemma, I agree

[00:21:08] SS: 100%. It is still

[00:21:09] SS: what can happen.

[00:21:10] CR: But!

[00:21:11] SS: They are gone on that area. They’ll still be a big, and by the way, Kodak and everyone that you’ve seen has been disrupted, had all the technology, had the footprint. Everyone said, Kodak won’t die.

[00:21:21] SS: They’ve got a footprint of 40 trillion

[00:21:23] CR: Google’s a fundamentally different business. It’s a technology business that has been at the forefront. We, like, OpenAI came out of Google. LLMs came out of Google and Google has Demis Hassabis, who’s no fucking clown.

[00:21:40] SS: Okay. I get it. But the difference has got nothing to do with

[00:21:45] CR: They take it, no, it’s about culture, but they

[00:21:48] CR: take it

[00:21:49] SS: It’s about business models. It’s not about culture. No, it’s not even about culture. It is about business

[00:21:54] CR: but it’s the culture that protects the business model. But it’s the culture of the company that protects, I mean, that’s Clayton Christensen’s whole thing about protecting the business model from era one till it’s too late to transition to era two and then you lose

[00:22:10] CR: the transition war, right?

[00:22:26] SS: Do you know what it is? Salesperson.

[00:22:29] CR: Right.

[00:22:29] CR: It’s like

[00:22:30] SS: They’ve got more salespeople than anything else in their business. You think they’re a technology company? They’re

[00:22:35] CR: the way, I love this podcast because it’s the only podcast I do where we argue. All the other podcasts I do, we just agree on everything. I love this. You and I can have an argument. I love it.

[00:22:45] SS: Fuck you, Cameron. You are so

[00:22:46] CR: No, it’s

[00:22:47] CR: great. I love it. I love it that we have

[00:22:49] SS: are fucking dead. They’re fucking dead, mate. Because everything

[00:22:52] CR: but you’re saying Alphabet’s not dead.

[00:22:54] CR: Google’s not dead.

[00:22:55] CR: Just, you’re saying

[00:22:56] SS: I’m saying

[00:22:57] SS: searches, they’re fucking dead. Now, here’s where I reckon, quickly, where I reckon they’re gonna go gangbusters and fucking slay the world.

[00:23:05] SS: with Waymo. Waymo is crazy. In every city they’re in now, they have 30 percent of ride share already. They’ve already overtaken Lyft, and they are eating into Uber so quickly, it’s not even funny. Anyway, so they’re my three, and I love that we disagree, and the beauty of it is, we’re going to talk every week and just see how this thing rolls out.

[00:23:28] SS: Now Cameron, over to your top three from this year.

[00:23:32] CR: I’ve done no prep,

[00:23:33] CR: because you

[00:23:34] SS: I love that. You don’t need

[00:23:35] CR: we were going to do like

[00:23:36] SS: were Born ready

[00:23:37] SS: born ready.

[00:23:38] CR: Yeah, that’s what I always say. Um, but, off the top of my head, I mean, it’s been a huge year, and I’ve probably forgotten a bunch of things. But, the things off the top of my head that I’m most excited or impressed about this year.

[00:23:52] CR: First is going to be Claude 3. 5 Sonnet and Coding. Now, funnily enough, just in the last, last 24 hours, O1, OpenAI’s new model, just beat Claude in the coding benchmarks. I haven’t tested O1 recently for coding because Claude 3. 5 has just done such a great job. job for me, but my ability to code stuff in the last couple of months using Claude 3.

[00:24:22] CR: 5 Sonnet has just been, um, like a magical experience. Not perfect, but just insane. Like you were saying before, like when I wrote that news app, A couple of weeks ago, I was going, Hey, can you just like build me my own news aggregator? Yeah, yeah, sure. Here it is. Boom. Like it’s, it’s, that has been an insane experience for me this year.

[00:24:49] CR: Um, the fact that you could just think of something and build it now, like it’s, it’s crazy that we can do that. Uh, with a tool that cost me, I mean, I pay for the API, but you know, and I, and sometimes I, I, run up quite a bill on it. You know, it might cost me anywhere from 20 to 50 bucks a month to use the API, depending on how

[00:25:09] CR: heavily I’m

[00:25:09] SS: What a build. Isn’t it funny on what’s quite a build these days? Go out and hire a software developer in 1992 and mortgage your house to get a shitty piece of software. It’s

[00:25:19] CR: Yeah. Or even hire one today. It’s going to be the same thing. Like to, to have somebody building apps for me for 20 to 50 bucks a month is crazy. My second one would be AI integration with apps. So we start the open AI, just released a lot of this stuff in the last couple of weeks, part of the 12 days to further and further integration today in their live stream, they announced integration with Apple Notes, which I don’t, I don’t Use any more, I use Obsidian, by the way, if you don’t use Obsidian and you’re a big note taker, check it out, Obsidian is absolutely magical tool.

[00:25:59] CR: One of the best pieces of software out there, in my opinion. But, um, we’re now at the era, I have AI integrated into, um, My code, um, app, the, the, the development environment that I use, I have it integrated in a couple of other different apps. So we’re now in the beginning of the era where it’s integrated into your apps.

[00:26:25] CR: It can see what you’re doing, You don’t have to copy and paste it from your app into the AI and then paste the result back. It’s just watching what you’re doing. It’s making suggestions as you go. It’s coding alongside you. That is the beginning of the future where AI is just integrated into everything that you do and all of your

[00:26:51] SS: It’s just in the walls. It’s just in the walls like electricity. That’s where AI is going. Ambient computing and omnipresence is when a technology has really reached its threshold. When you don’t have to go to it, it is just everywhere. That’s when you know technology has changed things.

[00:27:08] CR: By the way, this has been one of my rants for the last couple of weeks. I don’t know if you and I’ve talked about this, but let me ask you a question. Do you know how electricity works?

[00:27:16] SS: Uh, I’m gonna say no because I think my answer will be wrong even if I try it. So I’ll just go with no. Can you tell me? Don’t they look

[00:27:25] CR: Well, I think most people don’t know how it works. And it’s, it’s one of those things like, Special Relativity and the Double Slit Experiment that I need to re read myself every few years to remind myself because I forget. I need to do a refresher course on it every now and again. But I think, See, the way that you kind of get taught how electricity works in school is when you, at high school, in high school science, right, is you flick a switch and it opens the hose, this is a wire, and electrons get pushed down the wire to the filament in the light and they, and it heats up the filament and light comes out.

[00:28:06] CR: That’s not true. That’s not how electricity works at all. The electrons don’t move. They do move a little bit.

[00:28:16] SS: it’s like a wave in the ocean, which a lot of people think a wave is actually the same chunk of water. You know how a swell goes through the ocean, but the drops stay where they are. It’s the energy that

[00:28:29] SS: moves and it moves the drops in that spot. So it’s actually a hump and, and, and water. People don’t realize that you’re actually just riding energy.

[00:28:37] SS: You’re not even riding the same drops of water when you’re riding a wave.

[00:28:41] CR: but what is energy?

[00:28:44] SS: I don’t fucking know. You better tell me. I mean, I sort of know, but I, I, I’ve done the school stuff, but I kind of don’t really know. So can you please tell me and the listeners?

[00:28:55] CR: well, energy is just the ability to do work. It isn’t anything, you know, it’s what the issue I always have with new agey types, when we start having conversations, they go, well, everything is made of energy.

[00:29:05] CR: Like

[00:29:05] SS: Oh, they love that. It’s their favourite

[00:29:07] CR: is just the ability to do work. So what, What are you fucking saying?

[00:29:12] CR: It doesn’t make any sense. You, you’re just using random words. They don’t mean anything. That it’s like from Princess Bride. I do not think that word means what you think it means. Um, but here’s the, I had this conversation with ChatGPT the other day where I was going down this thing about electricity and how it works.

[00:29:28] CR: And I said, well, you know, what happens is the electromagnetic field. Field, uh, which is everywhere and it permeates everything that’s around the wire carries the energy, the field carries the energy down to the filament. And we go, but what’s, what’s the field? What is the field? And then it started going, well, think about waves on the water.

[00:29:52] CR: Don’t give me fucking analogies. I’m asking you a question. What is the field made of? And they go, well, think of this analogy. I’m going, don’t Fucking give me a na I went on with this for an hour. I’m having a voice conversation with it when I was

[00:30:03] CR: cleaning the

[00:30:04] SS: By the way, you told me you don’t swear at AIs in a previous post. You AIs, and I’m starting to sense that now that they’re

[00:30:11] CR: I was, I wasn’t swearing, I’m, I’m, I’m exaggerating.

[00:30:14] CR: I was being very polite.

[00:30:16] SS: just wanted

[00:30:16] CR: I don’t like And I was like, what is it? What is the field? And it was fudging. So after about an hour of ChatGPT, I gave up and I went to, um, Gemini, cause the new Gemini had just come out. And I said, what is the electrical field made, electromagnetic field made out of? Fundamentally, what is it?

[00:30:36] CR: And it said, nobody knows. Science doesn’t know. It’s one of the great mysteries of science. We do, we know it exists. We know how to use it, but we don’t know. What it is, we just take advantage of it. We don’t know. I went back to GPT and I said, Gemini just told me we don’t know what it is. And GPT said, Oh, well, that’s true.

[00:30:56] CR: Yeah. And I said, well, why didn’t you just fucking tell me that an hour ago, instead of giving me the same bullshit analogies over and over again, just say, Hey, listen, we don’t know what it’s about. He goes, Oh, you’re right. I should have said that. So

[00:31:06] SS: this is a key this is a key career business and technology insight for everyone. You don’t need to know how something works to make it work for you.

[00:31:16] CR: yeah, it’s like quantum mechanics. We still don’t know really what’s going on with quantum mechanics. We know how to use it. We do lots of stuff with it. No one knows how it works, or why it works the way it works.

[00:31:27] SS: It is nice to know how something works, it is nicer to get it working for you.

[00:31:32] CR: Here’s my analogy, though, with AI. We don’t know how AI works. No one knows how AI works, still, right? But we know how, we’re learning to know how to use it, and it will be like electricity, as you say. We, no one will know how it works, we’ll just do stuff with it. So, but AI integration with SMA, my third would probably be Apple Vision Pro.

[00:31:51] CR: Even though it’s had a minor, Eve, have you had the demo yet? Have you gone?

[00:31:59] SS: yeah, the demo, I’m like, how is this changing anything for me?

[00:32:03] CR: God, you’re

[00:32:04] CR: such

[00:32:04] SS: I went there and did it, and I actually meant to talk to you about

[00:32:07] CR: spiritually. You are so dead

[00:32:09] SS: I’m not seriously dead, I’m like, yeah, I get it, it’s nice and you can do all these things, and I get that it has a lot of applications. Right? I get it. I get that it has a lot of entertainment and industrial applications and it’s incredible and it’s, and it’s good, but I just, I just, it’s just a bit of a so what for me.

[00:32:29] SS: I don’t see how I’m going to plug into that and fundamentally change my life for the better.

[00:32:36] CR: Wow. Okay. Well, they’re my

[00:32:38] CR: top three

[00:32:39] SS: me why you love it though. Look, I just want to keep to make sure this is not the Mutual Agreement Society podcast because no one So I did lean into it a bit, I gave a bit of Cheryl, it was a bit of a Cheryl, and I leaned in to the disagreement. Now, come on, tell us what you love. this,

[00:32:53] CR: Well, you know, I think in terms of, I said this at the time when I did my demo, like, very few times in my life I can think of when I experienced a new piece of technology and it blew my mind and I went, this is me. A fundamental quantum leap on anything I’ve experienced before. And it’s completely reshaped the way that I think about the future and what can be done with technology.

[00:33:26] CR: For me, that was Apple Vision Pro. I had fairly low expectations going into it and the demo experience, I was like, Oh, holy shit. Like this just completely opens up new realms of what is possible. Interacting with technology can be like just, uh, the three dimensionality of my experience of the internet and photos and videos and, um, I can just imagine, it’s really just going into the Ready Player One sort of

[00:34:01] CR: space, the beginnings of that, you know?

[00:34:04] SS: and, and, and maybe it is five years from now. I mean, just keeping exponentials in mind, you know, not a huge amount of years. Mate, if, if this could transition into some kind of, Physical virtual reality holodeck style, you know, next gen Star Trek. I mean, for me that, that, that’s astounding, but look, I definitely see the applications and I think VR loosely and AR have had these applications, you know, training and pilots and surgeons and, uh, entertainment experiences, you know, being at a concert you’re not really at on the stage with Dave Grohl, all, all of that kind of cool stuff that could emerge from it, I think it’s still there, but, um, it feels like it’s a great piece of kit.

[00:34:47] SS: That hasn’t really got a home yet.

[00:34:50] CR: Yeah, no, exactly. It’s, it’s very, very early adopter y mode, but you know, I still think an amazing piece of technology, like incredible achievement.

[00:35:04] CR: What’s

[00:35:04] SS: going to go through our top five. We’re going to

[00:35:06] CR: only got one.

[00:35:08] SS: predictions. Hey, you’ve only got one. I’m going to go through my five and rather

[00:35:13] CR: I’m sure mine’s going to be one of your five. So this is going to be

[00:35:16] SS: well, interesting because I, there was, I’ve got 20 from my podcast, which everyone should do, not my podcast, my post on Substack, which everyone should certainly be signed up to and go and read.

[00:35:26] SS: Uh, tech and AI transfer 25, 2025. And I picked out the five that I think. The most interesting to me, there’s a lot that have overlaps. Okay, I’ll read them out and then we’ll just pick which ones you want to speak about. The first one is agentic AI, AI agents. That’s the top of my list and it’s not a course.

[00:35:46] CR: You

[00:35:47] SS: Oh, there you go, well said.

[00:35:48] CR: got my only one. Yeah, Agents. Wait,

[00:35:52] SS: and the, uh, well, the, it’s kind of the top tier one. The next one I had on my list was machine based

[00:35:59] SS: business.

[00:36:00] CR: move on yet. I

[00:36:01] CR: mean, are we going

[00:36:02] SS: No, no, we’re going to come

[00:36:02] SS: back.

[00:36:03] CR: Oh,

[00:36:04] SS: We’re going to come back. Okay. So one was agent AIs or agentic AI. Number two was machine based customers. I think this is going to be huge.

[00:36:12] SS: And that’s when it’s kind of an agent process, but it’s your purchaser, the person that goes and buys things, looks at things, does the research, negotiates all of that, everything from a car to whatever, goes through your subscriptions, does all of that kind of stuff.

[00:36:26] CR: just, that’s just,

[00:36:27] CR: a

[00:36:28] SS: Yes, but I think it has a specific business implication, because I think that businesses are going to get, have to get good at talking to and with AIs, and we’re going to have to get good at prompting them on that realm within that specificity.

[00:36:41] SS: The next one I had was the AI creative explosion, a Cambrian explosion of capability based on the fact now that if you can think of it, you can do it. If you can imagine it, you can do it. I don’t even know where that’s going to go, but the fact that anyone can do everything that has ever been done. is insane because I think a lot of people have ideas and have never had the capability and I think that’s extraordinary.

[00:37:03] SS: My fourth one was the global robotaxi ramp up. 10 years late but it’s finally here and I think that’s going to go global next year. The cities in the US that are doing

[00:37:13] SS: it. Yeah, I really do. I think they’re keeping that in their back pocket, and I think if Google or Alphabet are smart, this is something that they can do without any antitrust issues.

[00:37:23] SS: They can plug it in and then build out different business models based on logistics and moving people and payloads in a way that keeps some of their traditional business models of pay per click and advertising true within that logistical context. I think that’s going to be huge. And the other one is AI Eyeballs, which is the idea of holding up the video in a live moment, distilling, analysing, and helping you.

[00:37:46] SS: I did it with my PK Ripper BMX. I needed a new part on it the other day. I held it up and I said, Hey, that looks like a nice BMX there. It looks like a bit of a retro one. Is that a PK Ripper? I’m like, yeah, it is. He’s like, cool. How can I help you with it? Well, my, uh, cranks are broken. Yeah, I can see those.

[00:38:04] SS: They’re the red line flight tool. They’re not cuddler’s cranks. So they, they do break down a bit easy and I went through and I said, what part do I need? Cause one fell out when I was like, you said, what you need is this here. And then I asked it, I found a place that sells it locally that can ship it that day, gave me a link.

[00:38:18] SS: It was like insanely simple example, but this is kind of like what the promise of Google Glass was. But now OpenAI, you have live video chat, it can understand exactly what it’s saying, converse with you, solve real time problems with AI eyes. That’s huge. They’re my five, brother.

[00:38:38] CR: Well, that’s, uh, yeah. I don’t know about Robotaxis, but Elon running the White House might be interesting how he pushes that along from a Tesla perspective.

[00:38:49] SS: Well,

[00:38:51] CR: Look, I think there are some predictions of AGI as early as 2025. I’m not sure about that, but I think, you know, I think Sam Altman has been saying lately that we will get AGI and it’ll be kind of a ho hum moment. And I tend to agree because people are ho hum about AI already, which boggles my mind.

[00:39:16] CR: I still

[00:39:17] SS: for us. Except for us. Because I’m just

[00:39:19] CR: yeah. I was saying to Chrissy last night, like, AGI. Like when we’re talking about the Jetson and I was like, it’s 167 million times as powerful as the computer I had 30 years ago for a 10th of the price. Like, it’s just, my brain can’t even begin to compute what, where we’re at. You can call an AI over a rotary phone and have a conversation with it about any topic.

[00:39:52] CR: Like it’s just, anyway. Bye. I think he’s right that AGI will, when I say slowly creep up on us, I mean slowly in the sort of time frames that we’re looking at. AI is just getting better and better and better. And at some point in the next year or two, it’s just going to be better at any human, at every topic.

[00:40:17] CR: And people will just go, yeah, so what?

[00:40:19] CR: People will just

[00:40:20] SS: don’t know how many years now. I don’t know how many years now. It

[00:40:25] CR: It’ll just, people will just

[00:40:26] CR: be blasé

[00:40:27] SS: is intellectually. It is intellectually. It is intellectually. Intellectually it is, physically it’s not. On this TikTok that I was watching last night, I should have saved it. Anyway, it’ll be my history. This guy was saying that the thing that’s interesting so far with AI and robotics is that robots find hard the things that humans find easy.

[00:40:45] SS: Like standing still and walking upstairs. And if we go back to the Atlas robots and Boston Dynamics and all of that kind of stuff, but computers and AI find easy, the things that humans find hard, you know, computation, complex, complexity, all that kind of stuff. I thought that was kind of an interesting insight.

[00:41:04] SS: Um, and maybe that’s why we’re blase about AI because computers for a long time in a computational sense. a far exceeded humans, you know, whether it’s a calculator or, or anything. And so I was just going, yeah, they’ve always been better than us. You know what I mean? Like, and, and, and I don’t know if I can talk now, whatever.

[00:41:23] SS: I mean, for me, it’s really interesting because it’s language based and that’s how, that’s how, that’s the fabric of all human knowledge.

[00:41:31] CR: Speaking of which, there was an interesting paper that I saw this morning from Meta, don’t know if you’ve seen this, where the suggestion is that we should write the LLMs in a way where they don’t use language, don’t use English to think. So the way that, like if we have chain of thought reasoning with an LLM at the moment, like an O1, it uses English, language, human language, to think through the process.

[00:42:04] CR: And they’re saying that uses tokens and it’s slow.

[00:42:08] CR: It would be faster just to let it do it in its own way, in code, and then push out the answer in language, it doesn’t, it shouldn’t have to think it through, so that, and that would reduce the level of compute and the level of wattage that was required and all that kind of stuff.

[00:42:24] SS: is really interesting. And that’s where, uh, our favourite linguist, the guy did Esperanto. What’s his name again? He’s one of your boys, the linguist, Chomsky.

[00:42:35] CR: He didn’t do Esperanto. It was a

[00:42:37] CR: different guy, but

[00:42:37] SS: Did he do Esperanto? He was a linguist, wasn’t he,

[00:42:40] CR: Chomsky is, yeah, he’s, he’s one of the, you know, major linguists, yeah, in terms of understanding

[00:42:47] SS: I thought he did Esperanto. Who did Esperanto? Do you remember? Wasn’t

[00:42:50] CR: Uh, no, I can’t remember his name.

[00:42:52] CR: Different

[00:42:53] SS: It was Billy Espo. It was Billy Espo. And he put Ranto on the end of it. Anyway. So Billy Espo, the thing with language that, you know, it’s What I think is interesting is that it has nuance in it, which I think enables all of the things that we would do creatively be possible.

[00:43:13] SS: I think in the computation sense, it definitely would waste tokens. But I think at the output level, probably has to do it in that language. And as someone who speaks three languages, how’s that for a flex? Um, there is something in languages that, Makes it impossible to transliterate exactly. It just can’t be done.

[00:43:34] SS: And the only way you can truly understand the language is to kind of understand the culture and the way of life and even be in that country, because then the words start to make sense because of the physical interactions and they become a function of that. And you just, some things you just can’t translate.

[00:43:52] SS: And when you understand both languages really well, you can explain it to somebody and say, look, we say this word, but it really doesn’t mean that. And that’s the only one that we’ve got to kind of translate. But it’s kind of like this, and you have to go into that. And I think that if the LLMs were doing it in computation, their capacity to give us the optionality of all the things that they can do would be lost in that translation process.

[00:44:13] CR: Esperanto was invented by L. L. Zamenhof, Ludwig Laser Zamenhof, a Polish Jewish ophthalmologist, in the late 19th century. He published the first book outlining the language, titled Unua Libro. First book. In 1887, under the pseudonym Doctoral Esperanto, which means Doctor Hopeful, the pseudonym eventually became the name of the language itself.

[00:44:37] CR: Zamenhof’s goal was to create an easy to learn, politically neutral, and culturally inclusive auxiliary language that could foster international communication, and Understanding. Esperanto’s grammar is highly regular, its vocabulary is derived mainly from European languages, and it was designed to be phonetic and logical.

[00:44:57] CR: Despite being a constructed language, Esperanto has developed its own thriving global community of speakers.

[00:45:05] CR: There you go.

[00:45:06] SS: I’m gonna learn Esperanto late at night tonight after a couple of Campari’s at 1am.

[00:45:11] CR: like that idea. Maybe we should do our next podcast in Esperanto.

[00:45:15] CR: We’ll

[00:45:15] SS: Esperanto!

[00:45:17] SS: Well, Cameron, given that you’ve got one, do you want to give us your thoughts on agents and why they matter? And I’m getting the sense, and I kind of, when I look through mine, it kind of is a capstone thing that really trickles through all of the others to an extent. So why don’t we kind of get your view on agents, what they are and why they really matter.

[00:45:38] SS: And that, that might even be a nice way for us to kind of round out the, the pod. Cause we have to do one before the new year as well, by the way. Yeah.

[00:45:47] CR: Yeah, so look, this isn’t an original prediction. Pretty much everyone in the space is saying 2025 is going to be the year of agents, and the way that most of them describe agents, it’s when you can give your AI a complicated task made up of uh, uh, A number of sub tasks that have to be done, some of which might take a bit of time and effort and they will just go off and do those things and come back to you when it’s done.

[00:46:23] CR: Now, trying to come up with Examples of this is tricky. Like the one I heard Sam Altman use the other day was pretty mundane. He was like, well, you know, you could use, ask it to find a restaurant to make you a booking. And it might go off and ring three restaurants and find out what their availability is and then come back to you.

[00:46:51] CR: And. When it’s got a booking and check your friend’s calendars and do all that kind of stuff and interface with all these different things and come back to you with a result. And he said, that’s pretty, that’s interesting, but not that exciting. He said, but what if you got it to go and speak to a hundred restaurants and find out what all of their offerings were?

[00:47:12] CR: And, you know, if they catered to, you know, You know, uh, what pricing they could do and if they could cater for a particular, I don’t know, vegans or whatever it is. And then he said, well, you might think that’s a lot of wasting people’s time, but it would probably be your AI talking with the AIs of all of those restaurants and having a negotiation and a conversation and then coming back to you.

[00:47:33] CR: And, you know, it reminds me of, um, talks. I was giving at Microsoft circa 2000, 25 years ago, something like that, when I was talking about Web 2. 0 and it was the early days of Web 2. 0 and the vision that we had for how that would play out back then was that If I wanted to build a, um, you know, a TV, if I wanted to buy a TV, I would say to my browser, I need a TV, and it would go off and not just find the best price from a bunch of TV retailers for me, It would talk to the retailers and the retailers would use their own back end systems to do a just in time build of that TV and would go out and speak to all of their suppliers and negotiate the best pricing that they could get for all of the components and it would be built on the fly and I would get the best price based on the best negotiation that the best retailers and manufacturers could get with all of their suppliers, come back and give me the best price.

[00:48:54] CR: Not dissimilar to how Tesla, in a way, will custom build a car for you these days. One of the problems that we had, this was all part of the idea that there would be interoperability between all of the backend manufacturing systems, and they’d be able to have schemas that could talk to each other, and XML was going to be the basis of that, or NET as Microsoft called their platform back then.

[00:49:21] CR: And what. What happened was the, uh, technical logistics and the management logistics required to get all of those schemers to interoperate and interface with each other was way bigger project than anyone really wanted to do just to reduce their margins by having more competition. And it kind of all fell apart and it didn’t quite get there.

[00:49:46] CR: But I see we’re heading into a world where every system, and I remember you and I talking about this at the beginning of this series, like a year or two ago, about agents that sit on the front of every business’s website, that can speak with their back end manufacturing systems, and ultimately I think we’re heading into a world where my Agent will go out and be able to speak to the agents of all of the retailers and all of the manufacturers and negotiate on my behalf.

[00:50:19] CR: You were talking about, what did you call it, machine based shoppers or

[00:50:23] CR: something before?

[00:50:24] SS: Yeah. Machine based customers.

[00:50:26] CR: So agents is basically really complicated processes that software Well, your AI will be able to do on your behalf. And I think Sam’s point was, we tend to think of it in things that we, in terms of things that we already do today, that it might make easier, but he’s saying there are things.

[00:50:45] CR: That we haven’t even thought of because they’re not practical or possible today. That we will be able to do in the future when everyone has intelligent software that can talk to other pieces of intelligent software. And often, in some cases, it’ll be the same software. It’ll be my version of ChatGPT talking to your version of ChatGPT talking to another ChatGPT.

[00:51:12] CR: It’ll just be ChatGPT or whatever it is. It’s talking to itself, uh, cause it’ll be front ending a lot of these systems, but that’s the world of agents, complicated scenarios done completely by software and it’ll, it’ll happen slowly. I don’t think it’ll happen all at once. You know, it’ll, it’ll start with small iterations, but will gradually become more and more sophisticated over the next year or two, but definitely I think 2025 is where we’re going to start to see the beginnings of software, really.

[00:51:51] CR: This is where software as a service becomes real. These things will do services for

[00:51:57] SS: the, the current software as a service gets eaten up by an overriding agent style thing. And the agent is multimodal because it’s not like software at the service now is set in little verticals with functional tasks. But because it’s language based and this is the, this is why language based software LLM is so important is that it’s multimodal.

[00:52:20] SS: It can do any task with any set of directions and it eats up every little piece of software as a service. App that there is. And, and just on that, it’s interesting how technology has a way of building another layer to circumvent the problems. And the problem with the backend speaking to each other is that people have, companies have proprietary systems, don’t want to open them up.

[00:52:43] SS: They want to keep them closed on purpose because they have an incentive. And then technology sits above it. So, okay, you’ve got your two different backend logistics or supply chain systems. Well cool, now we’ve got the agent that just speaks English, or Hindi, or Spanish, or Mandarin, or both. Actually, it speaks all of them, sorry.

[00:53:02] SS: Mainly Esperanto. Look, supply chains in Esperanto has always been what I’ve hoped for. So, it can speak in all of those supply chain levels. So it doesn’t matter that it’s not all in NET or some system in the back end that has this ability for software to kind of, Interact because you get the language layer on top.

[00:53:23] SS: And that brings me back to what that paradox that I was trying to think of before it’s Jevons paradox

[00:53:30] CR: Who?

[00:53:30] SS: and, uh, Jevons paradox, Jevon. And it was by, um, coined by William Stanley Jevons in 1865. And it refers to the idea that as technology improves, the efficiency of resource use, the overall consumption of that resource often increases and Instead of decreasing, for example, more efficient fuel engines lead to greater fuel consumption as travel becomes cheaper and more accessible.

[00:53:58] SS: It’s the same kind of, with ai, it’s the AI becomes better and more useful and, and I think it kind of leads into. Unexpected use cases as well because it becomes more omnipresent. It goes on further and it says in a broader sense this could also be linked to the idea of emergent utility. As we refine or cleave off parts of a material resource or concept we discover new ways to use or repurpose what remains.

[00:54:23] SS: And so it kind of feels into this and it says this reflects on the recursive nature of innovation and while refining or breaking down something often leads to greater complexity rather than simplicity and reduction. So it kind of adds, and for me, Jevon’s paradox relates to the idea that we don’t need a UBI because there’s all of these unexpected things that just layer out.

[00:54:44] SS: So I just thought that that was interesting and a guy was talking about it on my TikTok last night.

[00:54:50] CR: Yeah, well I, look,

[00:54:52] SS: At 2am,

[00:54:53] CR: oh man, I didn’t get any sleep last night, Fox was in our room sick at 3. 30am and

[00:54:58] CR: throwing up all

[00:54:59] SS: Fox, what are you doing?

[00:55:01] CR: then I had to record a podcast at 9am, so I just got up, I had to make bread. Yeah, me too, so, um, yeah, look, I think, we can’t even begin to imagine the ways that we’re going to use this stuff in the next year or so, Steve, and just to wrap up, I had one last thing I wanted to say to you.

[00:55:17] CR: was great

[00:55:22] SS: Can I guess? Something, how are you going, um, have a happy new year, something in Esperanto. I know that much.

[00:55:31] CR: fun. Let’s do it again next week in Esperanto. Thanks, buddy. Have

[00:55:38] CR: a good week.

[00:55:38] SS: Thanks, champion, great chat.