Select Page

AI translates Chickens, Pizza Hut Hacked, Alexa GPT, Microsoft Copilot launches, Bard Gets Google plugins, Google nears release of AI software Gemini, Bing is back on GPT, Coca-Cola has Created a Futuristic Soft Drink Using Artificial Intelligence. Google AI breakthrough represents huge step in finding genes that cause diseases, bringing back the dead with ‘Digital necromancy’, where are our Jetsons products, and our forecast that GPT’s all become verbal within 12 months.


[00:00:00] Cameron: Welcome to Futuristic, episode 13, Steve Sammartino, little Stevie, little Stevie Wonder with his trucker hat on today.

[00:00:18] Steve: I love that. You called me Little Stevie Wonder. Little Stevie, friend of ours, Little Stevie, who was doing a record of Artists Against Apartheid.

[00:00:27] Cameron: right. Yes, he did. Um, good track that too. We won’t play Sun City or something. We won’t something Sun City, whatever that song was. That was it,

[00:00:39] Cameron:

[00:00:39] Steve: Uh, it

[00:00:40] Steve: was,

[00:00:41] Cameron: Ain’t gonna play Sun City, yeah.

[00:00:44] Steve: yes.

[00:00:45] Cameron: Anyway, sorry everyone who wasn’t around in the 80s. Futuristic Episode 13, what’s one thing of note you did AI or emerging technology related this week, Little Stevie

[00:00:57] Steve: Little Stevie wrote a blog post, which just got [00:01:00] published, I want to say a bit over an hour ago. Your name featured in it.

[00:01:04] Steve: And it was about how I’ve learned. A lot about technology of people that I just randomly met and then they were a big part of how I saw the world and helping me see changes in it.

[00:01:17] Steve: And I called it reverse mentors because the large majority of these people were younger than me. I think you might even be younger than me. I’m

[00:01:25] Steve: guessing. Uh,

[00:01:26] Cameron: not. What, what year were you born?

[00:01:28] Steve: 73. Anyway, let’s pretend.

[00:01:32] Cameron: I’ll be 53 in a couple of weeks,

[00:01:34] Cameron: mate.

[00:01:35] Steve: Well, you’re

[00:01:35] Steve: up there. But anyway,

[00:01:36] Cameron: at, look at the hair, look at the hair,

[00:01:38] Cameron: Steve, and tell us which one of us is older. Yeah, yeah, look at you, you’re brown, looking healthy, I’m long and gray, I look like. I look like, uh, Willie

[00:01:48] Cameron: Nelson, you

[00:01:49] Steve: Cameron, when I get a haircut, I look on the floor and I’m like, whose hair is that? It’s so great. So you featured in this blog post [00:02:00] prominently, and You were one of the first people that ever gave me a voice on media when I was on the G’day World podcast for the first time. This was very early doors. I even shouted out in the verbal version of my blog post where I read it, a little bit Cory Doctorow esque. Uh, saying how you,you know, gave me a voice for the first time. And also that Modem, Melbourne Online Digital Media, was one of the first places I ever went to where there was a startup and tech nerds kind of saying, Hey, we think something’s happening here, early web 2. 0. And I met Ross Hill there. He taught me about the future of media and social and actually helped me be a bit more transparent and publish stuff. And, you know, not have this top down media theory, the more bottom up and how I met all these different people. And my latest reverse mentor is, you know, 24 year old Tom Macracanis, who I’m doing macro 3D with, where we’re doing, uh, robotics and AI to build houses. And my next. This project is going to be the world’s first house entirely done by [00:03:00] AI, including design of the house, uh, the CAD programming, software, the materials, and the 3D printing of the house.

[00:03:08] Steve: I’m going to turn that into a viral video. So that’s my next project. So, reverse mentors and help me understand the

[00:03:13] Steve: future.

[00:03:14] Cameron: Reverse mentor sounds like people who teach you what not to do. That’s what a reverse mentor is.

[00:03:21] Steve: Maybe I’ll need to work on the nomenclature, but So, um, reverse mentors. There was this bad dude

[00:03:26] Steve: who taught me about drugs and how to live under a bridge. And

[00:03:29] Cameron: yeah, Modem was a, start up, community slash, get together piss up, really, that I started in Melbourne back 2005, maybe, I think.

[00:03:42] Steve: I met all five, and you invited me to that, and I met a whole lot of people, and it put me on an entirely different trajectory of the type of people that I hung out with, because I was in corporate and consumer goods, and everyone was just,on the battle to climb a ladder, and we were all

[00:03:56] Steve: about saying, let’s build a different ladder, and I

[00:03:58] Steve: love that.[00:04:00]

[00:04:00] Cameron: what it, what it did for me was I got laid a lot, but that’s another story. Uh, server debugging, Steve, that’s what I did this week. I had premium RSS feeds for my QAV podcast. It’s had a tendency I’ve noticed over the last couple of months where the premium RSS feed will just stop working. usually on the day that I release the podcast, which is a Wednesday, it just randomly stops working, for about 12 hours.

[00:04:27] Cameron: And then it randomly starts working again. I’ve asked my IT guy, the guy who runs my hosting provider to try and work it out. He’s given me nothing. So I went into GPT. This week when it happened and I said, Hey, listen, this, this is what’s going on. Can you help? And it said, well, you might want to look at the server logs.

[00:04:45] Cameron: I said, well, how do I do that? And it said, go to your seat, go seat panel access. Yeah, I do. Go to that. Tell me where to look at the server logs. And then I was looking through the server logs. I couldn’t make any sense of it. So I just cut and paste the whole server log, threw it in a text document, uploaded it to GPT, Advanced [00:05:00] Data Analysis.

[00:05:00] Cameron: I said, have a look through this. This is my server log. What do you see? And it said, yeah, well, I can see there’s these 401s happening, which is an authentication failure, followed by a 302, which is a redirect. So it looks like it’s an authentication issue. Fantastic. So what do I do about that? So it led me down another path.

[00:05:18] Cameron: I haven’t resolved it yet, but it was the first time I’d used it for… Debugging a server thing. And I was able to go back to my IT guy and say, Alright, I had a look at the server logs. I noticed there’s 401s happening before the 302s. I think this is an authentication issue. Can you drill down on that for me and try and figure out why the authentication would be randomly failing once a week and then randomly working again 12 hours later.

[00:05:42] Cameron: But again, it just gets back to this idea that of superpowers. I can do stuff now from a tech front. That I, I didn’t know how to do before I had GPT on as a co pilot. So that excited me, made me feel like,it was a good thing. It was one [00:06:00] of the better, I’ve had some wins and some fails this week with GPT as always, but that was one of the good

[00:06:04] Cameron: ones.

[00:06:05] Steve: Yeah, nice. So it’s, it’s funny. I mean, the word Copilot is so interesting and that’s in the tech news a little bit as well, but it feels like that’s where we’re heading. A few of the pieces of tech that we’ll talk about today, just this augmentation process, I think. And that’s what’s so interesting about it.

[00:06:23] Steve: It, for me, I’ve really been thoughtful lately about, is ChatGPT really going to take away jobs? And the more I see, I It still needs a lot of direction. It’s like everything and nothing, nothing is fully automated. It just isn’t. I don’t know if there’s anything in the world that is truly

[00:06:40] Steve: automated. I’m not sure.

[00:06:41] Cameron: Yeah. What do you mean

[00:06:42] Cameron: anything

[00:06:42] Cameron: in the world that is fully

[00:06:43] Steve: Anything that just goes about its own way without any human Import that is truly automated. Once you set it up, it just does, it just does what it does. I don’t know that that even exists anywhere in the world in any capacity. That’s a massive statement. And [00:07:00] now I’m ready for you to chop it

[00:07:01] Steve: right down.

[00:07:05] Cameron: No, well, I think, technology is used as a productivity tool for humans and we do have to turn it on and direct it and those sorts of things. There might be some systems out there that when you set it in motion, it can run for a period of time without interruption. You turn on a, a lot of robots.

[00:07:26] Cameron: manufacturing devices in a factory and they’ll go about their business until you turn them off

[00:07:31] Cameron: again.

[00:07:32] Steve: Repeating, repeating, but, but my contention is that the more complex the system and the more nuanced the outputs can be, the more direction it requires.

[00:07:42] Cameron: Yes,

[00:07:43] Steve: is like a classic example of that. It, it, it actually requires an extraordinary amount of cajoling and, working with and data inputs and reframes.

[00:07:55] Steve: And, and I’m wondering if the more complex a system, the more management it needs. [00:08:00] I mean, we can think back to the paperless office and the three day work week, which were purported to happen in the mid seventies. And I can remember watching a um, in science class or geography or something.

[00:08:11] Steve: And, In grade seven going, Oh wow, this is sort of how they saw the future didn’t happen. And it seems as though with all this automation and productivity tools, people are

[00:08:23] Steve: just getting busier and busier.

[00:08:27] Cameron: but there are lots and lots of jobs that were a thing in the 70s that aren’t a thing now. So those jobs did disappear. A lot of them went to China and Vietnam, but a lot of them got automated by technology and just new jobs were created. How about, there are people that lost their jobs for a variety of reasons and couldn’t or wouldn’t be retrained to take on a different role.

[00:08:51] Cameron: It tends to be a generational thing. Their kids go to uni or get a traineeship in something and learn the new skill or the [00:09:00] new task. And I think that’ll be the same with this to a point. But again, I think people are still placing the wrong kind of expectations on large language models. I don’t think large language models are how.

[00:09:14] Cameron: I think large language models are a language user, a linguistic user interface. That’s what they are. And they can do certain things quite well, certain things they can’t do very well, certain things they’re horrible at, but they’re going to provide us with the tools to have an LUI on all of our other systems moving forwards.

[00:09:38] Cameron: So we will have an LUI that will be able to communicate with the outside world and then feed instructions or requests back into their internal systems. I think that’s where we’re

[00:09:48] Cameron: going.

[00:09:49] Steve: it’s, it’s almost like a new layer in the tech stack, right? And, and every form of the tech stack, it just never gets less complex, does it? There might be parts that get automated, but there’s always new layers on top. And it [00:10:00] does seem as though roles in business and technology and development or whatever realm you’re in now.

[00:10:06] Steve: is far less singular than they were before. And roles that were singular or repetitive were the ones that were replaced. And whether you’re an engineer or a lawyer or a marketer, you’re now involved in so many more parts of the business. And that complexity, it’s kind of like a reflection of the world.

[00:10:21] Steve: As the world gets more complex, the roles that you do that remain jobs are more complex in the amount. of tasks and

[00:10:30] Steve: variety that you need to have some level of mastery on.

[00:10:34] Cameron: And what always happens with technology stacks is that if, if there’s a job, let’s say you’re hired by a company, there’s a job that, is part of your function. And it takes you eight hours a week to get this particular, process run. And a piece of technology that comes along that enables you to now do that eight hour process in one hour, there are other processes.

[00:10:56] Cameron: That you can go and focus that seven hours you’ve just saved on that [00:11:00]technology can’t do yet. So we’re always pushing human labor into areas that machine labor isn’t quite ready for yet, and then when the machines catch up to that, freeze up that human labor, and we can push that human labor into the new thing.

[00:11:13] Cameron: But I do think though that we’re going to We’re sort of speeding up with a lot of this stuff, where I think LUIs are going to have a huge impact on a whole range of sectors and jobs in sectors over the next five to ten years, but we’ll see how it plays out. There’s, we’re all not going to be able to work just yet.

[00:11:33] Cameron: Speaking of, speaking of that though, let’s get into the news stories for the week. And Steve, you’ve got the first one,

[00:11:42] Steve: Well, here it is. I mean, AI has, there has been a claim made that AI can now interpret what chickens are saying. The word saying probably shouldn’t be used here. Barking or whatever sound chickens make. This is uh, peer [00:12:00] reviewed, but I just want to explore the way that it says it does it. And it makes me feel like, feel like it’s not outside the realms of possibility.

[00:12:10] Steve: So the way that it works is they’ve developed a system which is capable of interpreting various emotional states and signals, including hunger, fear, anger, contentment, excitement, and distress, which they claim are visually easy to observe in the same way that we’ve observed. various animals and behaviors and studied them, over the years.

[00:12:34] Steve: But what they do is they add a cutting edge AI technique, which looks at the visual patterns and the slight nuances in their verbal sub verbals, not even the right word, in the sounds that they make. And so, uh, it creates a deep emotional analysis learning. So it’s, again, it’s rooted in complex mathematical algorithms.

[00:12:56] Steve: And it’s the word that they’ve used here is chicken [00:13:00] vocalizations, which I like. I don’t think we hear enough of chicken in, uh, technical papers, but this is super interesting for a whole lot of reasons. Obviously the chickens are going to tell us they’re really upset that they end up in a zinger burger from KFC.

[00:13:13] Steve: So that that’s a great one for the, the vegans. And by the way, can I tell you my favorite vegan joke while we’re on the topic?

[00:13:21] Cameron: please.

[00:13:22] Steve: So, there was this vegan lady who wanted to prove that vegans could do everything that non vegans could do. And as a part of her plight, she decided to climb Mount Everest.

[00:13:33] Steve: And what happened, unfortunately, was she died climbing Mount Everest. But, but she did die doing what she loves most, which is telling everyone she’s a vegan. So, that’s my vegan bit. But I think this is interesting, and I wonder if you had enough visual, interpretations, and the ability of systems to get nuanced sound differences in any form of animal cry or vocalization.[00:14:00]

[00:14:00] Steve: I wonder if you could learn the languages, because it’s pattern recognition, and we know that, large language models don’t really understand it, but they can create patterns. I just wonder if this is something that It’s possible. Imagine the DoctorDolittle. ai. Sounds like something interesting. Now, again,

[00:14:18] Steve: not peer reviewed.

[00:14:20] Steve: Gee, I hope it comes back as

[00:14:21] Steve: positive.

[00:14:22] Cameron: Well, when you, when you, put this in our notes and I started to read it, I immediately thought it was a joke. So I thought it was the onion. Because what’s, what’s the name of the researcher quoted in this, Steve?

[00:14:34] Steve: David Cheok.

[00:14:35] Cameron: Adrian David Cheok. It’s a story about chickens. And his name is Cheok.

[00:14:43] Steve: I love it though.

[00:14:46] Cameron: Now,

[00:14:47] Steve: It’s gotta be a,

[00:14:48] Steve: um,

[00:14:49] Cameron: I looked him up.

[00:14:51] Steve: Not

[00:14:51] Cameron: a, he’s a, he’s real and he’s an Aussie,

[00:14:55] Cameron: based in Tokyo. But in 2019, [00:15:00] he ran to represent the division of Boothby as a member of the far right Fraser Annings Conservative National Party. He’s been described as a sex robot expert.

[00:15:12] Steve: Wow.

[00:15:14] Cameron: Fairly controversial right wing dude, particularly for a scientist.

[00:15:19] Cameron: He’s been criticized, according to Wikipedia, posts, and his use of Twitter has been unfavorably compared to Donald Trump tweets by critics. He was a member of Clive Palmer’s United Australia Party, but stood down from the right wing party after being asked by Palmer to dumb down his policies. But, genuine

[00:15:41] Cameron: scientist.

[00:15:42] Steve: this, but is the research real though?

[00:15:44] Cameron: Well, I, I, I assume that some level of it is real. The story that you link to actually links to a paper in, Research Square, which is a, genuine pre print [00:16:00] research service, so as you said at the beginning, it hasn’t been peer reviewed, but, and I did read a similar story to this a few months ago about whales, researchers, Trying to use AI to interpret whales, language.

[00:16:17] Steve: Well, you, you know the Carl Sagan, Cosmos bit where he does the whale sounds and then they turned it into that song that went viral, the Carl Sagan whale song. And he was talking way back then how they had tracked And understood, they thought somewhat of what the whales were trying to communicate and how they would sing the exact same song in the exact

[00:16:36] Steve: same place six months later.

[00:16:40] Cameron: so look, I think these things are feasible, and, we can easily imagine, a time in the future when your pet dog or your pet cat… We’ll have a little thing it wears around its neck that, will, like they do in Adventure Time, like Lady Rainicorn, for any Adventure Time fans

[00:16:57] Steve: Adventure

[00:16:57] Cameron: watched

[00:16:58] Steve: Oh, Time. Me and my son

[00:16:59] Steve: love [00:17:00] that show. That show’s

[00:17:01] Steve: crazy.

[00:17:02] Cameron: love that show. There’s a new spin off, if you haven’t seen it yet, Fiona and Cake, which just came out a few weeks ago. spin off with all the same characters, but set much, much later and they go in parallel worlds. Really, really good. Yeah, we’re huge Adventure Time fans. But you know, Jake’s girlfriend, Lady Rainicorn, has a little device she wears around her neck that translates her speech from Korean.

[00:17:27] Cameron: or Rainicorn to, English, so, yeah, maybe your, your household pets will have one of those in the not too distant future and they’ll be able to talk to you and you’ll be able to talk back to them. That should be an interesting conversation.

[00:17:43] Steve: Well, but you know what’s interesting as well, now that you mentioned it, now that, because I didn’t look into the depths of this guy being a psycho or whatever. But you know what it did as soon as you did that? I didn’t want to believe it was true as soon as you say that. And what’s interesting is that.

[00:17:58] Steve: In the modern world, [00:18:00] we so often worry about who said something instead of what was said. And that’s interesting because it works both ways. In some ways, you should worry about who said it, because if they say a lot of stupid things, then maybe the next thing they say is stupid. But on the flip side,um, Stupid people might have good ideas every now and again, they don’t get listened to or reputation.

[00:18:18] Steve: It’s just

[00:18:18] Steve: an interesting quandary. I don’t know how you approach that.

[00:18:22] Cameron: Yeah, now look, I always try and separate the, research from the person, I mean, if there is, if a person does have a track record of spinning spurious bullshit, obviously you need to put a level of a bullshit filter over anything that they say, but, everything, particularly if it’s scientific research, is open to peer review and you should leave everything out there, but, I, I, I, I.

[00:18:48] Cameron: my initial reaction, when I, when I, when I tackle complex topics, we may have talked about this before, but, and I’m going, I’m going to do a show maybe next week. [00:19:00] I was supposed to do it yesterday, but I didn’t get around to it. I’m going to do a show on the voice referendum. Complex topic, a lot of, strong

[00:19:06] Cameron: views,

[00:19:08] Steve: The voice, you know what I thought when you said that. So why, I would know, why would Cam Reilly do an analysis of a terrible singing contest TV show? That’s

[00:19:17] Steve: all

[00:19:17] Steve: Oh, I thought you meant John Farnham. no. Who has lent, who has lent

[00:19:23] Steve: his?

[00:19:25] Cameron: my, my approach when there’s really, really complex topics that I can’t, I’m not an expert on and I can’t be an expert on and I’ll never be an expert on no matter how much time and effort I spend on it. I have a two step process that I always try and follow. The first is epistemology. I try and figure out, well, for this particular topic, how do we know what is likely to be true?

[00:19:47] Cameron: What, is it a scientific, it is a issue, is it social sciences? Is it philosophy? What is the domain this sits in and what is the process by which we normally determine what’s likely to be true in this [00:20:00] domain? Once I’ve worked that out, then I turn to my heuristics. Heuristics is, a shortcut to getting to, an answer.

[00:20:08] Cameron: And the shortcut for me is, well, who is a credible, independent body with expertise in this domain with a long track record of integrity and honesty and, and being right more often than they’re wrong. And then I’ll turn to that body to see what they’re saying about the particular topic. And that’s sort of, a quick way for me to try and figure out what I think is possibly likely to be true or likely to be going on.

[00:20:38] Cameron: And so that you can take any controversial. opinion or research and run it through those two filters relatively quickly and at least find out what the professionals, in a particular field think about something. But anyway, moving on from Dr. Chook, and his, credibility or lack thereof. Tell me about Pizza [00:21:00] Hut, Steve.

[00:21:01] Steve: Pizza Hut, Pizza Hut, Pizza Hacked. If you’ve, uh, ordered a Supreme or a Hawaiian, they had, a huge number of, because now you order online with your credit card, hacked again. It just, the thing that’s so interesting to me is that companies get hacked again and again. And if we ever wonder why security isn’t taken seriously, It’s simply because there are no serious consequences.

[00:21:26] Steve: In my view, there are no serious consequences. And right now, and up until now, yes, there’s been a lot of financial downside, but I don’t know that we’ve had any of those fear campaigns of watch out when We have autonomous cars because a hack could mean that you could get murdered by someone behind a keyboard who’s going to drive that car off a cliff or what have you.

[00:21:49] Steve: But it does seem as though hacking will never really be taken that seriously. I mean, we had the giant Optus hack. I don’t know that that much happened. Maybe the company gets a fine that it can afford, which [00:22:00] might be cheaper than actually, having a good administrative system, which can’t be hacked into.

[00:22:05] Steve: Of course, anything can be hacked into if someone tries hard enough. but I just don’t think it’ll ever be taken seriously in, in a corporate way,

[00:22:13] Steve: because I just don’t think there’s enough consequences.

[00:22:16] Cameron: Yeah. Well, it gets back to the old,seatbelt recall algorithm. Have we talked about this on this show?

[00:22:24] Cameron: The Fight Club seatbelt algorithm?

[00:22:26] Steve: not seen the formula.

[00:22:28] Cameron: Yeah. Tony and I, I think, talked about it on QAV not so long ago. great scene in Fight Club where Edward Norton’s character basically explains to Tyler Durden the formula.

[00:22:41] Cameron: He works as an insurance assessor. For a large automobile manufacturer, and he basically says, the amount of fines that they have to pay if somebody dies as a result of a seatbelt malfunction, if that number is lower than the cost of a recall, then they [00:23:00] just wear it. why would you

[00:23:01] Steve: Why would

[00:23:02] Cameron: the money

[00:23:02] Cameron: on a recall? Yeah, that’s capitalist

[00:23:05] Cameron: economics, you know.

[00:23:06] Steve: sure, and that used to be the case with Occupational Health and Safety, until they said that directors are responsible and can go to jail, and that’s

[00:23:13] Steve: when everything changed on OH& S.

[00:23:16] Steve: And until someone says

[00:23:17] Steve: that’s the

[00:23:18] Steve: risk of, a non financial risk which can hide inside a corporation, if it becomes a personal financial risk and or a personal, uh, criminal risk, then everything changes really

[00:23:28] Steve: quickly.

[00:23:29] Cameron: Jail time, just, should be jail time for,

[00:23:32] Cameron: um,

[00:23:32] Steve: be, it really should

[00:23:33] Steve: be.

[00:23:33] Cameron: breach, breach of your responsibilities

[00:23:36] Cameron: as a director. Same thing, I’ve been arguing for decades, I don’t know if you remember this from my old podcast many years ago, I’ve been arguing for a long time that politicians should have to sit a lie detector test. And if, if,

[00:23:50] Steve: I,

[00:23:51] Cameron: if they, if they make a campaign promise and then they get elected on the basis of that campaign promise, and then they don’t deliver on that campaign [00:24:00] promise, jail, automatic jail time, five years.

[00:24:03] Steve: I, I like the idea of delivering against a campaign promise. Um, the only, the only worry that I’d have about that is if a promise is made in an economic circumstance, and then by the time of the implementation of that the economic circumstance is so different that it wouldn’t be to the benefit of the populace.

[00:24:24] Steve: If that was implemented, then that could be a bad idea. Now, there would be certain promises, which I think. are capable regardless of what happens in the world and some that wouldn’t. And maybe there’d be different

[00:24:36] Steve: categories within that. Um, but I

[00:24:40] Cameron: Well, you just build it, you just build it, into the promise, Steve. You just say,

[00:24:43] Cameron: if we get elected, If these are the economic circumstances that we project, if they are true at the time, then we will deliver

[00:24:52] Cameron: X.

[00:24:52] Steve: I agree

[00:24:53] Cameron: It’s just, it’s just a way of stopping politicians

[00:24:56] Cameron: from making bullshit campaign [00:25:00] promises, because they are real personal consequences if they don’t deliver.

[00:25:04] Cameron: It’s simple. It’s just, don’t say it if you’re not gonna fucking do it. If you don’t, if you’re not 100 percent sure you’re gonna do it, don’t

[00:25:10] Cameron: fucking promise it.

[00:25:11] Cameron: Or put a lot of very clear

[00:25:13] Cameron: caveats around it in the first

[00:25:14] Cameron: place,

[00:25:15] Steve: not what you need. That’s the

[00:25:16] Cameron: Anyway, I think the most terrifying thing about this Pizza Hut hack is the possibility that, it will be made public that you eat Pizza Hut pizzas.

[00:25:28] Cameron: Like, if that got out

[00:25:29] Cameron: there,

[00:25:30] Steve: look, a lot of people don’t realise The hacking, the biggest issue is what people find out about you. I mean, that’s the issue there. And if you’re eating Pizza Hut pizza, we don’t want your ears on this podcast. I mean, we talked about that. We got it going way back, didn’t we, Cam? We said, if Pizza Hut gets hacked, what are we going to do?

[00:25:49] Steve: We’re going to have to be honest and say, if you’re part of that hack, don’t tune in, don’t share our stuff, don’t get involved. I mean, that’s, that’s, I mean, my last name’s Sammartino. That’s all I’m saying. [00:26:00] Pizza Hut on the

[00:26:01] Steve: team. Okay, the word pizza is, but put Hut on

[00:26:03] Steve: there.

[00:26:04] Steve: Change

[00:26:05] Cameron: it’s not even

[00:26:05] Cameron: food. Not, I tell you, it’s not any food.

[00:26:08] Steve: Exactly.

[00:26:09] Cameron: Anyway, moving right along. Tell me about, Alexa,

[00:26:12] Cameron: Steve.

[00:26:13] Steve: Yeah, I just, really simple story. So, Amazon had one of their announcements today. And part of that was the addition of a GPT and verbal command, ability with Amazon and with their Echo Hub, that it can answer questions. I can do all the things to chat GPT poetry and give big long answers and not just so much be a command and execute, verbal. voice recognition piece of software. So they’re putting a pre trained, you know, generated pre trained transformer inside it so they can actually answer a whole bunch of queries instead of just specific Amazon ecosystem queries and orders. which I think is kind of interesting and I think it’ll be the start.

[00:26:59] Steve: of the next [00:27:00] step of verbal commands and GPTs exiting the frame and actually entering a whole lot of devices that are in edge

[00:27:08] Steve: computing and IoT devices.

[00:27:12] Cameron: So, it’s a large language model built into Alexa that enables you to do what exactly?

[00:27:22] Steve: But let’s talk about things that you might do, uh, when

[00:27:25] Steve: you’re on chatGPT.

[00:27:27] Cameron: It can write you a poem or a story via

[00:27:31] Cameron: Alexa.

[00:27:32] Steve: It can answer, I’m assuming there’s a couple of assumptions in here, but I’m assuming it can give you large and complex answers of what it finds that are generated rather than reading from the web. Now I know that Siri is particularly bad at this because when you ask Siri any of these questions which are outside of the Apple ecosystem, it says here’s what I found on the web or here’s a Wikipedia page, it doesn’t even read it to you very often, half the time it doesn’t do that.

[00:27:57] Steve: But if it wasn’t… Able to give you [00:28:00] verbal answers of what you are looking for. It comes a little bit. Star Trek. Okay. Computer doesn’t it? Where it starts to actually give you some answers, which are, I’m gonna say leaping outside of the ecosystem. I mean, one of the other things that they did was relaunch the Amazon Echo Hub, which is kind of like a.

[00:28:18] Steve: A visual thing that was somewhat like the Facebook portal. Again, I don’t think that that’s going to work that well. Cause I really think yes, right to the microphone, like a, it’s like a portal, a little bit like the Facebook one that failed, uh, but to me, it just points out that the verbal side of commands is far more important than the visual interaction.

[00:28:38] Steve: But I just thought that that was an iteration that we’re probably going to see a lot of.

[00:28:43] Cameron: Like, where I expect or

[00:28:49] Cameron: Siri or the Google one, whatever that is, is the ability to actually do more complicated things as well. Like, I turn on my TV. I’ve got an [00:29:00] LG Smart TV. Its operating system is dumb as dog shit. It’s really frustratingly bad for 2023. Like, you would expect

[00:29:10] Cameron: You would expect a smart TV would actually have the smart component in there, but it, they really don’t.

[00:29:16] Cameron: Like, it can’t even, it doesn’t even have a, a, a, a portal of all the shows that I’ve watched. I’ve got all my streaming channels in there. You would expect it would have an, some sort of a home screen that would say, play the next episode of The Bear or Seinfeld or pick up on that film you’re watching.

[00:29:32] Cameron: None of that. You have to go into the things and figure out where you were, or what, what were we watching a week ago? Oh, yeah, that show, Only Murders in the Building. What fucking streaming channel is that on again? Oh, who can fucking remember? I have to go look it up again because I got 27 different channels and I don’t know which one, but you’ll be able to just go, or, hey, play me the last show that I was watching.

[00:29:53] Cameron: You’ll be able to talk to it. Hey, What shows were we watching last week that we haven’t watched for a while? What are the [00:30:00] shows that we were watching six months ago that we sort of stopped mid season? Oh, great. Can you play me the latest episode of that, please? And it’ll just take over. Or, hey, give me a new show to watch.

[00:30:11] Cameron: Based on everything that I’ve watched, more than one episode of in the last couple of years, give me a show that I’ll probably like, that has, An IMDB review of greater than 8. 5 or Rotten Tomatoes score greater than 85 percent or whatever it is. Have that kind of relationship with your entertainment device so it does a lot of that work for you.

[00:30:34] Cameron: I wanted to ask you though, in this article that you sent me, the little ad for the Amazon Fire Stick, the picture on the computer screen there of the TV, do you know what that’s a picture of?

[00:30:48] Steve: was a lot. I’ll have to have, I’ll have to click in and have a look.

[00:30:51] Steve: It’s going to be something weird now that you’ve mentioned it.

[00:30:55] Cameron: Nah, not weird, just cool.

[00:30:56] Steve: Uh, no,

[00:30:58] Cameron: that’s Antelope Canyon [00:31:00] in Arizona. I was there, in that canyon, about a year ago, walking through it, with Chrissy and Fox. It’s in, it’s on Navajo land, in, Arizona. Absolutely mind blowingly amazing. If you’re ever in Arizona, which got to say, if you spend much time in Arizona,

[00:31:20] Steve: no. I’ve spent zero, zero time there.

[00:31:24] Cameron: Arizona’s fucking amazing, man. Like landscape wise. Don’t go in summer because it’s like 50 degrees in the shade,

[00:31:30] Cameron: but,

[00:31:31] Steve: All

[00:31:31] Cameron: and we were there in summer last year, it was insane, but, absolutely beautiful, beautiful part of the United States. Anyway, that’s Antelope Canyon. Absolutely amazing.

[00:31:42] Steve: right,

[00:31:43] Cameron: moving, moving right along, Microsoft has finally announced CoPilot is coming.

[00:31:50] Cameron: it’s rolling out Windows starting September 26th, it’ll be in Windows 11,

[00:31:57] Steve: Wow.

[00:31:57] Cameron: then it’ll be across Bing, [00:32:00] Edge, and Microsoft 365 over the next couple of

[00:32:04] Cameron: months,

[00:32:05] Steve: I’ve seen the Copilot demo. Did you see the video demo?

[00:32:10] Cameron: I watched the videos in this announcement. Very underwhelmed I was. Very

[00:32:16] Cameron: underwhelmed.

[00:32:17] Steve: So the ones that I’ve seen that I think are pretty good in is for corporate output. So if you’re in a company and I’ll use the Microsoft Office suite, it’s very good at taking you from ground zero in the Office suite. I think that’s what it’s good at. you say, you know, create me a PowerPoint, which is based on this product launch for the company. Uh, looking at these things that we’ve done in the

[00:32:42] Steve: past, it’s good at that.

[00:32:43] Cameron: There was no videos that I could see in this announcement that had to do with Excel, which is the Microsoft product that I spend most time in. Actually, it’s the only Microsoft product I use, really. And really, I’m looking for something that will super power my Excel and integrate it into [00:33:00] Excel. As I’ve said on the show before, I use Excel all day, every day.

[00:33:04] Cameron: Not all day, but a lot of my day. And I use GPT a lot in Excel. And it’s… Pretty bloody good, but I expect them to integrate it into the whole thing to make it so I don’t have to ask GPT how to do something and then go to Excel and do it myself. I can just open, copile it in a workbook and say, hey, here’s what I want to do.

[00:33:28] Cameron: Can you do it for me and have it write the functions, write the macros, write the visual basic code and do it all for me? But there was nothing in this announcement, nothing in the videos in this announcement that suggested that that’s what’s coming out. So we might have to wait for that a little bit further.

[00:33:46] Cameron: They’re talking about how DALI 3’s gonna be in it. And there’s, yeah, help me respond to this email, help me do this PowerPoint, help me write this book. Yeah, it’s all kinda, eh, alright, not, not,

[00:33:58] Cameron: impressive.

[00:33:58] Steve: is the worst of the [00:34:00] image generation engines in my

[00:34:01] Steve: view. The only,

[00:34:02] Cameron: it’s

[00:34:03] Cameron: pretty bad.

[00:34:04] Steve: yeah, the only one that I like about Dali is that it enables you to highlight areas and say, change this to this and change it to that. I think it does it really well, but I think that stable diffusion and mid journey are way better.

[00:34:17] Cameron: Yeah, I use, Mid Journey almost

[00:34:19] Cameron: exclusively these

[00:34:20] Steve: I love your little mid journeys, uh, that you put

[00:34:22] Steve: on your Insta. The sci fi ones, they look

[00:34:24] Steve: great.

[00:34:26] Cameron: Well, I do them for every episode of

[00:34:27] Cameron: this. Yeah, The art

[00:34:28] Steve: Yeah. yeah, yeah, they’re really great.

[00:34:30] Cameron: too is, is Mid Journey, just to come up

[00:34:32] Cameron: with something interesting. Well, moving from Microsoft to Google, big announcements on BARD this week. And again, kind of overwhelming. I don’t know if you’ve had a play with this, but Google

[00:34:44] Cameron: announced.

[00:34:45] Steve: I’ve read it, but I haven’t played with it.

[00:34:47] Cameron: The latest version of BARD, their generative AI, came out and it says, BARD can now retrieve and help you work with real time info from maps, YouTube, [00:35:00] hotels and flights. You can pull together what you need across information sources and bring ideas to life easier and faster. These extensions are enabled by default and you can disable them at any time.

[00:35:13] Cameron: You can use BARD to interact with your information in Gmail, Docs and Drive, so you can find, summarize and answer questions across your personal content. Now, I’ve tested this a bit in Gmail and it was shit. It could find nothing and do nothing. So,

[00:35:34] Cameron: it,

[00:35:35] Steve: two things that really stand out to me on that, Cameron. The first one is, notice how, what Google’s doing here is trying to keep you in a closed ecosystem, and how far Google is from where it started, which was, we’re going to help you find the best things that you need, and point you to them wherever they happen to be.

[00:35:52] Steve: And now Google is about, oh, what we’re going to do is help you stay with us so you don’t have to go anywhere else. [00:36:00] You know, they’ve become a mall of the internet. And it’s really just trying to keep you in their ecosystem, whether it’s right, wrong, or the best, or not. I mean, for me, that’s really interesting because that’s what happens to almost all organizations.

[00:36:13] Steve: They stop doing things for their customers and start doing things to their customers. But if they did get it right, it’s kind of like what we’re talking about before. If you had a language user interface that you could verbally talk to and it could do all of the things that you need just by directing it around, if it did it well.

[00:36:31] Steve: Then I see that it would have utility regardless of the fact that it keeps you in the Google ecosystem. And no one has nailed that yet. There’s, there’s nothing that can actually help you navigate any ecosystem at all. Even in closed ecosystems like Apple and Google, none of them

[00:36:46] Steve: really work. And I just wonder why that

[00:36:48] Steve: is.

[00:36:49] Cameron: Yeah, the implementation of it doesn’t live up for, to sort of the promise, but I’ll give you an example. So I just went into BARD and I [00:37:00] wrote, find the first email I received from Steve Sammartino.

[00:37:04] Cameron: And it said, the first email from Steve Sammartino in

[00:37:08] Cameron: your inbox is titled, My Next World First Samatron’s

[00:37:12] Steve: dude.

[00:37:12] Cameron: Blog.

[00:37:14] Steve: The first email I sent was, Hey, Cameron from the podcast Network. And it would have been from Steve Sammartino

[00:37:20] Steve: at optusnet. com. au in 2004.

[00:37:23] Cameron: so then I, so it showed me that my next world first is your most recent email. it’s came out just today. So it’s saying the most recent one is the first. I said, okay, find me the oldest email I ever received from Steve Sammartino. And it’s pulled up a bunch of emails from August, July and June, 2023. Now, if I go, if I go into Gmail and do this

[00:37:49] Cameron: manually.

[00:37:50] Steve: and do a date.

[00:37:51] Cameron: says,

[00:37:53] Cameron: it says, 200 of many. Okay. 244. My first email from you is [00:38:00] actually,

[00:38:01] Steve: When was

[00:38:01] Cameron: that’s in Gmail. 15th of January,

[00:38:05] Cameron: 2008.

[00:38:05] Steve: Oh, there you go.

[00:38:06] Cameron: now you probably would have sent me emails before

[00:38:08] Cameron: that, but it’s, nice photo in the age today, by the way, I’ve had a lot of positive feedback from the podcast we did. It has generated some good results for Rentoid and I really appreciate it.

[00:38:18] Cameron: Cheers, Steve. So maybe that was not long after we met, because not long after we met, I had you on the show.

[00:38:25] Steve: Yeah, it might, maybe it was 2007 or

[00:38:26] Steve: something, I think, would have been.

[00:38:30] Cameron: Actually, yeah, I’ve got, if I drill back further, I’ve got a, like, oh, Stephen Sammartino with a P H

[00:38:37] Cameron: E N,

[00:38:38] Steve: Wow, that’s I

[00:38:42] Cameron: of October 2007, oh, you forwarded an email to me from somebody at Talking Business. But anyway, there you go. so there are emails in there, but it’s not finding them when I ask to find old emails.

[00:38:55] Cameron: So again, implementation like Microsoft’s implementation, a [00:39:00] little bit lackluster, a little bit underwhelming, but I’m sure it will get better.

[00:39:06] Steve: don’t remember your name, man. Listen, like, just clicking on one of the buttons going sort by oldest to newest and getting a fucking better result than AI. Like, seriously? Like, let’s just, let’s just consider that for a mere moment. We’ve got this AI, world changing AI, and a manual click. on oldest versus

[00:39:26] Steve: newest gives a better result.

[00:39:30] Cameron: And I’ve, and look, I’ve seen a lot of people in Reddit, when these announcements have come out going, I don’t want this. I don’t want AI snooping on my information and all that kind of stuff. And. I know that you and I have mentioned this before, like, everyone’s got all your information anyway, NSA, the Five Eyes, everyone’s got everything anyway, I’m not sure exactly, oh, Microsoft’s already reading your emails, Google’s already reading your emails, looking at your calendar, I’m not sure exactly what you’re worried about, but for me, like, to have [00:40:00] our emails get to a point where everything’s okay.

[00:40:02] Cameron: I can speak to it and say, Hey, delete any emails that come in with these words, like with unsubscribe in the, footer. if, if Sammartino sends me a new email, send a text alert to my watch. Because, I want to get it quickly and I don’t check my email that often during the day, usually, just to have that linguistic user interface

[00:40:26] Cameron: on email will be

[00:40:27] Steve: You had that?

[00:40:28] Cameron: it working.

[00:40:28] Steve: If anyone had a plug in where you can actually, uh, have a look at emails between 2008 and 2011 where I mentioned 3D printing, what it like, like that would be crazy because you We know intuitively in our own mental database of certain realms or ideas where we might be able to just pick out one point or idea or person or timeframe, which can enable us to do sorting that is very personalized, that would be different for each person.

[00:40:57] Steve: But the ability to do that would create extraordinary [00:41:00] value, especially when, and I imagine you operate the same way I do, I operate on an infinite cloud. I don’t delete any emails anytime ever. I just leave them all there and I just use them as a searching database of

[00:41:12] Steve: all my digital communications.

[00:41:15] Cameron: Yeah. That’s my, that’s my backup memory bank.

[00:41:18] Steve: Yeah. Are

[00:41:20] Cameron: Speaking of Google, Alphabet’s Google has given a small group of companies access to an early version of Gemini, its conversational artificial intelligence software, the information reported on Thursday. Citing people familiar with the matter, Gemini is intended to compete with OpenAI’s GPT 4 model, according to the report.

[00:41:40] Cameron: So, expect to see that becoming gradually available. That’s their actual competitor to GPT 4. I think BART is more 3. 5 Turbo, so, Gemini should be coming out soon, by the sounds of it. And hopefully they’ll be able to find your emails a little bit more effectively. Speaking of [00:42:00] GPT, Bing is back on GPT.

[00:42:02] Cameron: Search with Bing? Is back today, last night actually, they’ve figured out, they’ve figured out or fixed whatever issues they had with that over the last month or two. It’s been out of commission

[00:42:16] Cameron: and you know,

[00:42:18] Cameron: I’m, I’m hoping

[00:42:19] Steve: in ChatGPT or in the Edge browser or in Bing?

[00:42:23] Cameron: in

[00:42:23] Cameron: It’s in GPT. It’s built into GPT. Browse with Bing?

[00:42:26] Cameron: as an option. It was there a few months ago for not very long. Then they disabled it and now it’s back again. And that’s good because I’ve had varying success with the web browsing plugins. Yesterday I was pointing GPT using one of its web browsing plugins to a page on One of my websites where I had a bunch of notes and I was asking it to read that page and create a prompt based on my writing style that I could use [00:43:00] it to create future podcast notes for my Cold War show.

[00:43:03] Cameron: And it kept telling me it couldn’t read the webpage. Sorry, either the server’s down or it must be blocking bots like me. And so I opened it in Claude, another generative AI. Claude could read it just fine. So I went back to GPT and said, Hey, it’s, it’s you. It’s not, Claude can read it. And it was like, well, that’s embarrassing.

[00:43:24] Cameron: This is awkward. but then I used a different plugin, web browsing plugin on GPT and it could read it fine. This is before Bing came back. So I haven’t tested it with Bing yet, but I’m hoping it’s, does a better job.

[00:43:38] Steve: Yeah, I

[00:43:39] Cameron: But being

[00:43:39] Cameron: Microsoft, I don’t

[00:43:40] Steve: the information

[00:43:41] Cameron: hopes.

[00:43:42] Steve: Search, the integration of GPTs and search has been, I think, overstated and they’ve sort of been on off the ability one or the other. No one’s really successfully put them together yet, I don’t think, where it’s kind of taking, I don’t know how possible that is in the tech stack [00:44:00] to separate search versus,uh, generative pre trained answer or to combine them in some way.

[00:44:05] Steve: It feels like. It almost like needs to be a switch where it chooses one or the other if, like if the query, let’s imagine you have a query and inside that, whether you’re in Microsoft Edge or Bing or whether you’re in Google and it goes to BARD, whether or not it chooses. Whether A G P T will give you a better answer than a search.

[00:44:24] Steve: I dunno if a, that would be the right way, but imagine if it could combine answers on both, where it’s sort of quasi search results and a little bit of a search engine, you know, information panel or something like that that brings both up. I dunno that anyone’s really nailed

[00:44:41] Steve: that yet. It just seems really binary at the moment.

[00:44:44] Cameron: Yeah, and I’m in GPT right now and I’m looking for Bing as

[00:44:50] Cameron: an option

[00:44:51] Steve: I just don’t see it. Because I’m in it right now and I just pressed a refresh and I’ve got the subscription model. And mine,

[00:44:58] Cameron: me too.

[00:44:59] Steve: [00:45:00] all mine has is the classic grey screen where I can choose between 4. It’s got a couple of prompts down the bottom and then a little bit of a disclaimer and then the panel on the left.

[00:45:10] Steve: So I don’t know, unless it’s in the Edge browser, which I use sometimes. I’ll just click on the Edge browser and see what that gives me.

[00:45:17] Steve: Anyway.

[00:45:18] Cameron: well that’s sort of cheating, right? That’s not GPT getting access to

[00:45:22] Cameron: Bing. That’s

[00:45:23] Steve: Yeah, but it does have it here in the Edge browser,

[00:45:24] Steve: but…

[00:45:25] Cameron: Edge Bing getting access to GPT. It goes the other

[00:45:27] Cameron: way.

[00:45:28] Steve: Yeah, if it doesn’t, mine doesn’t have it. Maybe it’s in the US. That’s one thing that I’ve noticed too, is a lot of the updates, like when they launched the ChatGPT app, it wasn’t available in Australia. I haven’t downloaded it. I don’t know if it is

[00:45:41] Steve: available yet, but it certainly wasn’t at launch.

[00:45:45] Cameron: Yeah, it might be a

[00:45:45] Cameron: slowly rolling out thing. Anyway, moving right along. Coca Cola is pulling a Steve Sammatino. They’ve created a futuristic trademark soft drink using artificial [00:46:00]intelligence. Coca Cola is taking a bite out of the AI hype.

[00:46:04] Cameron: Oh, the person who wrote

[00:46:05] Steve: Oh, please tell me.

[00:46:07] Cameron: T E. Oh,

[00:46:10] Cameron: nice. With its new. Coca Cola Y3000 Zero Sugar, a limited edition AI powered beverage that imagines what the future tastes and

[00:46:23] Cameron: feels like. The new drinks, co created with Artificial Intelligence, marries AI generated insights with how fans envision the future through emotions, aspirations, colors, flavors, and more.

[00:46:39] Cameron: The company’s researchers began by having conversations with consumers about their preferences, trying to figure out what the future tastes like. Afterwards, all these insights were funneled into the company’s proprietary artificial intelligence

[00:46:52] Cameron: system

[00:46:53] Steve: this. It’s Billy from Marketing. It’s Billy from Marketing.

[00:46:55] Cameron: up the soda’s one of a kind [00:47:00] flavor. The beverage’s futuristic name hints at tasting notes from the year 3000, nearly a millennium away, it puts in brackets, just in case.

[00:47:08] Cameron: You weren’t sure how far away the year 3000 was. While its marketing draws inspiration from AI generated visuals of 2023, which is created by the image synthesis model known as stable diffusion. Like, fuck me if this wasn’t written by Coca Cola’s PR team.

[00:47:25] Steve: This is great for a whole lot of reasons. It’s great because it’s fucking hilarious to read it. But I love it. You know what I like? I like it that a consumer brand has said we’re just going to lean into some of this stuff that’s happening. Wrap ourselves in the blanket of what is now, which is AI, and just generate something.

[00:47:45] Cameron: I actually like it. I actually can respect the fact that I’ve done it because let’s be honest about it. Tech companies do this all day, every fucking day, giving us promises of things that are AI or whatever, when they’re not. And, and, and, you know, there’s someone behind the [00:48:00]curtain,well, the

[00:48:00] Cameron: thing I like about this is the can actually has on it, co created with artificial intelligence. And to me… This is a collector’s edition, I’m getting one of these, because this reminds me of, I, I can remember the actual moment when I first saw a poster on the side of a bridge in Melbourne that had a URL on it for the very first time.

[00:48:27] Steve: yes,

[00:48:28] Cameron: this was

[00:48:29] Cameron: circa,

[00:48:30] Steve: ttp. collincollin. ww,

[00:48:33] Steve: right? Remember it used to have all that?

[00:48:36] Cameron: yeah, all of that. This is circa, 2000, 1995 ish, right? 5 6, and I was working for Ozemail at the time, and I remember saying to my then wife, as I saw her as we were driving through Melbourne, Holy shit, look at that. It’s starting to happen because I’ve been running around [00:49:00]Melbourne trying to explain what the internet was to people, to IT managers or MIS managers as we called them back then, Management Information Systems Managers, and I had people telling me There’s no reason why our company will ever need email, we’ve got perfectly good telephones, thank you very much.

[00:49:20] Cameron: we, we’ll never need the internet, we, everything that we need, we have in our, our own local area network, all this kind of bullshit.

[00:49:28] Steve: Love of land, love of land, on our dumb

[00:49:30] Steve: terminals, on our

[00:49:32] Cameron: Yeah. And the first time I saw a URL, it was a big fucking deal, and I think this is the AI equivalent of that, when not tech industry, but consumer marketing starts to just throw fucking artificial intelligence in there just because they can. I mean, we all know that AI played next to zero fucking role in the development of this or the marketing of this, [00:50:00] but,

[00:50:00] Cameron: uh,

[00:50:01] Steve: that’s what

[00:50:01] Cameron: marketing, right?

[00:50:02] Cameron: It’s cool. It’s cool. I think it’s a milestone of what is to get like a year from now Every fucking product will have created with AI on it every single

[00:50:13] Cameron: consumer product.

[00:50:14] Steve: Don’t go too deep yet, because the technology time warp,

[00:50:18] Steve: we’re going to explore this further, Cameron.

[00:50:21] Cameron: I don’t get ahead of

[00:50:22] Cameron: myself All right. Well

[00:50:25] Cameron: speaking of things that legitimately

[00:50:26] Cameron: work

[00:50:27] Steve: please buy two cans because I want one

[00:50:30] Cameron: Okay. Things that were developed, have been developed by AI is, Google AI Breakthrough represents huge step in finding genes that cause diseases, scientists say. This is in the independent UK. Says, Google says it has made a significant step in identifying disease causing genes, which could help spot rare genetic disorders.

[00:50:55] Cameron: Now, listen to this. This is. I think amazing. A new model named [00:51:00]AlphaMissense is able to confidently classify 89 percent of all possible missense variants in genes, identifying whether they are likely to cause diseases or are benign. That compares with just 0. 1 percent of all missense variants that have been confidently classified by human experts.

[00:51:26] Cameron: So, Human experts have managed to classify up until this point 0.1% of a Missense mutation for people that don’t have a degree in d in genetics like I do,

[00:51:41] Cameron: uh, is a,

[00:51:42] Steve: what that is right now. Yes.

[00:51:45] Cameron: a Missense,

[00:51:46] Cameron: mutation is a, is a point mutation in a nucleotide, a single nucleotide change in, a codon that codes for different amino acids.

[00:51:57] Cameron: Right now you get a lot of these. in your [00:52:00] DNA. There’s always copying errors that happen, mutations that happen. Most of them are meaningless. Most of them do nothing. You don’t have to worry about it, but some of them can cause diseases. Now, as humans, we’ve been going through and trying to, as humans, human researchers have been going through and trying to figure out what those are so we can test for them.

[00:52:20] Cameron: We reckon we found 0. 1 percent of them up until this point. Google’s AlphaMissense has just, they claim, has found 89 percent of all possible

[00:52:32] Cameron: variations.

[00:52:33] Steve: race,

[00:52:34] Cameron: Very

[00:52:35] Cameron: close. It’s,

[00:52:35] Steve: close. mate. Look, a lot of people don’t realize how close some races are, Cameron. And I feel like 0. 189 doesn’t get the recognition it deserves as a close game. It’s

[00:52:47] Cameron: it’s like, uh, I don’t know what Greeks and Italians. Is that what you’re saying? It’s that

[00:52:52] Cameron: close.

[00:52:53] Cameron: Close

[00:52:54] Steve: It’s like bricks and very, very close. very close.

[00:52:58] Cameron: So like [00:53:00] that’s, this is fantastic. And a new system that they’re using, Alpha Missense, was built on Alpha Fold, which was their AI model that helped unfold proteins. Like, this isn’t generative AI, obviously, this is a different kind of AI, but this is expert system AI that has been, we’ve been building this for a long time, but this is what I’ve been saying since we started this show, like, things are happening at an exponential level.

[00:53:25] Cameron: Progress is happening at an exponential level at the moment, largely, I think, because of The power of chipsets, like NVIDIA’s new chipsets, and this is what Kurzweil was saying in that talk that I talked about a few weeks ago. All of these things like generative AI, generative AIs, LLMs weren’t possible until we had the computing power that we just started to get in the last few years.

[00:53:51] Cameron: These things probably weren’t possible. This is Google’s DeepMind division that’s doing this. The, as, as We hit [00:54:00] the exponential part of the curve with computational power. These things start to become possible and it’s just gonna get better. Like Moore’s law isn’t slowing down. When you look at the different ways of stacking these things together, it’s just going to continue to double every couple of years, the, our ability to crunch computation per dollar.

[00:54:23] Cameron: And these things that imagine if you can just, do a blood test. And it’s going to look at your DNA and go, okay, here are all of the diseases that you’re likely to develop in the next 30, 40 years. We’re going to start building custom medications for you right

[00:54:41] Cameron: now.

[00:54:42] Steve: that, I mean.

[00:54:43] Cameron: Go. They’ll be available for you when you start showing the

[00:54:45] Cameron: symptoms.

[00:54:46] Steve: Yeah, I mean. the population crisis then becomes a whole different argument where people say we’re not having enough babies. Well, you know, maybe nature’s finding a way given that people could live to 150. Who knows? I mean, do you die of old age? Is that, [00:55:00] this is separate, but do you die of old age? Or does old age give you some more things you can die

[00:55:05] Steve: of?

[00:55:07] Cameron: Well, that leads straight into my next and last story, Steve. I’m glad that you brought that up. So… My last story is about digital necromancy. This was in the conversation. Digital necromancy, why bringing people back from the dead with AI, is just an extension of our grieving practices. This is by a bunch of sociologists from the University of Liverpool.

[00:55:35] Cameron: Generative AI, which encompasses large language models like Chachapiti, but also image and video generators like DALI 2, supercharges what has come to be known as digital necromancy. The conjuring of the dead from the digital traces they leave behind. goes on to say that in the 2010s there were advances in video projection leading to the reanimation of Bruce Lee, Michael Jackson [00:56:00] and Tupac Shakur.

[00:56:01] Cameron: It also led to posthumous film appearances by Carrie Fisher and Peter Cushing, among others. Now I just read a book recently Steve, I finished it like a week ago, called Scythe. C Y t h e by Neil

[00:56:16] Cameron: Schusterman,

[00:56:17] Steve: S, S, how do you spell that?

[00:56:20] Cameron: ss c y t

[00:56:22] Cameron: h e, as in a big cutting sword. Si like they used to cut grain with s sife. Now it’s a, it’s a young adult novel and I, I bought it because I asked G P T for a recommendation for books about.

[00:56:38] Cameron: AI, like sci fi novels about AI. This is one of the ones that I recommend. It had good reviews and I read it and it was only after I started reading it that I realized it’s a young adult novel. It’s the first of a trilogy. I’m probably not going to read the rest of the trilogy, but the book is set in. The not too distant future, like 150 years from now, [00:57:00] and it’s in a world where there is a superintelligence, which they call the Thunderhead.

[00:57:07] Cameron: And it basically runs everything, runs all of human society, and humans are effectively immortal. if, if you die by some sort of an accident, it just brings you back, it can, we can fix everything, fix out your, and they have a backup of all of your thoughts and

[00:57:23] Cameron: memories and

[00:57:24] Cameron: all of that kind of

[00:57:24] Steve: Clouds sort of constantly upping the clouds in that

[00:57:27] Steve: space and then…

[00:57:28] Cameron: Everything’s captured, everything’s stored in the cloud, it’s watching everything, it sees everything, and it’s a benign superintelligence. it doesn’t allow anyone to come to harm if it can, avoid it. Like, there’s no crime, there’s no murder, there’s no mortality, all diseases are cured, everything’s perfect.

[00:57:45] Cameron: People get along with their lives, they still live lives and have children and do everything, but everyone’s effectively immortal. There is a group of people that are called the Scythes. Created by humanity, and their job is to [00:58:00] go around and kill people. Properly, kill people

[00:58:04] Cameron: permanently.

[00:58:05] Steve: Okay, questions.

[00:58:07] Cameron: when when somebody, we’ll get to it.

[00:58:09] Cameron: Let me keep explaining this. So when somebody dies by accident or deliberately, some people commit suicide just for shits and giggles. There’s like teenagers that jump off of a 30 story building and go splat just for the fun, the thrill of the falling and the instant of pain. And then they get revived and a few days later they’re back and they go, yeah, that was fun.

[00:58:31] Cameron: but they call that being deadish. If anything like that happens to you, you’re called deadish because you’re not really dead, you’re just deadish because they can bring you

[00:58:40] Cameron: back.

[00:58:41] Steve: Really, really interesting.

[00:58:43] Cameron: But to deal with

[00:58:44] Cameron: overpopulation, the scythes go around and kill people for good by permanent death and they do it and that they have complete free reign to do it however they want and [00:59:00]they, They’re outside of the Thunderhead, it has no, there’s no, oversight of them. And, they’re, they’re, they’re also effectively immortal, unless they take their own lives, which they are free to do.

[00:59:12] Cameron: But the way that they do it typically is they look at the statistics. of how people died pre the Thunderhead, in the age of mortality they call it, they’ll look at how people died from the statistics and then they’ll find somebody, like one of the first person who gets killed by a scythe in the book is a 20 year old young guy, just gets a car, got his driver’s license, and they go well that’s the typical demographic of people who used to die.

[00:59:38] Cameron: So a scythe turns up at his, actually he’s young, he’s 16, I think he’s in high school, goes, says sorry, you gotta go, and kills him. Because you probably would have died, in the, in the old world, so, so. But, here’s my point, when people are killed by scythes, their family and friends can still… communicate with them in the Thunderhead [01:00:00] because their personality, their memories, everything they ever said is recorded.

[01:00:05] Cameron: Their personalities can be reconstructed and you can have a eternal, relationship with that person as they were at that time. No one’s ever really dead dead, even if they are dead, because they still exist virtually. That, I think, is a realistic sort of world that we’re

[01:00:25] Cameron: heading into.

[01:00:26] Steve: sort of have it already. I can remember in 2000, I want to say 14, I read in Wired Magazine of a developer whose dad was dying of cancer and he spent months and months interviewing him and created a bespoke chatbot which wasn’t released for the public where he could speak with his dad about particular issues.

[01:00:46] Steve: You might remember it and that was well before LLMs, but you would. Assume that even with some of the things like replica AI, which have the fake girlfriends and all that, that we spoke about in a previous episode, that you could potentially, [01:01:00] instead of having a designer person, you could put Cam Riley in there with all these political proclivities you know, history and the way that you speak and all of those things, and especially someone like you who has An incredibly deep, you know, biological footprint of your words and your thoughts and published content. Uh, it would be a pretty easy thing to do. You would imagine that we’re not that far away from being able to have an ongoing relationship with your dad. Or your mum and then the question you get all of these philosophical questions is, okay, do I want my dad to remain in a form of stasis where the personality doesn’t change or evolve from the moment that they passed?

[01:01:43] Steve: Or do you want to be able to teach this new AI and evolve with your

[01:01:49] Steve: what do you call it, necro, how do you pronounce it?

[01:01:53] Cameron: Necromancy.

[01:01:54] Steve: necromancy, would it be the new version of your father or your mother that actually knows about new [01:02:00] technology in the new world so you can get their input and interpretation about what’s happening in the world and have a current day modern conversation with them?

[01:02:06] Steve: It becomes a really interesting idea and I imagine will become a big business at some point in the future. Where, have ongoing and permanent relationships with people that you once knew. Or even weirder, can get weirder, where maybe you split up with a girlfriend who you really loved who said, uh, it’s, it’s not, it’s not you, it’s me, but didn’t really mean it.

[01:02:33] Steve: And then you’ve got enough data points to then create an ongoing stalker relationship with the AI version of the girl

[01:02:41] Steve: who dumped You Wow.

[01:02:43] Cameron: I think, like, my dad died, uh, 22 years ago, and I’ve got barely any recordings of him or video of him, because that was the era that we lived in, right? But I think about, Carl Sagan,

[01:02:57] Steve: Right. Yep.

[01:02:58] Cameron: But, if you took, [01:03:00] if you got an AI to go over Cosmos… And all of the YouTube interviews with Carl Sagan and all of his books and interviews with his ex wives or widow about him.

[01:03:13] Cameron: And you could probably recreate a fairly good facsimile of Carl Sagan and people of that ilk. People, Orson Welles, Christopher Hitchens. These people who wrote and said a

[01:03:31] Steve: Sure. Absolutely.

[01:03:33] Cameron: And then recreate them. And certainly, in my case, I look, I, I don’t plan on dying. I plan on being immortal, but if,some angry Christian should catch up with me, before that happens, or just angry American patriot.

[01:03:48] Cameron: there are thousands of hours of podcasts of me talking out there, plus the books that I’ve written and the blog posts and the emails and all that kind of stuff. I think an [01:04:00] AI would be able to do a pretty good job of, of creating me, recreating a

[01:04:06] Cameron: fairly good

[01:04:06] Cameron: facsimile version of me.

[01:04:07] Steve: creating the persona that exists and maybe even if it became really smart, understand how your opinion and viewpoints evolved over time and maybe change based on and understand the inputs that you had through what you read and what you did and how that changed personality and then put in new inputs in your post. Uh, in your posthumous error, and go, Cam probably would have evolved to this way or to that way. Or maybe if you had someone you loved but didn’t like, you can say, you can change the gears and go, just make them a little bit more left leaning, and I really did like the person, but I hated their political opinion, so you could, you could maybe change the gears, or, or have five different versions of that

[01:04:50] Steve: one person to put you in, right?

[01:04:53] Cameron: well that’s what I was going to say, you could have the, the actual version of me at the time of my demise and then the [01:05:00] extrapolated version of what I would have been like 20 years later and

[01:05:02] Cameron: have them debate each

[01:05:04] Steve: Wow, that would be interesting,

[01:05:05] Steve: right?

[01:05:07] Cameron: But think about John Lennon, bringing back a version of John Lennon. Lot of interviews, all of his, songs, all of McCartney’s recollections about John, Yoko’s

[01:05:20] Cameron: recollections about

[01:05:21] Steve: You would have some great things to say about big tech and the state of the economy and, you know, crony capitalism, you know, in fact, that would be one of the things that I would most love is to get views of people like Carl Sagan or John Lennon, you know, thinkers and artists and so I wonder what they would say now or, uh, you know, George Carlin, like some of the comedians who really had a nice political proclivity

[01:05:44] Steve: to some of the ideas that they had,

[01:05:47] Cameron: Think about, I mean, some of my favorite artists, Bowie, Lou Reed, Leonard Cohen. These are guys that were with me my entire adult life. I got into them when I was probably 17, [01:06:00] 18, deeply, and they stuck with me for my entire adult life, all passed away in the last 10 years. And I miss them dearly, like the idea that I’m not going to get a new Bowie album, or a new Lou Reed album, or a new Leonard Cohen album, or a new Beastie Boys album, or, whatever it is.

[01:06:16] Cameron: These are, Deeply saddening to me because these people were my cultural touchstones that helped me deal with the world and appreciate art. Brett Whiteley is an artist even. To be able to maybe create a digital version of Lou Reed, again, a lot of interviews, a lot of written works about him, all of his songs, all that kind of stuff.

[01:06:44] Cameron: Have a new song. Written by Lou Reed and sung by Lou Reed. a new Bowie album, a new Prince album, a new Michael

[01:06:51] Cameron: Jackson

[01:06:52] Steve: right? We’ve kind of got that now, right? You know, with the fake Drake and a few others that sound as they would sound [01:07:00] and have the certain styles. So that’s kind of,

[01:07:03] Steve: that’s here to an extent already. You know,

[01:07:06] Cameron: Yeah, the faking of voice is

[01:07:07] Cameron: different from creating a digital version of Lou that thinks like Lou, that can write a brand new song as Lou would have written it.

[01:07:17] Cameron: And that’s, that’s a

[01:07:17] Steve: so far what you’ve got is, I mean, Ghostwriter is the most famous one who’s done a number of versions of songs, which have the type of lyrics that the person would create and AI creates those, uh, the type of music style and the voice. So they’re all facsimiles that are put together to create a new song.

[01:07:34] Steve: But then if you put it into an AI, which actually has the persona and then has that. Learning forward like we’ve just described, like it’s kind of what, what would Bowie be thinking about the world now as it is now, and what might Bowie be singing about or the Beastie Boys or or whoever that, that is really cool.

[01:07:50] Steve: That’s sort of different to what the ais are doing now, but I would have to think that that’s coming. Like I don’t think that that’s impossible, and all you need to do is layer in. [01:08:00]The current zeitgeist or political or economic or social issues with the style and the type of thinking that that artist has.

[01:08:07] Steve: And, and, and for me, that’s really exciting. Like I’d love to have again, some albums of bands that just aren’t around anymore because the, you know, they’ve passed away or, or have split up and just have an AI version of that. And I just wonder if, uh, because what I think we’ve spoken about before is that I think the bio API is going to be big in the future.

[01:08:27] Steve: Where your voice or your ideas, you can license it out. And I think it would be cool to see some of this stuff licensed out to AIs creating things.

[01:08:36] Cameron: A new Beatles album, a new Zeppelin album, a new Elvis album, a new Doors

[01:08:42] Cameron: album.

[01:08:42] Steve: Yeah, exactly. It would be extraordinary.

[01:08:46] Cameron: Possibly. Or it would suck.

[01:08:48] Cameron: It could go either

[01:08:49] Steve: But you know what? Let’s explore it and find out.

[01:08:53] Cameron: Talk to me about the Jetsons, Steve. This is the technology time warp.

[01:08:56] Steve: Yeah, so for me, uh, [01:09:00] I just thought when we heard about the AI drink that Coca Cola had launched, it just reminded me about how consumer centric the future was in the past. So whenever we saw anything from the past, it was… About products which were from the industrial world, it was white goods, it was production lines, even George Jetson, when he woke up in the morning, he like slipped onto a production line, which got him prepared for work, it helped him get dressed, it showered him, then he ate breakfast and then he brushed his teeth.

[01:09:35] Steve: And all of the products that they had from Rosie the Robot were all like TV industrial complex products. If we even think about Back to the Future, that was set in 2016. It was made in, I think, 89 or 90, but it was set in 2016. And that was also very, I’m going to say, TV and food based. It was the automated Nike sneakers, which wrapped themselves up and [01:10:00]wrapped around your foot and had the air, you know, Nike air pump up.

[01:10:03] Steve: It was food growing in the kitchen that you just picked off. From, from, uh, from above. And it was the instant food that went into that futuristic oven, which just sort of exploded out the food, irradiated or hydrated the food. And I think that that Coke move is really interesting because everything that we’ve got in the future at the moment is very, it’s very digital and distribution and screen based and virtualized.

[01:10:30] Steve: It feels like in this post factory era, the physical world, other than automated cars. It’s been totally ignored, and consumer goods haven’t really changed almost at all in the post war era. I mean, shampoo is shampoo, everything is, except for the barcode. The only futuristic thing you ever have, and barcodes were the original IOT device, Internet of Things, was a

[01:10:54] Steve: barcode, right?

[01:10:56] Steve: But they haven’t really…

[01:10:57] Cameron: You, you mean outside of iPhones and [01:11:00] AirPods and Apple Watches and tech

[01:11:04] Cameron: devices?

[01:11:05] Steve: outside of screen world,

[01:11:07] Steve: right?

[01:11:08] Cameron: Yeah, right?

[01:11:09] Steve: Everything that is consumer goods and wide goods, and they haven’t really changed that much. But it was just so interesting in the time warp, because we were in the production complex, everything was about the factory and efficiency and consumer goods and consumption in that way.

[01:11:25] Steve: All of the things from the future, whether it was in the Jetsons or whether it was in Back to the Future, which was set in 2016, was all consumer goods. And I just thought it was so interesting that that Coke had jumped on the AI bandwagon and sort of moved towards that. I just wonder if we’re going to see a little bit more of that.

[01:11:44] Steve: I just thought that was an interesting time. What, when you look to anything from the past, which represents the future, it’s, it’s just an extension of what you have today. And so often we end up on a different fork or trajectory, which is kind of where we ended up in the tech [01:12:00]world. Other than the, the TV and the, I’m going to use an old fashioned word here, ICT.

[01:12:04] Steve: Yeah, information, communications, technologies,

[01:12:07] Steve: kind of where it all, all went.

[01:12:08] Cameron: Yeah, that’s a really interesting point you make, and I hadn’t thought about that much before. Like, the, the level of innovation outside of ICT in the last 70 years has not really… Kept up with the

[01:12:25] Cameron: Jetsons.

[01:12:26] Steve: just zero. It’s nothing. There’s no innovation at all. And I’ve

[01:12:29] Steve: worked in that industry.

[01:12:32] Cameron: Yeah. I mean the focus for the last 30 years has not been Innovation in that sense. It’s been how do we make it cheaper, more disposable? How do we get them to make it in China

[01:12:43] Cameron: rather than in the West?

[01:12:45] Steve: been geopolitically and profit focused, not consumer focused. So it’s. You know, it actually comes back to the old idea of, uh, you know, economic rationalism and Friedman, which is, you know, maximize shareholder

[01:12:57] Steve: wealth at all costs,

[01:12:59] Cameron: [01:13:00] Which brings me to It’s not on the notes, but the book I started reading this week Walter Isaacson’s new biography

[01:13:07] Cameron: on Elon

[01:13:07] Cameron: Musk.

[01:13:08] Steve: which I’m not going to read,

[01:13:09] Steve: but anyway,

[01:13:10] Cameron: Man, you need

[01:13:12] Cameron: to read it.

[01:13:13] Cameron: You need to read it.

[01:13:15] Cameron: Oh, yeah.

[01:13:16] Steve: I’ve read like a number of really bad reviews saying that it was, he didn’t ask any hard questions and it was all just like a fanboy kind of documentation is what I’ve heard. Now, I haven’t read it,

[01:13:28] Cameron: Look, I, I think Isaacson is a good, pretty good biographer. Like he, his biographies of, of Steve Jobs, of Leonardo da Vinci, of Einstein, I’ve read. I’ve enjoyed them all. He spent Two years on and off with Musk, following him around, interviewing everyone in his life. He doesn’t paint a rosy picture of Musk at all.

[01:13:49] Cameron: Musk comes across, I mean, his ex wives and girlfriends refer to him as a man baby or a man child. I mean, he’s got it pretty fucked up.[01:14:00] childhood, his father is a psychopath.

[01:14:04] Cameron: Sorry, what? You cut

[01:14:05] Steve: don’t we all?

[01:14:07] Cameron: Well, yes, to

[01:14:08] Cameron: varying degrees, but his is, pretty bad. It also points out that he like, there’s a lot of talk about it.

[01:14:15] Cameron: Well, he’s not really a genius. Yeah, he

[01:14:17] Cameron: fucking is. He’s a genius

[01:14:18] Steve: buy the genius thing, but I also don’t like his behavior in certain ways, which we’ve spoken about before. That’s, I think those two go together. But I buy the genius. Because even if you did technology, the ability to bring them to market and finance them and

[01:14:33] Steve: market them and make them a reality, that’s genius as well.

[01:14:36] Cameron: But when he was 14, he was writing computer games and selling them to computer magazines. When he was 14, like, fuck you. I mean, anyone who can do that is, that’s pretty good. But then, there’s, it talks about how he started X. com, which then merged with PayPal. He got kicked out of his business before that.

[01:14:57] Cameron: He got kicked out of PayPal. [01:15:00] then when PayPal was sold, and he had his 250 million that he took out of it, his idea that he’d… Like, since he, according to his childhood friends that Isaacson interviews, he, since he was a teenager, his obsessions have been putting humans on Mars, sustainable energy, building electric cars and sustainable energy, and reforming the banking system, the global financial system.

[01:15:27] Cameron: He’s been a big, big sort

[01:15:29] Steve: small, small

[01:15:30] Steve: side hacks.

[01:15:32] Cameron: yeah, yeah. yeah. So they’ve been the things that have been driving him. And so when he, when he. When they sold PayPal and he had his 250 mil, he decided he wanted to put people on Mars and he, and he just started devoting himself to reading everything about rocket engineering and space that he could get hold of.

[01:15:50] Cameron: He ended up going, deciding that he was going to buy rockets, old Russian rockets. So he hired, or he got a couple of like rocket engineers, one of whom ended up being [01:16:00] the administrator of NASA later on. And. They went to Russia a couple of times to try and buy old Soviet rockets. And the Russians just basically toyed with them and fucked them and just kept putting up the price and one of them actually spat on Musk in a meeting, because he was a capitalist pig.

[01:16:17] Cameron: And it’s a story where they’re On a jet flying back from Moscow, the two rocket engineers are sitting behind Musk and they can see Musk on a laptop with a spreadsheet. One of them turns to the other and says, I wonder what the idiot Savant is doing now. And then Musk turns around and goes, I figured out that we can build our own.

[01:16:38] Cameron: It’s not that hard. It’s just metal and fuel. That’s all you need to build a rocket. Like it’s literally not rocket science. and so like, and they’re

[01:16:45] Cameron: like, you’re crazy.

[01:16:47] Steve: rocket science. I love that. That’s, that’s the quote of the day.

[01:16:51] Cameron: They’re like, you’re crazy. These, his friends actually staged an intervention to try and talk him out of it, that it was impossible for a private [01:17:00] individual, private company to build rockets. And the way he comes across in the book, the way Isaacson portrays him is, he, he has an appetite for risk, like a lot of psychopaths do.

[01:17:09] Cameron: And the more people try and tell him it can’t be done, and it’s crazy, the more he gets determined to do it. But it’s a, like, it’s a, he doesn’t come across. As a well balanced, individual at all, he comes across as a mad… Mad genius who’s driven by demons and driven by the way his father treated him and losing his first child and all sorts of stuff that have driven his behavior.

[01:17:33] Cameron: But I think it’s a must read, man. Like if nothing else, he is futuristic. He is trying to build the future and he’s devoting his life to building the future. So crazy, not crazy. Psychopath, whatever. He’s, he’s, he’s,worth a read. I rec highly

[01:17:53] Steve: All right. Well, when you make a recommendation, I listen, you know, I always write down those book titles. [01:18:00] I haven’t read them all yet.

[01:18:01] Cameron: I know you write them down. I don’t see you reading them, but I see

[01:18:03] Cameron: you writing down The book titles.

[01:18:05] Steve: The camera’s off, aren’t you Cameron? Let’s, I mean, let’s just be clear about that. Unless you are, are you watching me? Now I’ve got all these other questions and I’m starting to wonder about what you’re really up

[01:18:13] Steve: to.

[01:18:14] Cameron: I don’t watch you, I watch the people who watch you, Steve. I watch the watchers. Who watches the Watchers? That’s what the Watchmen was all about. getting back to the Jetsons, yeah, that’s a really interesting point that we don’t have that Jetsony type future for a whole variety of reasons. Like, we’ve made huge progress in a lot of areas.

[01:18:34] Cameron: Medical technology, a lot of those things that aren’t consumer tech, but we’ve made huge strides in a lot of areas.

[01:18:41] Cameron: quantum.

[01:18:43] Steve: Yeah.

[01:18:44] Cameron: Quantum developments have led to all sorts of things, our understanding of quantum physics, that, exists in all sorts of technology, but it’s not… in self tying Nike shoes or hoverboards yet, so let’s hope the supercomputing, the [01:19:00] LK99 finally is revealed and we all have, levitating hoverboards soon.

[01:19:06] Cameron: Okay, let’s wrap it up, Steve. Let’s get to the final session of the day, the final section of the show, which is your Futurist Forecast.

[01:19:15] Steve: So this one’s pretty simple. I think that we’re entering a phase where GPTs all become verbal within 12 months where we just speak to them. I think you’ll still be able to write. They’ll all have backwards compatibility, but I think we’re getting to the stage now where it’s okay. Computer Star Trek style with, you know, Picard, where we actually ask if we can get complex answers and information.

[01:19:42] Steve: And I think that visual is next. I mean, the vision that I have is I’ll use the word app. But maybe you hold up your phone and you provide some context to the LLM which looks at the visuals and the pictures and you ask it to describe what it sees or give you real world feedback or use this [01:20:00] visual and verbal interaction as a starting point for something that needs to happen or be done.

[01:20:05] Steve: or something that you work with and you know it sees and hears and interacts much like a person would. You know just this idea of uh what you’re talking about can be interpreted and done with an AI. I think the shift towards what Amazon are trying to do Adding a large language model to Alexa is kind of the shape of things to come.

[01:20:29] Steve: And I think that it’s going to jump off the screen where you go to a GPT and it gives you information and it starts to interact with the visual world. And it might be that autonomous cars do a large part of this. And this is one of those areas, speaking of your boy Musk, where if you have Millions of vehicles, uh, taking information in real time from the real world, you know, how that could inform other models, whether they’re visual models or real world models and language models as well.

[01:20:57] Steve: I think that’s where we’re going to go. And I think the [01:21:00] seed of what’s next, where GPTs become verbal partners that you can speak to and get complex answers back, adding the visual layer to that that we’re saying with. Generative AI image recognition, and we could be getting to a space where all of us have those

[01:21:16] Steve: personal AIs.

[01:21:18] Cameron: I think you’re right, and you made me think of something that wasn’t in my notes, but I’m going to share this with you. So a couple of days ago, bedtime, Fox, always tries to, delay going to bed as long as possible. And he said to me, hey, can I, can I have your iPad to use GPT? And I said, what for? He goes, I just want to ask it something.

[01:21:36] Cameron: And I said, what? He goes, I don’t want to tell you. I just want to ask it. I said, okay. So I was in his room and he had my iPad and he’s typing away there for 10 minutes. Then he gives it to me at the end and I’m going to read you the conversation that he had with GPT. He says, hi. GPT says, hello Cameron, how can I assist you today?

[01:21:56] Cameron: just to say I am his son, Fox. Hi Fox, nice to meet you. [01:22:00] How can I assist you today? I have a question. It’s not really a good question, but here it is. If you could meet anyone that is alive or dead, who would you meet? As a machine, I don’t have personal preferences or desires, so I don’t know if anyone I’d like to meet, but it’s an interesting question for people.

[01:22:15] Cameron: If you could meet anyone alive or dead, who would you choose? Fox says it would be a tie between Leonardo da Vinci and my grandpa George, my dad. GPT says those are both fascinating choices. Leonardo da Vinci was a true polymath with contributions to art, science and engineering. And meeting a grandparent can offer a deeply personal connection to your own history.

[01:22:37] Cameron: Why did you pick those two? Because my grandpa died before I was born and da Vinci because he is one of my favorite artists and I think he’d be really cool to meet. GPT says, those are heartfelt reasons. Missing the chance to meet a grandparent leaves a lot of unanswered questions and curiosities. And Leonardo da Vinci is indeed a captivating figure.

[01:22:58] Cameron: His influence spans multiple [01:23:00] disciplines. I can see why both would be at the top of your list. Fox says, I have a few more questions. Of course, Fox, feel free to ask your questions. What would you like to know? Of all the people you have answered questions for, what is most people’s favorite book author? I can’t recall specific interactions or aggregate data from previous conversations due to privacy reasons.

[01:23:21] Cameron: However, popular authors often mentioned include J. K. Rowling, George Orwell, and Jane Austen, among others. These authors have made a significant impact in their respective genres. Do you have a favorite book or author? Fox says yes, it is Roald Dahl. And Then they go in and he talks about Roald Dahl, asks him what his favorite Roald Dahl book.

[01:23:41] Cameron: Fox says, That’s a hard question. It’s probably Danny the Champion of the World. Says it’s a great pick. Fox says, Okay, I need to go to sleep now. He says, No worries, Fox. Have a good night’s sleep. Feel free to return whenever you have

[01:23:53] Cameron: more questions. Good

[01:23:54] Steve: Cameron, can I just admit something really quickly? That’s the kind of interaction I should be having with my son, [01:24:00] but when I hear them say, You bloody go to sleep, you gotta get up early in the morning, you’ve been annoying me all day. Oh my god, the AIs are already

[01:24:06] Steve: outperforming me!

[01:24:07] Cameron: If, as if Bluey’s dad wasn’t bad enough showing us all up, I had to be a good parent. Now it’s, GPT. But the thing that struck me about this is, so Fox is nine. he was just having a conversation with GPT. There was really no big questions. Oh, who would you like to meet? Who’s everyone’s favorite book author?

[01:24:28] Cameron: But it was

[01:24:29] Steve: because no one makes it, it just, was Is that kind of it?

[01:24:32] Cameron: it was just a chat. Yeah. It was a chat with a friend, an AI friend. Hey, he’s just having a chat about how he would like to meet Leonardo da Vinci and his grandfather. Like it’s, I thought it was really sort of a profound moment. and it reminds me, and I, I’m sure I’ve told you this story.

[01:24:51] Cameron: I’ve mentioned a million times in my different shows. I had this moment, I don’t know, 17, 18 years ago when my son, Taylor, Who’s now, [01:25:00] will be 23 in a couple of weeks, but back when he was about six or seven, I got up one morning, came downstairs, Taylor was in his bedroom on his old IBM 400, and he was playing a game.

[01:25:15] Cameron: I can’t remember what it was called. but it was, it was like an interactive world conquering game. And he, I said, what are you doing? He said, I’m playing this game. It was online and he’s with another guy. He said, this guy’s helping me solve these, find these objects that he needed to find. I sat down and looked at what he was doing and the guy was talking to him in Cyrillic.

[01:25:35] Cameron: And I said, well, how’s he helping you? You don’t even speak the same language. And he goes, oh, he’s just leading me to places and sort of showing me where to find a magic sword or a shield or whatever. And I said, why is he doing that? And how did you get him to do that? He said, Oh, I don’t know. I just, said hi.

[01:25:52] Cameron: And he just started helping me find things. And, and that was it. And I was like, so at the time it struck me as really [01:26:00] profound that this is a millennial and his His way of using technology was to collaborate on solving a task, on going on a mission, with a person on the other side of the world, they didn’t even speak the same language.

[01:26:15] Cameron: I don’t know what that other person’s motivation was for helping him, or how old he was, or she, or

[01:26:19] Cameron: whoever they

[01:26:20] Steve: In jail

[01:26:20] Steve: now, Cameron.

[01:26:21] Cameron: They collaborated, people from across the world collaborated on solving a project, solving a mission. And I thought that was such a internet generation thing to do. Now, of course, Taylor today runs a global business, where he manages TikTok stars, his girlfriend lives in LA, he, he has business partners in Canada and the United States, he travels the world doing deals with people, he lives like that, he became that kid, that kid became the 23 year old who lives that experience now, does it as a [01:27:00] business, as an entrepreneur, because that was his Like, it was a natural experience for him to collaborate with people around the world to get shit done.

[01:27:11] Cameron: Now, I apply that sort of analogy to Fox and his relationship with an AI. For people of our generation, AI is this new thing, what are we going to do, it’s the end of the world, and how do we use it, and blah, blah, blah. For Fox’s generation, it’s like, hey, it’s my friend that I can have a chat with, and it can help me do

[01:27:30] Cameron: things, you know?

[01:27:31] Steve: really telling. I mean, you firstly, I mean, let’s make a three. The first one is, is experiences in your formative years can shape you so much that you don’t realize. And much later they come back and you go, Oh wow. In hindsight, that thing there really led to this. It reminds me of one of the great definitions of technology you would have heard is technology is something that came along after you were

[01:27:55] Steve: born. Right. And if, and

[01:27:59] Steve: if.

[01:27:59] Cameron: I

[01:27:59] Cameron: [01:28:00] hadn’t heard that, but It’s

[01:28:00] Steve: It’s a good one because it sort of reminds you what is technology, like a shoe or a knife is a technology, but the ones that come after you’re born are the ones that we struggle with and that we focus on as adults. So it reminds me of that. But then, you know, on the third one, it’s like you become the environment you swim in.

[01:28:16] Steve: And even if you look at Australia, 90 percent of our population lives within,You know, 10 kilometers of the coastline or something. And everyone knows how to swim here, right? So you become the environment you, you, you swim in. And if you swim in AI and technology, you just become that. And it’s just normal.

[01:28:30] Steve: And I think it’s just a really interesting, you know, I guess way to close out the podcast. But, uh, yeah, when we look at these things philosophically in the

[01:28:39] Steve: structure, there’s a lot of answers there.

[01:28:40] Cameron: And these kids, this next generation, AI, they, they won’t, like Fox won’t be able to remember, by the time he’s Taylor’s age, he won’t be able to remember a time when he didn’t have an AI best friend slash

[01:28:56] Cameron: personal

[01:28:56] Cameron: assistant.

[01:28:57] Steve: It’s just like,

[01:28:58] Cameron: That will

[01:28:58] Cameron: be his normalcy. [01:29:00]

[01:29:00] Steve: be his normal. Just like, we don’t remember a time when if you’re by yourself, you could just sit in front of that box and just watch someone. You know, put like a little performance in front of you that you can just sit and absorb.

[01:29:09] Steve: It’s

[01:29:09] Steve: just,um, what’s around you. 2012.

[01:29:12] Cameron: I mean, I struggle to remember life before the internet and mobile phones. I mean, I was in my twenties when that happened, but that was so long ago, I barely remember it. My adult boys can’t remember life before the internet and mobile phones. It’s just, they’ve always had it. And, yeah, these kids, I don’t even know what you call Fox’s generation.

[01:29:33] Cameron: Born in 2014. I don’t know which generation they are.

[01:29:38] Steve: know what he is either. Is it Gen Z? I think Millennials is from 95 until 2009.

[01:29:46] Steve: And then it’s Gen Z after that, I think.

[01:29:49] Cameron: Right. Born 2014,

[01:29:54] Cameron: Generation

[01:29:55] Steve: Alpha,

[01:29:55] Steve: there you go. Z maybe. Ah, Millennials. What was, I don’t know. It’s Millennials, Z and then

[01:29:59] Steve: [01:30:00] Alpha.

[01:30:00] Cameron: gone. We’ve got all right back to the beginning, the Gen A. Yeah. So, it’s, it’s a fascinating world, Steve. And, well that’s Futuristic for this week. Thanks for the chat,

[01:30:11] Cameron: buddy. That was fun as

[01:30:12] Steve: Really was a good chat. Thank you.