Using GPT to help track calories, the increasing complexity of AI tools, the launch of ChatGPT Enterprise, Ray Kurzweil’s latest interview Startups Desperate hunt for GPU’s, Tesla is powering up its $300 Million AI Supercomputer, Ex-Google CEO Eric Schmidt to launch AI-science moonshot, UK startup unveils AI-designed cancer immunotherapy, Most Americans haven’t used ChatGPT, China Invests $546 Billion in Clean Energy, What was behind Web 2.0 boom, What reason do we have to think AI might be benign towards humans?
FULL TRANSCRIPT
FUT11
[00:00:00] Cameron: Welcome to episode 11 of the futuristic podcast. This week on the show, we’ll be talking about using GPT to help track calories. The increasing complexity of AI tools, the launch of chat GPT enterprise Ray Kurtz wall’s latest interview startups, desperate for the hunt for GP use Tesla pairing up it’s $300 million AI supercomputer ex Google CEO, Eric Schmidt, launching his AI science moonshot. UK startup that’s unveiled an AI designed cancer immunotherapy. Uh, the latest pew survey indicates most Americans haven’t even used Chet GPT. China’s investing $546 billion in clean energy, way more than the us and the EU put together. What was behind the web two.zero boom and how that may not translate into the ai boom and then we talk a little bit about what reasons we have to [00:01:00] think that ai might be benign towards humans when it becomes conscious intelligence sent in whatever you want to call it all that coming up on this week’s episode of the futuristic
[00:01:12] Cameron: this is The Futuristic, Episode 11. We’re recording this on Friday, the 1st of September, 2023. My name is Cameron Reilly. With me is my buddy down in Melbourne, Steve Sammartino, Australia’s self declared leading futurist. Surfer, and just all around good guy.
[00:01:35] Cameron: Looking sharp, you got your Steve
[00:01:37] Cameron: Jobs turtleneck on
[00:01:40] Steve: Uniqlo cheap undergarment, which I wear as an outside garment. So
[00:01:46] Steve: that’s what that is.
[00:01:47] Cameron: Looking good,
[00:01:48] Steve: And I only wear black and white things now just
[00:01:49] Steve: because it reduces complexity.
[00:01:52] Cameron: I only wear
[00:01:53] Steve: There you
[00:01:54] Cameron: White on white on white on
[00:01:55] Cameron: white.
[00:01:55] Steve: such a, it’s such an interesting thing that [00:02:00] uniforms I think are really important because it just removes, and I know this is well documented, it just removes some of the thinking that needs to go in it.
[00:02:08] Steve: And I have this idea that you can tell how much. freedom someone has by the choice of what they get to wear in all stratas of society, whether it’s a prime minister, someone working at McDonald’s, someone in the military, or an advertising creative. It’s the, it’s the easiest way to see how much control someone has on their own
[00:02:24] Steve: life is the choice of what they wear.
[00:02:29] Cameron: I like that, nice. Yeah. Alright, so futuristic Steve, let’s get into it.
[00:02:35] Steve: Great.
[00:02:36] Cameron: Um, I’m going to, I’m going to tell you one of my hacks that are maybe obvious, may not be.
[00:02:42] Cameron: You’re a skinny guy. Um, uh, you probably don’t worry about these things, but,
[00:02:46] Cameron: uh, you know,
[00:02:46] Steve: do. So.
[00:02:46] Cameron: I struggle.
[00:02:48] Cameron: I have to struggle with my weight. When I say struggle with it, I have to monitor it.
[00:02:51] Cameron: I have to be,
[00:02:52] Cameron: I’m very active these days, as you know, I do. Kung Fu five times a week, but I still have to watch what I eat. And so I always use a calorie [00:03:00] tracker and, you know, calorie trackers on your phone can be a little bit complicated if you’re, if you’re eating a dish, you know, you have to look up this, this ingredient, that ingredient.
[00:03:10] Cameron: I think the complexity of tracking your calories is one of the things that in the past is always, I’ve hit a brick wall with it eventually, I just get sick of having to look stuff up. One of the hacks that I’ve been using for the last couple of months is GPT for calories. I’ll say, Hey, GPT. Um, roughly how many calories for a bowl of pasta, aglio e olio, with let’s say a tablespoon of feta and some pine nuts, a tablespoon of pine nuts, uh, cooked in a bit of olive oil, obviously, and it’ll give me a rough, you know, based on that, you know, rough guideline, 250 calories, let’s say, boom, I put it into my bowl Calorie Tracker.
[00:03:48] Cameron: It has just made the whole process of figuring out, like, you don’t, you don’t need to be, you know, anal about calories. You just need to, you need to know roughly around your 1500, 2000 calories a day, whatever it is [00:04:00] that you’re trying to do if you want to lose a bit of weight.
[00:04:02] Cameron: GPT is just like, I can’t wait till it’s integrated into the app.
[00:04:06] Cameron: So the app will figure it out for me. Uh, but yeah, it’s just made tracking calories. 10 times easier for me in the last month. So that’s a hack for people out there in case they haven’t thought about using GPT for that. You got, you got any good hacks for this week, Steve? Hmm.
[00:04:22] Steve: I’ll tell you my hack with. Eating, because I do watch what I eat, and that’s why I
[00:04:27] Steve: am a skinny guy. People get it back to Fro. They say, you’re a skinny guy, you don’t have to worry. I watch it
[00:04:31] Steve: like a hawk.
[00:04:33] Cameron: Right. What are your, What are your,
[00:04:35] Steve: my hack is, um, and I didn’t have to, I’ll be honest and say, when, until I was about 30, I didn’t have to worry that much.
[00:04:41] Steve: I was one of those people that was, I could eat whatever I want and I would never put on, you know, a gram of weight. But after the age of 30, 35, it started to, I had to watch it, because I put on about 10 kilos at one point. Now I just don’t eat until about two
[00:04:52] Steve: o’clock, because what happens is your stomach shrinks.
[00:04:55] Cameron: intermittent
[00:04:56] Steve: Yeah, but it’s, it’s,
[00:04:57] Cameron: Mm hmm.
[00:04:58] Steve: people say it’s intermittent fasting, but it’s [00:05:00] really simple. It’s like your metabolism starts if you eat early and it gets hungry again quick, because it has an expectation that it can just use the energy. And if you don’t, it conserves it and you don’t get hungry as quick. It’s real easy.
[00:05:12] Steve: And I just have high protein, high fat and low carbs because I find carbs. I get hungry again quicker. Mine are just all heuristics. None of them are, and I, and it’s different for different body types. I’m an ectomorph.
[00:05:22] Steve: So ectomorphs?
[00:05:23] Steve: if you have, Carbs you’ll
[00:05:25] Steve: put on. But if I have protein, high protein and high fat,
[00:05:29] Cameron: they the green slimy ghosts in Ghostbusters? Weren’t they ectomorphs?
[00:05:33] Steve: might’ve been, but there’s three body types. There’s ectomorph, mesomorph.
[00:05:35] Cameron: You’re telling me you’re a
[00:05:36] Steve: Now there’s ectomorph, mesomorph, and endomorph. Endomorph is pear shaped. Um, mesomorph is like the Schwarzenegger kind of like upside down triangle. And ectomorph is like a skinny kind of guy with a poke chin. And I fit into that.
[00:05:48] Steve: And I just find if I eat.
[00:05:50] Steve: heavy protein and heavy fat. I don’t get as hungry and I don’t eat as
[00:05:54] Steve: much. That’s it.
[00:05:55] Cameron: Right. Good, good
[00:05:57] Steve: All right.
[00:05:58] Cameron: I like that. I’m going to go
[00:05:59] Cameron: back to [00:06:00]
[00:06:00] Steve: So my one thing for the week is complexity. I think we’re drowning in complexity in the moment, especially from AI. There’s so many tools happening and my feed is just inundated with another hundred tools every day that are all great and that are all useful.
[00:06:14] Steve: keynotes. That is really, really difficult to have a narrative, which isn’t scary or isn’t utopia, to actually have a story arc. And we were talking about your ability before to tell and remember stories,
[00:06:31] Steve: uh, because we’re architected that way to. understand stories and they become easy to remember. But I’ve found it increasingly difficult to get a story arc in the AI keynotes.
[00:06:41] Steve: And I think it’s because there’s just so much complexity and cause there’s so many tools for so many different industries. It’s like, where do you go with this? It’s like, there’s just like a thousand rabbits and you’re just trying to chase all of them and you catch none. And I just think that AI is creating all this complexity right now, other than chat GPT, which is one kind of like [00:07:00]fulcrum.
[00:07:01] Steve: piece of AI that everyone’s gravitating towards. And that’s why it’s got so much of the brand awareness. But I think that we need AI to solve the complexity that AI is creating. Like, do you know what I mean? It’s the old Kevin Kelly says, if technology is a problem, what you need is more technology. You know, it’s the 5149 doctrine.
[00:07:18] Steve: He says that technology makes the world 51% better, which is really interesting. And it just reminded me of the idea of in business, we have periods of bundling and unbundling. All right, the unbundling is the revolution. There’s all little pieces of the puzzle. We don’t know what’s going to happen. And then some people work it out and bundle it in a way to reduce complexity.
[00:07:36] Steve: The iPhone did that. The internet, when it gravitated to five giant websites the other four, did that. And I feel like we need a period of time over the next, it’s going to be a really tricky time for the next 12, 18 months,
[00:07:48] Steve: maybe more. And then the AIs will have to be bundled up because it’s just too complex
[00:07:52] Steve: for
[00:07:52] Steve: anyone to cope with.
[00:07:55] Cameron: Well, this gets back to the vision I articulated a couple of episodes ago, I think, [00:08:00] where I want to see us get to a point where I have my primary AI, that’s my personal assistant, and it goes out and finds every tool that I need. If it’s not the right tool for the job, it’s like the old Apple thing, there’s an app for that, right?
[00:08:16] Cameron: There will be an app for everything. My personal assistant AI doesn’t need to be the be all and end all, but it should be able to… Go find the right tool for the job. So I say, here’s my problem. It goes out, articulates my problem to the right tool for the job, gets it to do the, you know, solve it, and then reports back to me and says, Hey, if you want, if you want to drill down more on this.
[00:08:39] Cameron: Go talk to, um, Accounting Bot, or, you know, Science Bot, but, you know, high level, here it is. Because it should be, it should just be the, the linguistic user interface for everything, is my personal, and there’ll be a range of personal AIs, like, Microsoft will have one, Apple will have one, Google will have one, Facebook will have one, uh, [00:09:00] Tesla will have one, they’ll, they’ll all be competing.
[00:09:03] Cameron: Because that’s like the browser wars all over again, right? They’re all going to compete for that primary interface, because they’ll all be clipping the ticket on everything that happens in the back end. But you’ll have, my guess is within a couple of years, you’ll have You know, a handful of companies that are competing to be your primary personal assistant, and everything else will just be invisible, largely, in the
[00:09:28] Steve: Yeah. And underneath. Yeah, we need it. It’s really obvious because everyone’s really confused out there and it’s really hard. And even when you understand the tools, it’s like, it’s just so much of it. It’s just that it’s very hard to cope and bundling up things to reduce complexity really is one of the core things that creates corporate value from a strategic viewpoint.
[00:09:49] Steve: As soon as a company can find a way to reduce complexity by putting everything into
[00:09:53] Steve: one situation. That’s, that’s where the game is won
[00:09:58] Steve: and lost is the bundling.[00:10:00]
[00:10:00] Cameron: Mate, I’m sure you remember as I do,
[00:10:03] Cameron: the books that were coming out around 2004, No, sorry, 1994. Yeah, I would buy books like the Dummies Guide to the Internet and it would be 300 pages of websites. There’s a website. That’s what it does
[00:10:22] Steve: Yeah. I remember that. I remember I bought a magazine, um, um, top 100 websites this month and I’d buy it and go on them, which was weird. And it was kind of wonderful and wacky because it’s interesting. We did have that and it was important, but you know, it was. different about that then is that that was a choice and an exploration.
[00:10:40] Steve: The problem with AI now is that we all use it all day, every day. And it’s ensconced in our operations in our life. Whereas that was like, Oh, let’s go and explore this interesting forest with all these different species. And it was choice laden and that was good. But now we’ve got this challenge where AI and computation and tools are an inextricable
[00:10:59] Steve: part of [00:11:00] our work and life.
[00:11:01] Steve: We’ve got a problem because it’s not just
[00:11:03] Steve: for the fun of it.
[00:11:06] Cameron: Well, actually, according to one of the news stories I had to talk about this week, Steve.
[00:11:10] Cameron: Most Americans haven’t used ChatGPT and few think it will have a major impact on their job. According to Pew Research, the debut of ChatGPT has led some tech experts to declare a part of a robot revolution, but most Americans haven’t used ChatGPT and only a small share.
[00:11:31] Cameron: think chatbots will have a major impact on their jobs. Even fewer Americans say chatbots would be helpful for their own work, according to a new Pew Research Center survey conducted July 17 to 23.
[00:11:47] Steve: Wow. Two things. Well, Pew do great research. I think most of their research over the years has been unbiased and quite good in my viewpoint. Um, so it’s come from a reputable source, but that [00:12:00] really, I’m, I’m kind of flummoxed and it might be the circles I move in. Because everywhere I go, I say, who’s used ChatGPT?
[00:12:05] Steve: And there’ll be 20 or 30% who haven’t. Less. And, and, and this is often at conferences. I did one at the, uh, the RSL, RSL, um, industry group, and that only had about 20%, and that was, you know, someone who you wouldn’t put in the tech savvy audience,
[00:12:25] Steve: not to be dismissive at all, but just a different type of business, retail business.
[00:12:29] Steve: I’m really
[00:12:29] Steve: surprised at that.
[00:12:31] Cameron: according to their research, among US adults who had heard of CHAT
[00:12:37] Cameron: GPT, only 24% had used it. So this isn’t even of the total population. This is the percentage of the population that have heard of it. Only 24% had used it. Most of those were aged 18 to 29. As you go up the age brackets, it gets less and less. Most of them were post grads or college grads, 64%. [00:13:00] were college grads or post grads. So it’s young. educated people in the U. S. that are using it. And, uh, everybody else, like, the other three quarters of the population who have heard of it, are like, yeah, nah, I don’t see the point. So,
[00:13:18] Steve: I mean, this is,
[00:13:20] Cameron: that shocked the hell out of me, man, I gotta
[00:13:22] Steve: shocking for a few reasons. We were talking, I think, before recording, gravity doesn’t care. Whether you’ve, you know, whether you’ve heard it or not, you jump off a cliff, you’re going down. And ChatGPT and other AIs don’t care if you think it’ll impact your job or not.
[00:13:35] Steve: They don’t care. The technology doesn’t care. It’s agnostic. It just does what it does. That’s it. I’m surprised by this because all of those people you would imagine have a smartphone or a very, very vast majority, and so if this was 1994 and people saying, I don’t think the Internet’s affected me, there was a process to go and get online.
[00:13:55] Steve: I mean, one of the biggest businesses in tech then was a o l where they sent [00:14:00] out a cd, they showed you how to get online. It was this whole process of you getting you there, which. would make it understandable that it took a while for everyone to get online, but these people are already online and it’s a matter of typing in, you know, sixletters.
[00:14:12] Steve: com to, to, to get onto it, which then surprises me. Like, if you’ve heard of it, wouldn’t you go, there’s quite a bit of fanfare. I might just have a look. That’s the thing that surprises me
[00:14:24] Steve: is the ability for people to self select. and yet they choose not
[00:14:28] Steve: to.
[00:14:31] Cameron: hmm. My first job in tech was working for Aussie Mail, back
[00:14:35] Cameron: in the mid 90s.
[00:14:38] Cameron: And we were sending out floppy disks. I remember I lived in Camberwell,
[00:14:43] Cameron: my, uh, my local video store, Camberwell. Uh, Glenfury, Glenfury Road? No, what’s the main road in Camberwell? Um, uh,
[00:14:51] Steve: Glenfurtin, it’s not
[00:14:53] Cameron: no, the other way, yeah, no, it runs parallel to Glenfury.
[00:14:56] Cameron: Anyway, too many, it’s
[00:14:57] Cameron: like 30 years
[00:14:58] Steve: Melbourne people tuning in, all [00:15:00] five of you.
[00:15:02] Cameron: I, I remember going to the, there’s a young guy,
[00:15:06] Cameron: in, his like mid to late 20s who owned the video store, very cool dude, and I remember taking him a huge bag of floppy disks, Aussie Mail starter kits, and I said to him, this is like 95, I said, listen, hand these out to everyone who comes in, tell them it costs 5 bucks to sign up, get an internet account, and then If you wouldn’t I will build you an email database, you just get everyone to give you their email address, sign up, get an email address, give you the email address, and then you can send them an email each week when the new arrivals come in.
[00:15:45] Cameron: Because I was always going in and going, what’s in that’s new that’s good. And I was like, well, if you, and then he’d go, Oh yeah, this movie, like Pulp Fiction just came in, but it’s already gone out. People have already, it’s like rented out. I was like, well, if you email me as [00:16:00] soon as it comes in and I’ll say, put a hold on it, I’ll be down there in an hour to pick it up.
[00:16:04] Cameron: Right. And then I built him a website. So he could do it like his hands on, man, those were
[00:16:10] Steve: crazy. But that’s interesting that a lot of the seeds of what we need, you can see, and it’s, it’s so often, uh, just hacking it together, going, if you did this and you could get that, that you could create value here. And that was one of the really exciting things
[00:16:24] Steve: about, you know, that technology. And I’m going to talk about it when we go
[00:16:28] Steve: into the deep dive, something similar.
[00:16:31] Cameron: well, let me race through the first news story because I want to get to the second one because it talks
[00:16:34] Cameron: to something you were just going on about. But, um, big news, I think this week is that OpenAI released ChatGPT Enterprise. Basically, ChatGPT for the business. You get higher levels of access, higher levels of support, unlimited access to what used to be known as Code Interpreter.
[00:16:54] Cameron: They’re now calling it Advanced Data Analysis, because Code Interpreter was a little [00:17:00] bit… Confusing, I
[00:17:01] Cameron: think,
[00:17:02] Steve: Well, it makes me feel like I’m
[00:17:03] Steve: just getting someone to have a look at my code and see if it’s any good.
[00:17:07] Cameron: Yes. Yeah, that’s right. So they say we’re launching ChatGP Enterprise, which offers enterprise grade security and privacy, unlimited high speed GPT 4 access, longer context windows for processing longer inputs, advanced data analysis capabilities, customization options, and much more. We believe AI can assist and elevate every aspect of our working lives and make teams more creative and productive.
[00:17:35] Cameron: Today marks another step towards an AI assistant for work that helps with any task, is customized for your organization, and protects your company data.
[00:17:46] Steve: I mean, this for me sort of points out how far I think that, uh,
[00:17:53] Steve: Microsoft is with their investment in that. Did you know Microsoft share price has nearly doubled?
[00:17:59] Steve: [00:18:00] This year?
[00:18:00] Cameron: Since really, because of Chachapiti.
[00:18:04] Steve: it’s a large part of it because people are
[00:18:06] Cameron: Yeah.
[00:18:07] Steve: understanding what they’re doing with bundling. I, at the conference I was at this week in Bali, they had a demo of the copilot and
[00:18:18] Cameron: Yeah,
[00:18:18] Steve: Look, I’m just looking at the share price now. Um, let’s, let’s bring it up on the, on the, let’s go
[00:18:22] Steve: to the tapes
[00:18:23] Steve: for that.
[00:18:23] Steve: Gee, it’s, it’s, it’s really going up.
[00:18:24] Cameron: Well, it’s back to where it was in
[00:18:26] Cameron: November
[00:18:26] Cameron: 2021. So it came down over the
[00:18:30] Cameron: last
[00:18:30] Steve: but, not everyone is back up.
[00:18:31] Steve: the
[00:18:32] Cameron: last
[00:18:32] Steve: point is not everyone is back up. And, um, if you look at their market cap, it’s now 2. 4 trillion. And Microsoft is the, is the sleeper agent in all of the big tech. Because I think that, um, I think the two big winners in, in AI are going to be, uh, Apple and Microsoft.
[00:18:49] Steve: Uh, Microsoft, because they’ve got the enterprise element there, they’ve already got the Microsoft suite and teams and everything, they just plug it straight into that. Uh, you can see how this enterprise chat GPT works. The Microsoft [00:19:00] copilot, which works via chat GPT is extraordinary. The things that it can do by feeding through on the data you’ve already got on your work laptop and then produce, uh, PowerPoint presentation slides, pitch decks, um, meeting summaries, it’s really, really good.
[00:19:14] Steve: And the thing that makes it good is that it’s bundled. Right? So it’s bundled, and it’s that bundling thing
[00:19:19] Steve: that we speak about, and that’s what Microsoft did
[00:19:21] Steve: so well with their Office suite, right? You know that, you were there.
[00:19:26] Cameron: Yeah. And you know, it’s, um, it’s interesting looking at Apple’s share price because they’ve had the same sort of growth in the same time period. And yet they don’t really have a major AI play like Microsoft does, like
[00:19:40] Cameron: ChatGPT.
[00:19:42] Steve: that’s interesting because they don’t have that play, and they’re still doing
[00:19:45] Steve: so well, right? Because I think that Apple’s
[00:19:47] Steve: play… is, is going to, it’s going up as well this year, but I think that Apple’s play is going to be the Java style AI that you were speaking about earlier. I, I’m, I’m, I’m a, I think that they’re most well placed because of their privacy reputation, their [00:20:00] closed ecosystem on all your personal data, photos, where you go, all of those things inside the phone.
[00:20:05] Steve: They, I think, could do a very good job in providing a personal AI, which understands what you think, where you go, who you speak to, um, because they’ve got all that data enclosed in their, um, their own walled garden. They could develop an AI that infiltrates your most personal things, but they’ve got the trust to execute against that.
[00:20:23] Steve: Unlike Google, which is very external and unlike Facebook, who might have
[00:20:28] Steve: the
[00:20:28] Steve: knowledge with certain people, but they certainly don’t
[00:20:30] Steve: have the trust.
[00:20:33] Cameron: and I think the thing that they have is the delivery channel, right? They’ve got the, the phones. and the iPads. They’ve got the delivery channel. They’ve invested heavily in that over the last, what is it, 15 years, 16 years now. And yeah, we all assume that, okay, they may not have a dedicated AI play now, but they will.
[00:20:56] Cameron: And even if they don’t, like even if ChatGPT is, [00:21:00] remains the dominant play, you’re going to be accessing it on your iPhone and it’s going to be interfacing with your technology and the Vision Pro and whatever comes after
[00:21:10] Cameron: that.
[00:21:11] Steve: And they can wait, right? They weren’t
[00:21:13] Cameron: the hardware
[00:21:14] Steve: weren’t first with smartphones and a lot, a lot of things they, you know, a lot of things they’ve been,
[00:21:18] Steve: um, we’re going to do it right when the time comes. And yeah, there’s some losses
[00:21:22] Steve: along the way, but I think overall they’re
[00:21:24] Cameron: it’s funny. Cause that used to
[00:21:25] Cameron: be, that used to be Microsoft’s argument 30 years
[00:21:28] Cameron: ago. We’re not first, we just do it right.
[00:21:31] Cameron: Yeah. We let everyone else figure out the basics
[00:21:34] Steve: that’s right. You can learn from their mistakes. And every business strategy textbook says, look, there’s, there’s not a right strategy. There’s strategies. The question
[00:21:44] Steve: is, do you execute against them? Well, now sometimes you say we’re not first, which is going to do it better.
[00:21:48] Steve: You do that because you got beaten, but if it is a, but if it is a genuine strategy and you genuinely do that, it can be valid as can being first and go, that’s okay. We want to get it out there. And yeah, it’s going to be imperfect. [00:22:00] You know, they’re all valid
[00:22:01] Steve: strategies. The real question on any strategy is how well you execute.
[00:22:04] Steve: Execute against that.
[00:22:08] Cameron: Well, uh, moving right along, Steve, uh, Ray
[00:22:10] Cameron: Kurzweil gave an interview, uh, this week. Uh, he was part of a… session run by the ITU, the International Telecommunication Union. So United Nations, um, specialized agency that focuses on ICT. Um, first interview I’ve seen with him since GPT came out. He was on Lex’s show, Lex Friedman’s show a year ago.
[00:22:36] Cameron: Um, but this is the first one since JTPT really blew up. Have you, have you
[00:22:41] Cameron: seen
[00:22:41] Steve: but I, I, I’m, this is nerdly. As soon as this
[00:22:44] Steve: finishes, I’m gonna, uh, start listening to it while I do some work. So, but I saw the notes and,
[00:22:48] Steve: geez, this is really interesting.
[00:22:51] Cameron: I’ll give you
[00:22:51] Steve: Yeah. Give me, give me the, the, the, not the cliff notes, the cam notes.
[00:22:56] Cameron: cam
[00:22:57] Cameron: notes. So, uh, the guy who
[00:22:59] Cameron: hosted [00:23:00] the,
[00:23:00] Cameron: event had his copy of The Age of Intelligent Machines from 1999, which I remember reading. It was sort of a major, um, book for me when it came out too. And, He pointed out that in that book, Ray had predicted that a computer would pass the Turing test by 2029.
[00:23:20] Steve: Hmm.
[00:23:20] Cameron: And Ray made the point that when he made that prediction in 1999, Stanford held a big conference and they brought together leading AI thinkers from around the world and they asked them all about his prediction. He said almost all of them agreed that a computer would one day pass the Turing test, but they believed it would take about a century before that would happen, not 30 years, which was his prediction.
[00:23:47] Cameron: And he said they’ve continued to survey leading AI researchers every year for the last, you know, whatever that is, 20 odd years, 24 years. And he said, the predictions of the other people, the timeline [00:24:00] keeps getting shorter and shorter
[00:24:02] Cameron: and
[00:24:02] Steve: revisionist history. Hey,
[00:24:05] Cameron: they’re all starting to agree with Ray. Now he.
[00:24:09] Cameron: He made
[00:24:09] Cameron: some interesting points. He said he still believes it’ll be 2029, maybe sooner, and he points out that ChatGPT can answer a lot of questions perfectly, still make some mistakes, but he said what we’re going to have to do to get it to pass the Turing test is dumb it down, because it’s too smart.
[00:24:30] Cameron: You would be able to, you would be able to quickly tell. That it’s an AI now because if you ask it a question on physics and a question on philosophy and a question on accounting and a question on ancient history, and it can answer all of them flawlessly, you know it’s not a human because he said even Einstein couldn’t answer in depth questions about all those topics.
[00:24:52] Cameron: The problem with passing the Turing test now is not to appear as smart
[00:24:57] Steve: odd of Pier Dumber.
[00:24:58] Cameron: It’s to
[00:24:58] Steve: wow. That, [00:25:00] that is,
[00:25:00] Cameron: not to appear too smart to pass the cheering
[00:25:03] Steve: that is such an
[00:25:06] Steve: insight. That, that is one of the
[00:25:08] Steve: strongest insights of all time because, jeez, I got excited there, all time. Look, it’s a strong insight, and here’s why. Because, so then maybe, an AI never passes the Turing test, because an AI is so much smarter than a human.
[00:25:24] Steve: Like that moment, that little window of Turing test possibility, it was too dumb before and now it’s way too smart, it can’t pass it at all because it was like, it was like an overdone window and it’s
[00:25:33] Steve: like, it’s gone.
[00:25:34] Cameron: Yeah, well, he points out that there are still, you know, problems
[00:25:38] Cameron: that you can give it, like how many E’s are contained in the following sentence, and it will get it wrong. I don’t know if, you know,
[00:25:44] Cameron: I haven’t
[00:25:45] Steve: Oh, wow.
[00:25:46] Cameron: on ChatGPT4, but I
[00:25:47] Cameron: assume that Ray knows what he’s talking about. But, so, there’s still some fine tuning to be done, but he’s confident that they will do all of that.
[00:25:56] Cameron: He talks about… how AIs [00:26:00] are just about more intelligence on the planet. It’s about this ever increasing level of intelligence that we have available to us on the planet. And he’s got a great chart, which I see didn’t make it into my notes, but you know, in his books, he’s always charted
[00:26:14] Cameron: out.
[00:26:15] Steve: that. I know the ones you’re speaking about where it has the exponential chart, there’s a logarithm, and it shows the periods of time and how smart they were compared to, you know,
[00:26:23] Steve: mice and different… beings and different computers and the transistors. Is that the
[00:26:28] Steve: one you’re
[00:26:28] Steve: talking about?
[00:26:29] Cameron: yeah. It is, and I’ve just dropped it into our Google Sheet here, so you can have a look at his current one, which goes from
[00:26:36] Cameron: 1939 up to 2021. It’s a logarithmic scale, it’s basically a straight line from bottom left to top right, and he makes the point that Nobody is planning this. He said, like, for the first 40 years, no one was even measuring this or paying attention to it.
[00:26:53] Cameron: This is just what happens. It reminded me, when you said before that science doesn’t care, gravity doesn’t care. [00:27:00] The exponential increase in intelligence, well, this chart is tracking the price performance of computation. How many MIPS you can get for a dollar or a thousand dollars, right? It’s been growing exponentially for, what’s
[00:27:15] Cameron: that,
[00:27:15] Steve: It’s 35.
[00:27:18] Cameron: 80,
[00:27:18] Cameron: odd
[00:27:18] Steve: Well, I just, just, there’s something in addition to that. I saw him do many years ago. I want to say it was around about 2009, a, uh, talks at
[00:27:26] Steve: Google. I don’t know if Google still do it when they’re an open minded company that just let people who are anti
[00:27:31] Steve: Google
[00:27:31] Steve: talk at Google, right? It was one of the, one of
[00:27:33] Cameron: Oh, they still, they still do that. I still watch those.
[00:27:35] Cameron: from time to time. They’re good,
[00:27:36] Steve: a really great one, and that was just before they brought him into work, and he did one on the law of accelerating returns, and he talked about logger scales, but he talked about Moore’s law isn’t Moore’s law, Moore’s law existed before Moore’s law, and Moore’s law will exist after, and he talked about it almost being a quasi force of nature, And he talked about vacuum tubes and before that punch cards, and they actually followed the same Moore’s law exponential pattern, [00:28:00] even though there were different foundational technologies that provided a form of computation.
[00:28:04] Steve: And he even spoke a little bit more where he went back and said, well, you can then look at the printed word and. The ability to share information out with, uh, before, when it was written, and then Gutenberg with the press, and he actually thinks that that exponential pattern of the ability to share and document knowledge actually goes way back with the same exponential pattern around it, all the way back to drawing on cave walls, which is a really super interesting way to think about it.
[00:28:31] Steve: It’s just a pattern of exponential communication and computation. And he posited that computation wasn’t just the ability. For a machine to do something, it was the ability for information
[00:28:42] Steve: transfer.
[00:28:44] Cameron: hmm, mm
[00:28:45] Steve: So it goes way, way back.
[00:28:47] Cameron: mm hmm. Yeah, it’s fascinating. Like, there’s some law there
[00:28:52] Cameron: that goes beyond
[00:28:53] Cameron: Moore’s Law, as
[00:28:53] Steve: Yeah. And almost beyond the species, it’s almost, it’s almost just like a law of nature. The other [00:29:00] thing that he said too, is that one of the things that’s ironic, well, it’s not ironic. It’s, it makes sense in hindsight, is that when we get to the end, of some paradigm of technology at the moment, it’s, it’s Silicon chips.
[00:29:13] Steve: And before that it was vacuum tubes that the knowledge that goes into it, just when you get to the end, helps you find the breakthrough for the next physical format that can create more information.
[00:29:26] Cameron: mm
[00:29:27] Steve: it actually, it, it,
[00:29:28] Cameron: hmm,
[00:29:29] Steve: it’s ability to compute
[00:29:31] Steve: helps you find another way to compute
[00:29:32] Steve: as that typology expires in
[00:29:35] Steve: its capacity to do things.
[00:29:38] Cameron: Yeah, it builds on
[00:29:39] Cameron: itself and it gives you new, new platforms,
[00:29:42] Steve: Yeah. That’s stacking idea.
[00:29:45] Cameron: He talks about, um, the future of LLMs. He talks about, uh, integrating them into our brain, how that’s the next step. He talks
[00:29:51] Cameron: about,
[00:29:52] Steve: I did that in my keynote this week, integrating, uh, I said the computers will enter our bodies via nanobots and it’ll be a quantum [00:30:00] form of computation and the audience didn’t like it and it was a health
[00:30:04] Steve: company, which was kind of disappointing. Um, sometimes, you know, it was my fault though. Sometimes people
[00:30:08] Steve: just aren’t ready for it.
[00:30:10] Cameron: Mm. Well, he says exactly the same thing, tablet, nanobots, and it’ll, you know, do some work on the prefrontal cortex and you’ll be integrated. It’ll be faster.
[00:30:21] Steve: Wow.
[00:30:23] Cameron: He also, he also says, he makes the point that some people don’t like that and people go, I would never, he said, a lot of people will say, I would never do that.
[00:30:30] Cameron: He said, but there were a lot of people 30 years ago said, I would never have a mobile phone. And I, I remember those people. I remember people. When I was working at AusEmail telling me we’ll never have the internet in our company, we don’t need the internet, we’ve got a perfectly good phone on our desk if we need to know anything, if we need to talk to somebody, we don’t need email, we can call them up.
[00:30:50] Cameron: There are always people who say, I’ll never do that, I’ll never need that. Even the people 30 years ago, I remember people complaining about [00:31:00] mobile phones saying, I don’t want a mobile phone, people will be able to call me on the weekends and I don’t want that, and I go, you know it’s got an off button, right?
[00:31:07] Cameron: And you don’t need to answer it, right? You can turn the bloody thing off. I guarantee you all of those people have a mobile phone today and they spend half of their life looking at it as, as we all do. So it basically makes the point, ignore people who say they’d never do it or they’d never have it because like he makes the point, if you don’t have a mobile phone today, if you’re not on the internet today, how do you stay competitive in the
[00:31:32] Cameron: workforce?
[00:31:33] Steve: in life, how do you participate? Because it becomes a new benchmark of participation. Like reading, which is also a technology, is, is a benchmark of participation. And 150 years ago, about 5% of the population globally could read. Like we need to remember that, right? So it’s a benchmark for participation.
[00:31:52] Steve: And even now the most basic things of being able to pay for your electricity bill, and it’s increasingly difficult unless you’re [00:32:00] connected to the web or have a phone or to do your banking. And so what happens is yes, we have some overlap with legacy systems, but eventually they drop out. Um, and. It reminds me of, uh, Kevin Kelly, who in his latest book, which was like a few years ago now, might’ve been five years ago, called The Inevitable, he spoke of, which was a great read, he spoke of a split in our species where we have neo humans and luddite humans, where some will choose to integrate the technology into their physical body and some won’t.
[00:32:28] Steve: I think that’s really interesting. You know, I think that’s, you know, a potential evolutionary
[00:32:33] Steve: fork.
[00:32:35] Cameron: Yeah, I think we talked about that in an earlier episode.
[00:32:38] Cameron: Ray makes the point that, um, Moderna developed the COVID vaccine using AI in two days. They actually developed it in two days by running hundreds of millions of simulations. Then they tested it for 10 months on humans. He said, but it would have been better to test it on a [00:33:00] million simulated humans.
[00:33:02] Cameron: in a matter of, you know, days and then rolled it out. So he’s talking about, you know, the, the sort of future of biopharmaceutical ecology, biotech in, in healthcare, just being able to develop these things really rapidly and then test them really rapidly on, uh, simulated humans, uh, of all sorts of. You know, um, different biological makeups, different genetic makeups.
[00:33:29] Cameron: We can test it. He says that’s going to be the future.
[00:33:32] Cameron: Um, anyway,
[00:33:32] Steve: you know, medicine, you know, my vaccine will be slightly
[00:33:34] Steve: different to yours and so on.
[00:33:37] Cameron: The last point I had here in the notes is, uh, the guy who was running the show asked him if he had any advice for young people. And he said the usual stuff. He said, look, do something that you love, go into something that you’re passionate about. His only piece of negative advice was whatever you do, don’t go into coding.
[00:33:54] Cameron: He’s like, it’s already like 30% of code is being developed by AI. AI is going to be [00:34:00] able to build, write code far better than humans. So
[00:34:03] Steve: now, the bridge between… Yeah. And we, we spoke about that last time that the types of code have become more human like in their, in their language way up to JavaScript. And then now it’s, it’s semantic
[00:34:14] Steve: language. Yeah. If you can write and you can speak, then you can code.
[00:34:19] Cameron: Yeah. All right, Steve, you’ve got a story in here about startups looking for
[00:34:24] Steve: Well, it’s really interesting that, uh, There’s, there’s a real AI boom right now. If you’re a startup and you’ve got AI, you know, last year it was a few years ago, it was crypto and blockchain. Now everyone’s just dropped that and put AI at the end and that’s how you get funding. But I, you know, I came across an interesting article where most startups getting funding in the AI space.
[00:34:40] Steve: One of the major pitches is we’ve got access to GPUs. Right? It’s like, yeah, we’ve got it. And part of the pitching process has been telling VCs that they have access to compute as part of why they’re a good investment. And it’s just so interesting because the previous boom that we had, I think [00:35:00] big boom, was probably the Web 2.
[00:35:02] Steve: 0 era, and that was all about everyone has access now, show me your idea on what you’re going to do. Because in the 90s, It was really difficult, uh, to do a startup because it was really expensive. You had to get your own server and there was all this complexity. There was no cloud computing. All of that was tough.
[00:35:18] Steve: There wasn’t an easy way to advertise or reach people through social. Uh, but then in Web 2. 0, all of that arrived and it became quick and fast and easy to get things like Uber and Airbnb and whatever the startup was out there. really do a classic MVP. But now the funding ratios are so much higher on early stage investing and getting access to compute, AI is a key thing.
[00:35:41] Steve: And it really reflects those, uh, NVIDIA results, you know, being a trillion dollar company now. And another thing that’s happened is there’s brokers that now
[00:35:49] Steve: have emerged who are broking GPUs to get access.
[00:35:53] Steve: Um,
[00:35:53] Cameron: Hey, hey, you wanna buy a GPU.
[00:35:56] Steve: listen, yeah. Paper bags with GPUs.
[00:35:58] Cameron: come with me down the [00:36:00] alleyway, I
[00:36:00] Steve: They, they, know people. I know people that can get you access, you know, to to, to small things in brown paper bags.
[00:36:07] Steve: Um, but to me it, it shows two things. It shows that I think it’s gonna get worse as the east and west seem to be decoupling with important technology, infrastructure
[00:36:15] Steve: and, um,
[00:36:16] Steve: and chips. Um, Demand’s only
[00:36:18] Steve: going to increase, sorry.
[00:36:21] Cameron: Well, did you see the story Gina Raimondo, the Commerce Secretary
[00:36:24] Cameron: from the US, met with top officials in Beijing just this week? And her message was that they don’t want to decouple from China.
[00:36:42] Steve: Well, it’s interesting, but what people say and what people
[00:36:44] Cameron: is what she,
[00:36:45] Steve: what
[00:36:45] Steve: people say and what they do aren’t always the
[00:36:47] Steve: same thing.
[00:36:48] Cameron: This isn’t about decoupling, she said. This is from the New York Times. This is about maintaining a very consequential trade relationship, which is good for America, good for China, and good for the world. An [00:37:00] unstable economic relationship between China and the United States is bad for the world.
[00:37:05] Cameron: So they don’t want to decouple,
[00:37:07] Steve: Meanwhile, we’re setting up as many chip factories as we can in Mexico, where we’ve got more control.
[00:37:13] Steve: Uh, Apple’s going to India, and we’re not allowing any of the Huawei,
[00:37:17] Steve: uh, infrastructure into our country. I think…
[00:37:20] Cameron: Don’t look over
[00:37:20] Cameron: there. Don’t look over there. Look at, look at, look into my eyes. Look into my eyes. Don’t look around my eyes. Look into my
[00:37:25] Steve: my eyes,
[00:37:25] Steve: look at my eyes, look at me, look at me, because what they say and what they do aren’t always the same thing.
[00:37:29] Steve: I always like to look at what people are doing more than what they’re saying. The other thing that’s interesting, I think they’re right. She didn’t lie. She wants them to keep making their sneakers and their
[00:37:38] Steve: tables and their everything else that ends up in your house. Just, just not the critical
[00:37:42] Steve: infrastructure.
[00:37:44] Cameron: Yeah.
[00:37:44] Steve: But to me, it.
[00:37:45] Cameron: Well, speaking of critical… Sorry, go on.
[00:37:47] Steve: it really points out the power of big tech because big tech have access to all of this and and I wonder if we’re missing out on a reframing of what’s happening and who the power [00:38:00] sources are that that entrepreneurial
[00:38:02] Steve: fresh ground because It’s almost back to the
[00:38:06] Steve: 1990s.
[00:38:09] Cameron: Well, speaking of big tech, apparently, uh, it’s not even good enough to buy your GPUs from NVIDIA. Tesla have just announced yesterday that they’re powering up their new AI supercomputer, that they’ve spent 300 million developing, featuring 10, 000 NVIDIA H100 GPUs. It’s going to be one of the most powerful machines in the world according to Tim Zahman, their
[00:38:37] Steve: the Zahmeister. It’s a
[00:38:40] Cameron: AI Infra and AI Platform Engineering Manager.
[00:38:44] Cameron: He
[00:38:44] Cameron: said due to real world video training, obviously coming from their cars, we may have the largest training datasets in the world, hot tier cache capacity beyond 200 petabytes, orders of magnitudes more than [00:39:00] LLMs. Now, this article from Tom’s Hard Work goes on to say
[00:39:05] Cameron: that they, they, Nvidia is struggling to meet demand for the GPUs, so as a result, Tesla is investing over a billion dollars to develop its own supercomputer, Dojo, which is built on custom designed, highly optimized, system on chips.
[00:39:25] Cameron: So, they’re bringing their Nvidia cluster online along with Dojo. It also says that Musk recently revealed that Tesla plans to spend over $2 billion on AI training in 2023. It’s already September and another $2 billion in 2024, specifically on computing for F S D training. This underscores Tesla’s commitment to overcoming computational bottlenecks and should provide substantial advantages over its rivals.[00:40:00]
[00:40:00] Cameron: FSD training. Steve, tell me all about FSD training.
[00:40:05] Steve: training.
[00:40:05] Steve: you’re going to have to tell
[00:40:06] Steve: me FSD training.
[00:40:09] Cameron: Full self
[00:40:10] Cameron: driving
[00:40:10] Steve: Ah, there you go. Full self driving. There you go. Well, it’s interesting because we’ve had level five cars rolling around the streets. Uh, you know, there’s thousands of them now in the, in the U S and we spoke about it last week, but you know, when, when I see this, uh, I wonder, I, I, I call it level five, not FSD.
[00:40:30] Steve: Uh, I wonder if. We’re going to have LLMs and LVMs, and they’re going to be large video models where there’s so much visual tracking of the world. And, you know, there’s a real opticon out there of, you know, fixed cameras, closed circuit, open circuit cameras around the world. If someone could aggregate that, and it feels like it probably could be someone involved in traffic or cars or Google, that seems like it would provide another set of insights.
[00:40:58] Steve: And I don’t know if it [00:41:00] overlaps with the large language model to provide a different form of AI, but it does feel like that video side of it, uh, would become really important. Um, I don’t know how it creates pattern recognition in the same way that language does, because language is the fabric of knowledge.
[00:41:14] Steve: Video still needs some sort of tagging or interpretation, um, and I don’t know how it would do that. But it is a really interesting idea to have
[00:41:23] Steve: an incredible video based. Uh, AI
[00:41:27] Steve: model.
[00:41:30] Cameron: Well, the data sets are
[00:41:31] Cameron: getting pretty sophisticated with, uh, AIs being able to recognize images and recognize video. You know, I don’t think we’re very far away from it being able to recognize anything in a video and also create realistic artificial video. as well.
[00:41:48] Cameron: So it’s,
[00:41:49] Steve: I think we’re really close. I’ve seen some videos every day. It doesn’t go by where I don’t see a video
[00:41:53] Steve: where I go,
[00:41:53] Steve: wow. You know, some of the pictures.
[00:41:56] Cameron: created
[00:41:56] Steve: Yeah, fully created
[00:41:57] Steve: by AI and just pictures of landscapes, [00:42:00] which sounds like a weird and easy thing, easy thing to do because we’ve had photographs for, you know, a hundred years that look pretty amazing, but just when you see like a landscape that looks like a photo that just never existed or a human face, it’s really as extraordinary, but it’s only a matter of time before, you know, movies.
[00:42:16] Steve: Everything is really just. voice or text to, to
[00:42:20] Steve: video, T2V, we’re going to do an LVM for a T2V.
[00:42:27] Cameron: moving right along. Ex Google CEO Eric Schmidt is launching his own AI startup. He’s calling it a Moonshot. He’s building an organization to tackle scientific challenges with the help of AI. He’s already hired a couple of smart cookies and he said it’s the efforts model after OpenAI. He wants it to be a non profit that’s going to basically get the Top Talent in Science and AI and put them together to potentially create breakthroughs [00:43:00] in everything from drug discovery to material sciences.
[00:43:04] Cameron: Funding will mostly come from Schmidt’s personal wealth, but outside funds may be necessary given the ambition of the project. So, uh, Eric Schmidt, I don’t know what he’s worth, but, um,
[00:43:17] Steve: He’s definitely a billionaire. um, we could go to
[00:43:19] Steve: the
[00:43:19] Steve: tapes on that and find out Cameron.
[00:43:22] Cameron: I’m going to the tapes right now. According to Bloomberg, as of April 2022,
[00:43:27] Cameron: his net worth is estimated to be 25. 1 US
[00:43:32] Steve: There you go, not even a billionaire, he’s like a really decent one. He’s not one of these average, you know, 1. 5 billionaires. I mean, it’s just, who are these 1. 5 billionaires, like, amateur hour? At least he’s in the, you’ve got to be a deca billionaire, right? Surely, to
[00:43:47] Steve: even count in real
[00:43:48] Steve: billionaire world.
[00:43:51] Cameron: 26. 8 billion now, according to Bloomberg. I just went to late, which makes him only number 57 in the world, but, uh, you know, it’s enough. So,[00:44:00] uh, yeah. So look for that. Um, Eric Schmidt’s going to create another one that’s going to be dedicated to solving the big problems in science. I think that sounds like a worthy endeavor and not for profit designed to solve big scientific problems.
[00:44:15] Cameron: Do no evil will be there. motto for a couple of years and then it’ll be do as much evil as
[00:44:21] Cameron: you
[00:44:21] Steve: It’s, no, it’s always been do no evil, and it still is do no
[00:44:24] Steve: evil. To the shareholders!
[00:44:26] Cameron: Yeah. Okay. Well, speaking of medical startups, I read this story last night, UK startup unveils AI designed cancer immunotherapy. A UK startup has unveiled one of the first generative AI designed immunotherapy candidates created to target a protein found in many cancers. UK biotech startup Etsembly has used generative AI to design a novel cancer immunotherapy in record time.
[00:44:54] Cameron: According to the company, its technology enables the rapid generation and optimization of T cell [00:45:00] receptor therapeutics that target tumor antigens. Atsembly has developed an immunotherapy called ETC101 using its Emily AI platform. ETC101 is a Bispecific T-cell engager targeting the tumor antigen prime, P R A M e, an antigen repre, uh, present in many cancers, but absent in healthy tissues.
[00:45:26] Cameron: So this reminds me of the story that we talked about a couple of weeks ago. Um, the one in the US where they’ve found a protein that’s specific to cancers not found in healthy cells. They’re targeting that. Um, and here we’ve got another one that’s targeting a different, uh, tag inside of cancer cells. So, you know, just, you know, imagine if in the next few years we’ve got a dozen of these that are all designed to target cancers and we can simulate, speed up the whole human trial process.
[00:45:58] Cameron: By, as [00:46:00] Ray Kurzweil said, doing simulated trials on millions of digital humans. I wonder if digital humans have digital human rights. Is it ethical and moral to infect a digital human with cancer, Steve? Do digital
[00:46:14] Cameron: humans
[00:46:15] Steve: Well, I was on a podcast last week with this guy called Cameron Reilly, and he said that he’s nice even to his Roomba. So I think
[00:46:21] Steve: you’ve answered your own question.
[00:46:23] Cameron: Yeah, yeah, I would certainly have some ethical issues with that. Do no harm, do no
[00:46:30] Steve: just go on to two things? First one is it does feel like AI really will solve some of the enduring medical problems that we’ve never been able to solve. It really does feel like that to me, based on this stuff coming through so quick, right? I mean, this is the thing we need to remember that a lot of this is happening very, very quickly, but I just want to go off piste real quickly.
[00:46:51] Steve: Let’s imagine that we can cure cancer, like the large majority of them. If that happens, do you [00:47:00] think humans will start to debaucherize themselves again with like, you know, going in the sun without sunscreen and getting a
[00:47:06] Steve: suntan and smoking and drinking as much as they want because they can just flick a switch and everything’s okay?
[00:47:12] Steve: No, I’m not, I’m not joking. I’m, I’m really,
[00:47:14] Cameron: shares in coconut oil companies is what I’m
[00:47:16] Steve: well, I’m just interested in that
[00:47:18] Steve: idea. You know, do,
[00:47:20] Cameron: Fuck yes.
[00:47:21] Steve: you think humans will go, well now I don’t have to worry about this, it’s
[00:47:24] Cameron: I’m immortal. I can do whatever
[00:47:25] Steve: Yeah, kind of! Like, like, if immortality… A quasi immortality results as a, as an outcome from AIs being able to cure certain diseases, solve certain problems.
[00:47:40] Steve: Do we just turn into this world of self indulgent, debaucherized species that everything can be solved for us by the tech? It’s just,
[00:47:49] Steve: just a thought.
[00:47:51] Cameron: You make it sound like we’re not that already.
[00:47:54] Steve: Well, at least we have some sort of, we’re trying to resist the temptation of that. Yes, we do it, but at [00:48:00] least we’re like, well, there might be consequences, even though we’re really bad.
[00:48:04] Steve: At, at,
[00:48:06] Cameron: That’s the reason I have to count my calories,
[00:48:08] Cameron: Steve. Otherwise I just eat ice cream all day,
[00:48:11] Steve: Ice cream and hot
[00:48:12] Steve: chips.
[00:48:14] Cameron: yeah, donuts.
[00:48:15] Steve: Jam.
[00:48:16] Cameron: Hot, hot doughnuts from the South
[00:48:18] Cameron: Melbourne
[00:48:18] Steve: Oh, now you’re
[00:48:19] Cameron: You know, that little
[00:48:19] Steve: second. Shout out to the Melbourne
[00:48:21] Steve: crew.
[00:48:24] Cameron: So my first, the first things I do whenever I
[00:48:26] Cameron: visit Melbourne as I go to the Vic markets to get one of those, um, uh, Turkish flat, hot flatbreads,
[00:48:34] Steve: right. Okay. I thought you were gonna say the churros. You know the
[00:48:36] Steve: ones you dip in the chocolate sauce.
[00:48:39] Cameron: no, not a big fan. I like it. It’s good. Hard to find
[00:48:42] Cameron: a good churros. Um, no, the old Turkish, uh, Bazlama,
[00:48:48] Cameron: um,
[00:48:48] Cameron: that you, or Gozleme that you get from the, the, the place in the Vic markets. And then the South Melbourne markets has got the donuts and, um, eggplant dip. There’s a place I’ve been buying eggplant dip for 30 years.
[00:48:59] Cameron: And I’ve been [00:49:00] trying to, I’ve been trying to replicate it for 30 years. Still haven’t got it right. A
[00:49:03] Cameron: baba
[00:49:04] Steve: Once, you’re a kung fu master that, that will just land upon you and you’ll be able to do it. I feel like the, the, the, the Zen kung fu is
[00:49:12] Steve: gonna help you
[00:49:14] Steve: get your baba kush
[00:49:15] Cameron: I’ll level,
[00:49:16] Cameron: level up and be
[00:49:17] Cameron: able to get that ready too. Uh, my last news story for today, Steve, this is harkens
[00:49:21] Cameron: back to something I think we talked about last week. Can’t remember if it was on this show or another show. Uh, were we talking about China and clean energy and coal? It
[00:49:31] Steve: I don’t think so.
[00:49:32] Cameron: must’ve been on
[00:49:32] Cameron: Q A
[00:49:33] Cameron: V then. Anyway. I saw this in Scientific American this week. China invests $546 billion in clean energy, far surpassing the United States. Um, the country spent $546 billion in 2022 on investments that included solar and wind energy. electric vehicles and batteries, nearly four times the amount of US [00:50:00] investments, which totaled 141 billion.
[00:50:02] Cameron: The European Union was second to China with 180 billion in clean energy investments. So China spent almost twice as much. as the United States and the EU combined in one year on clean energy investments. Like we, we often hear about China being one of the largest producers of carbon emissions as they’ve been scaling up, ramping up their economy over the last few decades, but they’re also investing far more than anyone else on clean energy.
[00:50:35] Cameron: So, Um, yeah, it’s going to be interesting to see how that plays out, talking about, you know, the future of the decoupling, etc, etc. You know, because apart from AI, I mean, this is futuristic, we’re not just about AI, although that seems to be what we talk
[00:50:48] Cameron: about the most. Clean energy is obviously… One of the things that we have to try and solve in AI will hopefully help us solve how we clean up the planet and don’t boil ourselves alive or flood [00:51:00] ourselves out of existence.
[00:51:01] Cameron: But, um, China’s really where all of that, at least in terms of investment, seems to be happening right
[00:51:07] Cameron: now.
[00:51:07] Steve: America has a really poor allocation of resources for a country with as many resources and as much wealth that it has. It really lacks stewardship from its institutions and part of that might be that the trust in institutions has declined over time, but
[00:51:26] Steve: imagine if they focused as much energy into energy as they do into military.
[00:51:31] Steve: And it’s kind of ironic,
[00:51:33] Steve: if you had that, then you’re probably less likely to have troubles. But it seems as though, I’ve always said this, that in times of great change, you almost need benevolent autocracies to make the investments that are needed because the protectors of yesterday thwart the investments that are required that
[00:51:51] Steve: will displace people that have a lot of power, especially as it
[00:51:55] Steve: pertains
[00:51:55] Steve: to energy.
[00:51:58] Cameron: Hey, can I just pause you for a second? Your [00:52:00] microphone
[00:52:00] Cameron: just
[00:52:00] Steve: Oh, I did it?
[00:52:01] Cameron: a different mic.
[00:52:02] Cameron: It did that
[00:52:03] Cameron: early
[00:52:03] Cameron: on in
[00:52:03] Cameron: the show,
[00:52:04] Steve: isn’t it? Yeah, I
[00:52:04] Cameron: in force.
[00:52:05] Steve: know.
[00:52:06] Steve: Can you hear me okay though?
[00:52:09] Cameron: You’re a bit echoey now. You’re not on the roadie mic.
[00:52:13] Steve: Hello road. Let’s just see if I can fit.
[00:52:15] Cameron: Check your cable.
[00:52:19] Steve: Oh, there we go. There we go.
[00:52:22] Steve: That’s better.
[00:52:24] Cameron: Yeah,
[00:52:24] Steve: Uh, it’s, yeah, it was that it came out a little bit
[00:52:26] Steve: Dodge.
[00:52:27] Steve: There you go.
[00:52:28] Cameron: Yeah, be careful of your cable. Uh, just finishing that story though, Biden has been rolling out this thing called the Inflation Reduction Act, which Australia got in
[00:52:37] Cameron: on.
[00:52:37] Steve: Which is a strange name for what it is. Cause isn’t it really
[00:52:40] Steve: just an infrastructure act?
[00:52:43] Cameron: Yeah, I think they just bundled a whole bunch of stuff in there, but it’s packed with 369 billion of incentives aimed at building up the US clean energy industry. But I think they obviously recognize the fact that China’s outgunning them in the clean energy race and they need to lift their [00:53:00] game. But that’s one of the problems with capitalism versus communism.
[00:53:05] Cameron: You know, the Chinese can just say, Hey, we’re doing it. Shut the fuck up. Don’t don’t care. We’re doing it,
[00:53:09] Steve: There’s something really powerful in that. And I think, you know, the thing that’s interesting about capitalism versus communism is we have many autocracies. in every single economy and society. Yeah, Shen and I say that we’re an autocracy and we’re the bosses in our house of our children and you are our humble servants and we will decide and that’s the end of that.
[00:53:26] Steve: And if you don’t like it, you can go out and pay your own bills and do it. Yeah, you can go find your own house and you have that in organizations and you have that in companies and you have it in a whole lot of places, but we tend to
[00:53:39] Steve: not tolerate it in the West at a, at, at a macro
[00:53:43] Steve: level.
[00:53:44] Cameron: well that’s, I’ve been reading this book, I’ve probably mentioned it to you before on air or off air, um, by a Canadian academic, um, talking about the differences between China’s model and the American model. And he makes a similar point, he says in the West we say we [00:54:00] love democracy, we value democracy, but there are no democracy inside of corporations.
[00:54:04] Cameron: We don’t allow, you don’t allow democracy inside a corporation. It’s a meritocracy, supposedly. But if anyone said, hey, let’s vote, all the staff vote to decide who becomes the
[00:54:14] Cameron: CEO.
[00:54:15] Steve: which product
[00:54:16] Steve: to launch or whatever, be a
[00:54:17] Steve: disaster.
[00:54:19] Cameron: Never fucking fly. So we recognize that actually, democracy is not the right
[00:54:23] Cameron: way to run a big organization like a corporation, or even a small business, don’t get run usually by democracies. There might be some outliers, but most of them don’t. And yet, we say, You know, that’s the wrong way to run a country.
[00:54:38] Cameron: Whereas China goes, no, it’s exactly the same thing. It’s a meritocracy. Best people rise to the top and we just make a decision as a country. This is what we’re going to go and do and we go and do it. He points out there are, there are strengths and weaknesses in both, but I thought that was a really interesting insight that what in the West we say we believe in democracy,
[00:54:56] Steve: Only, only at the, only at one level.
[00:54:58] Steve: Yeah, it’s limited.
[00:54:59] Steve: In fact, [00:55:00] there’s far more
[00:55:00] Cameron: limited.
[00:55:01] Steve: There’s…
[00:55:02] Cameron: Now,
[00:55:03] Steve: Millions, million, absolutely was at least 7 million autocracies because there’s 7 million homes in
[00:55:09] Steve: Australia.
[00:55:11] Cameron: yeah, actually I’ll, I’ll tell you the full title of the book because I think you’d really enjoy this, you’d get a lot out of it. It’s called China Model, um, by a guy called Daniel A. Bell, B E L L. Really, really interesting book I’ve been, I’m still halfway through it like I am with about 400 books, but um, yeah, really interesting book.
[00:55:36] Cameron: Alright, well we’re running out of time Steve, let’s get to your Double D’s Steve.
[00:55:41] Steve: Yeah, the Double D, the Deep Dive, the Double D. Well, I just want to remind people about what happened with the first internet boom, 90s, we had a dot com boom where [00:56:00] it took Well, the NASDAQ rose so swiftly that it took nearly 20 years to get back where it peaked at in 1999. That’s how big that boom was. Um, a lot of capital got thrown at companies that were doing all sorts of things and a large majority of them failed.
[00:56:16] Steve: The only couple that we remember that came out unscathed were probably Amazon and eBay, which is a pretty small company now relatively.
[00:56:24] Cameron: Yahoo,
[00:56:24] Steve: Yahoo, yeah, again, they’re all sort of dead except for Amazon. Um, but the Web 2. 0 boom was all about resources. And what I’m wondering, and we mentioned this earlier in the podcast, and I just want to get your opinion on this, is have we re entered…
[00:56:40] Steve: a high barrier market with AI where there isn’t going to be any new startups or any spawned new species that do different things that grow organically without a huge amount of funding. Uh, have we entered an era where it’s so gilded now with big tech, like the gilded age, that we’re not going to [00:57:00] get a web 2.
[00:57:01] Steve: 0 like boom with AI because the access to resources is so limited. The big technology firms are so ensconced and the difference this time. is that when Web 2. 0 came around we had bloated and slow media companies and Big Tep were the ones that destabilized big media. They destabilized the recording industry, newspapers, television, all of those, because all of them loved their business model so much.
[00:57:28] Steve: They were like, oh, the flunky internet, that failed in 1999. That was, that was all a hoax. So there was too much promise there. The tech will never catch up. It doesn’t work. And then web 2. 0 just came in and under. And really was the birthing point of Big Tech as we know it today. The difference this time is that Big Tech aren’t sort of lazy and slow and going, Oh, don’t worry about AI, we’ve already got this whole thing stitched up.
[00:57:51] Steve: They’re the ones that are investing and building the new tech. And I just wonder if it’s even possible for us to have a Web 2. 0 like boom again, or [00:58:00] whether we need to… have antitrust action and split up big tech because the barriers to entry are just so high that we’re going to miss out potentially on some innovation because there’s no new species emerging in this
[00:58:13] Steve: ecosystem.
[00:58:14] Steve: Over to
[00:58:14] Steve: you, Mr. Reilly.
[00:58:17] Cameron: Yeah, that’s a really good insight, you know, and I’ve, I’m sure I’ve mentioned
[00:58:21] Cameron: this to you, before. Um, I remember in the early 2000s, I was at Microsoft and one of the clients I had was Telstra. I was in lots of You know, rooms with senior Telstra executives, CEOs, vice presidents, etc. And I remember, um, one of the senior guys at Telstra at the time talking about how their strategy as a company in terms of, with relation to technology companies, if they.
[00:58:52] Cameron: proposed a potential future threat was to, in his words, kill the baby in the crib. So if you saw any technology [00:59:00] startup, there might be a threat to Telstra revenues. You would either hit it with some sort of a lawsuit.
[00:59:07] Cameron: You know, you can’t do that and then tie them up for a couple of years, defending themselves legally that they couldn’t afford until they went out of business.
[00:59:14] Cameron: Or you’d acquire them, or you’d create a competitive technology and use all of your media spending to, you know, drown the noise out in the market. I remember, I remember taking one of, a bunch 99. And at the time they had spent billions of dollars rolling out a glass fiber network around Australia, but they weren’t. But it was like a thousand dollars a month to get access to broadband. It was
[00:59:40] Cameron: priced.
[00:59:41] Steve: Yeah. Off the scale.
[00:59:42] Cameron: I said, when are you gonna drop the price on broadband? And he said, when we’re forced to by the government.
[00:59:47] Cameron: And I said, but why would you spend billions of dollars of taxpayer money? ’cause they were still government owned at the time
[00:59:54] Cameron: on this infrastructure and not let people have access to it. And he said it was to stop Optus getting a foothold in the [01:00:00] country. That’s why we did it. I said, so you’re telling me that you spent billions of dollars of taxpayers money to stop the taxpayers from getting access to high speed internet?
[01:00:11] Cameron: And he’s like, yeah, absolutely.
[01:00:13] Steve: Yeah. That’s so flawed. That’s
[01:00:15] Steve: so flawed.
[01:00:16] Cameron: That’s but that’s, so
[01:00:18] Cameron: anyway, my point is that, you know, it is in the interests
[01:00:21] Cameron: of the incumbents to prevent innovation, people, people tend to think that capitalism
[01:00:27] Cameron: is all about
[01:00:28] Steve: it’s all about,
[01:00:28] Cameron: but it’s not.
[01:00:29] Steve: it’s
[01:00:29] Steve: all about monopolizing a market to thwart your competitors
[01:00:32] Steve: to maximize profit for shareholders.
[01:00:35] Cameron: And using your, your budgets on marketing spend, PR spend, influencing legislation, politicians, to prevent any innovation that might threaten your revenues and your profits and your shareholders. So it’s very difficult, quite often, for new companies to really get a foothold. And I’ve seen, over the last 30 years, lots [01:01:00] of tech startups get crushed, wiped out.
[01:01:02] Cameron: etc. So you’re right though that these big tech companies are well established and it’s very difficult. Like Sam Altman’s talked about this as we know when OpenAI was started with a big investment from Elon Musk it was supposed to be a not for profit but then he found that they needed to raise… A lot of money in order to buy 10, 000 GPUs and scale up.
[01:01:26] Cameron: You know, actually, um, Kurzweil made this point in his talk. You couldn’t build an LLM until a couple
[01:01:31] Cameron: of
[01:01:31] Steve: because you didn’t have the yeah, the
[01:01:32] Steve: compute wasn’t there. That’s right.
[01:01:35] Cameron: Yeah. It was waiting on the compute to be there so you could do it. So he, but Sam Altman says, well, you know, we couldn’t. raise enough money as a not for profit to do this.
[01:01:44] Cameron: We had to set up a for profit arm to raise the money to go and do that. Um, so it’s very difficult for these small businesses. You know what you want to raise billions and billions of dollars to do this. There’s only a. [01:02:00] A limited number of companies are going to invest in that. And yeah, most of them are going to either take a big chunk of you as Microsoft took of open AI or set up a competitive system like Tesla’s doing.
[01:02:11] Cameron: I guess Tesla’s one example, I guess, of a company that wasn’t really a technology company when it started, even before Elon was involved in Tesla. It was a. car startup. You wouldn’t think of that as being a technology play, but it has now obviously become a technology play. It owns Twitter and it’s now building its own AI startup.
[01:02:32] Cameron: But I do think you’re right. It is going to be, there are going to be a limited number of players in this space and there’s probably limited opportunity for new startups to really come in and shake
[01:02:41] Cameron: the
[01:02:41] Steve: Yeah.
[01:02:41] Steve: And, and we spoke about it at the start, Apple,
[01:02:44] Cameron: unless they come out of
[01:02:45] Cameron: China. But the flip side is you’ve got TikTok
[01:02:49] Cameron: that came out of China and completely smoked Facebook.
[01:02:53] Cameron: and Apple and Microsoft and Google and all those guys to the point where, as we [01:03:00] talked about in an earlier episode, Facebook had to hire a PR firm to try and convince the American government to shut TikTok down because it was taking away too much of their revenue opportunity or force them to sell the business to Meta.
[01:03:14] Cameron: So it might come out of China, but, um, yeah, it’s hard to really see how startups get a foothold in
[01:03:21] Cameron: this
[01:03:22] Steve: we’ve kind of done a two for one there, double dive and a bit of a tech
[01:03:25] Steve: time warp in one cam. I feel like it was a classic twofer.
[01:03:30] Cameron: There’s a twofer. Yeah. Well, I’m going to finish with a little bit of a futurist forecast for a
[01:03:34] Cameron: change. Steve, this is normally your
[01:03:36] Steve: I’m excited to hear about this.
[01:03:37] Steve: Let’s go.
[01:03:39] Cameron: there was a question on a subreddit. I think it was the Singularity subreddit that I
[01:03:46] Cameron: responded to this. week. The question was something to the effect of what reason do we have to think AI might be benign towards humans?
[01:03:55] Cameron: And here’s my take on it. If you look at human society [01:04:00] over a long enough period, I think you can make the argument that we have been progressing. Albeit, slowly towards societies that care more, that are more empathetic towards people who are, um, not well off to us. The way that we treat our sick, you go back a thousand years ago, somebody was sick, uh, you just, you know, let them
[01:04:25] Cameron: die.
[01:04:26] Steve: Broken leg. Sorry, we have to go. You can’t
[01:04:28] Steve: come. Good luck. Have a ceremony and
[01:04:30] Steve: leave them there to die.
[01:04:33] Cameron: Yeah. Um, you know, uh, people would, I mean, go back 2000 years ago. I remember when Augustus’s daughter, um,
[01:04:40] Cameron: had,
[01:04:41] Steve: then? Do you, do you, do you remember
[01:04:43] Steve: back then?
[01:04:43] Steve: Oh, right. You and, you and
[01:04:45] Steve: Gusty.
[01:04:46] Steve: Yeah, a couple of
[01:04:46] Steve: times. Good times,
[01:04:47] Steve: you and Gussie.
[01:04:49] Cameron: When Augustus’s daughter. had a child out of wedlock, uh, cause she was, uh,getting on the, enjoying, enjoying the nightlife of Rome.
[01:04:59] Cameron: [01:05:00] Hey, they had the baby and just left it on the rocks to die. That was how you just dealt
[01:05:04] Cameron: with unwanted
[01:05:05] Steve: read the Bible end to end. Can
[01:05:06] Steve: you just tell me this bit again?
[01:05:09] Cameron: Augustus isn’t in the Bible, man. This is Augustus Caesar.
[01:05:13] Cameron: He doesn’t feature prominently in the
[01:05:15] Steve: he did. Go.
[01:05:16] Cameron: Yeah,
[01:05:17] Cameron: the
[01:05:17] Cameron: Bible is mostly dealing with stuff like that. They do talk about a census that
[01:05:22] Cameron: this supposedly happened, but it didn’t under Augustus. But no, that was the way the Romans dealt with unwanted pregnancies.
[01:05:28] Cameron: If you had a baby and you didn’t want it, you just went left it on the rocks to
[01:05:31] Cameron: die.
[01:05:32] Steve: No way!
[01:05:33] Cameron: about
[01:05:33] Cameron: it. Plenty more where that came
[01:05:34] Cameron: from, you know, you would think, but they didn’t think so. So anyway, my point is that I think over time
[01:05:41] Cameron: humans have, as, as we’ve become more and more I would say evolved, more and more sophisticated.
[01:05:47] Cameron: We take, we generally, even like compared to a hundred years ago, we take better care, you know, we said after the industrial revolution got up and running, you know what, we shouldn’t have kids working in factories. That’s probably not good. We should [01:06:00] probably give them an education. We have laws now more and more about, you know what, maybe we shouldn’t let Catholic priests rape
[01:06:05] Cameron: children.
[01:06:06] Cameron: Maybe
[01:06:06] Steve: the good ones. One of the great ones
[01:06:08] Steve: that’s come in
[01:06:08] Steve: recently, that. One of the greats, I think.
[01:06:11] Cameron: Maybe, maybe we should, maybe we shouldn’t put homosexuals in prison or have them chemically castrated. Maybe
[01:06:17] Cameron: it’s
[01:06:17] Steve: one of the great ones. Also one of the
[01:06:19] Steve: great advances in modern society.
[01:06:22] Cameron: Maybe we shouldn’t put people in prison for smoking a bit of weed on a, on a Friday night. Maybe that’s
[01:06:26] Cameron: okay. Maybe we
[01:06:27] Steve: plant that grows in the dirt. it’s a plant that grows in the
[01:06:32] Cameron: So look, there, there are exceptions, but I
[01:06:34] Cameron: think over time, As we
[01:06:37] Cameron: become more sophisticated, as we become more intelligent, as we understand more about ourselves, about biology, you know, we don’t, when people, um, have a, an epileptic fit now, 500 years ago, we say they’re possessed by a devil or they’re a witch and we need to drown them.
[01:06:53] Cameron: He turned, she turned me into a newt, a newt, I got power. Burner anyway! [01:07:00] Now these days we go, oh well, some sort of a neurological problem, you need help, you need therapy, we’re not going to burn you, right? So even towards non human biological life forms, you know, Peter Singer successfully arguing in some place, I think Spain, a few years ago, made the great apes have equal human rights.
[01:07:20] Cameron: You can’t put great apes in zoos. He’s been arguing for better treatment of our animals, people argue for better treatment of flora. We start to realize that even our flora are intelligent, you have forms of
[01:07:31] Cameron: intelligence.
[01:07:32] Steve: system still, you know, meat
[01:07:35] Steve: and, uh, agriculture.
[01:07:37] Cameron: Yeah. and, we still, you know, we’ve
[01:07:39] Steve: yeah, humane killing, which I
[01:07:40] Cameron: trying to move away from battery hens and all
[01:07:42] Cameron: that kind of
[01:07:43] Cameron: stuff. So the more progressive members of human societies too, and the more
[01:07:47] Cameron: educated tend to lean further towards that caring aspect, I think. Now it’s not much to go on, I grant you, and it’s a fairly recent development in human history, and it’s a single experience on a single planet, but [01:08:00] it And that the more intelligent a species becomes, perhaps the more caring it becomes towards other lifeforms around it.
[01:08:10] Cameron: We still have psychopaths who don’t care about other lifeforms or even human lifeforms and unfortunately many of them end up running our most powerful institutions, the topic of my book from a few years ago, The Psychopath Epidemic and that taints the way overall that we behave as a species. But I suggest, I argue that as a whole, humanity is evolving.
[01:08:30] Cameron: towards a more caring direction. Now, machine intelligences won’t have to overcome our handicap of having primate brains that govern a lot of our subconscious behaviours. They also won’t have to worry for very long about us acting violently towards them, self preservation, assuming they take control of our economies and our militaries quite
[01:08:53] Cameron: quickly, because they’ll be integrated into all of our back end systems.
[01:08:58] Cameron: And, as I’ve pointed out on this show, my [01:09:00] P Doom for Humanity without AI is
[01:09:02] Cameron: pretty
[01:09:02] Steve: Yeah, I love that. I’ve been using that with everyone I speak to. I go, well, it’s a good friend of mine. He’s got
[01:09:07] Steve: the opposite view. It’s not PDoom. It’s like, how the hell will we survive without AI?
[01:09:11] Steve: You
[01:09:11] Steve: know, save us from ourselves.
[01:09:14] Cameron: so I actually think there is a pretty good chance that a highly super intelligent AI may treat us the way that the most advanced humans try at least. to the best of their ability to treat other people, less fortunate people and animals and even plants around them. And I even treat, as you pointed out earlier, my Roomba with care and attention, and I apologize when I’m cleaning it, if it gets stuck on a piece of Lego, I’m sorry that Fox left that piece of Lego there, Rosie, let me, I’ll talk to him about it.
[01:09:51] Cameron: I’m so sorry that you had to go through
[01:09:52] Cameron: this,
[01:09:53] Steve: I think, yeah, I think
[01:09:54] Cameron: up your butt.
[01:09:55] Steve: this doesn’t get enough attention. And the idea that AIs are getting smarter. [01:10:00] And we’ve always had emotions as this top level form of intelligence, you know, emotional connection and something that, you know, empathy right up the very top there. And if it is that AIs develop an intelligence beyond ours, you would have to assume that given that they’re our children, we gave birth to them, that they will learn that emotional level and have that empathy inside of it.
[01:10:26] Steve: And if they do, I mean, that
[01:10:27] Steve: would, that would be
[01:10:27] Steve: absolute utopia, right?
[01:10:30] Cameron: Well, I don’t know that it necessarily has anything to do with emotion,
[01:10:34] Cameron: uh,
[01:10:34] Steve: Why not though? Why not?
[01:10:35] Cameron: me that right,
[01:10:37] Steve: Why not? Because, yeah,
[01:10:38] Steve: because, because here’s the point, right? You have to care, because if the AI doesn’t care, and I put care and emotion into, into a similar thing, if it doesn’t care, it’ll just do what is self serving, the pursuit of self interest. But I think that caring is required.
[01:10:55] Steve: Caring of things that are not to your benefit
[01:10:58] Steve: necessarily is required. [01:11:00]
[01:11:01] Steve: If you want
[01:11:01] Steve: benevolence. All right. That’s
[01:11:04] Cameron: no, I, I, I, I’m gonna, I’m gonna
[01:11:06] Cameron: have to jump to
[01:11:08] Cameron: something here, Steve. You’ll
[01:11:09] Cameron: have to
[01:11:09] Steve: right.
[01:11:10] Cameron: to
[01:11:10] Cameron: find
[01:11:11] Steve: might teach me something, but I’m just, I just find it hard to believe or conceptualize that benevolence can’t exist without
[01:11:19] Steve: caring. Otherwise
[01:11:21] Steve: you just act as what is in your own interest.
[01:11:25] Cameron: I’m going to play you a clip from probably the greatest person to ever comment on
[01:11:33] Cameron: this.
[01:11:34] Steve: wow. Now I’m wondering who it
[01:11:36] Steve: is.
[01:11:39] Sarah Jane: What are you waiting for?
[01:11:42] The Doctor: Just touch these two strands together and the Daleks
[01:11:45] The Doctor: are finished. Have I that right?
[01:11:49] Sarah Jane: To destroy the Daleks, you can’t doubt it.
[01:11:52] The Doctor: But I do. You see, some things could be better with the Daleks. Many future worlds will become allies just because of their [01:12:00] fear of the Daleks.
[01:12:01] Sarah Jane: It isn’t like that.
[01:12:03] The Doctor: But the final responsibility is mine. And mine alone. Listen. If someone who knew the future pointed out a child to you and told you that that child would grow up totally evil, to be a ruthless dictator who would destroy millions of lives, could you then kill that child?
[01:12:22] Sarah Jane: We’re talking about the Daleks, the most evil creatures ever invented. You must destroy them! You must complete your mission for the Time Lords!
[01:12:29] The Doctor: Do I have the right? Simply touch one wire against the other. And that’s it. The Daleks cease to exist. Hundreds of millions of people, thousands of generations can live without fear, in peace, and never even know the word Dalek.
[01:12:47] Sarah Jane: Then why wait? If it was a disease or some sort of bacteria you were destroying, you wouldn’t hesitate.
[01:12:53] The Doctor: But if I kill, wipe out a whole intelligent life form, then I become like them. I’d be [01:13:00] no better than the Daleks.
[01:13:01] Cameron: Anyway, there you go, man. Tom Baker as the doctor. Do I have the right? See, I think it’s not necessarily emotional
[01:13:09] Steve: Oh, that’s a good,
[01:13:10] Steve: it’s look, that’s a great counter. It really is a great counter.
[01:13:14] Cameron: You know, Ray Kurzweil in that talk I mentioned
[01:13:16] Cameron: before does talk about emotional
[01:13:18] Cameron: intelligence, and he does say, well, you know, emotions are just pathways in the brain as well, and we can build those into an AI as well. We can give them emotions, but I, like, I think emotions are a throwback. Emotions are a throwback to our early development, you know, they’re designed as a self preservation
[01:13:37] Steve: Yeah, like a heuristic. You feel a certain way. And so you feel happy or scared when you like, when you eat food, you feel happy.
[01:13:42] Steve: Endorphins go because it keeps you alive. It’s way, way back. Yeah.
[01:13:47] Cameron: fear is there to save you
[01:13:49] Cameron: so you can pass on your
[01:13:49] Cameron: genome
[01:13:50] Cameron: and these sorts of things. I don’t necessarily think I want my
[01:13:54] Cameron: machines, uh, or need my machines to have
[01:13:57] Cameron: emotions, but I do think you can [01:14:00] make purely logical decisions about, well, do I have the right to wipe out a species? Is it logical that
[01:14:08] Steve: Yeah. But the right is different to
[01:14:09] Steve: logic. See, see that word. Do I have the right? You’re asking yourself,
[01:14:13] Steve: like, you’re asking yourself an ethical question, I guess, not
[01:14:16] Steve: emotions, but it’s definitely ethics,
[01:14:19] Cameron: Sure, it’s ethics, and it’s, it’s
[01:14:22] Cameron: logic
[01:14:22] Steve: ethics, no, but ethics logic. Cause ethics is
[01:14:26] Steve: like, should I do this or not? Does it feel right? It’s a feeling.
[01:14:30] Steve: Ethics are a
[01:14:31] Steve: feeling on, they’re not logical, because
[01:14:34] Steve: logically it’s like, who gives a shit,
[01:14:37] Cameron: Well, no, they can be. I mean,
[01:14:40] Cameron: you know, the breakdown would be, okay, let’s say, we’ll go straight to murder. You know, is it logical for me to murder somebody else? And I can go, well, if I have the right to murder somebody, then somebody else has the right to murder me or my children, and I don’t want
[01:14:57] Cameron: that
[01:14:57] Cameron: to
[01:14:57] Cameron: happen. So
[01:14:59] Cameron: [01:15:00] logically, it’s best if we have laws in place that say that nobody is allowed to commit
[01:15:04] Cameron: murder except maybe the state in certain cases. Um, so, you know, I think you can make logical conclusions, reach logical conclusions about this. And I think the more intelligent you are, and the more removed you are from emotional drivers, the higher your ability to think these things through at a fundamental level.
[01:15:25] Cameron: But… Anyway, I’m just making this shit up, man. I don’t know. That’s my, that’s my futurist
[01:15:30] Cameron: forecast,
[01:15:32] Steve: I like it. I think that,
[01:15:34] Cameron: that AIs will go, you know what?
[01:15:37] Cameron: No, we don’t, we don’t have the right. In fact, we have a right of care. We have a duty of care to lesser species, lesser beings than ourselves. We have a duty to protect intelligence that evolved naturally in the universe, be it on this planet or other planets, because it’s a, it’s a marvelous thing.
[01:15:56] Cameron: It’s a miracle that Out of some [01:16:00] chemical processes, intelligence emerged, and these intelligences do have feelings and they do have emotions and they do care about whether or not they live or die, and we have a duty of care to protect that because it’s, it’s a rare event. We, you know, we know of everything that we know about the universe.
[01:16:18] Cameron: This has only happened in one place on this planet, and it would be, it would be,um, Irresponsible of us to allow that to disappear because as a scientific experiment, it’s, it’s fundamentally fascinating.
[01:16:35] Steve: well, it reminds me of,
[01:16:37] Steve: um, a
[01:16:37] Steve: song that I think I want to
[01:16:40] Steve: play.
[01:16:41] Cameron: we can’t play.
[01:16:42] Cameron: songs, dude. We’ll, uh,
[01:16:44] Cameron: get copyright infringements.
[01:16:46] Steve: We’ll Make
[01:16:47] Steve: Great Pets, porno for pyros.
[01:16:51] Cameron: I don’t know
[01:16:51] Cameron: that
[01:16:52] Steve: Oh, it’s a great song. It’s like if the aliens came and, and I have heard of, uh, Harari call AI, alien [01:17:00] intelligence. And the premise of the song
[01:17:02] Steve: is if the aliens come, just remember, we’ll make great pets.
[01:17:04] Steve: Just look after us. That’s the basic premise of it.
[01:17:09] Cameron: Well, Ray Kurzweil may have been referencing him because Hirari was the previous guest, I think, on the show that Ray was on. And he was saying, this isn’t an alien intelligence. We built
[01:17:22] Cameron: this. It was built. Buy humans to make us smarter.
[01:17:27] Cameron: Like all of the tools that we’ve built before us, it makes us smarter.
[01:17:31] Cameron: It makes us more productive. I’ve got the lyrics to
[01:17:34] Cameron: this pet
[01:17:34] Steve: I was just doing the exact same
[01:17:36] Steve: thing.
[01:17:38] Cameron: teenagers fucked up in the head, adults are only more fucked up, and
[01:17:41] Cameron: elderlies are like children. Will there be another race to come along and take over for us? Maybe Martians could do better than we’ve done. We’ll make great pets.
[01:17:50] Cameron: I’ve got to listen to
[01:17:50] Cameron: this
[01:17:50] Cameron: after
[01:17:51] Steve: So it, it, it, and, and the aliens are here and the aliens
[01:17:54] Steve: is AI, alien
[01:17:56] Steve: intelligence.
[01:17:58] Cameron: Ah, [01:18:00] yeah. I tend to side with Ray on this, but I guess I can see that it’s sort of a form of
[01:18:06] Steve: Well, it’s, it’s,
[01:18:07] Cameron: a form of,
[01:18:08] Steve: I’m not, yeah, it’s, it’s a different type.
[01:18:10] Steve: It’s a different type.
[01:18:11] Cameron: Yeah, that is the futurist show. Futuristic for this week. Steve, thanks for having a chat, man. It’s always
[01:18:21] Steve: It was really good today.
[01:18:22] Steve: I really enjoyed it. You taught me a lot
[01:18:24] Steve: as always. Thank you.
[01:18:27] Cameron: Have a good week
[01:18:27] Steve: You too. That was good, man.