Select Page

This episode explores the transformative potential of artificial intelligence (AI) and nanotechnology across various aspects of society and the economy. The hosts share insights on how AI tools like Python are revolutionizing fields such as education, comparing these developments to past technological advancements and contemplating their future impact. A significant focus is placed on the theoretical integration of nanofabricators in homes, which could enable an age of abundance by addressing key challenges such as energy, health, food scarcity, and waste management. The conversation also touches on the evolution of internet usage, with an emphasis on the trend towards re-bundling services to create comprehensive ecosystems. Moreover, the hosts conceptualize ‘P Utopia’, a visionary society where technology resolves fundamental human problems, prompting a re-evaluation of capitalism and societal values with scarce resources like space and land gaining prominence. The discussion ultimately raises critical questions about the societal implications of such advancements and the role of AI in shaping a future oriented towards abundance and technological innovation.

 

00:00 Welcome to The Futuristic: Catching Up

00:40 Diving into the World of AI: A Crazy Few Weeks

02:25 Coding Adventures and Audio Leveling Breakthroughs

05:36 Exploring AI’s Impact on Education and Personal Projects

09:49 Meta AI: Facebook’s Game-Changing Tool

22:40 The Future of AI Content and Its Implications

25:37 Microsoft’s VASA 1: A Leap Towards Lifelike Virtual Characters

37:41 Exploring Groundbreaking Tech: The Carbon Negative Nanogenerator

39:12 The Power of AI in Conversations and Research

39:58 The Future of Nanotechnology and AI’s Role

41:17 Debunking Net Zero: A Critical Look at Carbon Credits

43:45 The Potential of AGI: Transforming Research and Society

51:32 Navigating the Complexities of Global Media and Censorship

54:44 The Degenerative Potential of Generative AI

56:19 Reimagining the Internet: The Future of Content Curation

01:03:34 The Return of the Tech Time Warp: Bundling vs. Unbundling

01:06:42 Nanofabricators and the Future of Production

01:14:25 Envisioning a Post-Capitalist Society with Nanofabrication

FULL TRANSCRIPT

FUT 24

[00:00:00] Cameron: Welcome back to The

[00:00:09] Cameron: Futuristic. It’s been a long time between drinks, Steve

[00:00:14] Cameron: Sammartino. How are you, buddy?

[00:00:17] Steve: Hi, I’m well. Hearing your voice just puts a smile on my face. You’re so erudite and articulate that I feel like every time I’m going to learn something.

[00:00:29] Cameron: All my podcast hosts feel that way, man. Talking to me is the best. Uh, part of their week, they all tell

[00:00:35] Steve: This is definitely the best part of my day so far. Not even close.

[00:00:38] Cameron: I’m just that kind of guy. Uh, Hey, uh, it’s been, uh, I know we say this every episode, Steve, but it’s just been a crazy few weeks. I mean, I’ve got so many stories that we could talk about. We don’t have time. We’re going to pick the top few, but it’s been

[00:00:58] Cameron: a crazy couple of weeks in the world of AI, uh, in particular.

[00:01:04] Cameron: Uh, at least that’s how I felt. How about you? Have

[00:01:06] Steve: Yeah, same again. Yeah, same again. It just doesn’t

[00:01:09] Steve: stop. And I think

[00:01:12] Steve: it feels like each of those little

[00:01:16] Steve: periods of time when you get a technology shift, like a little fork in the road, this is one of those, you know, I felt that way when the smartphone first came out and different gadgets and things you could do.

[00:01:26] Steve: And when social first started coming back, it’d be like, Oh, this is website Reddit and there’s, um, yeah, delicious. Yeah. Get all your, all your sharing, your links and these different sites, everything web 2. 0. Yeah. You know, I felt like that when the internet, we were just talking offline about when people first started getting online with their dial up, you’d go out and buy a magazine, the top sites to visit, which, when was yours from, Cam?

[00:01:50] Steve: You just said. A book!

[00:01:51] Cameron: Mine’s not a magazine. It’s a book, like a published

[00:01:54] Cameron: book. I think it’s the top websites.

[00:01:57] Steve: to research, publish, do a book, and these sites will be fine. No books for 404s there.

[00:02:06] Cameron: O’Reilly or one of those publishers back from the

[00:02:09] Steve: Oh yeah, yeah, yeah,

[00:02:10] Cameron: think it’s like 1994, the top 100 websites of 1994 or something like that.

[00:02:16] Steve: That’s great.

[00:02:18] Steve: Yeah, so it definitely feels like that. Yeah, it feels like it.

[00:02:23] Cameron: I’ve

[00:02:24] Steve: Continues.

[00:02:25] Cameron: I have been, uh, let’s talk about what we’ve done. I have been, um, you know, coding and scripting my ass off for the last couple of weeks. Quite often in the wee hours of the morning because I’ve also had a bout of insomnia. And I’m like, okay, screw it. I’m just going to code, get up at 2am and just code.

[00:02:45] Cameron: Do the stuff that I don’t, I can’t really justify spending too many hours on during the day because of, you know, there’s a, there’s a sunk time cost aspect to it. But what else am I going to do at 2am? So a lot of coding. Um, I won’t, uh, just one thing, which is a really stupid, simple thing, but. It was kind of a revelation to me.

[00:03:08] Cameron: So, um, when I’m doing podcasts, I have a sound library of things like this. I don’t believe those figures. Please explain. That, you know, and for my history podcast, I’ve got hundreds of these and some of them are Clips from shows that I’ve done that I’ve pulled out with my other co host Ray and others are movie soundbites and that kind of stuff But the audio levels on them are all over the place and it’s been bugging me because you know when I capture them off of YouTube or whatever the audio levels are right now.

[00:03:39] Cameron: I’ve never had the Wherewithal, well, couldn’t be bothered to go and fix all the audio levels. I suddenly thought, I wonder if I can just write some code that’ll just fix all the audio levels. Boom! Jumped into one of the tools. Um, said, hey, can you write me some Python code that’ll do this? It’s like, piece of cake, man.

[00:03:57] Cameron: We write this Python code, just pointed it at a directory of hundreds of mp3. Files and a second later or 60 seconds later, they’re all audio balanced. Um, done. Boom. Oh, there you go. What else? And I actually. After that, I, I, one of the websites, I’ve got about 50 websites I read every morning. I have a little iOS shortcut that, um, I click and it opens up 50 web pages in Chrome and the same 50 pages.

[00:04:31] Cameron: And then I just,

[00:04:32] Cameron: you know, I just, some of them, I’m just seeing if they like, they’re like Russian news sites and Chinese news sites and blogs and different things,

[00:04:40] Steve: Barak, you are a communist sleeper. You, I mean, you are, I mean, we know. Red is under the bed and there’s a bed in the background.

[00:04:46] Steve: Red’s under the bed.

[00:04:48] Cameron: There’s nothing sleeping about it. I’m very openly, very openly a communist. Um, but yeah, I’m reading all these things and you know, sometimes I’ll just glance at the headlines and then click, click, click, click, click. Anyway, what are the, what are the blogs that I read? A guy mentioned a guy, mentioned a guy, mentioned a guy.

[00:05:06] Cameron: And it led me to a guy who’s written a bunch of books called things like automate all the boring stuff with Python. Now these are things that, you know, coders have been doing this for years, but it’s new to me. So anyway, that was a thing that I was like, Oh, what other

[00:05:19] Cameron: boring things can I code I haven’t found anything that was that, had that big an impact for me yet, but

[00:05:26] Steve: You’re not

[00:05:26] Steve: another code world, hey? You can code us out of our global problems. That’s

[00:05:31] Cameron: that’ll, that’ll, that’ll get there. What about you? What have you done? That’s cool. Steve?

[00:05:36] Steve: Before I tell you what I’ve done that’s cool, it’s interesting because I want to do some

[00:05:40] Steve: fixes on things that are live in

[00:05:42] Steve: codebases I don’t understand, but I’m nervous about it. That halfway through using an AI to help me, it buggers it up and I don’t get it. That’s the only thing. If you’re creating something new, it feels okay when you’re clean slating and if it’s for you, but if it’s something that’s live and is functional, it has this extra layer.

[00:05:59] Steve: And I’ve been thinking about doing that on my own, a couple of my own sites, but I haven’t got the courage to do it yet. Cause it’s like, I used to be a marathon runner, but I don’t run marathons anymore. So it’s the same kind of idea. It’s like, I’m a marathon runner. When’d you last run? I was 2004. Um, well, what I’ve noticed that was interesting, my daughter’s come up to me twice in the last week.

[00:06:20] Steve: First one was, can you explain large language models to me? I’m like, okay. I thought she just wanted to know, and I just explained it to her. And then she rang me and she

[00:06:30] Cameron: How old? How old’s your daughter?

[00:06:32] Steve: in grade, she’s in year 8, 14, so in a science class. And then the next day she

[00:06:38] Steve: came to me and said, so what’s the difference between that and machine learning? And so I explained, you know, machine learning to her. And I said, well, what’s going on here? And she said, we’re studying them in science class. And I thought that, That was interesting because they’re studying computation and various forms of code and computation and artificial intelligence in science.

[00:07:03] Steve: Historically, that would have lived within a simple, a separate computer science course, or maybe a derivative of maths, and it’s branched out from the, you know, the three core science, you know, chemistry, physics, biology. And here we have artificial intelligence sneaking its way into science class in grade eight.

[00:07:20] Steve: What I thought was interesting was that they’ve gone beyond the, Oh, should we be using AI in classes? Um, yes, no, maybe the argument on how that affects education to just actually just learning about it and probably learning about it then informs how it should be used in class. I just thought that was pretty quick evolution for that to be in there.

[00:07:43] Steve: and her asking the questions and it really piqued her curiosity into some of the stuff that I do and talk about and it enabled us to get a connection when you have that rare moment when you work with your mum or your dad or you see what they do and you and you say hey my mum or dad aren’t that stupid after all they actually know since no wonder they can pay bills so it was a it was actually a nice moment as well and interesting from an educational perspective

[00:08:07] Cameron: Mmm. Yeah, that’s, that’s so great. But even just the fact that you’ve got kids of that age that are starting to touch the perimeter of this as a learning tool, as an educational tool, um, you know, it’s interesting. I sat down with Fox, uh, just a day or so ago and I was playing, I’ve been playing around a lot with Meta AI, which we’ll talk about in a

[00:08:32] Steve: yes

[00:08:32] Cameron: but. And he’s like, can I ask it something? And I’m like, yeah, what do you want to ask it? Cause usually when I give him an AI to ask with, it’s like, Hoopy Poopy Bum Bum. Can you, you know, say that kind of stuff? I’m like, So I was getting it. And so he actually started coming up with some great questions about Pokemon cards.

[00:08:49] Cameron: That’s his big obsession is trading Pokemon cards. He’s like, ask it what the top 10 most expensive Pokemon cards are. And then we started talking about, you know, the designs behind them and he started asking it these sorts of questions, which is, you know, it could be a Google question, but you know, obviously the great thing is you’re able to interact with it and have a conversation.

[00:09:11] Cameron: But you know, it’s always interesting for me. And this is funny. My mom has been staying with us for a few weeks again. She’s a 66, sorry, 76, 77. He’s nine turning 10. And with both of them, I’m trying to get them to use AI. I mean, you know, both generations, the generation before me and the generation after me, I’m like, you got, I said, Mum, you got Meta AI now.

[00:09:36] Cameron: I mean, I put GPT on a phone six months ago or more and she hasn’t used it. I’m like, Meta AI now, it’s in Facebook Messenger, which you’re on all day, every day. It’s in Facebook now. You’re on it all day, every day. You’ve got a free AI. So let’s just talk about that. So Meta AI came out, uh, Facebook’s AI tool built on LAMA three

[00:09:57] Cameron: and, uh, it’s free and it’s, it’s kind of pretty much as good as chat, GPT-4.

[00:10:07] Steve: similar to me. I tested it a bit, felt, felt pretty good

[00:10:10] Cameron: And it’s freely available now, it rolling it out in Facebook Messenger. As I said, in Facebook, in Instagram, uh, they’re rolling it right across this stuff. It’s just boom. AI tools everywhere. And you know, they do the usual things. Oh, they can make you a picture, you know, and help you write an email, help you write a message.

[00:10:32] Cameron: I actually, the first thing I used it for in Messenger, I’ve got one of my podcast listeners of many years. He’s a guy who lives in Germany. He’s dying of, uh, brain cancer and, um, you know, we’ve been talking about it over the last year or two as he’s been going and he sent me an email the other day saying, I have to unsubscribe finally from your podcast.

[00:10:59] Cameron: I just can’t listen. I can’t follow anything anymore. I can’t follow a story. I can’t concentrate long enough to listen to a podcast. And you know, he’s, Brain is basically shutting down and I was like, ah, I spent like five minutes trying to figure out what do I write and I just went into Meta and I said, here’s the situation, uh, long time listener, um, don’t know him that well, but you know, what do I say?

[00:11:27] Cameron: You know, help me, help me draft an email here and it did, it really, Helped me come up with something that was way more, um, I don’t know, articulate, than I felt I could be in that situation, so. But this is, like, I think this is, uh, uh, probably, uh, uh, a tectonic shift, Steve. Getting AI into the hands of Facebook’s 3 billion users, or whatever it

[00:11:58] Steve: I think it’s a big shift for a couple of

[00:11:59] Steve: reasons and just to go back to something I did two weeks ago. There’s an electrical company that, um, I consult to, and that involves me going around the country to a lot of regional areas to do a speech on the latest on technology. And the rooms are usually filled with electricians and people from the electrical industry. I did one a month ago in North Melbourne. Put your hand up if you’ve used an artificial intelli put your hand up if you know what it is you heard a bit about. Yep, all the hands go up. Put your hand up if you’ve used a tool like a ChatGPT or a Gemini. 60 percent of the hands go up, alright, in North Melbourne, in North Melbourne, and this is tradies but they’re pretty smart tradies as well, electricians are pretty, you’ve got to be pretty clever and they run their own businesses and they’re using tools and their phone’s one of their big tools.

[00:12:45] Steve: Uh, but then I was in Horsham, and this is not in any way to be dismissive, it’s just a geographic observation. Um, room filled with, I want to say, 200 people, less than five hands went up. And so, we circle this back to what Meta have done with free AI for all, and certainly what you’ve seen, it was noticeable.

[00:13:06] Steve: In fact, I noticed it on the app before I’d seen any of the news on the release. of their AI. I noticed in Instagram, because I go in there a fair bit just to look at some tech news and stuff, and in the, uh, where you share things with your friends, was Meta. ai asked me anything, right? And I just thought, well, it’s interesting, that’s going to get to the 3 billion people that these, that this company has contact with, like that.

[00:13:31] Steve: I mean, it’s, You know, if we thought ChatGPT was the fastest implemented app in the world, this might be the fastest implemented AI in the world, because basically, there’s gonna be probably 3 billion people use it within the first month. I would imagine. I don’t see why you wouldn’t type something in there.

[00:13:47] Steve: And that shows, you know, the power of owning your own platform. So, that was what was noticeable to me, and I would think that the people in that second talk, where very few had used AI, and I showed them some demos, blew them. They just fell on the floor, as we all did. Um, but you know, this is 18 months later.

[00:14:05] Steve: Uh, from ChatGPT became omnipresent and, and you’d think that meta AI will really, really widen the use of AI to pretty much anyone who’s on the web, I would say.

[00:14:19] Cameron: I think you’re right. And I think that I’m interested to see the progression in how people use it and also how it prompts people to use it. Like, if you go to it, if you go to it now, I’ll just bring it up, um, and, uh, you, it has the prompts that are pre populated and there are obviously a range of them and they’re rotating, but ask MetaAI anything.

[00:14:48] Cameron: Create a packing list for a trip. Make my email sound more professional. Play 90s music trivia. Help me with an assignment. Simulate a mock interview. Paint New York City in watercolor. Okay, not bad. Now, one of the things that’s really interesting

[00:15:04] Cameron: about this is it, uh, it is up to date with the web. Um, it’s using Bing behind the scenes, I think,

[00:15:12] Steve: And Google,

[00:15:13] Cameron: that on, oh, and Google, right.

[00:15:16] Cameron: I, um, I’m sure you’ll be fully aware

[00:15:20] Cameron: of this, but the, um, 2024 Chess Candidates Tournament, uh, finished on, uh, uh, yesterday, I

[00:15:29] Steve: I mean, that’s, I was, I barely slept. I mean, I was

[00:15:31] Cameron: don’t know. Barely see.

[00:15:32] Steve: tuning right in. I barely, I really was. I mean, I nearly had to cancel because I had to just come down from the excitement and the endorphin rush that I’ve passed for 24 hours has been

[00:15:42] Cameron: Now I know you’re being facetious, but my son

[00:15:45] Cameron: Taylor.

[00:15:45] Cameron: was texting me. Yesterday morning while I was trying to work going, holy shit, are you watching this? And I had the live stream on,

[00:15:54] Cameron: but I

[00:15:54] Steve: wasn’t with Taylor.

[00:15:55] Cameron: it. I had to keep muting it because I was trying to work and I couldn’t have it both going, but I was flipping backwards and forwards.

[00:16:01] Cameron: It was a big thing, huge result. Anyway, I won’t go into it, but not long after the result, I was in Meta. ai and I asked

[00:16:08] Cameron: it, Who

[00:16:09] Cameron: won the 2024 Candidates Tournament, and it told me, it was like within half an hour of the result. No,

[00:16:16] Steve: Yeah.

[00:16:17] Cameron: no, I’m not, I’m, I’m not,

[00:16:19] Steve: And I go to September 2023. Yeah.

[00:16:21] Cameron: yeah, I, you know, my, my, my database isn’t

[00:16:24] Cameron: up

[00:16:24] Steve: brain turned off six months ago.

[00:16:27] Cameron: And I still get that from GPT, I still have times when GPT says I’m, I can’t look up stuff on the web, and I’m going, Yes, you fucking can! Stop gaslighting me, bitch! Look it up! And it’s like, oh, okay,

[00:16:39] Cameron: okay. So, so far with Meta, I haven’t had that. I’ve only

[00:16:43] Cameron: tested a couple of times of that, but it’s like real time.

[00:16:45] Steve: right? Because Google hasn’t seemed to have done well in reverse. I mean, if Meta is doing well now by using an AI and then plugging it into a search, they’re getting a twofer with this straight away. They’re kind of doing two things. They’re getting a search. angle, and they’re getting an AI angle, which is something you typically didn’t do within any of their forums.

[00:17:07] Steve: Google hasn’t done really that well in getting you to use Gemini when you would otherwise search. OpenAI hasn’t done really well. Uh, Bing’s done pretty well with having Bing as an engine which has an AI plugged in, I guess you could say, but OpenAI on its own, and ChatGPT hasn’t done very well, as you say, converting to search.

[00:17:26] Steve: So that hybridization of Search and AI, um, being combined for something that is a learned model and can generate something versus something that just is and just gives the answer and, and recent current. So that’s a really interesting point, Cameron. It

[00:17:47] Cameron: I was going to get to, um, is this with the prompting thing is how long before they start to prompt people on, uh, like a higher level of AI utilization, like, um, fact check a news story. Uh, you know, uh, research that, um, crazy, um,

[00:18:11] Cameron: fake science, uh, article that your, uh, uh, conspiracy theory friend just sent you.

[00:18:18] Cameron: Um,

[00:18:19] Steve: was real and put in the podcast. Well, here I asked

[00:18:22] Steve: a couple of things similar to that today. When I went in, I, I asked it just to do some basic stuff and it did pretty easily. I said, create a picture of Donald Trump. It said, oops, I can’t generate that image. Can I help you with something? I said, generate a picture of a political rally.

[00:18:35] Steve: And it gave me one, which was just a mashup of all different colors, ethnicities and flags. None of them real and words. And it wouldn’t do any political things. I mean, clearly with all the elections coming up, so it’s sort of got some parameters, but. It’s interesting. I get the sense that they’ll just go, all bets are off, rather than going, no, we can clarify what’s real and what isn’t.

[00:18:56] Steve: But I get the sense that they’re going to go, oh, politics, not touching it, you know, generate a nice for Instagram.

[00:19:04] Cameron: I just jumped into it and said, what are the big headlines around the world today? And it said, I apologize, I’m having trouble accessing today’s headlines. You can try searching online for the latest news updates. Alternatively, I can provide you with some of the recent notable news stories. For example, the situation in the Middle East continues to unfold, and there have been developments in the fight against climate change.

[00:19:24] Cameron: Let me know if you’d like more details on these stories. So it’s not just saying, no, I can’t tell you the news. It’s actually saying, actually, I

[00:19:32] Cameron: can give you some news stories that we can talk about. Um, I, I,

[00:19:37] Steve: I just asked here, I said, can you tell me if an

[00:19:39] Steve: Instagram post is folk or real, fake or real? It said, I’ll be happy to help whoever I need more information from you. Can you tell me the

[00:19:44] Steve: user? Name or link to the profile that posted it. I mean, at least it’s asking you that. A description or screenshot of the post and any additional context or information.

[00:19:53] Steve: I can guide you on how to fact check, but keep in mind, this can be challenging. It’s difficult to know for certain if the post is fake or real. However, we can work together to evaluate, evaluate the post credibility and identify a potential red flag. I actually don’t mind that. It’s reasonable.

[00:20:09] Cameron: now, by contrast, I just jumped into GPT and asked it the same question. Um, what are the big headlines around the world today? And, um, well, I tell you what, let’s just get it to, uh, tell us.

[00:20:29] ChatGPT: Today’s major global headlines include a variety of pressing issues and developments. 1. Continued conflict in Ukraine.

[00:20:36] ChatGPT: The conflict between Russia and Ukraine persists, marking its 789th day with ongoing military operations and significant impacts on civilian source. 2. Middle East tensions. In Israel, there is heightened tension with new developments in the conflict with Hamas, including Israeli military actions in Gaza and discussions around the safety and rights of Palestinian civilians, eighth source. Three, plastic pollution concerns. Globally, there’s an urgent call for action against the surge in plastic pollution, threatening environmental sustainability and health, sixth source. For Global Health, the World Health Organization is actively monitoring various disease outbreaks and health emergencies around the world, emphasizing the need for rapid and coordinated international response.

[00:21:26] ChatGPT: 9 Source. 5. Financial Systems Reform. Discussions are underway regarding the need for a significant reform of the global financial architecture to better handle contemporary challenges like climate change, inequality, and economic instability. 6 Source. 6. Political Developments in South Africa. In South Africa, there are ongoing trials and political maneuvers, including the Mayiwa murder trial and debates over energy policies amidst electoral contexts.

[00:21:54] ChatGPT: 10. Source. These headlines paint a picture of a world grappling with complex conflicts, environmental issues,

[00:22:01] ChatGPT: and significant political movements, each requiring informed and decisive responses from global leaders and communities.

[00:22:08] Steve: That was pretty good. You take that.

[00:22:12] Cameron: Yeah. So number one, uh, Meta can’t talk to me yet. And GPT, uh, that’s in just in the browser too. That’s just straight out of the browser talking to me. And it, you know, did a pretty good job of giving me some new stories. I mean, I’m sure I could drill down. Obviously it’s given me sources against each of those that I could drill down into.

[00:22:33] Cameron: Um, so, but anyway, like my point with the prompts is I still think the prompts that we get, and it’s the same with GPT. The, the, the. Proforma prompts that you get are still dummy AI stuff, like basic, you, you, you, you know, Oh my God, let me see what, you know, these things, help me write an email, right? As opposed to like a high, help me write code to automate boring stuff.

[00:22:55] Cameron: Give me, give me some ideas of a way, ways I could code the automation of tasks that I do every day that I shouldn’t, I shouldn’t have to do. That’s where we will be. In the next year or so, I believe, where you, when you open up your AI, it’ll say, Hey, um, how about we, uh, save you some time and effort and automate some things today?

[00:23:20] Cameron: Of course, eventually when Apple gets this shit into gear.

[00:23:23] Cameron: and come out with it integrated into your devices and Microsoft and Google with their own devices. It’ll say,

[00:23:31] Steve: it’s suggesting based on everything it observes and so it.

[00:23:34] Steve: straightaway

[00:23:35] Cameron: or it’ll by,

[00:23:36] Cameron: by the way, I’ve, I’ve seen you do this thing for the last couple of weeks, just so you know, I just automated that for you and you

[00:23:42] Steve: not even asking you, just do it. Go next time instead of doing this, Hey, hey, hey dickhead. Listen,

[00:23:48] Cameron: Yeah. I’ve written, I’ve written a code. I’ve written a, an app for you that just does that,

[00:23:53] Steve: it is such a great reminder. I mean, we need to think back to the Amstrad and the Commodore 16

[00:23:58] Steve: ads where it said buy a, computer and

[00:24:00] Steve: you can do a budget. It is trying to teach you the use cases and so it, it. They are going to that lowest common denominator of, remember, like we just said, there’s 3 billion people who have no idea how or what to use this for, so let’s show them the most basic thing to get their hair wet, right, get some dirt under their fingernails of using, and then the job becomes their imagination.

[00:24:21] Steve: It’s a little bit like social media. We didn’t know what that would evolve into. When it first comes around, you know, Twitter and, and some of the, you know, internet relay chat lines, like, what do they become? Once we become more familiar, we, the audience invents use cases. And

[00:24:37] Cameron: fires.

[00:24:38] Steve: yeah, yeah, yeah, that, we didn’t know it was a great way to burn down civilization.

[00:24:42] Steve: But now that we know that, it’s amazing how quickly we have burnt it down. And, but it is a little bit like, Those early computer ads where it’s just trying to show you like some simple ideas. You can write a Christmas list and have a database of And they have family budgets, and still no one’s done it in 2024, no one’s ever done it.

[00:25:06] Steve: But it’s the same thing again, right, where we need to teach use cases, but the best use cases will come from those who go, well, my knowledge bank of the things that I’m working on that I care about, I attach that to the power of this AI, and then I create something new. And that’s where the new realm of creativity belongs.

[00:25:22] Steve: It’s your personal database. Plugging into this new brain that you have access to, to do all of these things that are relevant to you. You know, it’s one size fits one, which is really cool.

[00:25:34] Cameron: like that line, one size fits one.

[00:25:37] Cameron: Well, moving right along, uh, Microsoft, uh, came out with a demo.

[00:25:45] Steve: just one more thing before we do move on.

[00:25:47] Steve: I do wonder if, with the ability to generate images and captions, In something like Instagram or Facebook while you’re in there, if that whole thing just becomes this whole circular reference of AI generating AI and just spiraling into sort of who knows what, just, I, sorry to interrupt, but I forgot to

[00:26:09] Cameron: Yeah, no, you’re right. And I did see a stat somewhere in the last couple of days about

[00:26:17] Cameron: just the amount of AI generated content that is out there that AI is going to be reading and training on. I know they’re trying to prevent that, but we’re, we’re, We’re quickly reaching a point where AI generated content is

[00:26:31] Cameron: going to outpace human generated content, and I’m not necessarily sure that’s a bad thing, but it’s going to be a, an interesting

[00:26:39] Steve: excite us from the web. It almost becomes this new sphere of like AIs talking to AIs and we can just operate down in

[00:26:45] Steve: this human level underneath and just let them do that AI stuff up there. No one does anything, just plug them in. Alright. You

[00:26:55] Cameron: moving right along to plugging things in, Microsoft announced VASA 1, V A S A 1. VASA stands for Visual Effective Skills, V A S, abstract, uh, something. Um, this Is probably the most shocking thing that I’ve seen, uh, recently and I think every, every commentary I’ve seen on this people are just like, Fuck off.

[00:27:23] Cameron: This is insane. How can we be here already? Really? This is I mean, crazy. We’ve seen a lot of crazy things in the last 18 months. So here’s the abstract from Microsoft Research. We introduced VASA, a framework for generating lifelike talking faces of virtual characters with appealing visual effective skills, VAS, given a single static image and a speech audio clip.

[00:27:52] Cameron: Our premier model, VASA 1, is capable of not only producing lip movements that are exquisitely synchronized with the audio. But also capturing a large spectrum of facial nuances and natural head motions that contribute to the perception of authenticity and liveliness. The core innovations include a holistic facial dynamics and head movement generation model that works in a face latent space.

[00:28:21] Cameron: And the development of such an expressive and disentangled face latent space using videos. But below that they say, note, all portrait images on this page of virtual non existing identities generated by StyleGAN2 or DALI3. So there’s a bunch of, uh, demo videos. There’s like a ton of them on this, uh, Microsoft site.

[00:28:43] Cameron: Usually somewhere between 10 and 60 seconds long. They are of talking heads. Um.

[00:28:53] Cameron: That are insanely, really, you would not really, I think, tell, unless you knew to look for it, you wouldn’t know that these aren’t

[00:29:02] Steve: you weren’t looking for it, you wouldn’t know.

[00:29:04] Cameron: Not only are the, the talking heads virtual, but as they said, the reference profile photo that the talking heads are based on was generated by AI as well.

[00:29:17] Cameron: So they’re not even real photos of people that are being turned into real talking heads there. Fake profile shots, and they are all sorts of faces, men, women, young, old, dark, Asian, Anglo, And they are, uh, you know, these people are, I mean, I won’t even play a clip because the clips is sort of nonsense. It’s just audio, you wouldn’t be able to tell, but you have to go and look at these things, like the, the, the, the intonation.

[00:29:48] Cameron: The expressions on their faces

[00:29:51] Cameron: when they’re doing this stuff is absolutely, um, insanely, insanely realistic.

[00:29:59] Steve: Yeah, incredibly so. And when I saw it,

[00:30:03] Steve: I read the comments in, in, there was a Reddit that, that you pinged me where there was some comments. And you know, one thing I keep reading again and again, there’ll be some saying, you know, buckle up, Kansas is going bye bye. But there’s always about half the comments saying, yeah, but if you look at the lips, you can see this.

[00:30:21] Steve: They’re just not getting that we are in exponential times. And if the lips are a tiny bit thing, if you’re looking for it, you can find it. Yeah, maybe you can. Well, guess what? Well, two weeks later, that’s, that’s, that’s not like that. And four weeks later, people just not understanding where this goes. You know, because, you know, if I expand my mind on this, if we can do this visually with an AI to create the facial expressions, Eventually, that visual capability of moving the face goes on to a soft robotics version that gets put on top of a figure one robot where you can get soft facial expressions that don’t look like the weird Sophia robot that’s trying to get expressions that are wrong.

[00:31:00] Steve: You’re going to be able to transpose this visual element of the AI and moving the face to eventually move the actuators and the muscles on a soft robotics. And I’m not missing my words when I say within 10 years, it is possible. Like, within a decade that we’ll have soft robotics walking around which have an NND, a no noticeable difference to humanity unless someone told you.

[00:31:21] Steve: I’m telling you, we’re getting very close to the replicant.

[00:31:25] Cameron: Speaking of which,

[00:31:26] Cameron: um, Taylor’s mate, Kevin, who got laid

[00:31:29] Cameron: off from the Apple Car Project a month ago when they crashed that,

[00:31:34] Cameron: Taylor spoke to him the other day, he said, how are you going?

[00:31:37] Cameron: Huh?

[00:31:38] Steve: Pun not intended, Sorry, I couldn’t resist.

[00:31:41] Cameron: What did you say? I missed it. Sorry.

[00:31:42] Steve: Well, you said when they crashed that and I just said pun not intended, but then,

[00:31:46] Cameron: crash that. Yeah. Good one. Um, uh, Taylor spoke to him the other day. He said, if you’ve got another job, he said, I did get another offer at Apple, but

[00:31:55] Cameron: I got a better offer outside of

[00:31:56] Cameron: Apple. He’s going to figure.

[00:31:59] Steve: oh wow.

[00:32:00] Cameron: Yeah. working on the head, apparently.

[00:32:04] Steve: the head.

[00:32:04] Cameron: on the head.

[00:32:05] Steve: do? Gets to his mate, he says, I’m working on head. I’m just saying, a very interesting way to start a discussion in a bar, maybe, young guy, I don’t know. I

[00:32:18] Cameron: Anyway, um, super cool, uh, and crazy, scary stuff. And the implications of this is, We can all imagine when you get a FaceTime call, you’re talking to a customer tech support, you’re talking to tech support or customer support at a bank, it’s a person who looks like a person, who sounds like a person, that’s all AI.

[00:32:43] Cameron: And it’s doing your support call. And of course there’ll be, you know, news on the television that’ll look like somebody who you know, who’s saying something. Commentators, expert commentators on what’s happening in Iran right now. Um, is it a real person? Is it not a real person? We’ll have AI tools that’ll

[00:33:05] Cameron: help us know that hopefully, but it’s, it’s a crazy world we’re moving

[00:33:13] Steve: it on LinkedIn and most of the people that read my stuff on LinkedIn pretty cross what’s

[00:33:17] Steve: happening with AI, you know, the same kind of ideas that we share, and it blew their minds. It went crazy. I put one of those videos up and people were really like, whoa. I mean, obviously in an election year, you say to yourself, again, how do we know?

[00:33:32] Steve: Like, post truth, post truth era, anything you see on a screen, I don’t know, did it happen?

[00:33:41] Cameron: I like the way that we say in an election year, not here,

[00:33:46] Cameron: not in,

[00:33:47] Steve: Yeah, not in

[00:33:48] Cameron: know, China. We just know where you mean when you say it’s an election year. Could be Russia. Putin just got elected.

[00:33:54] Steve: Well, he, he really, he had a great, didn’t he have an astounding victory, Cameron?

[00:33:58] Cameron: Very good victory. You know, he’s, look, all bullshit aside, he’s very popular. He is very

[00:34:05] Steve: I don’t know. I actually don’t know.

[00:34:07] Cameron: He is very popular. Xi Jinping is very popular in China. These guys are, despite what in the West we might think of them based on

[00:34:15] Cameron: the Western propaganda that we get, these guys are, you know,

[00:34:19] Steve: Well, I know

[00:34:19] Steve: Xi Jinping

[00:34:20] Cameron: The home team is very popular.

[00:34:22] Steve: well, I know one thing for sure, if you’re in the 70s in

[00:34:26] Steve: China versus now, you’re much happier with your

[00:34:28] Steve: life if you live in one of The cities, that’s for sure. I mean, you may well, you may well be making iPhones and like jumping out to just bounce in the net and be ricocheted back in to make another iPhone.

[00:34:38] Steve: We don’t know. But if you’re not doing that, your life is better

[00:34:43] Cameron: I’m about two thirds of the way through a biography on Deng Xiaoping at the moment, which is absolutely mind bendingly interesting. Like, that guy, I think probably in the annals of human history, there is probably who, there’s probably not many people I can think of who’s had as big an impact in a shorter period of time, As Dong Xiaoping had, like in terms of humans that have affected change, absolutely insane what he got done.

[00:35:21] Cameron: Um, and, and, you know, forget about the period from the twenties through to the seventies when he was a revolutionary. Um, when

[00:35:31] Cameron: Mao died and he got his hands on the wheels and he’s like, we’re completely reinventing.

[00:35:36] Steve: that he started to say at the first, it was

[00:35:39] Cameron: Well, we’re,

[00:35:40] Steve: when they had the first economic zones or?

[00:35:42] Cameron: You know, it was a bit earlier than that. When Mao died, Dong actually Mao died in 76, Dong didn’t take over.

[00:35:48] Cameron: Hua Guo Feng, um, was the person that Mao picked to, um, lead the party when he died and Dong was still in the shitbox when Mao died. Because he refused to publicly endorsed the Cultural Revolution and Mao was worried that when he died, Dong, they called him the, he was going to be the Chinese Khrushchev.

[00:36:08] Cameron: He was going to do to Mao what Khrushchev did to Stalin. Um, basically, uh, you know, blame all of their problems on him. Uh, and Dong refused to do it. So he was in the shit, he was in the shithouse with Mao when Mao died. Wasn’t even allowed to give a speech at Mao’s funeral or anything like that. Anyway, he sort of came back slowly and he wasn’t the premier for the first few years, but he was basically in charge of reinventing the economy and the military.

[00:36:33] Cameron: And he just went balls to the wall. Um, was sending like 50, groups of 50 ministers to Europe and to Japan to go figure out what they’re doing and bring it back here. And, uh, they came back from Europe, a group of guys said they were, Europe was willing to loan them 18 billion dollars to rebuild their economy.

[00:36:55] Cameron: And there was a lot of ministers that were worried about getting into debt slavery, to European countries and that kind of stuff. And Dong’s response was 18. If they’re willing to give us 18, let’s go for 80. Let’s just put the pedal to the metal because we are so far behind and we need to turn this ship around now.

[00:37:14] Cameron: And he said, we, the Chinese leadership, have failed the Chinese people. Um, you know, with the revolution up until this point, we’ve got a lot of, a lot of, um, undoing to do, you know, so. Big man, it takes a big man to, to do that, and Mao wasn’t able to do it or willing to do it, so, anyway. Moving right along.

[00:37:37] Cameron: Queensland researchers create device that consumes carbon dioxide and generates

[00:37:41] Cameron: electricity. Now we, look, we see a lot of these stories.

[00:37:45] Cameron: There’s always these stories

[00:37:46] Steve: words out of my mouth. Yeah, you know, world beating tech that does something extraordinary that will change everything.

[00:37:54] Cameron: And we see a lot of it come out of Queensland, actually, University of Queensland’s PR department, I think, does a great job of getting these stories into the ABC. I reckon, once a month, I see something groundbreaking coming out, if it’s not cancer research, it’s something like this. But! Um, you know, and all of these things, you know, they need to be scaled up and there’s commercial issues and production issues, all those sorts of things.

[00:38:16] Cameron: But this is a really interesting story, I think. Um, basically they’ve built a tool, obviously small scale, that does basically what it says. An electrical generator that consumes carbon dioxide and can turn it into electricity. It’s a car they call it the carbon negative Nanogenerator. Not surprisingly, it’s the work of Zhiyuan Wang and Zhihuang Zhang from

[00:38:43] Cameron: University of Queensland’s Dow Center for Sustainable Engineering Innovation.

[00:38:47] Cameron: So it’s a Chinese Australian or Australian Chinese invention.

[00:38:52] Steve: ever see that movie going around where it had, um, the maths champions from every country?

[00:38:58] Steve: And so

[00:38:59] Cameron: And they’re all

[00:38:59] Steve: picture, it’s amazing, I have a picture of the American, um, top five maths and it was five Chinese people, Australia, everyone, it was five Chinese people, every single one of the maths championships, every country in the world.

[00:39:09] Steve: It was

[00:39:12] Cameron: Well, I tell you what’s interesting to me about this is, um,

[00:39:17] Cameron: one of the conversations I’ve been having with

[00:39:20] Cameron: ChatGPT over the last, uh, week

[00:39:23] Steve: I love it, even in one of the conversations, this goes, this harkens back to one of our very first Futuristics, where you said you’re nice to your robots and

[00:39:31] Steve: you’re nice to the artist, and you say, I was having a conversation with ChatGPT, anyway, I just picked up on that,

[00:39:39] Cameron: I, I, well, I say that in conversation all the time and people laugh at me like, you’re crazy, but it is like,

[00:39:44] Steve: Well, you know you are a bit crazy, you know you’re not normal, you know that, right?

[00:39:48] Cameron: I’ve never aspired to be normal, Steve. Um, what did David Lee Roth once

[00:39:53] Cameron: say? I’d rather, um, I’d rather live my life like an art project than just chickening out and buying one. Um, so I’m, I’m wondering where the current state of nanotech research is. And, and, you know, I, I, I read, um, stuff. I, you know, it’s one of the sites that I look at every day, but I jumped into GPT and I said, what’s the current state?

[00:40:15] Cameron: I don’t know. And so it took me through some of the things that are going on. And then I asked him, what are the, what’s the potential impact of AI on the field in coming years? And then it took me through a bunch of things. And then I said, um, uh, assume that we have AGI. Within five years, what happens then to nanotech?

[00:40:38] Cameron: And it gave me a roadmap five to seven years where basically we end up with, uh, you know, robust nanofabricators and nanotech. So when I read these stories, like the UQ story about this thing, you know, in my head, I’m thinking, that’s great. What happens to these sorts of, uh, this sort of research when we throw AGI at it in the next five years?

[00:41:06] Cameron: And the idea of carbon capture, generating electricity

[00:41:12] Cameron: through carbon capture, To me is, uh, fascinating. I mean, we already generate

[00:41:17] Steve: fascinating, look, it’s, the one thing it did make me think when I read that, was I pivoted off,

[00:41:23] Steve: because I’m an, I’m a non believer that net zero is possible, even the word net is a ridiculous joke. Uh, if you see how net is, is measured and created, Juice Media did an incredible video on net zero in Australia and, and what it means, and basically if there’s a forest that you didn’t cut down.

[00:41:42] Steve: That, that can be, that was planned to be cut down, all of a sudden you get the carbon credits for this, there’s all these carbon credits which are absolute bullshit where they’re counting things that have already been there, a forest that’s there, that didn’t get cut down, that might have, and there’s just

[00:41:59] Cameron: stops you from cutting down a forest. I

[00:42:01] Steve: No, no, but, but they’re counting it as a credit to offset.

[00:42:04] Steve: It’s already there. It already is breathing oxygen into the air, but they can’t, it’s like a double counting. You’re basically

[00:42:10] Steve: counting it because you didn’t chop it down. It’s not, it’s ridiculous. Anyway,

[00:42:15] Cameron: argue, but Anyway, I don’t want to get you off topic. So keep going.

[00:42:18] Steve: yeah, but, um, so net zero, there’s a lot of chicanery that goes on that can get

[00:42:24] Steve: your, uh, carbon, so called emissions down.

[00:42:29] Steve: Um, and I don’t think that, I think that the way we live our lives, it is in any way possible to have a net zero world at all. I just don’t think it’s possible. And that’s not me saying I don’t believe the climate science, and it’s not me saying that we shouldn’t do everything we can to try. I just don’t think it’s possible with the lifestyles we’ve become accustomed to.

[00:42:48] Steve: I think that the way we do it, um, and I’m sort of Uh, you know, I’m, I’m, I’m now disagreeing with myself. Is new forms of technology which can capture carbon? Because I don’t see there being a way that we can actually go down where we don’t have emissions. And it’s got to be something like this with the new technology.

[00:43:09] Steve: I guess, you know, nature catches, captures carbon. All we need is a greater sense of biomimicry where we learn to do what nature does only quicker. And AI is probably the answer to that. Because what’s a tree? It’s captured carbon.

[00:43:22] Cameron: hmm.

[00:43:23] Steve: That’s what it is. It’s a long time, right?

[00:43:25] Cameron: that’s what, that’s what a human is too.

[00:43:27] Steve: Yeah, exactly.

[00:43:27] Cameron: Everything

[00:43:28] Steve: Everything needs captured carbon, right?

[00:43:30] Steve: But that’s what we’ve got to do. The idea that we can net zero it, um, without some sort of radical

[00:43:36] Steve: advancement, just with the technology that we’ve got today

[00:43:38] Steve: is just not going to happen. Or the idea that we can move away from creating emissions is just absolute folly.

[00:43:45] Cameron: I wrote a. blog post, I guess, uh, in the last week or so based on the conversation we had, uh, last time we did a show. Cause the thing that still is annoying the hell out of me and all the media coverage of AI is nothing I see is talking about the scenario where let’s say Altman and Musk and Kurzweil and Hassabis and Jensen Huang, et cetera, et cetera, are right.

[00:44:20] Cameron: And we have a good chance of having AGI in the next five

[00:44:24] Steve: 3. 29, yeah, five years, right? Yep.

[00:44:26] Cameron: maybe sooner. And then Altman’s scenario that he talks about all the time is what happens when we have a million extra climate change researchers in the world that are virtual? What happens when we have a million scientists working on cancer research?

[00:44:46] Cameron: What happens when we have a million people working on, million scientists, PhD scientists working on nanotech? What happens when we have a million

[00:44:56] Cameron: new PhD researchers working on, uh, uh, reducing inequality in capitalism

[00:45:04] Steve: abundance. You would think that you’d end up in an abundance society where you have, uh, Bucky Fuller, you know, abundance style, create everything from nothing.

[00:45:13] Steve: Just reorganize everything at the molecular level till you get what you want. 36. Don’t

[00:45:17] Cameron: so if it is, if it is, And I believe it is extremely likely that we’re going to crack AGI in the next five years. Then, therefore, it is extremely likely that we’ll be able to scale up research into all of these things to attune, you know how many climate

[00:45:36] Cameron: change, active scientists are working on climate change around the world right now?

[00:45:43] Cameron: It’s in the thousands.

[00:45:45] Steve: you asked me to guess.

[00:45:49] Cameron: It’s in the thousands.

[00:45:50] Cameron: If all of a multiply

[00:45:52] Steve: off. I

[00:45:53] Cameron: add a million.

[00:45:54] Steve: was pretty close. I was within the thousands. There’s 8 billion people and I was

[00:45:58] Steve: within thousands. That’s how close I was, Cameron.

[00:46:01] Cameron: that’s why you get paid the big bucks.

[00:46:02] Steve: That is why I get the big bucks Let me tell

[00:46:06] Cameron: have a million, million AI agents working on this stuff. And then on top of that, working on nanotech, how much progress do we make on nanotech when we’ve got a million? And it’s not only a million researchers working on it, but they can run billions of virtual experiments simultaneously to come up with The absolutely, uh, with the shortlist of experimental, uh, scenarios that need to be run in a lab.

[00:46:36] Cameron: You know, we’ve run a billion simulations on this. Here’s the thing to try to make this work in the lab. That plus robots. I don’t know if you saw this. I didn’t put it in the notes, but Jensen Huang, CEO of NVIDIA. I saw a clip from an interview that he gave in the last week where he was saying you can buy a car today for 10, 000 to 20, 000, a reasonable

[00:47:03] Cameron: car. Within 10 years you’ll be able to buy a humanoid robot for 10, 000 to 20, 000, assuming any of us have jobs

[00:47:11] Steve: I will buy one tomorrow. The

[00:47:13] Cameron: Yeah, so you’ve got a world in the next five to ten years with AGI, millions and millions and millions of scientists running billions of virtual experiments on all of the big challenges and billions of humanoid robots in society.

[00:47:31] Cameron: What are we doing to prepare for that world? What do we have to do in the next ten years to prepare for that? Eventuality, you know, if you did a probability on

[00:47:45] Cameron: that as a likely scenario in the next 10 years, what would you give it? What’s your, what’s your P?

[00:47:54] Steve: probability of that occurring, humanoid robots within 10 years.

[00:47:59] Cameron: AGI in

[00:48:00] Cameron: five.

[00:48:01] Steve: robots for 10k, 20k. Car, cost of a car, let’s say. I’m

[00:48:06] Cameron: No, I’m saying in the next 10 years, we have AGI, millions of, millions of AGI scientists working on all the big problems and humanoid robots. Steve’s walked away to, uh, a visual

[00:48:21] Cameron: aid. I think. I’m not sure, he just stood up and walked away.

[00:48:27] Steve: back. I had A

[00:48:28] Steve: humanoid daughter at the front door.

[00:48:31] Cameron: Oh, I thought you’d gone to get a visual aid to

[00:48:33] Cameron: explain something.

[00:48:34] Steve: No, I wish I

[00:48:36] Cameron: Sound in a funnel, supported by a string, a slight push,

[00:48:40] Cameron: so too Cadbury Dairy Milk Chocolate contains a cup of milk in every block.

[00:48:47] Steve: glass and a half of full cream dairy milk in every So the probability of AGI, humanoid robots for the cost of the car and what else? Just those two.

[00:48:58] Cameron: All those AGI’s working on, millions of AGI scientists working on all the big

[00:49:02] Steve: within a decade, for me, I’m giving it P90. It’s 90%.

[00:49:07] Cameron: Me too.

[00:49:08] Steve: Yeah.

[00:49:09] Cameron: Outside, like my big fear at the moment, honest to God, my biggest fear right now, Is that we’re a,

[00:49:15] Cameron: I was going to say, cunt’s hair away from getting AGI. Sorry, am I not on headphones? me on headphones. Uh,

[00:49:25] Steve: all good.

[00:49:26] Cameron: we’re an ant’s dick away from getting AGI in

[00:49:28] Cameron: robots and we end up in World War three and we reset everything back by a million years.

[00:49:35] Cameron: Like we get this close.

[00:49:37] Steve: AGI is World War 3! You’re not seeing this!

[00:49:41] Cameron: No, no, and Trump,

[00:49:43] Steve: All

[00:49:44] Cameron: Trump, and Putin, and, uh, she, land us all in World War III, and that’s it. Like, in my lifetime, I got this close to getting AGI and robots,

[00:49:57] Cameron: and nanotech, and then, I get nuclear.

[00:50:02] Steve: went down. It was funny, I

[00:50:05] Cameron: That’s my big

[00:50:06] Cameron: fear right

[00:50:06] Steve: Terminator series. I just took

[00:50:08] Steve: him all through it on the weekend. I think

[00:50:09] Cameron: really? How old is he?

[00:50:11] Steve: Twelve.

[00:50:13] Cameron: I’m dying to be able to expose Fox to those. I mean, I think I showed my boys those films when they were like five and

[00:50:19] Cameron: probably

[00:50:20] Steve: careful. You’ve got to get them at a level where they’re old enough to be interested. If you show them too early,

[00:50:25] Steve: it can be not quite there. But, um, but you’re right. There is that risk that, It’s kind of a little bit like the Fermi Paradox. They say, why haven’t we got a civilization? And one of the theorems is that we get close to a point of having exponential, uh, super intelligence.

[00:50:44] Steve: And then we destroy ourselves before we ever get there. That’s one of the parts of the Fermi Paradox, isn’t it?

[00:50:49] Cameron: Yeah. There’s a, there’s a window of opportunity

[00:50:53] Cameron: between when humans develop nuclear weapons and develop AI. And the question is, do they, do they wipe

[00:51:01] Steve: And interstellar travel capabilities. Yeah. That, that, those two?

[00:51:05] Cameron: What, how big is that window? Are you watching or, or they develop, you know, they have their, their,

[00:51:10] Cameron: industrial revolution and then they burn themselves with a, they destroy the environment and they

[00:51:16] Cameron: wipe themselves out.

[00:51:17] Cameron: Are you watching the three body problem on Netflix?

[00:51:19] Steve: I am. And I watched at

[00:51:21] Steve: lunchtime today. I’m halfway through episode two. Don’t ruin anything because I’m loving it. You dare ruin a thing.

[00:51:27] Cameron: They have a different version of the answer to the Fermi paradox, but yeah. Well,

[00:51:32] Cameron: anyway, um, Moving along, because we’re running out of time, um,

[00:51:37] Steve: Let’s, let’s, let’s zip through the last little bit.

[00:51:40] Cameron: you want to talk about, uh,

[00:51:41] Steve: Yeah, well, I just wanted to be, yeah, I mean, obviously this is relevant to our Australian audience, the geopolitical challenges which go with

[00:51:49] Steve: internet based global media. It’s just, it’s almost cute that Australia has come out and said, hey, you, don’t you dare put this stuff on your platform.

[00:51:59] Steve: Uh, they’ve threatened to fine. X, and I don’t know how they would implement. I think the fines are potentially 750k a day. It’s, it’s quite heavy numbers. Uh, I just don’t know how our government can have any power on anything other than just turning off that domain in this country. I mean, obviously you’d be able to get around it if you wanted to.

[00:52:23] Steve: Uh, with VPNs, but I just don’t know that he would play ball at all, and he says he’s not going to, and I just don’t think that we have any power in that way other than turning things off. It feels like that’s the only way you can have censorship in real terms in a country. Especially with, if it’s with a private firm.

[00:52:43] Cameron: Tony and I, and Ray, are going to spend some time on this story on the bullshit filter that we’re recording on Friday, because, you know, um, it is a question about, um, what level of, uh, Censorship versus what level of freedom of speech should people have? And how do you manage that in a global media landscape?

[00:53:08] Cameron: By what authority can the Australian government force a U. S. based company to not talk about stuff and vice versa, by what authority can the U. S. government threaten an Australian citizen, Julian Assange, with, uh, an eternity in jail for publishing content that, uh, made them look bad. So, uh, You know, it’s this issue of global

[00:53:35] Cameron: media reach and governments not liking that, um, and having to just suck it up, I think, to a large

[00:53:41] Steve: Well, it just seems to me, it was real simple. It’s like you either tolerate it or you turn it off. That’s the only sovereign opportunity you have. Cause I don’t think you can have any real economic or censorship ability on something which is global, where the traditional media rules are out the window, um, given.

[00:53:56] Cameron: But we can’t turn it off because then how do we criticize China?

[00:54:03] Steve: I don’t know.

[00:54:05] Cameron: And the US are trying to do it with TikTok, with exactly the same sort of justification. We don’t like what they’re posting. I mean, the whole, oh, they’re stealing our data, everyone knows that’s bullshit. It’s really They, they don’t like

[00:54:17] Cameron: the fact that, uh, there is content on TikTok which

[00:54:20] Cameron: is critical of Israel, critical of the US government, critical of their support of Ukraine,

[00:54:25] Steve: And also

[00:54:26] Steve: that the, the, the biggest growth social platform is not owned by an American corporation and they have their own corporate

[00:54:32] Steve: interests.

[00:54:34] Cameron: Hmm.

[00:54:34] Steve: Lobbying

[00:54:35] Steve: that.

[00:54:36] Cameron: taking

[00:54:36] Cameron: advertising dollars and out of the mouths of Mark Zuckerberg’s children.

[00:54:42] Steve: Okay.

[00:54:43] Cameron: Deep dive, Steve. What do you want to do?

[00:54:45] Steve: Well, I don’t know if we’re going to go too deep. We’re going to go deep into the shallow end and that’s an oxymoron, but I was just wondering now that

[00:54:52] Steve: AI, and especially with Meta, and we already discussed it, whether AI is going to become, generative AI is going to become degenerative AI, given that there is so much AI on the web that it sort of loses its value and becomes degenerative.

[00:55:05] Steve: Sort of like a, you know, a quasi circular reference of content, which feeds on content, which is created by AI, which creates more AI content. And the AI learns that, especially now that 3B and people will start trying to make their lives look a lot better than they are on Instagram by generating photos of where they’re not.

[00:55:26] Steve: And Yeah, I just wonder if we’re getting a little bit closer to dead internet theory. You know, the idea that the internet, nothing is real, nothing on there matters, and we almost extricate ourselves from that because it just becomes awash with things that are generated, you know, whether it’s music, whether it’s video, whether it’s content, whether it’s Like Microsoft, uh, uh, Vasa, everything we’re talking about right now is fake versions of things.

[00:55:54] Steve: Like, everything. And, and at what point does it, does the internet itself become like social media, where it’s this giant bin fire of nothing being anything that is of value or has any creation that is interesting or nuanced or human generated. I’ll just leave that as open ended. Maybe we don’t have the answer.

[00:56:14] Steve: Maybe that’s something for the listeners to postulate.

[00:56:19] Cameron: look, it’s, it’s a genuine concern, but I still cling to the maybe utopian view that the AI tools that we’re going to end up with in the next couple of years will help us curate and filter and navigate our way through the undeniable morass that will, that already exists and will continue to exist.

[00:56:46] Cameron: Um, And, you know, if I say to my on device AI personal assistant, you know, just automatically filter out all of the fake content and all of the Murdoch media content, which pretty much the same thing, uh, just give me, give me stories with a, uh, a probability. of factual content rating of eight and a half and above.

[00:57:17] Cameron: And, uh, you know, it just feeds me highly credible stuff if I want it.

[00:57:25] Cameron: Now, how many people are going to ask their AIs to give them that versus, hey, just give me stuff that,

[00:57:30] Cameron: that exists in my, uh, bubble. Yeah,

[00:57:33] Steve: Give me stuff I like.

[00:57:35] Cameron: Yeah.

[00:57:37] Cameron: Stuff that makes me feel good about

[00:57:38] Steve: stuff that makes me feel good and keeps me here, or feel really bad. So long as it’s on the edge there. It’s either going to make me feel really good or really bad.

[00:57:45] Cameron: yeah.

[00:57:46] Cameron: yeah,

[00:57:46] Steve: did not middle ground me.

[00:57:49] Cameron: You know, I’ve been saying for decades, you know, since the early blogging days, um, you know, we’re now in an era where you need to program your own media, right?

[00:57:57] Cameron: You need to choose the blogs you read, the Twitter feeds you read, the podcasts that you listen to, um, the TV shows that you watch, versus how we grew up in the 70s and 80s, where you just got fed one radio station, three TV stations, or two TV stations, one newspaper. Uh, and, and it’s a, and it’s a learned skill, I think, for humans to have to program your own media, particularly, you know, in the 20th century, we were taught you don’t have to do that when media was created, when broadcast media was created, um, and large newspapers, you know, before that, everybody had their own newspaper.

[00:58:34] Cameron: At the beginning, you know, the late 19th century, early 20th century, there was hundreds and hundreds and hundreds of newspapers Everyone had a printing press, every union, every business, everyone had their own printing press. And then there was this consolidation, and the advertising dollars, the price of paper went up, the price of ink went up, to, to, sort of, crush the, uh, non capitalist, uh, newspapers.

[00:58:59] Cameron: We ended up in this world where a very small group of people had control over the vast majority of what you read, watched, and listened to. Uh, radio, television. Record labels, TV, film, newspapers, book publishing, etc. Magazines. And then the internet came along and we thought, hooray, revolution, and then the big capitalist interest grabbed most of that as well.

[00:59:22] Cameron: You can still write a blog, but no one’s going to find it unless Google shows it to you, right? So we ended up in the same sort of situation. The, the big, the utopian version of AI is where it becomes the moderator for us. And it, and it, and it gives us what we want, but also hopefully doesn’t dumb, doesn’t let us dumb ourselves down.

[00:59:44] Cameron: But I don’t know, man. I, I, you know, I have a very low degree of, um, faith in humans to use a responsibility responsibly for

[00:59:58] Cameron: their own brain development. But that’s maybe, maybe it’s a 20th century generation thing. Maybe our kids

[01:00:07] Steve: we’ll

[01:00:07] Cameron: grow up in this world will be better. You know, that’s always been my hope is that our generations struggle with this stuff because we didn’t develop the ability to think for ourselves at a young enough age.

[01:00:19] Cameron: age that, that, you know, our

[01:00:21] Cameron: kids will grow up smart enough to um, protect their brains.

[01:00:29] Steve: you’ve got to see your mistakes too in the next generation. You grow up and you see your parents say, when I grow up, I’m not going to be like this. You know, if they’re poor, I want to be rich. Or if they’re, you know, not loving, I’m going to be loving with my kids, whatever, whatever it

[01:00:41] Cameron: My parents were chain smokers. You know, my, my mom

[01:00:46] Cameron: quit 25 years ago, 30 years ago. My dad didn’t and died at

[01:00:50] Cameron: 52 of lung cancer, but I grew up going, I’m never going to smoke because I saw my parents chain smoking and I hated growing

[01:00:58] Cameron: up

[01:00:58] Steve: Yeah, exactly.

[01:00:59] Cameron: cigarette smoke, right? Yeah.

[01:01:01] Steve: Yeah. So you tend to learn, um, the opposite often. It takes a generation or two, I think, but I mean, but there is also something where you can see human frailty in something like the supermarket, right? Because what I see is the same patterns everywhere.

[01:01:16] Steve: You go down to the supermarket and there’s three aisles with fresh food. and fresh meat and vegetables and cheese and there’s, you know, eight, ten aisles filled with junk. And you get the choice of which aisle to go down and what to buy and the internet’s the same choice. You can get broccoli for the brain or you can buy popcorn.

[01:01:35] Steve: It’s up to you, right?

[01:01:39] Cameron: Yeah, and it’s a skill, like, I talk to Chrissy about this a lot, like, I know 15 years ago, I was spending a lot of time on Twitter.

[01:01:48] Cameron: And then after that, a lot of time on Facebook and I know Chrissy did as well. And now we buy, I mean, I haven’t been on Twitter much for, for years and years. I go on a Facebook only when I need to, you know, when one of the communities that I run for my podcast, I need to go and post stuff or check stuff and pretty much stay off of it outside of that.

[01:02:06] Cameron: All my Kung Fu student community or the school community, whatever it is. We learned, you know, we, we got sucked in with the dopamine mining of the tools when they first came out. And then we felt really disgusted by that after a period of time and we weaned ourselves off of it. Like our brains adapted to that relatively quickly.

[01:02:27] Cameron: I think in, in terms of human development and hopefully our kids, you know, won’t get sucked in by that stuff as much

[01:02:36] Steve: I actually genuinely believe, and the only reason that I

[01:02:40] Steve: use the social tools now is a couple of people that I am friends with and I do back channel chatting, and it’s really just a way to stay connected. It’s like a phone call used to be when you were a teenager, you send each other a meme or something interesting or smart or funny or whatever that, and you send different things to different people on whichever channel you’re on with them, so I have that, and then the only other stuff I do on the internet really is to learn things for my work or to promote myself for my work.

[01:03:04] Steve: If I was, you know, a trillionaire and had a heap of money and didn’t have to work, I promise you, I wouldn’t be on any of those forums other than just reading and sharing information for people who want it. I don’t, I’m not on there for any of that dopamine hit. I mean, there is a dopamine element when I put something up for work and a lot of people like it, but I’m only caring about that because it has an economic impact, not just for, oh, I feel good about myself because a lot of people

[01:03:27] Steve: like it. Yeah.

[01:03:30] Cameron: Technology time warps, Steve. Let’s, let’s move

[01:03:32] Steve: This is a quick one. Yeah. So look, we remember when Yahoo was The

[01:03:39] Steve: internet, the front page of the internet. And the idea for a really long time was we want to build a website that has everything on it and keep everyone there. And then we sort of went away from that, the Google model, send someone to where they need to go, social media links, sending everyone across the web, even though they had tentacles that were power and platforms.

[01:03:58] Steve: Feels like we’re going back to that model where everyone is trying to suck an audience into a singular platform. And I thought that the meta AI was a classic example of trying to bundle again after the web was sort of unbundled. It’s re bundling. And it feels a little bit like that’s coming again, that all of the big apps are trying to get everyone into the one space.

[01:04:24] Steve: I’ve just noticed that Google are doing advertising around the web and on TV saying, come to the Google app for all of your internet y needs. You want to do search here, find images, do Translation. Every you get a recipe, everything you need is right here in Google. Now, I dunno if that’s a response to open AI and their their lagging in ai.

[01:04:44] Steve: We talked about meta saying, ask me anything. No need to leave Instagram Here. We’ve got all the answers, everything you need right here, no need. And that’s what it says on there page where they’ve announced the meta ai. You won’t need to go to other websites to find things. It’s all inside here. And I just wonder if we’re going back to the past.

[01:05:03] Steve: The tech time warp to the, the idea that brands are now trying to keep you in that one spot rather than get you using it as part of a portfolio of, of, uh, internet sites, a little bit back to the way it was, and there’s this pattern of bundling and unbundling, bundling and unbundling, which happens with tech.

[01:05:21] Cameron: Is that new though? I mean, most people get their news from Facebook. Um, which is what the Australian news publishers are fighting with at the moment. Um, you know, Facebook, you know, bought Instagram so they didn’t have to leave to go and post their photos and see their photos. They’ve got Reels to stop people from going to TikTok.

[01:05:42] Cameron: I mean, they have already spent 20 years trying to keep you in the Facebook walled garden as much

[01:05:51] Steve: Feels like there, there, there was, but it felt like there was an oligopoly of kind of like, these are the social guys, these are the search guys, and now everyone is just demarcating, which I think is good in a way, because it’s increasing competition where they’re stomping into each other’s ground a little bit more, where they’re going to do their own thing.

[01:06:10] Steve: I mean, It’s interesting that Google is ensconced inside Apple and pays, you know, billions of dollars every year, 10, 15 billion dollars, whatever it is, to be the default search. But I just feel like we’re going to have an end, uh, enter a period of hyper competition where all big tech start competing with each other more head on and AI is the start of that.

[01:06:31] Steve: So that was all.

[01:06:32] Cameron: Well, they’re certainly competing with each other for buying GPUs. I mean, that’s the, what’s the big thing right now, right? How many GPUs can you get? thing I had was the futurist forecast

[01:06:43] Cameron: and I was going to talk more about nanotech, but I’ve probably already shot my nanotech load. I mean, just quickly though, you know, I said I had this

[01:06:51] Steve: can say you’ve got your nanotech load. We have really R rated the futuristic today. Thank you, Mr.

[01:06:58] Steve: Cameron Reilly, the OG of podcasts. It is not Joe Rogan, it is Mr. CR up in Brisbane, Australia.

[01:07:08] Cameron: I was gonna, uh, uh, well, we’ll do this. I’ll just, I’ll just shoot to, um, I’ll, I’ll wrap this up. I’ll let GPT talk us through it. So I asked her to give me a theoretical roadmap for getting to nanofabricators in the home. So imagine a device, the size of a washing machine or a dishwasher. in the home that can build nearly all of your daily requirements, including food, clothing, technology, electronics, uh, you know, you need a vase, you need a glass, all of your white hood stuff, and then you’ll have your robot that will assemble Anything that can’t be pre built in the nanofabricator, the nanofabricator will build the components for you, and then you’ll have your robot assemble it for you while you sleep.

[01:08:07] Cameron: I asked GPT for a roadmap, assuming we have AGI within five years. Let me play GPT’s response and we can go out if you have any comments after that.

[01:08:16] ChatGPT: Assuming the development of Artificial General Intelligence, AGI, within the next five years provides a tremendous boost, here’s a theoretical roadmap to achieve the integration of nanofabricators in homes.

[01:08:28] ChatGPT: Phase one, foundational research and AGI integration. Years one to two. Enhanced research. Leverage AGI to accelerate nanotechnology research focusing on increasing precision in molecular manipulation and reducing production costs. Safety protocols. Develop robust safety protocols for handling nanoscale materials, especially those that might be toxic or reactive.

[01:08:52] ChatGPT: AGI driven design. Use AGI to optimize the design of nanofabricators, making them more efficient and easier to use. Material Science Advancements. With AGI’s ability to process vast amounts of data, accelerate the discovery of new materials that could be used safely and effectively in household settings. HATA Phase 2, Prototyping and Initial Regulation Years 2 to 4. Prototype Development. Build prototypes of home use nanofabs focusing on user friendly interfaces, reliability, and safety. Regulatory frameworks. Work with regulatory bodies to ensure that nanofabs are safe for home use. Establishing standards for emissions, waste management, and operational safety.

[01:09:36] ChatGPT: Public Demonstrations. Begin public demonstrations to showcase the capabilities and safety of home nanofabs, addressing public concerns and gathering feedback. AGI Enhanced Simulation. Utilize AGI to simulate different home use scenarios to predict potential failures or misuse, improving system resilience. Phase 3. Market Preparation and Education. Years 4 to 5. Educational campaigns. Launch educational initiatives to inform the public about the benefits and limitations of nanofabs, preparing them for responsible use. Market testing. Conduct extensive market testing to understand consumer needs and fears, refining the product according to feedback.

[01:10:18] ChatGPT: Supply Chain Development. Establish a supply chain for the raw materials needed for nanofabrication, ensuring they are sustainable and readily available. Phase 4. Commercialization and Scaling. Years 5 7. Product Launch. Introduce nanofabs to the market with robust support systems in place, including user manuals, online tutorials, and customer service.

[01:10:42] ChatGPT: Scalability Improvements. Continue improving the scalability of nanofab production to lower costs and expand market access.

[01:10:49] Cameron: Okay, so years five to seven, he sort of maps a fairly, you know, motherhood roadmap for how we’re going to get there. The, the, the big issues I have with this, um, two

[01:11:00] Cameron: things is in terms of the, um, scalability of it, I keep saying to people, once you have one nanofabricator, you

[01:11:09] Cameron: I was going to say, don’t you build one and then that builds the next one and builds the

[01:11:12] Cameron: next one, you get self replication, right?

[01:11:14] Cameron: yeah, it becomes self

[01:11:16] Cameron: I need a chain, isn’t it? Isn’t most of the Help me out here Cam, I can’t remember, it’s grade 9 science, isn’t like 99 percent of the universe made out of three things at the atomic level?

[01:11:27] Cameron: Can’t you just like take anything and convert it, put dirt in, in fab and reconfigure it?

[01:11:34] Cameron: Five things. I mean, nearly everything is made out of carbon, oxygen, hydrogen, nitrogen, and, you know, there’s some other trace elements as well, some other things there. But nearly everything is made out of those things, right? So you think about, um, your crapper. Your sewage system, a lot of, a lot of waste products, anything that’s a waste product.

[01:11:56] Cameron: You know, we, we, we, you know, we have all of these, uh, like, uh, recycling plants that don’t know what to do with old clothes. You know, we used to send them overseas and now they’ve got too much overseas. We can’t, even India can’t deal with all of our secondhand clothing. Secondhand clothing, boom, you just drop everything back into the nanofabricator.

[01:12:18] Cameron: It deconstructs them into carbon, oxygen, nitrogen, hydrogen, and whatever else, and then they go into

[01:12:24] Cameron: containers underneath it, and then the next thing you want to build, boom, we just re everything in the world just reconfigured atoms.

[01:12:32] Cameron: Zara, iPhone,

[01:12:35] Cameron: well, see, when you get into, you know, those sorts of electronics, then you get your copper, you get, uh, boron, you

[01:12:43] Cameron: Tungsten, one of my favorites.

[01:12:45] Cameron: tungsten, Yes, Yes, there are rare

[01:12:48] Cameron: Lithium,

[01:12:49] Cameron: there are rare earth minerals, and and

[01:12:53] Cameron: Borox,

[01:12:54] Cameron: earth minerals that are harder. And some of those can’t be made out of the raw elements easily. You need some, you need a, you need a, a sun to go supernova to make a lot of, uh, uh, heavier elements. You also need it to make carbon, but we have a lot of carbon that, and this is one of the reasons Elon wants to have rockets mining asteroids and mining Mars and mining

[01:13:16] Cameron: places like that.

[01:13:16] Cameron: Right. So we. Get, get access to rare earth materials or rare Mars materials as they might be. Or maybe they’re not rare on Mars.

[01:13:23] Cameron: see, now I like it, Rare Mars Materials, now that is a great name for a post

[01:13:29] Cameron: punk band, rare Mars materials And you know what their first album should be called? Petrol Station Coffee. Like, I feel like that’s the kind of band you and I can play in, before we plug out,

[01:13:43] Cameron: All right. Anyway, nanofabs in 10 years. That’s my dream. Nanofabs and robots within 10 years. And again, I’ve said before,

[01:13:52] Cameron: what are you going to make? You’ve got a nanofab, you can make anything, what’s the first thing You make? You dirty, dirty man. I see where you’re going with this straight away. I’ll just go

[01:14:07] Cameron: yeah, yeah. Uh, yeah. Apart from sex toys. I don’t know. What are you, what are you going for?

[01:14:15] Cameron: with an Earl Grey. I’ll start with my boy from

[01:14:17] Cameron: Right. An Earl Grey.

[01:14:20] Cameron: Earl Grey.

[01:14:22] Cameron: Yeah,

[01:14:23] Cameron: Just to pay homage.

[01:14:25] Cameron: but what does the world look like, Mr. Futurist, when we have nanofabricators in every home and a robot in every home? What does

[01:14:34] Cameron: capitalism look like? What does society

[01:14:36] Cameron: The end capitalism as we know it because you’re basically abundance and capitalism is about scarcity. And about accumulating resources. Uh, because, you know, we live in a world of scarcity and limited resources, and this theoretically removes the limitations. Although I think the one thing that we will never have enough of, the limitation again, always go back to the future, back to the past again, where it’s land and space.

[01:15:03] Cameron: You know, and so while everyone might be able to have something, land and space would become the ultimate scarcity again, I think, in some way. You know, we go away from a consumer goods society, where that’s the ultimate in consumption, to space, and I mean space on earth, so I don’t know. But it certainly is the end of capitalism, I know that much.

[01:15:25] Cameron: We just build up.

[01:15:27] Cameron: So, I’m going to call this episode P Utopia. Forget P Doom. What’s P Utopia?

[01:15:33] Cameron: Like it. We should get that going Let’s get that going. I think that’s something we need to ask some of the leading AI researchers. What is P Utopia? What’s your P Utopia? If your P Doom is 5 and 10 and 20 percent, you know, Elder, Dazowski and all these guys, what’s their P Utopia? If we do get to AGI, solve the energy problem, health problems, the climate problems, scarcity problems of food around the world, recycling, nanofarms,

[01:16:01] Cameron: P Utopia.

[01:16:03] Cameron: P Utopia. Thank you, Steve. It was fun

[01:16:07] Cameron: Thank you, Cameron. Tremendous. Tremendous. That was good. Really good chat

[01:16:12] Steve: again.