Select Page

Futuristic 5

[00:00:00] Cameron: This is futuristic, episode five. We’re recording this on the 9th of June, 2023, about 12:38 PM on a Friday afternoon. Getting a bit warm in my office up here in Brisbane. How are things down in sunny Melbourne? Steve Sammartino, Australia’s leading futurist,

[00:00:21] Steve: self-proclaimed. No, I,

[00:00:24] Cameron: I proclaimed it as well. So we’ve both proclaimed it.

[00:00:27] Cameron: Yeah. Good,

[00:00:27] Steve: good. Now I’ve had a few people sort of say that. I always say if you’re a futurist out there and you wanna have the debate, any topic, any time, no prep required. So I’m looking at the window. It’s cloudy, overcast, and cool. It’s not super hot, Mr.

[00:00:42] Cameron: Reilly. Well, that’s good. I wish I was there. Okay, let’s get into it.

[00:00:47] Cameron: I’m gonna start my stopwatch because, uh, we’re gonna run on a timer again this week. Yes, we are Steve. Uh, tell me about what you have done [00:01:00] in the world of emerging tech in the last week that has, uh, what are your wins, what are your losses?

[00:01:07] Steve: So I did three keynote speeches this week.

[00:01:10] Cameron: Got ChatGPT to write them all.

[00:01:12] Steve: Of course, of course. And in fact, I had a soft robot of myself turn up and I just stayed in bed. So, no, a soft robot. A soft robot. That’s what, that’s what it is. It’s soft robotics, right? That’s the next evolution is, uh, exoskeletons, which is soft. I did one for the rail industry, uh, and that was interesting.

[00:01:35] Steve: Uh, it was online, so it was really hard. It was one of those rare online ones again, where they were all interstate and we all did it on Zoom, and that’s like really tough. It’s like pulling teeth because you’re trying to extract energy through the screen and most people have a little dot just with their initials.

[00:01:50] Steve: It’s really tough. So did that one. And then I did, uh, some MCing and hosting in Sydney, which was uh, that was good. That was like one of [00:02:00] those, what was called the D L T Digital Life. Something I should know anyway. And it, uh, that was again, a really interesting one cause it was a zillion sponsors and different keynote areas and stages.

[00:02:13] Steve: But I had one this morning, which was inspiring. It was at the tech school where I grew up in, whereby the Wyndham Tech School. And they had some of the best kit I’ve ever seen. 3D printers, drones, C n C machines, all sorts of robotics, materials, science. Extraordinary.

[00:02:32] Cameron: They still smell like a sewer in Werribee, or do they take care of that?

[00:02:36] Steve: That hurts my feelings. That hurts me. And I thought that you were more mature than that, Cameron. But here’s one thing that I noticed. There was a key T, there was a key takeout. And the key takeout was. A lot of people are still unaware of AI’s capabilities. Everyone’s using ChatGPT, but you’re not.

[00:02:54] Steve: They’re really using it to do on average, I’m gonna say write them an email. I mean it’s [00:03:00] more than that, but they’re basically doing that. Find me this, they’re still using it like a Google search, like that’s the number one error everyone is making. Instead of Google giving you 10 options, they want chat p t to give them a singular option.

[00:03:13] Steve: People aren’t like going back and layering or putting things in there and asking it to summarize. People aren’t really using it that well. I mean, the basic 10 things you can use chat p t for. I think we get in this wormhole that, um, we think people are like us. Even though it’s got massive awareness, people aren’t really using it very well.

[00:03:32] Steve: And that’s my key insight for the week. It depends

[00:03:34] Cameron: on how you define really well. I mean, people are using it based on where they’re at, but one of the things I have discovered about myself, and we’ve talked about this before on the show, Three, four months ago, right? When, let’s say when 3.5 came out at the end of last year, or when four came out a few months ago, I remember being at a point going, holy shit, this is cool.

[00:03:53] Cameron: Oh, what do I do with it? And I spent like the first week trying to figure out what do I actually do with this [00:04:00] thing? Now I use it, I don’t know, 20, 30 times a day, often enough that I keep hitting my limits. On four. It keeps telling me I have to shut up for an hour, tell my, I have more questions. But what I found, I was saying this to Chrissy last night, it’s like this recursion process.

[00:04:19] Cameron: The more I use it, the more I, the more ways I think of how to use it. You know what I’m saying? Like I’m doing stuff in Excel now, and that’s primarily where I use it. There are situations where I hadn’t even thought I had a problem. And now I’m like, maybe I can, maybe there’s a way that I can use Excel to do this, and I’ll say to G P T, Hey, I’ve got this ideas, there’s some way I can do.

[00:04:44] Cameron: And it goes, yeah, absolutely. So there’s like new ideas coming up in my head all the time that GPThas enabled my brain to think up things that I didn’t know were a problem, ways that I’ve been doing things like, you know, I run [00:05:00] a lot of spreadsheets for the investing side of the business and. You know, I would be cutting and pasting a lot of data in from different sources and they start thinking, well, maybe there’s a way of automating this now, and I’ll ask GPT, it’ll cook up something for me.

[00:05:13] Cameron: It’s just this process of just gradually becoming more and more comfortable with the fact I was saying to Chrissy, let, like I’m almost blase about the existence of ChatGPT now. Like something that a year ago I would’ve thought was at least 10, maybe 20 years away, access to this kind of artificial intelligent assistant.

[00:05:33] Cameron: Now I just take it for granted, like I’m just using it. I’ll ask you to come up with a a formula for me for Excel, and it’ll give me one, it’ll be, let’s say it starts the formula at a one, cuz I didn’t give it good enough information. When the data actually starts at a five, I can go in and change it from a one to a five in the formula myself, but I’m so lazy now I just go, Kate, can you do that again, but just change a one to a five?

[00:05:57] Cameron: Yeah, yeah, yeah, sure. Absolutely. I can do that. It’s just like [00:06:00] this. Blase fact that I have this super brain now that can do stuff for me anyway, I saw a great quote on Reddit I wanted to throw at you. Having ChatGPT by your side is like going through life with cheat codes.

[00:06:13] Steve: It is, it’s funny, I, I was, I wrote down the word before co-piloting, and this is the thing that I think, and it circles back to what you’ve said, well, it depends on what they want to use it for.

[00:06:22] Steve: I’ve said, you know, the, the use cases are broader than most people imagine, but I guess I’m thinking of chatGPT and GPTs in general as co-pilots.

[00:06:32] Cameron: Microsoft paying you

[00:06:33] Steve: to use that. No, but that idea of it being a co-pilot is ostensibly what it is. Right? And so it becomes relevant to what you are doing in that day that you can ask it.

[00:06:43] Steve: But, um, great technology becomes invisible, right? Once it’s a great technology, it’s invisible. You just expect that it’s electric electricity, it’s just in the walls. And, uh, I’m using. GPTs at the moment. One of the big things I’m doing is I’m asking you to give me [00:07:00] steps and processes more than answer questions.

[00:07:02] Steve: There’s a lot of those on the web, they’re hard to find and which one’s good and sometimes they’re wrong and they’re outdated. It’s really, really good at process steps, like providing process steps. That’s one of the, the hacks I’ve been using it a lot for lately is, give me the 10 steps to fix this or do that.

[00:07:18] Steve: Can you give us a real

[00:07:18] Cameron: life

[00:07:19] Steve: example? I was having some problems with Apple on my phone and my, cuz I run the, the apple uh, mail, so my calendar and my phone weren’t marrying up and it drops out every now and again when there’s an upgrade and I just went through there. I could’ve went onto Apple for an hour and then booked an appointment.

[00:07:36] Steve: I went to GPT, just gave me four steps. It worked. I went to a few websites before that, I thought, I’ll just go ChatGPT and it had a far better and succinct answer on how to get them going and, and fixing it. A lot of those little hacks where you need someone to help you. I mean, they’re doing a service to a lot of companies that will save them hours in customer service calls.

[00:07:55] Steve: I

[00:07:55] Cameron: think one of the other things that I managed to do, I may have told you this, about a month [00:08:00] ago, I pointed GPT to a podcast transcript that was online, one of my own. That was, you know, an a transcript from a 90 minute podcast, asked it to summarize the transcript for me, and it gave me a fake summary.

[00:08:16] Cameron: It just hallucinated something up. I did it a couple of days ago and it gave me an absolutely perfect summary of a 90 minute transcription. So why would it do that? I think, uh, the whole web interface stuff, uh, wasn’t very developed a month ago or six weeks ago when I tried it. Now, of course it’s got web access through plugins, et cetera, et cetera.

[00:08:41] Cameron: It’s also got some bing stuff integrated into the version four stuff. But anyway, it used one of the web interface things to go out, read the transcript, and then gave me an absolutely perfect, uh, summary. Couldn’t get it to do the transcription itself. I pointed it at an MP3 and said, you could, can you [00:09:00] transcribe this MP3 for me?

[00:09:01] Cameron: And it couldn’t do that. But it can now do the summary of the online transcript, which is terrific. You know, you think about, Pointing it at any webpage now, uh, or a website and say, read this and give me, uh, a summary of it. One of the things my son Taylor did, he writes articles for Yahoo Finance every week.

[00:09:22] Cameron: He pointed it at his author profile on Yahoo Finance and said, read all of the articles written by this guy, and then give me a summary of his writing style, which it then did. And he said, right now, write me an article in the voice of that person. And it wrote one for him in his voice. So he’s now trained it to write articles in his specific voice as he writes for Yahoo Finance.

[00:09:50] Cameron: So there’s some sort of interesting ways of tuning it to your own personality, your own voice. You’ll be able to upload your own [00:10:00] documents now and point it at them if they’re in Dropbox or somewhere like that and say, analyze this style. I’m, I’m using it at the moment. One of my, uh, nephews has got a birthday coming up and I’m using it to write a children’s book for him about him and all of his cousins and all of her family.

[00:10:16] Cameron: So the, I’ve been using it to write bedtime stories for Fox for the last couple of weeks. Every night I’ll say, what do you want a story about? And he’ll gimme a range of things and it’ll write a story about a poop monster. In the garden shed that comes alive or whatever it is, whatever he, whatever toys or ideas he’s had that day, we throw it into GPTand they’re usually fairly simplistic.

[00:10:36] Cameron: They’re not very good. What I did to make this children’s book is I asked it to, first of all, give me a breakdown of all of the common elements of the most popular children’s stories, which it did. Then I asked it to give me a breakdown of Joseph Campbell’s Hero’s Journey, which it did, and then I said, right now, combine all of that into writing a [00:11:00] new children’s story based on that sort of approach, those themes, those elements.

[00:11:04] Cameron: Then I gave it a list of all of my nephews family members, all of his cousins, grandparents, parents, a little bit about them. And I said, now, build a chapter outline that integrates all of these people and all of those story elements that you mentioned before. And then it did that. And then I said, right now, write chapter one.

[00:11:24] Cameron: And it wrote chapter one. I said, great. Now write chapter two. And if I needed, you know, when I would read each chapter and I’d go, actually, this person doesn’t talk like that. Or they don’t know that, you know, change it to this. I’d point it in different directions and get it to my, but I basically wrote a 13 chapter book for my nephew last night.

[00:11:44] Cameron: Last night,

[00:11:44] Steve: just last night.

[00:11:46] Cameron: I mean, it’s not the greatest book of all time, but it’s a, it’s a story about. Kids, you know, using a lot of family stories too that I put in there. Things that happen in the family, you know, little idiosyncrasies of the grandmother [00:12:00] and all that kind of stuff that I put in. Went to MidJourney to create the art.

[00:12:04] Cameron: Gonna go to snap print or somewhere like that in the next week and get them to do a couple of soft cover bound versions of this for him, him and his cousins for his birthday. I mean, it’s just a fun little exercise, but gotta wonder kids writing children’s books. You know, is there a future for children’s authors?

[00:12:25] Cameron: Oh, Fox said to me, uh, last night, I think we’re gonna use ChatGPT to write bedtime stories for the rest of my life. This is so much fun.

[00:12:33] Steve: It can become a collaborative process too, right? Where you can write the story that you want. So it’s, well, this is the thing in, it goes back to that whole Disney idea of imagineering.

[00:12:42] Steve: You know, the idea of prompt engineering imagineering. The questions you ask the AI become all important. You know, it does have hallucinations because one of the things, if you’ve written a lot and published a lot of content like you and I have, we can ask it already to do it in the style of Cam Reilly or Steve Sammartino and it spits something [00:13:00] out.

[00:13:00] Steve: I asked it to reframe an article and go make this as if Steve Sammartino wrote, it was an article that I saw on Bloomberg or somewhere, and it came back with a whole lot of weird stuff where it had overlapped my surfing. As a thing that I, I publish a lot of visuals and I’m down at the surf park and I went here and I’ve got sort of my work and they’re the two things.

[00:13:18] Steve: And it’s like, it sounded like, oh, here’s the, uh, the way that you should approach this is like a surfer would riding the wave into the future. And it got all, got all weirded out on it. And what I’m curious in is that I’ve asked it the same question twice and it gives different answers. It doesn’t always give the same answer, which I, which I really like.

[00:13:39] Steve: I like that it’s different each time.

[00:13:42] Cameron: It’s a probability engine, right? So in theory, a large language model using a transformer architecture is trying to figure out what the most probable word is that should come next. You would think that that would always be the same. I mean, if you are waiting the [00:14:00]probability of words that should appear, they should always come out with roughly the same waiting.

[00:14:06] Cameron: The fact that these stories are different is interesting. I, I know what you mean though, because in this book, uh, that I, the, the children’s book that I was getting it to write, I would throw in the fact that my, uh, sister, my, my nephew’s mother, Is very good at making elaborate birthday cake. She goes over and beyond to make these really crazy elaborate cakes.

[00:14:24] Cameron: I mentioned that in her bio, in her profile for the, at the beginning of the story, it used her cake baking skills as a way of her teaching the kids, uh, about some life lessons in this story and bringing it back to, sometimes you need to, you get lost in the details and you need to focus on the fundamentals when you’re making a cake.

[00:14:47] Cameron: And it, it did that with their father as well. Who’s an ex-military guy? It, I just said he is ex-military. And it brought in some military principles to give the kids ideas. It, it, it tries to find [00:15:00] ways of taking what somebody does and, and building, you know, metaphors or, or, or life lessons into these stories.

[00:15:08] Cameron: That’s fascinating the way it does that, without being prompted to do that. Like it just works out. Yeah. I should find a way to use this, uh, to help tell the story.

[00:15:18] Steve: The thing with, with probability engines as well is that just because it looks for the most probab, it doesn’t always do it. The thing with probability is the probability’s imperfect.

[00:15:27] Steve: Every now and again, it’ll choose something within that 5% edge case. And, and that’s the one thing, you know, we have confidence levels is one of the first things we learn is statistical analysis. You have a 95% confidence level. These machines have confidence levels as well. And every now and again, it’ll go outside of those confidence level bounds, which is where the hallucinations come from and where those we weird iterations come from as well.

[00:15:50] Steve: And it’s also a, a live model. Even though the data on the model is old, the learning pros of the model is changing. So the dataset is old other than what you [00:16:00] put into the question base, which changes the dataset slightly. So you can put in modern data for it to analyze or current data, or it can come from depending, if you ask it to pull something out and you put that together, you’re gonna get different answers.

[00:16:11] Steve: Cause the model is a, a live organism of, of what it delivers back. I mean, it’s just so fascinating here that we have. Technology, which everyone’s using and fascinating, are fascinated by. And because it’s machine learning, large language models and there’s this black box element where we dunno how it works and the people who own it and make it don’t even know exactly how it works.

[00:16:32] Steve: It’s like this giant curiosity. Yeah. Like this live experiment. It’s so interesting. From that perspective

[00:16:37] Cameron: really is we’re 20 minutes into our one minute intro section, Steve. Well,

[00:16:42] Steve: which is perfect because now it’s time for t3, the top three in tech this week.

[00:16:53] Cameron: Well, there’s only one thing really that that has to go at the top, and that’s Apple’s Vision Pro [00:17:00] at WWDC. The, the, the finally, they’ve, they, they sort of revealed this thing that everyone knew they’ve been working on for years, and people have been expecting an announcement for years. They finally announced Vision Pro, their ar, vr, whatever, uh, goggles.

[00:17:20] Cameron: What did you think, Steve? Did you watch the full keynote like I did? I

[00:17:25] Steve: didn’t watch the full keynote. I woke up in the morning and saw the low lights and decided it wasn’t worth my time. But I did see a lot of the little pieces I did notice, and, and I, I saw a lot of updates across social media, especially on TikTok.

[00:17:41] Steve: TikTok apparently was the first ever launch where they didn’t have a demo. Or at least even a hack demo first ever, right where it was all video, which I thought was interesting. It made me feel like it was rushed and it wasn’t. It wasn’t even working well enough to fake a demo cuz apparently Steve Jobs faked large parts of the demo on the [00:18:00] iPhone.

[00:18:00] Steve: When that was launched, look, it felt rushed. I think this is an example of a technology looking for a home. The good thing about Apple is they’re nearly a 3 trillion company. They can afford a failure. The answer that we needed to hear was the same one that we got. There was a bunch of fanboys sitting at a big screen watching it live somewhere.

[00:18:23] Steve: It wasn’t, I don’t think, at the Apple HQ. And it was all the fan boys. Where was that? You know the one I’m talking about? And when they launched it and said, and it comes in at a price of 3,409, everyone went, Ooh,

[00:18:38] Cameron: everyone laughed. The people were snickering and laughing in the audience. The video that I saw, the whole thing was the, like, the whole presentation.

[00:18:45] Cameron: I did watch the fall, two and a half hours of it that night. Uh, later on in the night. It was really weirdly produced, overly produced, lots of digital effects, and I’m pretty sure all of the presenters voice were dubbed in a booth afterwards. [00:19:00] They weren’t micd up at all. They’re standing in front of green screens a lot of the time.

[00:19:04] Cameron: But the voices sounded bizarrely like clear and like they were speaking in little sound booth and it didn’t exactly match up with their lips when they were talking. So I had this whole uncanny valley feeling throughout the whole thing. It really threw me.

[00:19:21] Steve: It looked like a Proctor and Gamble shampoo, a with a person from who knows where with a classic Eurasian look video selling shampoo to 12 markets at once.

[00:19:31] Steve: That’s what it looked like. And my

[00:19:33] Cameron: initial impression when I watched the like, uh, summary video in the morning was disappointment over the, the Vision Pro. But then I watched the full thing and I can, you know, I kind of think I get what they’re doing here. First of all, it doesn’t come out for another like a year.

[00:19:49] Cameron: Right. And then when it does, it’s only coming out in the US market. At this stage. It’s obviously extremely high price. Three and a half US probably at a 5 1, 5 [00:20:00] 2 landed in Australia. So it’s obviously not designed for a mass market device. I’m guessing It’s designed for really early adopters and probably a lot of developers, they’re gonna try and push it out there to get people building apps and, and entertainment experiences on it.

[00:20:20] Cameron: I, you know, I, I think it’s, it looks interesting and cool and futuristic and obviously this is just the start. I mean, there are other goggles already out there, so they’re not, it’s not groundbreaking in terms of we’ve never seen goggles before, but they’re, they’re certainly slicker and sexier. I think that any of the goggles, like from Meta or any of these other companies that are out, but the feeling I got with that announcement is the same feeling I got with their watch mark.

[00:20:46] Cameron: I don’t think Apple knows how to market shit anymore. Big

[00:20:50] Steve: call from the world’s biggest

[00:20:51] Cameron: brand, my feeling, well, they became the world’s biggest brand when Steve was running things and Steve knew how to market shit. Tim took over [00:21:00] whatever 12 years ago. He is done a great job running it as a business.

[00:21:03] Cameron: Obviously the share price is like 10 times, a hundred times what it was when Steve died. Very, very profitable business. But they dunno how to market. Tim’s not a marketer. He’s a business. He’s a supply chain guy. Guy. Yeah, right. I felt this about the watch when it came. Yeah, the

[00:21:20] Steve: watch. The watch isn’t a great product.

[00:21:21] Steve: I don’t care what the revenue is. I mean, companies can coat, can coast on what they’ve done in the past for a real time. I’ve got an Apple watch and it’s in my drawer. I use it when I go for a run and I go surfing. It doesn’t have the utility. It’s a pain in the ass. All it is is a bridge to the phone and it’s a bridge.

[00:21:39] Steve: I don’t frigging need. Really The watch is like massively overhyped. Yeah. It’s got some use cases, but it’s not worth it.

[00:21:46] Cameron: I’m gonna disagree. I love

[00:21:48] Steve: my, oh good. What? How can you love it? Tell me one thing. It does. I will. That your phone

[00:21:53] Cameron: can’t do. My son Taylor is like you though. He, he and I bought our watches at the same time, [00:22:00] and which was, I don’t know, a year or so ago.

[00:22:02] Cameron: He hasn’t, he wore his for about a month and then he hasn’t worn it since. He’s like, it’s pain in the ass to charge, et cetera, et cetera. And here’s my story with the Apple Watch. When they first came out, I was like, eh, it’s cool, but what’s the killer app? Why? How’s it gonna make my life better? What do I need it for?

[00:22:18] Cameron: And I kind of got like all of their early marketing. My impression was it seemed to push the idea that, oh, you can check your messages when you are moving around in the office. You can check your messages or you can check an email comes in. I’m like, well, I just sit at my desk all day. Uh, so I don’t need a watch to check my messages on, oh, you got a phone?

[00:22:39] Cameron: How hard is it to do that? I’ve got my phone, all right. And I, and I waited year after year after year for them to show me the killer app. Oh, you can track your sleep. Eh, I don’t really need an nap to track my sleep. Can you track your heart rate when you’re exercising?

[00:22:56] Steve: I’ve got a body to track my sleep.

[00:22:58] Steve: If I wake up fucking tired. [00:23:00] It wasn’t a good fucking sleep. I don’t need a number. I missed to tell me, fuck Alison, we gotta get serious about this shit.

[00:23:07] Cameron: Like, yeah, just for year after year. I was like, where’s the killer app? Why aren’t you communicate? Look, I’m an Apple fanboy. I live in Apple. I love Apple.

[00:23:17] Cameron: I’m everything Apple. But you are not. Telling me why I need one of these things. Like I’m your Target demo here and you’re not talking to me about how this is going to make, why should I spend 800 bucks on one of these things? How’s it gonna make my life better? And they just never got there. Now, the reason I bought one is because my doctor scared the shit outta me, my heart doctor.

[00:23:39] Cameron: Um, he basically said, you know, you could drop dead from a heart attack at any minute. And I knew that the watchers had the alert, alert stuff, right? So if Chrissy’s out of the house and I turns out my heart’s fine, but he scared me for about three months there. But my thinking was if I fall over of a heart attack, you know, Chrissy’s not home.

[00:23:59] Cameron: I’m at [00:24:00] home all day. There’s no one around me. I’m like a hermit sitting in my office. Yeah, yeah, me too. You don’t have to press a button. It, it’ll automatically, you know, send off a, an alarm call to the ambulance and next of kin or

[00:24:11] Steve: whatever. How does it know? How does it know? Because of that heart beating, heart’s not beating.

[00:24:15] Steve: That’s not necessarily a heart attack though, is it?

[00:24:17] Cameron: No. If you fall over, if you just fall over. It’ll pop up a message saying, are you okay? Do you need help? If you don’t say, I’m fine, it’ll automatically alert someone. Has yours

[00:24:28] Steve: ever gone off in that situation like doing karate or something?

[00:24:31] Cameron: Yeah. Well, I don’t wear it when I’m doing kung fu, but it’s gone off twice when I’ve fallen over and says, are you okay?

[00:24:37] Cameron: Yeah, and I’m go, yeah, I’m fine.

[00:24:39] Steve: I’m gonna put mine on and fall over this afternoon and just test that out.

[00:24:43] Cameron: Yeah. Okay. Well, I bought one first for my mother, who lives by herself in Bundaberg. Makes sense. They’re like, well, if something, if she falls over, we, we need to know she’s got no one around her. She lives by herself.

[00:24:54] Cameron: Then I bought one for myself, but then I realized, actually the killer app for me on my [00:25:00] watch is, Hey, remind me of this. Hey, remind me of that. Hey, set a timer for this. Hey, start a stopwatch for that. I use that 20 times a day and I don’t want to carry my phone around with me. If I get up from my office and I go to the kitchen, To make a drink or I go to the car to get something, or I go outside or I go upstairs.

[00:25:22] Cameron: My brain is one of those where things pop into my head. I’m like, oh, I should’ve done that. I gotta call that guy. I gotta do this. And if I don’t write it down immediately, it’s gone forever. Uh, uh, five minutes later I’ll forgotten what that thought was. And what I’ve trained myself to do is as soon as it’s, it’s, um, getting things done, David, what’s his Face’s book from 25 years ago, right?

[00:25:44] Cameron: As soon as the thought comes into my head, I should do something. I just add it to reminders. Oh, remind me in an hour to call Steve and say, why the hell haven’t you answered my email? Uh,

[00:25:54] Steve: did you do that, by the way, listeners? That’s what Ken did. He said, Hey, what the mo we have you done this [00:26:00] thing? You haven’t updated the doc yet.

[00:26:01] Steve: And I was literally just about to do it when you emailed me. But anyway,

[00:26:04] Cameron: So I use, since, for me that’s the killer app for the watch is just, it constantly reminds me of stuff. If I’m watching TV at night with Chrissy, I get a reminder that I have to do something, take my tablets, you know, take something out the dryer, get the bread out the oven.

[00:26:21] Cameron: My my day is filled with timely reminders and the ability to get data in and out via the watch. My analogy here is with the Vision Pro, they didn’t tell me how it’s gonna make my life better. They didn’t sell me the killer app. The only thing I got out of it, cause I haven’t got one, but we’ll get to that.

[00:26:38] Cameron: Yeah. Well, I, I think you’re right. I don’t think that, see this is the classic and worked at, you know, I worked at Microsoft for a long time and this is the classic problem when you have engineers building things without input from humans. Yeah.

[00:26:51] Steve: Humans. Humans, sorry, engineers. You know, I love you. I’m one of you.

[00:26:56] Steve: Deep down.

[00:26:57] Cameron: You end up with brilliantly [00:27:00] engineered products with no use case for them, no compelling use case technology, and looking

[00:27:04] Steve: for a use.

[00:27:04] Cameron: Yep. Now the only thing I got out of the Vision Pro was the idea of having unlimited, multiple unlimited monitors. When I’m working all around me, uh, you know, I’ve got two monitors on my desk and adding the second monitor, which I only did like a couple of years ago, completely changed my workday.

[00:27:21] Cameron: Um, having multiple screens is fantastic. Uh, you know, I can imagine having three or four screens how it would make me even more productive. The idea of having virtual high res screens all around me. Sure. I, I, I think that sounds cool, but I can buy a, a new 27 inch monitor for 200 bucks. I could add another three of them here for 600 bucks, and then why would I spend five grand to do that?

[00:27:49] Cameron: It just doesn’t make a lot of sense.

[00:27:52] Steve: So this is the point, this is the point is that look, and a lot of people, the fanboys in the comments sections are saying, oh, that’s why they’re engaging developers. And I [00:28:00] think they are because they’re trying to get them to find a use case. Because one thing people forget is that big public companies need revenue growth.

[00:28:08] Steve: They have to find the next thing. They’ve got fund managers, they’ve got all of that, and you know, they don’t wanna sort of flatten off. They want to continue to grow, so they need to launch things. The use case you’ve given on the watch, I, I, I get that. For me, I use the EarPods for that. I have my pods in and I say, Hey Siri, send me an email with this.

[00:28:26] Steve: Or I do that reminder thing with the EarPods, which goes straight to the phone. So I, I use that and I see that use case. I just don’t see. And, and from, for you, it feels like the watch you use in the same way I use my EarPods, but it just seems like there’s, there’s no use case there for this piece of technology.

[00:28:44] Steve: They’re, they’re, they’re really struggling to find it. I just can’t see it having a killer app and at that price point. It’s just not enough. E especially even if it, it does interesting things. Yes, you can put it on, but it doesn’t solve a real problem. It’s an enhancement of some things that are [00:29:00] already there, but it doesn’t really solve a problem.

[00:29:02] Steve: And it’s, it was, I think, just disappointing is all. And, and eventually we’ll get there, but I’m gonna, uh, when we close out and we go to the forecast, that’s where I’m gonna bring in where I, what I think’s next.

[00:29:15] Cameron: Well, just looking at their share price. Like it, it took a big hit. Not a big hit. Yeah. The market didn’t like it.

[00:29:20] Cameron: Yeah. Not a huge hit, I mean, for Apple, but it was disappointing. The other interesting thing about the presentation, well, when you watched the full thing, was Bob Iger from Disney was their special partner. They brought in, he did this Disney thing, and it was the biggest crock of vaporware bullshit I’ve ever seen.

[00:29:36] Cameron: It was another massively overproduced video. That didn’t show me anything. That was a realistic application of this. And of course the whole, you can watch movies on it. Yeah, it’s great. But who, you know what? You’re by yourself sitting at home watching a movie. How does, how does my wife, how do my kids watch a movie if I’m sitting with the goggles on?

[00:29:55] Cameron: That was

[00:29:55] Steve: the crazy. And even better than that, even better than that, the battery life is shorter than most [00:30:00] movies. That was my two hours Battery life. Yeah. Yeah. That’s great. It’s two hours. Listen, you can watch movies with this one minute 59. That’s the key

[00:30:08] Cameron: prohibition movie. Watching movies are getting too long anyway.

[00:30:11] Cameron: We need to do something about that. We need to make them make, tell Martin Scorsese to make a shorter movie. That’s their

[00:30:17] Steve: whole plan. And here’s what you’ll see on Apple Plus if you’re a subscriber, every single movie. One minute. 59, none over. They’re gonna cut them all off on their long tail of content on Apple Plus movies 1 59 cuz that’s how long the battery lasts

[00:30:32] Cameron: for.

[00:30:33] Cameron: Okay. Maybe gaming, you know, for gaming is typically a solo thing. Gaming,

[00:30:37] Steve: look, there’s three use cases and I wrote about this when Meta stuck up with their metaverse and theirs is just a slicker version of the Metaverse. Instead of like people with half legs and little cartoon. Mark Zuckerbergs. This is like the cool design version of that piece of shit.

[00:30:52] Steve: So there’s three use cases. One is education and training. You think surgeons, pilots, what have you. The other one [00:31:00] is entertainment and gaming. Great. I can see that working. And the other one is industrial cadden use cases for pre-building buildings and walking around. They’re the three use cases. This will go back down.

[00:31:12] Steve: Into the industrial use cases and fade away and just become like any other piece of industrial or corporate machinery. It’s not gonna be something that’s ever consumer adopted until there’s a big iteration in the future, which I will reveal at the end of this podcast.

[00:31:25] Cameron: Okay, and moving on. But staying with Apple’s thing, obviously, uh, they made z the words AI were never uttered once in their two and a half hour keynote.

[00:31:36] Cameron: Really? No way. Nothing about ai. They did use the term machine learning a lot. Well, we’re using machine learning to do this machine learning to that mostly to make auto correct a little bit better. There’s a great line when Kevin Feder says, and for those times when you just wanna write a ducking word, it will now let you write that [00:32:00] word.

[00:32:00] Cameron: So it’s not going to edit out fucking and turn it to ducking anymore.

[00:32:04] Steve: Don’t you hate ducking? I just love that you said that. Just stop for a minute. I never thought ducks would be such an important part of all my written communication as it’s become in the last five years. I write duck a lot. I’m like, duck this, duck you.

[00:32:21] Steve: This ducking annoys me.

[00:32:23] Cameron: Go duck yourself. Yeah, you can duck yourself. The highlight of the whole presentation, by the way, was Kevin Feder pulling out a three necked guitar and then doing an Eddie Van Halen tapping solo on it. When he was leading into some segment, he apparently really play guitar, so I was like, yeah.

[00:32:41] Cameron: Alright, good to you. That’s pretty good for you. That’s pretty cool. I’m

[00:32:44] Steve: watching that. Forget the thing, but I’m definitely looking for the highlights on that because there’s nothing I like more than a bit of Eddie

[00:32:49] Cameron: Van Halen. Yeah. R i p Eddie. But like, they talked about machine learning, as they said, but it’s just nonsense bullshit.

[00:32:55] Cameron: Like, oh, you, we can make new emojis and new stickers and better. [00:33:00]They did have a couple of good tools. The good features, like there’s gonna be some upgrades to iOS and the watchOS that integrate some widgets and updated alerts. There’s a good thing coming to the phones, which is, um, they call it something, but it’s like a, it’s like a safety thing.

[00:33:18] Cameron: So if, if your girlfriend leaves your place and she’s going home, she, you can set an automated message, she can set it up so it will alert you when she’s home safely. And if she’s delayed in traffic, it’ll tell you that she’s delayed. And it will, you can also get, like, you can with Find My Friends kind of thing, but you’ll be able to see what the Battery Life is on their device and where it was last online.

[00:33:41] Cameron: And there’s a lot of some safety stuff built in. Oh, they finally brought back the old Bump idea from 15 years ago. That was a great

[00:33:48] Steve: product bump. I reckon that was underrated.

[00:33:50] Cameron: Bump was great. You would’ve thought this would’ve been built into version one of the phone. But now finally with the O iOS update, you just hold your phone next to somebody else’s [00:34:00] phone and it uses a version of airdrop to share your contact card with them.

[00:34:04] Cameron: So, but these are like, these are their big announcements and they’re just, they’re just minor, minor bullshit. The

[00:34:12] Steve: whole thing was disappointing. Terrible product launch with minor bullshit. One of the ones that was hilarious. I mean, one of the things that was the, the greatest thing about this. Apple presentation was all of the meme wear that came out of it on the socials because they had so many weird things.

[00:34:29] Steve: And one of my favorite ones was on the phone. Now you can put an image of yourself and who it is, and it’s the full page contact. I see it. Yeah. The contact card. I saw someone saying on Twitter, I’m gonna put a picture of a hot chicken, say sugar baby, and start bringing my friends at two in the

[00:34:43] Cameron: morning.

[00:34:46] Cameron: That’s nice. Yeah, it’s funny stuff. All right. Leaving Apple. Very disappointing Week from Apple. Moving right along.

[00:34:52] Steve: Moving right along. Cameron, Coinbase and Binance, the s e c have arrived. With the [00:35:00] double whammy for the sector. And Bitcoin came off a little bit down to it, I don’t know, 23,000 as well now.

[00:35:06] Steve: And we had the two big, uh, areas, uh, Binance and Chenin or CZ as he’s affectionately known, right up there with sbf. Now, CZ and SBF might spend some time together in the big house, but of CZ as a US citizen, I may, I’m not sure, but the s e c has charged him with 13 charges in all. Many of them are related to misuse of investor funds, running, uh, unregistered broker exchange, clearing agencies, basically acting like a bank that isn’t a bank.

[00:35:42] Steve: That’s the long and short of it. Let’s, let’s skip around what the exact amounts are or exact charges are. And what was crazy was the s e c actually when they announced the charges, put like a big screen print with their chief compliance officer saying, Damn it. [00:36:00] We’re running an illegal exchange here.

[00:36:02] Steve: Like that’s, and, and I thought that was crazy that the s e c played a real strong Twitter game in their announcement of the charges against Binance. And CZ is gonna, the big house is, is my clear view on this and, and I’ve, I’ve got a clear view on all of this right along it comes back to something called the HOWI test.

[00:36:21] Steve: And the HOWI test is part of US law, which helps us circumvent people using different instruments that they claim are non-financial instruments. And basically what the Howi test says is if you are using assets or anything in any form of exchange, which looks, sounds like, acts like a financial instrument, then it is a financial instrument.

[00:36:45] Steve: And that’s basically what this comes down to. So I think that this is really going to put a. A knife into the side of the crypto sector again, and it has become such a scammy [00:37:00]place, you know, a lot of wash trading going on. Uh, that was one of the, the, the big accusations that they were taking, the money trading up and then putting it back in the market with, you know, limited amounts to push the prices up.

[00:37:12] Steve: It, the whole thing feels really stinky.

[00:37:16] Cameron: Yeah. I’ve been having this, uh, conversation with Crypto Bros for years, particularly when, you know, they’re pushing, uh, crypto as, uh, uh, an investing opportunity. It always seemed obvious to me that at some point, governments around the world were going to wanna take some form of control over these alternative currencies and separating the idea of.

[00:37:46] Cameron: Blockchain as a technology, which as we know from the, uh, Australian stock exchanges attempts to rebuild their trading platform on blockchain. Tremendous opportunity, tremendous [00:38:00] opportunities there. The fact that their project failed disastrously and they’ve just announced they’re not gonna, uh, move off of chess for another 10 years.

[00:38:09] Cameron: There’s another story, a lot of opportunity in using blockchain for different things.

[00:38:14] Steve: I agree, and I genuinely do believe that blockchain is an incredible tech that has a lot of important use cases and, and currency is one of the use cases. Sure.

[00:38:23] Cameron: But then you separate that from all of the claims made by the crypto bros over the last few years about how it’s gonna take over the world’s currency, et cetera, et cetera, and all these great opportunities.

[00:38:34] Cameron: Like anybody with half a brain knows that at some point the governments have to control their currency. It’s, you have to control your security. You have to control your currency. You have to control your legal system.

[00:38:45] Steve: Monopoly on violence. Monopoly on your currency. Monopoly on tax. I mean, these kids need to pick up a 300 year old economics textbook.

[00:38:52] Steve: You know? I mean, seriously, Machiavelli the Prince wrote about that during the Renaissance. It’s not, this is, there are certain principles that you need to maintain [00:39:00] your society and symbolization and they just don’t understand it. The

[00:39:04] Cameron: state needs to control power. Exactly. It’s insane. Like so eventually though, you know, we know governments are either gonna crack down or regulate or do something to gain control of this.

[00:39:15] Cameron: And there’s been so much cowboys behavior in this sector, you know, it’s, it really boggles the mind. So, yeah, it’s not surprising to me that these crackdowns are starting to happen now. What are, what are the implications of that for. Technology futures. Steve, what does it mean?

[00:39:32] Steve: Well, I think implications are, are a few.

[00:39:34] Steve: I think that all crypto exchanges in the long run are gonna go to zero. And I’ll tell you why. Because the idea of crypto is that you can exchange peer-to-peer. The fact that they have created intermediaries actually goes against the entire ideology that it was built upon. And it’s op an opportunistic way to create a UX and make money instead of letting these operate in the way that they should wear.

[00:39:57] Steve: There’s purity in the trading. And for [00:40:00] that reason, I think that every crypto, other than Bitcoin and maybe Ethereum, which will become like a supply chain kind of crypto or smart contract crypto, and Bitcoin becomes a digital, let’s say it’s store of value, which I think is still questionable even though it’s the best performing.

[00:40:14] Steve: Asset class in inverted commerce over the last decade or since it was launched, or 15 years or so. I still think that none of it is going to survive. I think all the exchanges will go away. I think, uh, Coinbase will come off the share market. The fact that it only lost 13%. Once it got taken by the s e c is astounding short that stock, this is not financial advice.

[00:40:37] Steve: Short that stock cuz that thing is, is gonna go away and there’ll be direct peer-to-peer trading on the blockchains for crypto, the people that want it. And in the long run we end up with gov coins. What are Gov coins? Exactly the same as a u D and U S D except they become programmable. And you’ll have one usc, one US Crypto is worth one U s d and [00:41:00] they’ll be stable based on the government, but they’ll have crypto functionality and blockchains built into them.

[00:41:05] Steve: All of the speculation and yes, real money got made. Our good mate, Ross Hill. Yeah, he made a fortune. He made tens of millions of dollars by being an early adopter and good on him. Smart him. I love him. But they’re not an investment. The only things that should ever rise in value is something that has a yield, which justifies said value in the capital.

[00:41:25] Cameron: My question always to the crypto browser is, tell me how to calculate the intrinsic value of one Bitcoin today. When you can do that, I can work out what I’m willing to pay for it, but if you can’t tell me how to calculate, they say, well, you look at the price of it in the market, and I go, no, no. Price and value are two different things.

[00:41:43] Cameron: Tell me how I calculate the intrinsic value of one. They go, well, there’s a limited number of them will ever be produced. Like, no, no, you’re not, you’re not answering my question

[00:41:51] Steve: and not really understanding. There’s a, there’s a lot of things in the world that there’s a limited number of, yeah,

[00:41:56] Cameron: I always say there’s a limited number of limited number of shits that I’m ever gonna [00:42:00] take.

[00:42:00] Cameron: Does that mean that they have intrinsic value? How much are you willing to pay for? One of my dumps,

[00:42:06] Steve: by the way, poo has come up twice in this

[00:42:09] Cameron: recording. Yeah. Well, I’m just an eight year old boy at heart, man. Well, the

[00:42:13] Steve: other one too. This is my favorite one when it comes to crypto, right? If you really want to know what a crypto bro believes in, ask them this.

[00:42:21] Steve: What’s one Bitcoin worth today? And they’re going to answer you in US dollars, and then you really found out what they truly believe in. All

[00:42:29] Cameron: right. Well then wanna say anything more about, uh, Bitcoin or cryptos?

[00:42:34] Steve: No, I don’t, but I am looking forward to the subsequent documentaries that come out about sbf and cz and

[00:42:40] Cameron: friends.

[00:42:41] Cameron: Mm. It’s just beginning, I think.

[00:42:47] Cameron: Let’s do the double d Steve, the deep dive. What do you wanna deep dive on today, buddy? We touched

[00:42:53] Steve: on it last week, cam, but I love it when something big starts happening, that we [00:43:00] develop acute moniker, and the latest one is P Doom, which is, you know, p parenthesis doom in big letters. What is the probability of doom?

[00:43:10] Steve: The P doom is from AI researchers. The estimated probability that current and developing artificial intelligence systems will pose an existential threat. To humanity and some of the proponents on it. Uh, they’re very, very big AI researchers. And I put a post up on LinkedIn this week about the P doom that was coming, and I got accused of, uh, being a fear manga.

[00:43:43] Steve: And I was really, you know, the messenger got shot in this occasion because I was, uh, really just sharing what some of the others had said. And I had 76 comments on it cam, about this P doom, which is a lot of comments, right? I had more comments than likes, which is [00:44:00]interesting. And I just wanted to see, have people h heard of the idea that an artificial super intelligence would create a risk to humanity?

[00:44:09] Steve: And many of the learned AC roaches now have this probability as high as 20 or 50%. Now you do the investing podcast Cam, and I know a lot of the listeners are interested in finance. But just imagine for a minute that you were given an investment opportunity, right? And this investment opportunity could really enhance your life or maybe, you know, make you rich beyond all belief.

[00:44:34] Steve: But here is the one thing about this investment opportunity, right? You had to put all of your eggs into that basket. You have to put every asset you own into this one particular investment and alongside this investment, which could go just pretty well or amazingly well, we don’t even know. But there is a chance that this investment will wipe you out and take you to [00:45:00] absolute zero or even kill you, let’s say, right?

[00:45:04] Steve: Even a 5% chance of that investment, or a 1% chance that it would take my entire asset base where I live. Everything house gone wiped away. I live in a cardboard box under a bridge. If there was even a 1% chance that that would happen, I would not take that investment. So here’s what we have, right? And this is not about something bad happening, like we’ve seen through millennia inhumanity, like a, a recession or a war, or a hurricane or something that is really bad that we need to overcome.

[00:45:36] Steve: This is like the end, the end of our existence. Now, here are some of the AI researchers and their p dooms, which is their probability that they think this will happen. We’ve got Michael Tache and he comes out at 20%. We’ve got Paul Christiano, who is a former open AI researcher. He puts it at 50%. [00:46:00] We’ve got Eliza Ya Danowski, who is a bit of a doomster.

[00:46:04] Steve: He’s been around for a long time. He has 50%. And then we’ve got Jeffrey Hinton X, Google, and the Godfather of AI who has it at 50%. Now even Sam Altman says it’s probably a 5% risk. He says three to 5% and he’s off there just developing it as fast as he can on the P Doom. So I’m just wondering on this deep dive of a P Doom, and we’ve talked about regulating it, but we, this is something that could overtake humanity.

[00:46:32] Steve: And we talked about the letter and I think on our very first futuristic podcast, the open letter to pause it. But now we’ve, we’ve got to the level of P doom and we’ve even got the Doomsday Clock, which is quite famous and has been around for a long time. You’re a thinking man who studied history more than anyone.

[00:46:51] Steve: I know. I’m giving you an honorary doctorate in history just now. Just announced that from the Santino University about 12. Well, I’ve been [00:47:00] thinking about it for a long time. Cam, how do you see P Doom?

[00:47:04] Cameron: Well, I don’t know if I’ve said this on the show, but I’ve been saying this for a long time on other platforms.

[00:47:09] Cameron: I am very skeptical about the human race’s ability to survive this century anyway. Oh, right, okay. I don’t think the human race is, uh, not only is it not heading in the right direction, I don’t think it’s capable of heading in the right direction. I’ve been saying for years that we’re facing, I think, probably three existential threats concurrently.

[00:47:36] Cameron: Well, what are they? One is the threat of, you know, global nuclear war. Yeah. Which I

[00:47:43] Steve: still think is the

[00:47:44] Cameron: biggest one by the way. And, you know, maybe 10 years ago, 20 years ago, many of us had assumed we’d got past that to a large degree at the end of the Cold War. Things calmed down quite a bit in the nineties it seemed.

[00:47:57] Cameron: But of course we’ve [00:48:00] ratcheted it right back up over the whole Ukraine situation. You know, I’ve been having chats with, uh, Dennis, who edits this podcast in my QAV podcast who lives in Kyiv. I am genuinely concerned that, uh, you know, this could, I, I dunno what probability did I give it, but it could lead to some sort of a nuclear attack.

[00:48:21] Cameron: And the consequences of that are to, uh, horrible to imagine not just for people in Ukraine who’d probably be the first recipient of it, but you know, the, the counter attacks that would happen and the whole scenario that we’ve been living under since, uh, 1945. And, and it’s not just that. I mean, it’s terrorists getting access to nuclear bombs and that kind of stuff, which is always been a, a fear.

[00:48:43] Cameron: Then there’s the whole. Idea of, you know, the, the, the implications of climate change over the next 30, 40, 50 years. And we’re doing obviously a piss poor job. Nothing. We’re doing nothing. Let’s honest, we’ve just been ignoring it, [00:49:00] kicking the can down the road. Not our problem.

[00:49:02] Steve: Everyone cares, but no one cares because what you do is actually the function of what you actually believe.

[00:49:08] Steve: And everyone I know who cares dearly about the environment still catches airplanes and uses resources and eats meat or does one of the hundred things that ruin the environment or probably 95 of the a hundred things.

[00:49:21] Cameron: Yeah. And I think those people, you know, are, are hoping that governments around the world will get their shit together and do something about it.

[00:49:28] Cameron: But the governments aren’t prepared. There’s no will there to do anything about it, seriously. So we’re just, I think that’s an existential threat. Maybe it’s not gonna wipe out all of humanity, but it’s gonna fundamentally disrupt civilization in society as we know. It leads to the displacement of hundreds of millions, if not billions of people deaths, the collapse of economies and societies around the world over the course of the next 40 or 50 years.

[00:49:56] Cameron: And, you know, the implications of that, not just the, the seas [00:50:00] rising, but fires and, you know, extreme weather events and all that kind of stuff and, and the impact that those have on the economy. And then we’ve got the rise of AI nanotech. Gray goo Yeah. Gray goo. Yeah. Chemical warfare, paperclip

[00:50:15] Steve: maximization,

[00:50:17] Cameron: plagues.

[00:50:18] Cameron: Pandemics.

[00:50:20] Steve: We’ve listed about 10 things now, not three, but I guess the, the, the AI and other kind of externalities. Small things, smart things, things with their own ideas.

[00:50:30] Cameron: Yeah. The, I mean the third category I put into like the, the evolution of technology and all of the risks that it comes with, whether it’s building chemical, uh, agents that, or, or weaponizing a virus and releasing it.

[00:50:45] Cameron: All those sorts of things that can be done by non-state actors can be done by uh, uh, people who just decide, you know what? Screw this. I’m taking my ball and going home. I just finished reading. I dunno if I’ve talked about this on the show. You ever heard of the three [00:51:00]Body Problem trilogy? No. It’s a great, uh, science fiction trilogy came out over the last 10 or 15 years written by China’s leading science fiction ortho, and it’s just, I think it’s been made the first book, maybe it’s been made into a Netflix series.

[00:51:17] Cameron: It starts off with a disaffected Chinese physicist, a woman whose family, I think her father got killed during the cultural revolution and her career got, she got attacked by the, by the, the malice and the cultural revolution. And she just gets to a point where she’s like disenchanted with humanity and she figures out how to send a message out into the universe saying, Hey, you know, if anyone’s listening come and, uh, do something about the human race where we’re a plague and.

[00:51:48] Cameron: An advanced civilization, four and a half light years away gets the message and says, okay, we’re coming. We’re gonna, we’re gonna take over and destroy your planet because their planet had problems. But it’s gonna take us 400 years to [00:52:00] get there based on the speed that we can travel at. So the human race has 400 years to figure out what to do about this advanced civilization that’s coming to wipe us out.

[00:52:08] Cameron: Anyway, that’s

[00:52:09] Steve: a really nice idea though, because, yeah, I’ll tell you what gives you a goal, doesn’t,

[00:52:13] Cameron: it gives you a goal and then it looks at the different ways the humans try and deal with this over the next couple of hundred years. But the point being that this one person in this book, this woman was just so disenchanted with where the human race was going and not tackling its problems and how we treat each other, but she just decided, fuck it.

[00:52:31] Cameron: I’m gonna put a call out to, for somebody to come and step in. So some external party needs to stop in and get involved and, uh, come and parent us because we, we can’t parent ourselves. And it, you know, one of it could be somebody like that who releases a gray goo or a nano, or, you know, a weaponized virus or nanotech thing.

[00:52:49] Cameron: Or an AI virus, you know,

[00:52:53] Steve: well, if, if someone is prepared to be a, a suicide bomber, then you would, you should be able to envisage that. If someone could [00:53:00] get their hands on nuclear technology with that same sentiment, would be prepared to dismantle humanity through that methodology.

[00:53:09] Cameron: Now, this has been my theory for 20 odd years.

[00:53:11] Cameron: Uh, people close to me have heard me bang on about this, uh, over the years. And my theory has always been a, we’re facing a handful of concurrent existential threats to human civilization on this planet. B, I don’t think humans are capable of solving it by ourselves. We’re not showing any signs that give me any hope that we’re going to solve it.

[00:53:34] Cameron: So therefore, we need some sort of external intelligence or power. If we are to survive, we need an external intelligence or power to intervene. Now, this external intelligence is gonna come from one of a very few number of sources. It’s gonna be, uh, God and I find no evidence to support theories of, uh, gods B.

[00:53:59] Cameron: It’s gonna come [00:54:00] from, uh, An advanced, uh, extraterrestrial civilization and I find no evidence, uh, that they A, exist or b, if they do exist, give a shit or know that we

[00:54:11] Steve: exist. Yeah, give a shit can get here even if they wanted to. By the way,

[00:54:16] Cameron: the Dark body, the three body problem books taught me the idea of the Dark forest hypothesis.

[00:54:21] Cameron: Have you ever heard of that? No. You better tell me. You know, firm’s paradox. You ever heard of firm’s paradox? Yes, I have. Yeah. For people who dunno, firm physicist worked on the Manhattan Project Firm’s Paradox was if intelligent civilizations coexist with humanity, where are they? Why haven’t we heard from them?

[00:54:40] Cameron: And there’s a number of answers to that. But the Dark Forest hypothesis is any advanced civilization anywhere in the universe, if it learns of another slightly advanced civilization, is gonna wipe it out immediately. Because here’s the situ, here’s the sit. It’s like [00:55:00] the, the universe is like a dark forest where everyone’s trying to stay as quiet as possible is why it’s called the dark forest hypothesis.

[00:55:07] Cameron: Here’s the theory. Let’s say you’ve got civilization, civil A and civilization B, and civilization B learns of the existence of civilization A cuz they start broadcasting radio waves outta the universe. Even if civilization B is a pacifist sort of civilization, it has to look at civilization A and go, okay, are they more advanced than us or less advanced than us?

[00:55:29] Cameron: If they’re more advanced than us, they’re probably gonna, they might attack us. We don’t know if they’re pacifist or they’re violent. They might attack us. If we discover them before they attack us, we should attack them first. Attack is the best form of defense. If they’re less advanced than us, how do we know they’re not gonna become more advanced than us a hundred years from now or a thousand years from now?

[00:55:53] Cameron: We know what the explosion of technological advancement can look like. So we should take them out now before and not run the risk of [00:56:00] them becoming more advanced than us. And basically that’s the theory that, uh, you have to stay as quiet as possible because if any other civilization finds out about you, it’s in their selfish best interests to wipe you out as quickly as possible, because otherwise you might do the same to them and they can’t be sure that you won’t.

[00:56:19] Cameron: So, and then as Tony Constan pointed out when we talked about this, he’s the guy who put me under these books. He goes, and if they look at the human civilization and look at our history and look at how we’re running things now, why would you risk that civilization? He’s

[00:56:33] Steve: basically saying, we’re not good dudes.

[00:56:35] Steve: A lot of blood that has been spilled over for a long time, we’re not even good to ourselves. Anything on our planet.

[00:56:42] Cameron: I don’t think alien civilizations are coming to save us. So the only savior I see for human civilization is an ar, artificial intelligence. A, a super intelligence that steps in and fixes all of our [00:57:00] problems for us and takes control of the things that we don’t seem to be able to do a good job of now.

[00:57:05] Steve: Or fixes the problem that we create, which is us

[00:57:09] Cameron: and intervenes and says yeah, and says, well, I’m running things now. Everyone just relax. It’s gonna be okay. Now there’s a risk that the AI will be aggressive towards us and wipe us out, but we’re gonna do that anyway. So

[00:57:26] Steve: it’s a mercy kill, isn’t it? It’s a

[00:57:27] Cameron: mercy.

[00:57:28] Cameron: It’s a mercy. I have long, for decades, I’ve always seen that. The only hope for the survival of human civilization is in advanced artificial intelligence. Taking control as soon as possible before we do more damage to ourselves. That is un doable. Does it come with a risk? Yes. But not having it is, I think, a higher

[00:57:53] Steve: risk.

[00:57:54] Steve: So you see, you see on the P doom, you see the 90% upside of it solving our [00:58:00] problems, which are gonna get us anyway.

[00:58:02] Cameron: Yeah. Well, my P doom without ia, with without AI is extremely high.

[00:58:08] Steve: Is higher than the 5%. See, that’s a great question. See, rather than just having P doom on its own, you say, okay, what’s our p doom?

[00:58:16] Steve: Without ai, probably higher than five, it’s probably 20 or 50%. So with ai, it hasn’t increased it, but what it does do is give us a super intelligence, which could offset the p doom’s ai. See, I like this. Now we’re getting little equations going across different domes. I

[00:58:33] Cameron: think P Doom without AI is 99% by the end of this century.

[00:58:38] Cameron: This is,

[00:58:38] Steve: yeah, okay. By the end, yeah. End of the century

[00:58:41] Cameron: by 2200, 2100, sorry.

[00:58:44] Steve: It’s a, that says, on a long enough time scale, the, the probability of human survival is zero.

[00:58:51] Cameron: Yeah. I just don’t see how we get from where we are to Star Trek without, uh, without an AI intervening. I just don’t see [00:59:00] humans evolving fast enough to get ahead of all of the problems that we have.

[00:59:05] Cameron: And this gets back to my book, the Psychopath Epidemic. Wonderful book, by the way. Thank you. I think our society is run by psychopaths and we’re not doing anything about that problem. Most people aren’t even aware that that is a problem. And so, yeah, I’m pessimistic about our chances of survival. I think AI gives us a chance at surviving without which we don’t really have one.

[00:59:25] Cameron: Yeah.

[00:59:26] Steve: Without the ai, we don’t. Our PDO is higher without it. I like that. You’ve just given me another bit to add to my next article, by the way, Cameron, I’m gonna steal it. How do you

[00:59:34] Cameron: feel about that? As long as you quote me. That’s right. I’ll

[00:59:36] Steve: quote you

[00:59:36] Cameron: Cameron Reilly. You know the three, the, the three rules of quoting?

[00:59:41] Cameron: No. The first time you use it, you say, as Cameron Reilly says, the second time you use it, you say, as, uh, someone said, and the third time you use it, you say, as I’ve always said,

[00:59:52] Steve: well, you have, you’ve said it three times by then, and three is the same as always. Just, I mean, here’s [01:00:00] why I think we won’t survive just while we’re on the topic, is because our DNA hasn’t had a software upgrade in 200,000 years.

[01:00:07] Steve: And our D N A says, do whatever you can now for the short term, or you might not make it through the winter. And that’s why we drive the car and we do whatever it is that actually helps us survive today, this week, next month and next season. And all humans are based on that. And even back to our employment Now job and putting up with horrible things our companies do because you know what?

[01:00:31] Steve: Making sure we can eat this week is way more important because the long-term doesn’t matter unless you get through the short and that’s the issue.

[01:00:37] Cameron: Yeah, it’s true.

[01:00:39] Steve: It is time for a technology time work. Do we have the time? And it’s quite ironic because the technology time work today is digital watches, and I thought that was an interesting allegory on where we are today with the VR headsets.

[01:00:54] Steve: Just want to tell you, the first ever digital watch, which was launched by a company called Hamilton, [01:01:00] released the Pulsar in 1972, and in those dollars. Back then, not today’s dollars. It was $2,100, which was more than most cars at that point in time. And guess what it did? It told the time, didn’t have any other use case.

[01:01:16] Steve: Feels really similar to something that just got launched this week by Apple. But within a decade there were about $20. And then the true smartwatch came about 30 years after that initial digital watch launched in the eighties. And by the way, and the James Bond fans out there, this wristwatch, the Pulsar appeared in Live and Let Die in 1973.

[01:01:42] Cameron: Whoa. Which I liked. Well, it was. It came with an 18 karat gold case, Steve. So that’s where all your money went was in the 18 karat gold.

[01:01:51] Steve: Yes. The digital watch didn’t do much, but I think that seeing these pricing patents and finding use cases, they’re, the two things that are really interesting [01:02:00] about this launch for the, the Business Minds tuning in today is that there’s two things.

[01:02:04] Steve: Exponential technology makes things a lot cheaper than their first iterations, and they eventually become affordable and democratized. And the second thing is that sometimes it takes us a really, really long time to find a use case. And the smartwatch, I guess, is really just a small version of the phone to a certain extent, but it’s sort of found its use case.

[01:02:24] Steve: And I think that that’s a really interesting idea is to say sometimes it takes 20 or 30 years of iterations, but before we find a really good use case of a technology. And I think the digital watch is a really interesting allegory. So that’s our tech time walk for the day.

[01:02:39] Cameron: Hmm. I’m looking at one of the original ones now.

[01:02:43] Cameron: Yeah, there was a whole lot

[01:02:43] Steve: of interesting Commodore actually, you know, Commodore, who made the Commodore 64. They actually had some digital watches in the seventies too, before they got into console consoles and computers and gaming, uh, machines.

[01:02:55] Cameron: Wow. Good stuff. Yeah, I think you’re right. These things sort of [01:03:00] get thrown out there by, again, by engineers and they’re not really thinking of the, you know, the use cases and the marketing and making it, uh, something that people are really gonna buy or really gonna use.

[01:03:10] Cameron: Sometimes you just need to get it out there and let the market figure it out over time. But that’s not what I think Apple was good at doing. Apple during Steve’s era was really, really good at making stuff that was gonna change your life now. But, um, anyway,

[01:03:32] Cameron: let’s finish with the futurist forecast. Steve, what do you wanna talk about?

[01:03:37] Steve: I wanna talk about what you want to talk about first Cam, cuz you wrote something down that I, I don’t know that much about it. I know what a V is, but I don’t really know what a smart box is, so you’re gonna have to bring me up to speed and, and the listeners,

[01:03:49] Cameron: well, the term was new for me, which is why I wanted to throw it on the list.

[01:03:54] Cameron: A voxel. I saw this, my old mate, Robert Scobel. The Scobleizer.

[01:03:59] Steve: Oh, the [01:04:00] Izer. I remember he was one of the early

[01:04:02] Cameron: bloggers, one of the early bloggers. Uh, then he worked at Microsoft while I was at Microsoft and then we hang out. We hung out a bit together in those years, and I, the first podcast he was ever on was Goodday World.

[01:04:13] Cameron: I got him on and, uh, as a guest early on, and in fact, uh, he was the guy that was responsible for me getting an email from Steve Jobs. I’ve told you that story I’m sure before, but when Steve Jobs did his Mac World announcement that said that the next version of iTunes would’ve a podcast directory in it.

[01:04:30] Cameron: I did a blog post. This is like 2005. I said that it’s nice, Steve, but how the hell did we get our podcasts into it? Scobel Reblogged my blog post or mentioned it and Jobs read his, and then he read mine and then he, Steve sent me an email telling me to speak to Eddie Q at Apple about getting podcast networks podcasts.

[01:04:49] Cameron: Into Apple’s podcast directory. So that all came from stabilizer. So we’ve known each other a long time. Anyway, he posted this thing on Twitter the other day, A Unity employee at a w E 2023 [01:05:00] last night was explaining why every voxel in your home would soon be hyper smart. So I started thinking of my houses, just a bunch of voxels.

[01:05:10] Cameron: If the resolution was one per millimeter, one per millimeter, how many is that? 11 meters. 11,000 millimeters. 13 meters long, 13,000 143 million voxels all around me. So I didn’t know what a voxel was. I had to go look that up. According to Wikipedia and 3d, computer graphics of voxel represents a value on a regular grid in three-dimensional space.

[01:05:33] Cameron: As with pixels in a 2d, uh, bit, bitmap, voxels themselves do not typically have their position explicitly encoded with their values, instead rendering systems in further position of a voxel based upon its position relative to other voxels. But basically, as I understand it, it’s a, it’s a a point, right?

[01:05:52] Cameron: It’s a no. A unit, a, a cell, I guess, in your house. Would that be, is that your understanding of how to define a [01:06:00] voxel?

[01:06:00] Steve: Yeah. The, the simple explanation to visualize it for people, if anyone’s seen Minecraft, right? So Minecraft creates like little. Pixels that are squares, which create values, which can then create a physical representation of something.

[01:06:14] Steve: So this is, and now imagine that taken off the screen so you can have boxes in around your house, in your physical being. Yeah. And there’s been

[01:06:20] Cameron: lots of talk in hype circles, vaporware stuff for decades about smart paint and everything will be digital and everything will be enabled.

[01:06:29] Steve: Yeah. It’s molecular manufacturing that, that thing, everything down at the, at the nano level, you’re having smarts embedded in it.

[01:06:36] Steve: Yeah. Spray on wifi was one I read about in Wide Magazine.

[01:06:40] Cameron: So this guy from Unity, which is a games developer, I think said to Scobel that someday soon every single one of those voxels will have an IP address and an L L M and a microphone and a speaker, maybe not for a few years though. And he puts a little smiley face in it.

[01:06:59] Steve: [01:07:00] Optimistic, because you’re basically saying everything at a, at a nano level becomes a form of intelligence.

[01:07:07] Cameron: Yeah, well, it has access to it and like, I’ve been waiting for this sort of reality for decades. I mean, this is, this sort of stuff has been forecast by science fiction authors for decades that we’re gonna be surrounded by intelligent devices that’ll communicate with the net, with each other, with us, you know, full immersive, uh, interactivity with your environment.

[01:07:31] Steve: Yeah, you’re just basically swimming in knowledge transfer. It

[01:07:35] Cameron: seems a long way off today, but a year ago, if you’d asked me how long it would be before we had an AI assistant, like a ChatGPT, I would’ve said 10 to 20 years based on what I knew about what was going on,

[01:07:51] Steve: I, I wouldn’t have thought it would’ve come this quick.

[01:07:53] Steve: It’s come quicker than we thought. Although I did have it in my book based on exponential technology on the lessons school. Forgot [01:08:00] I talked about. Within five years, and I was really close, so I wish I had the exact page number there, but I say in five years we’ll have an AI assistant, which is like a Siri, but it has a PhD in every single subject based on exponential.

[01:08:14] Steve: I, I did the exponential charting on a swell, and here we are. I actually forgot that I wrote that and what I wrote, I’m like, well, it could happen. I almost didn’t believe it. Here we are. Well, you

[01:08:23] Cameron: good. It’s good that you bring up Kurtz wall, and I think we mentioned him on our first episode, like Kurtz Wall’s been predicting the singularity.

[01:08:30] Cameron: 20 30, 20 35. 2040. We’re 2023 now. And you know, the idea of the timing for the singularity is we, the closer we get to it, the, the faster the innovation happens on the bell curve, the J curve, uh, I was gonna ask, so the reason I added smarts onto this was to ask you, based on all of your reading and research, What you know about what’s going on in that space.

[01:08:55] Cameron: Have you seen anything? How close do you think we are to having a house full of [01:09:00] Smart

[01:09:00] Steve: Voxels? I haven’t seen, I mean, one of the things that’s been so interesting is that smart house adoption has been really slow, much slower than everyone thinks, and people tend to like, and this is the whole HU human use case.

[01:09:13] Steve: People really don’t like smart things without analog optionality. And the idea of smart devices and doors that open and houses that heat themselves and the adoption of that is really, really low. So unless you have a curve jump where the utility is so grand that you go, I just have to have this, I can’t see it happening.

[01:09:32] Steve: The current smart devices, which are clunkier, RFIDs, nfcs, all of that stuff, image recognition of when you’re home, when you’re driving home, none of that stuff has happened. And it’s all possible now, and the adoption is super poor. So I think that you need a big curve jump before that would happen. But if you did get that curve jump, and I dunno how that occurs, but some sort of smart nanotechnology, then you’d adopt it.

[01:09:58] Steve: But I think the security risks of [01:10:00] smart things that we don’t have full control on is one of the things that is holding people back from adoption. We’ve all seen the Black Mirror episodes where you get locked in your own house or your car or all of that kind of stuff, and it comes back to that PDO area.

[01:10:15] Steve: But I think smart things and smart items at a, at a nano level, there’ll need to be some sort of a revolutionary moment before that becomes real. And it could, it could be within a decade. Because one of the things that’s interesting is that the LL Labs and the ability for, uh, machine learning, It’s like we don’t need to be involved in the development process to an extent now it can find things that we won’t be able to find, you know, links between materials, science and smartness that we could just never find.

[01:10:42] Steve: So that’s, that’s why that J Curve curves, while stuff becomes really important now, because it sort of is like, we’ve unleashed the beast now it’s out of our hands.

[01:10:50] Cameron: Yeah, and one of the problems with adoption curves for this stuff, particularly when it comes to physical objects, is, okay, let’s say they c, let’s say somebody does come out with a smart paint [01:11:00] tomorrow and you know, you got sort of an IP address in all of the paint voxels, and it’s connected to the wifi, et cetera, et cetera.

[01:11:08] Cameron: Okay. Everyone needs to repaint their houses then to get this smart paint right?

[01:11:15] Steve: Yeah. It’s not like you have a phone subscription that runs out in a year and you’re gonna get one anyway.

[01:11:20] Cameron: Yeah. Whenever you have to spend money to replace existing things that are working fine, there needs to be a huge benefit payback and a sh relatively short payback.

[01:11:32] Cameron: It’s like getting people getting solar panels on their roofs. Governments have been pushing that for decades and it’s with very, very slow adoption. People are looking at payback curves and that kind of stuff. Why, if I’m gonna invest a couple of grand on this, when am I gonna get my money back? How long is it gonna take?

[01:11:47] Cameron: The move to electric cars is a similar sort of, uh, adoption curve. Yeah. Payback hasn’t been there yet. It’s very difficult to get people to spend money to replace something that’s already working, unless the benefit of spending that [01:12:00] money is enormous and you know, is fairly quick payback. Thing about LLMs, the chat.

[01:12:06] Cameron: GPTis, you know, it’s released as a software freemium model. You get free access to three and a half, 20 bucks a month or whatever it is in Australia now for access to 25 bucks a month access to GPTfour. It’s doable for most people or some sort of, it is just like another Netflix subscription, right?

[01:12:23] Cameron: Adoption for actual hardware, advanced hardware, I think is gonna be trickier unless they launch it as a freemium model matter what? What if they said, you know what, the painters free, we’ll even pay for the guys to come and paint your house for free. And then you pay 20 bucks a month.

[01:12:40] Steve: You might do it in that case, you might do it

[01:12:43] Cameron: for the paint.

[01:12:44] Cameron: For using our smart paint.

[01:12:46] Steve: Yeah. On the smart paint. You know, if you had incredible utility where every wall has connectivity to the internet where it can act like a quasi screen or an A or an actual screen, who

[01:12:57] Cameron: wouldn’t do that? The last time I actually [01:13:00] spoke to Robert Scobel was probably a year or so ago.

[01:13:03] Cameron: We we’d FaceTimed for a couple of hours and he was trying to sell me on the idea that the, the long-term business model for Tesla is where all the cars are being driven by ai. And you don’t actually buy a car anymore. You don’t own a car. Yeah. Subscribe. Yeah. You subscribe to Tesla’s driving service. You subscription.

[01:13:25] Cameron: Yeah. Yeah. I need a car. And one turns up at your doorstep, three minutes later, takes you wherever you want to go. And that, that’s the business model for cars in the future is you don’t own a, like the whole idea of everybody owning a car, words will seem archaic at some point in the future. Why would you own a car that’s like crazy, right?

[01:13:46] Steve: Well, how can you show people how wealthy you are unless you have like a big piece of metal and where are you gonna put your golf clubs? I’ve always said that. It’s like, well this is, but then you start to get to the consumer side and the psychology of ownership, right?

[01:13:59] Cameron: [01:14:00] Yeah. I think there’s, there will be those sorts of things will take time to be adopted, but I think it’ll be a lot faster than, I guess, I guess what I’m saying is the adoption of new technology when it becomes physical technology hardware is a lot, got a lot to do with the business model.

[01:14:18] Cameron: How do you make it accessible to people, particularly when AI’s gonna put a lot of people outta jobs and waiting for Sam Altman to have, we talked about his orb. The orb thing that he’s doing. We haven’t talked about that. Sam Altman’s got another startup that’s called the orb that goes along with open ai.

[01:14:37] Steve: Yes. It’s a global currency, isn’t it? Did I get that right? He’s going for a, the crypto bro side of it. He wants to take over the world.

[01:14:45] Cameron: It’s a crypto, it’s a coin thing. Well, his basic view is look, AI’s gonna create a lot of wealth and it’s gonna put a lot of people outta jobs, but the economy needs people to have money.

[01:14:54] Cameron: So we’re gonna take some of that wealth and we’re gonna provide it as a ubi Universal basic [01:15:00] income. Ooh, I

[01:15:00] Steve: hate UBIs. They’re horrible disgust. Can we put that on the agenda for the next futuristic?

[01:15:06] Cameron: Yeah. I’m really shocked that you’re booing UBIs. Because,

[01:15:10] Steve: oh my God, UBIs are the worst fucking idea in fucking history developed by people who are already fucking rich.

[01:15:20] Steve: Here’s a crumb, and they don’t understand basic economics, and I will tear that shit apart next week. All

[01:15:26] Cameron: right. Okay. Well, That’s how we’re gonna finish this episode is, uh, Steve dropping a cliffhanger for next week’s episode. Why? UBIs are the worst idea ever.

[01:15:40] Steve: They really are. They’re, they’re one of the fucking worst ideas I’ve ever heard in my whole fucking life.

[01:15:44] Steve: I swore three times

[01:15:46] Cameron: ducking. You have a ducking lock.

[01:15:48] Steve: Pathetic they are. And by the way, they will fade away into obscurity and they have done to a certain extent until you’ve raised it. And I’ll go through that next

[01:15:57] Cameron: week as well. Good to chat, Steve. Have a good [01:16:00] week buddy. You too, champion. Loved it.