Self-Perfected Podcast

292 The Future of Philosophy, Time Machines, and Accelerationism

Mitchell Snyder, Cameron Cope, Drake Pearson

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 3:15:21

A stage mentalist flips a card onstage, Melania’s face tightens, and moments later the room erupts into a shooting scare at a White House dinner. That’s the kind of clip that breaks your brain in 2026, because you’re not just watching an event, you’re watching a narrative form in real time. We start with what can be verified, then follow the internet’s instinct to connect dots: Oz Pearlman, the “shots fired” line caught on camera, and a bizarre Time Machine banner image that appears to echo a famous Trump moment years before it happened. 

From there we zoom out into the deeper story: why trust is collapsing. We talk Yuri Geller and the Stanford Research Institute experiments the CIA funded, and what it means that institutions have chased “paranormal” edges when power was on the line. We connect that to Palantir-style surveillance, the panopticon feeling of always being watched, and the way algorithms can become a hypnopticon that steers behaviour without needing force. 

Then we hit the big question: are we building AI to help humans make better decisions, or are we building it to remove humans from decision-making entirely? We unpack AI accelerationism, the idea that capitalism behaves like an information-processing machine, and why some techno-optimists treat autonomy as the end goal. The red button blue button thought experiment becomes our mirror: how you vote reveals what you believe about other people, responsibility, and survival. 

If you care about AI ethics, media literacy, surveillance capitalism, and how conspiracy thinking thrives in uncertainty, this one will stick with you. Subscribe, share the episode with a friend, and leave a review, then tell us: red or blue, and why?

Cold Open And Fight Night Jokes

SPEAKER_09

Good morning.

SPEAKER_04

Good morning, world. Good morning, Vietnam. Yeah, I was I was I was kind of picturing we have some intro like that. You ever see out those UFC events?

SPEAKER_02

Like, oh no, the main event.

SPEAKER_04

As pictures, and Drake could probably do that for us. I'm just kidding. I think I would, out of all of us, I think I would have to be that guy, and I'm not gonna do that. Not yet.

SPEAKER_07

I I want to be one of the show girls that holds up the stuff.

SPEAKER_04

Drake can be the fighter in the ring. I'll be the announcer guy. Okay. I have to wear that like fancy tuxedo, though. You know, I forget the guy's name, but it's like an iconic thing. So hey, good morning, everybody. You have to fight your own ego.

SPEAKER_03

Bruce Bruce Drake versus Eagle. Hey. Bruce Segal? Bruce Buffer. I think that's his name. Bruce Buffer. That's his name.

SPEAKER_04

Yeah. Austin, you could you could probably do that. I don't even know who that is. Oh, okay.

SPEAKER_09

And he shouts just like I just like I was.

SPEAKER_11

And then I think his brother is the one who does the.

SPEAKER_09

You guys are worrying me that you know that.

The White House Dinner Shooting

SPEAKER_11

He he's the one his brother is the one that does the the uh the boxing ones. The the because they have trade boards.

SPEAKER_04

Okay, how about WWE? Okay. Let's go. The realest one out of them all is WWE. AKA, our political system. Speaking of WWE. Speaking of WWE, dude, last night before I went to bed, I'm like, let me just check X real quick. Same. Make sure, you know, make sure nothing crazy happens. Oh, something crazy happened. Something crazy happened. And it gets dude, dude. Do crazy. Do you know who the guy was on stage?

SPEAKER_11

Wait, wait, wait, wait, wait. Can we slow down and tell people what's going on? No, no, no.

SPEAKER_04

No.

SPEAKER_11

Not yet.

SPEAKER_04

Not yet. Okay, here we go. Let's just stop. Dude, you know who Oz Prolman is?

SPEAKER_03

The guy who was on stage at the moment it happened? Uh I'm a s I I said some.

SPEAKER_07

You tell me because I don't know it for sure.

SPEAKER_04

He's a fucking mentalist hypnotist. He's like Yuri Geller of like today. Yes, he was on. I didn't watch it, but he was on the Modern Wisdom podcast. And uh it's like mentalist hypnotist blows Chris's mind with knowing something about so he's like literally the fucking master hypnotists.

SPEAKER_07

For my next trick, I'll make everyone disappear out, I'll make the president disappear out of this room.

SPEAKER_04

Yeah, and then you see like the reaction because it looks like Melania's face is like, but I think it's like before the actual thing goes, I don't know, dude. We're gonna go deep down, deep down the rabbit hole.

SPEAKER_07

Yeah. Oh man, this is crazy. Like, I just want to say real quick, like, I've I've been doing my live streaming, right? And like two nights ago, so not last night, but the night before, I went into like a whole fucking deep dive on Yuri Geller. Oh shit. Okay, like spoon bending and all the different CIA experiments that they tested his psychic abilities and shit. Like, that's fucking weird, dude. That then it's all coming together, guys. It said I saw a tweet where somebody says Oz is a magician, and I was like, Dr. Oz is also a magician? I thought they were talking about Dr. Oz. I didn't know.

SPEAKER_04

Yeah, no, I guess I guess his name is Oz Pearlman. I never really looked into him, but he's like modern day The Wizard of Oz.

SPEAKER_11

Dude, okay, hold on. That's weird because I I've only just heard of Oz Pearlman like three days ago. I saw a post that was uh I'm looking for a magician to hire for a one-hour event. And uh this White House Correspondence Center.

SPEAKER_04

Um, last minute, okay. Who do we get?

SPEAKER_11

This guy quoted me$165,000 for an hour. There's gotta be somebody cheaper, right? And then so I guess maybe they were like somebody was making that a meme, and it was Oz Pearlman was the name, obviously. And just to like get people introduced to like who is Oz Perl Pearlman and like of course I it's like the textbook fucking thing.

SPEAKER_04

It's like textbook, textbook false flag. So so anyway, in case anyone was not on X last night at 8 p.m. Uh there was what do we objectively know is true? Trump was hosting the White House Correspondence Dinner. Well, I don't not hosting, but what I was doing.

SPEAKER_07

I don't even know if that was really happening at this point.

SPEAKER_04

It's so there's this thing called the White House Correspondence Dinner, and uh at it there was a shot, a shot fired.

SPEAKER_07

Oh apparently. Apparently there was somebody rushing the lobby, and then the the Secret Service or whoever, or police, like shot him, and he even shot one of the officers, maybe, and so they killed him, and then that was during the actual dinner, and then they got Trump and fancy.

SPEAKER_04

Yeah, so they like flooded Trump out of the room and Melania, and then they killed Service, like and then they handcuffed him. Well, that's standard operating stuff. Just to make sure he wasn't a zombie. Um yeah, but then you clearly see like Secret Service and like the guys in full-on machine guns and helmets and stuff like come on the stage. So it was very it was definitely it looked dramatic. There's a funny CNN feed then of where all the people are kind of like crouching under their tables, and this this one congressman is just like taking a bite of his dessert. I love that one.

SPEAKER_07

It was like this guy, he was sitting there, like everyone's crouching, and you see this guy like just go into his little bowl and he's like eating something. I'm like this guy, man.

SPEAKER_04

It's kind of like habit, right? You're like watching WWE and you have to like eat. He's like, I'm stuck here. Might as well enjoy the food.

SPEAKER_11

Dude, we we have um we have uh Arabian Nights or Medieval Knights. You guys have that? You know?

SPEAKER_07

No, I think you talked about this. Is this a place you go to?

SPEAKER_11

Yeah, yeah, yeah. It's it's a place, yeah, yeah, yes, yes. Okay, we have talked about it because that was Cam's first question the last time. Yeah, yeah.

SPEAKER_04

Bringing me back 105 episodes.

SPEAKER_11

That's right. That's right. But it's like you go and you sit at um you sit in a I guess it's a large round like pen for horses, but you don't know that yet. You're just sitting there. It's like a night's tale.

SPEAKER_04

So it's like an interactive dinner theater kind of thing?

SPEAKER_07

Yeah, yeah, yeah. Okay, okay. Yeah. Do you get giant? I know I've probably asked this before. Do you get giant turkey legs?

SPEAKER_11

You get giant turkey legs. Oh shit. And you eat with your hands. You don't get a fork at night. Why would you? Because you're, you know, a king or a knight or I don't know. Why would a king?

SPEAKER_04

You guys we should make one of those, but it but it's of the history of mankind, and you're like you're like an Atlantean, and then you get like sucked in. You're being like reincarnated, and you just have to eat water. You get reincarnated into the Stone Age. Yeah, self-perfected theater, guys. Maybe one day, but yeah, so so that was pretty crazy.

SPEAKER_11

Um, so you know what's really interesting about this whole thing is it's coming at a time right after, for whatever reason, Tim Dylan just recently did another breakdown of uh when Trump was shot. Yeah, exactly. And and and just looking at the the film footage of like, okay, Trump is shot or shot at, and then uh why are they moving these photographers into position? Why did all of a sudden the flag come down? Like this flag comes down right after it, literally, right after the flag comes down, they move the photographers into position. It's it's all like just like that's really suspicious. That's crazy, you know?

SPEAKER_07

Okay, can I do a little schizo conspiracy theory for you? Go ahead, schizo. I rejects your own. This is what I come here for. Yeah, I come here for. Dude, it's crazy because I was like, you know, I did this whole live stream last night on another whole crazy schizo thing. And I'm like, okay, maybe I'll talk about that. We got some things we can talk about, and then we're like, nah, we got a new one already. The Matrix just keeps dropping new.

SPEAKER_04

Dude, okay, that red pill or the red button, blue button, we definitely got to touch up that. Yeah, yeah.

SPEAKER_07

That's yeah, exactly. That's what we were like, we'll talk about that.

SPEAKER_04

I was like thinking, yeah, I was thinking that was gonna be the whole arc of the story today.

Palantir Ad And Techno Dread

SPEAKER_07

So hold on. Take this non-story out of the way so we can talk about what we really want to do.

SPEAKER_04

Before that, guys, uh, did you see Palantir's new commercial? Can I show you guys this first? Sure. I think this could this could warm us up. Uh all right, so new commercial just dropped. Uh let's watch it.

SPEAKER_12

Listen up, losers. Stop pretending it's a democracy. We run things, give up the illusion, and in exchange, we will bring you order and efficiency. Yes, we'll own you. But do you really want to be free? Trust us. We know what to do. While Silicon Valley was feeding you dope and free email, we built the architecture of empire. Welcome to our world, motherfuckers. We aren't here to protect your privacy. We are here to enforce supremacy. We are the ledger now. Every tax return, every Medicaid file, every license plate, every crossing. Your president signed it into being with a pen in March. You kept scrolling. Your politicians are empty vessels. Your civil liberties are a liability. We are done pretending all cultures are equal. We know who the elites are, we know what we are building, and we demand that you applaud the billionaires taking the reins where your fragile democracies have failed. Welcome to the Technological Republic. You can stop scrolling now. We already have everything we need. Try to unplug us. We dare you. Now get the fuck out of here, you peasant. Listen up.

SPEAKER_07

Oh, that's great. Yeah.

SPEAKER_09

Like shaking.

SPEAKER_07

I don't I'm probably not gonna go into all this. We're not. I don't know. We'll see. We'll see what we have time for tonight today. But the live stream I did last night, you guys, it would blow your fucking minds. Like I'll watch that replay. Let's hear it.

SPEAKER_04

Yeah, I want to watch it. Can you give us at least a little uh little teacher? No, no, no. Maybe we'll come to it. At least tell me the title. The thing is, I don't want last night.

Cole Allen And The Time Machine

SPEAKER_07

I don't want to not do it justice, so I just scare the shit out of everybody and not be able to direct it. So you can be like, okay, okay, I see what the whole picture is here. But that live stream I did, uh the stuff I went through was like fucking insane. And it has a lot to do with the Palantir that whole the whole thing that guy with that that uh what's his face, Carp was just talking about. Um but why don't we schizopost for a second? Um okay, so so the shooter's name, okay, last night, this alleged shooter who they killed. Oh he was a teacher from California named Cole something Allen. Okay, Cole Allen. So then of course, immediately the internet's looking for any reference to this guy, right? So they find this post of this Henry Martinez account from 2023, it says Cole Allen, that's the only post on the account. Oh no. That is the only post on the account, okay? 2023. Um 2023. Would that have been this would have been after Butler. Right? Before the inauguration in 2025. When was it no, that was 2025.

SPEAKER_04

Uh it 2024, July 13th, 2024 was the first one.

SPEAKER_07

Right, because 2025 is when he went into office. It was at the beginning of last year. Not that it really matters, I was just curious. Okay, so it's the only post on the account. This is his background. Okay, some random banner image, okay. The only post. So then people started looking into, okay, we got this thing. Um, and then what is that picture in the background? So they traced it to a website called the Time Machine. Okay? So look at this image. So when was this image uploaded uploaded? I don't know, it was on that app. It was posted, no, the account was supposed to it comes from a 2022 website called the Time Machine. But it doesn't really matter because it's his banner image. I don't know, but the thing is I don't know when the banner image was uploaded. But look at the uh picture, okay? I see it.

SPEAKER_08

No fucking way. Show it next to each other. Yeah, show it right next to each other. Look at it. No, no, no, no, no. Do that again. Do that again. That was crazy.

SPEAKER_07

Yeah. I'm trying to reload that time machine website. 3D digitization. Holy shit. Now it looks different though, doesn't it? No, it doesn't look like that.

SPEAKER_04

Yeah, I mean, you can see the face and the arm if you if you look for it. No, it's the same. It's the same picture.

SPEAKER_09

Okay, okay. Can you go back and do that again? So what side by side.

SPEAKER_07

See, look at here's Trump's head.

SPEAKER_11

We can't see what you're looking at. Oh, I'm sorry, I know we can't see what you're doing. Somewhere else. Okay.

SPEAKER_07

So here's Trump's head. Yeah. And then his fist. Can you see my cursor?

SPEAKER_11

I can see your cursor now. Okay.

SPEAKER_07

Trump's head, there's the there's the fist. Right. There's the other guy's head. Someone did a uh they did a video version of it, of it kind of like fading into the crazy. That is weird. I mean, it seems like, oh, that's just a weird but it's on the fucking page. Like now everybody's talking about it. What is this? Seven. Twenty. What is this? Twenty. Nine plus five plus six. Fifteen, fifteen, twenty, eighteen, eight. Okay, that's just some random shit or something together. Maybe the right uh just another schedule. Um and then apparently this guy, Henry Martinez, Henry Martinez was a right here. According to Cole Thomas Allen's LinkedIn, he entered at NASA in 2014. NASA published a paper in 2014, and Henry Martinez was an author which is the same name as this person's account here. And the the Cole Allen, which may or may not be the same guy, but the same name, was a fellow was a fellow at or was a undergrad at GPL. Which for anyone who didn't see it is the Jet Propulsion Laboratory, which is also been a which somehow connects to the missing scientists with the jet with the rocket stuff and the UFO shit and like all the it's like a schizo's wonderland right now, guys. Oh man.

SPEAKER_04

What the fuck? So you must be in happen. So it's it's unknown basically when that photo was made. It it would seem as though the Trump Butler event happened, that photo was taken, and then later it was turned into this modified cover photo.

SPEAKER_11

No, no, no, no. No, that photo is from 2022 or 2022 website anyway. The tie-dye photo.

SPEAKER_04

Yeah, yeah. But was it actually on the website at that date or was it added retroactive? Actually, after Trump's right. That so my the logical brain says, oh, it was just you know afterwards.

SPEAKER_11

Perhaps, but why is it called time machine? What's what's the deal with that website?

SPEAKER_07

Yeah, okay, let me go back to the actual full website. Hold on. Hold on, hold on. Okay, so I just clicked on the actual website.

SPEAKER_04

What is going on? Is it called time machine.com?

SPEAKER_07

What's it what's that actually adding a new dimension to the past, building a time machine? Become a part of the time machine.

SPEAKER_04

What?

SPEAKER_07

What what is this? What is the time machine? Become a part of the time machine.

SPEAKER_14

The continent. In archives, beneath stones, in crumpled maps and winding streets, in pictures and paintings, in the depth of its green forests, history in Europe lies everywhere. How does one preserve and explore such a dense and fragmented past? How does one bring it into resonance with our contemporary world?

SPEAKER_07

Venice.

Digitising History Or Rewriting It

SPEAKER_14

Thanks to the development of big data and artificial intelligence technology. Since then, dozens of European cities have taken similar steps. Each now contributes to the creation of a unified temporal exploration tool accessible to all. All these initiatives are now gathered within a consortium that groups together more than 250 European partners, universities, archives, research centers, and also private companies. Time machine participants include historians, engineers, geographers, developers, entrepreneurs, researchers from a variety of backgrounds, as well as ordinary citizens, defenders of their heritage.

SPEAKER_07

Um, okay.

SPEAKER_04

So it's like a digitization of the on the surface, that what's that's what it looks like, but it could be some big brother shit where they're trying to rewrite history.

SPEAKER_11

No, no, no, no, no. I don't I know what it is. This is a fucking alien from the future trying to enter into our past. Then this is how they're doing it.

SPEAKER_07

Obviously.

SPEAKER_11

That's Occam's razor. Come on. The most likely event.

SPEAKER_04

I've been going in on some of the Giorgiani stuff too. Man. That shit's crazy. One with Jesse Michaels, where he goes all out with that.

SPEAKER_11

Yeah. In the white room. Yeah.

SPEAKER_07

Isn't that one great? I I just finished uh promoting his first book. It's fucking wild, dude. At the end, he's talking about he's going through like the Old Testament, and like you know, when they were in the desert, and they would have this object in the sky that they would follow at night during the day or whatever, and then it would stop and they'd have the Ark of the Covenant there, and then it's like similar to the object that was described whenever Gabriel came to see like Muhammad in the desert and shit, and like it's it's crazy. That book's actually really cool because he goes into like a lot of deep philosophy as well, like all kinds of stuff. He talks about a lot of it in the interviews, but like he's kind of giving you like the conversational version, but the books that he writes are like they're like fucking scholarly, like philosophical, like it's it's a lot of shit in there.

SPEAKER_11

You know, what's funny is it reminds me of uh Raiders of the Lost Ark, right? Weren't they trying to describe that in that movie? I think it was.

SPEAKER_07

I was way too young to understand anything.

SPEAKER_11

Yeah, like it was some kind of uh like if you looked at the Ark of the Covenant, like you know, your face would melt or something. Is that what happened?

SPEAKER_07

Yeah, yeah, yeah, yeah. Yeah, if you opened it up or whatever, and then it was like killing Nazis.

SPEAKER_11

And then it was also somehow strangely connected to a crystal head of like an alien, right?

SPEAKER_07

Oh yeah, yeah, yeah. Like the crystal skull or whatever.

SPEAKER_11

Yeah, I think I'm not combining two different movies, right?

SPEAKER_04

Oh, well, there was multiple movies. I think it's a separate movies.

SPEAKER_07

One of them was I think that was like the more recent one they did. Oh, okay, okay. But it's still about Nazis, obviously. And uh who made those movies?

SPEAKER_06

Who made those movies? Oh no, don't tell me it's who was it that made those?

SPEAKER_04

Who made Indiana? Somebody who can you that was like a core memory of yep, fucking Spielberg and George Lucas.

SPEAKER_07

And who's making the disclosure day movie?

SPEAKER_04

And George Lucas fucking with Star Wars dude.

SPEAKER_11

I don't wait. When does that disclosure day movie come out?

SPEAKER_04

I think it's the June twelfth. Uh yeah, June 12th, 2026. Alright. Why? Were you gonna like time machine on on this? I thought it was today, honestly. What if no? I'm definitely going to it when it comes out. I'm I'm getting front row to that shit. You're gonna get front row brainwashed.

SPEAKER_07

I'm definitely going to that. All of my work, all of my live streams for naught.

SPEAKER_04

At least you'll have been prepared. Yeah. I mean, I want to see what they're like how they're positioning it, right? I want to because there's gonna be so many fucking people who see that movie.

SPEAKER_07

I am not going to any fucking movie theaters and any major cities. No, I live in Las City.

SPEAKER_08

Yeah, definitely don't go in Minneapolis.

SPEAKER_04

No, I'm definitely I don't go to the fucking Minneapolis theater. I'm just saying. I'm just saying that would definitely be ground zero. Yeah, dude. I know, I know. That would be ground zero. Yeah. They definitely got some psyop fucking harp shit mind controlling people here. You're not a line.

SPEAKER_07

It's the movie, Mitch. The movie itself is the fucking mind control.

SPEAKER_04

I know. I'm gonna I'm gonna break the fucking mind control. So I remember being a kid watching Indiana Jones up as like age six, having no context, just remembering like, oh, it's so cool that he's fighting. We talk about this on the podcast too. I thought it was like fighting ninjas or some shit. It was not ninjas, I know that now, but that totally imprinted in my mind and then going to Disney World. You know, Drake, the you fucking worked there. I've been there, right? You were Indiana Jones dressed up in costume. I was not, but I but that's so impulsive. Can you just assume I worked at Disney? I know you worked at Disney because I was a child, all right. Listen, yeah, you know how they're into like childhood.

SPEAKER_11

You know what they say, you know what they say when you're at Disney and you're on like the the jungle ride, they're like, don't fall off the boat. Uh any kids that fall off the boat, we send them to it's a small world and you gotta work there until your parents come find you.

SPEAKER_07

Yeah, that's so fucked up. So, so that that I'm just looking at that picture again, like we were showing on that website. I'll just show you real quick. So it's about 3D digitization of tangible cultural heritage. But like you see here, it says 5 May 2022. So that was made in 2022. Potentially, obviously. I mean they could have uploaded that particular picture at any time, but still it's But it seems like the five years.

SPEAKER_04

So there is actually a website though called um what do they call that website? It uh don't they call it Time Machine Machine? Wayback Machine? Yeah, Wayback Machine. That's a separate thing.

SPEAKER_07

That's more like yeah, but you can actually see what's on it.

SPEAKER_04

Okay, so let's do this. Oh Wayback Machine, and what's it called? Time machine.

SPEAKER_07

I think that would assume it was archived though.

SPEAKER_04

Time machine archives.

SPEAKER_07

Actually, I don't know how it works, I've never used it. Yeah, I think it's but it's that particular picture. It's it's this particular um page on that.

SPEAKER_04

Kim, what what's the um what's the URL exactly? Time machine.

SPEAKER_07

I'll text it to you. Okay.

SPEAKER_04

It's a particular page on that website. Okay, we'll we'll see. Because if this I mean, I don't know how valid Wayback Machine is, but it's easy for you.

SPEAKER_07

I see.

SPEAKER_04

Yeah, yeah. Okay. Here we go. Wayback Machine. Moment of truth. Uh it's oh my god. It started in 2022. May 6, 2022. Yeah, you have it? It's in there. I'm gonna open it first so I'm just not like randomly scrolling on a screen. This is our first snapshot, May 6, 2022. Oh shit. Moment of truth, it's loading.

SPEAKER_11

Okay, alright, load it. Come on. Don't don't keep me hanging here. What's happening? What's happening, Mitch? We're just staying.

SPEAKER_04

Oh my god, it's fucking on there.

SPEAKER_11

From May 6, 2022.

SPEAKER_04

Look.

SPEAKER_11

Mitch, Mitch, you're not sharing anything.

SPEAKER_07

Center yourself in your solar plexus. Bring yourself here.

SPEAKER_11

Yeah, that's it.

SPEAKER_07

Don't go into your mind.

SPEAKER_11

Mitch, are you gonna share it or what?

SPEAKER_04

I'm gonna share it. I'm gonna share it. I have to make sure I'm on the archive.

SPEAKER_07

Hey, it's a good thing I found the Jason Giorgiani to help everybody understand. Like Artie's a lot weirder than you think, okay? Guys, okay. So just relax.

SPEAKER_04

So this is okay, yeah. It is on here. This is on here as of and it was May 6th, 2022. Yeah, May 6, 2022.

SPEAKER_07

Was on here. I mean, look, I know you could go all schizo and be like or you could be anti-schizo and be like, no, it's just a coincidence. That's not really the same picture, but I mean it's on the I don't know, that four-part overlay was pretty. And you got the overlay with it.

SPEAKER_04

And it's called Time Machine.

unknown

Yeah.

SPEAKER_07

And it's called Time Machine.

SPEAKER_04

And they have the whole Trump thing from the future, you know, that fucking you know that thing? Baron von Trump.

SPEAKER_07

I I I probably these are some things you already know, but like his dad, sorry, his uncle was an MIT professor, and he was the guy that the government sent to get Tesla's papers. Yeah. I'm not trying to say that implies something, it's just fucking weird. And his mom's maiden name is Christ. Trump's his grandmother. His grandmother or his mom? I can't remember which. It's either his mom or grandmother. Okay. You can look it up. It's like Mary Christ. That's not Mary, whatever her first name is.

SPEAKER_04

Elizabeth Christ Trump is. Yeah, was her paternal grandmother of Donald Trump.

SPEAKER_07

Okay, okay. So he literally comes from on his mother's side, the last name Christ. She co-founded the real estate firm Trump and Son.

SPEAKER_11

Let me let me just share this.

SPEAKER_07

She's the one behind the real estate shit.

SPEAKER_11

Because I didn't know about this whole 3D thing, right? Okay. So this is this is just the video of what happened in Butler, Pennsylvania. Okay. So I'm not gonna play the sound because it's just some fucking weird music or whatever. But okay, shots are fired, right? Trump's down. Then they're like getting him up, and then all of a sudden, uh, where is it? They're turning the stage. There's they're moving this thing over. Why? I don't know. Then this is Secret Service, he's ushering in the photographers, right? The photographers are taking this.

SPEAKER_07

The flag lowers. Yeah, yeah. I remember seeing that. They said because it hit the hydraulics. So that's what they said. Yeah, it hit the hydraulics to say that was that.

SPEAKER_11

It couldn't hit Trump's face, it hit the hydraulics. Right. And then they get these guys all right up in there. Like, what is this? What is that? Why why is okay? If the Trump, if the Trump, if the president or uh former president and running for the next campaign president is has just been shot. Why is Secret Service so worried about like fucking photographers coming and getting their their shot?

SPEAKER_07

We couldn't possibly have known there was a shooter on the roof, but we definitely know there aren't any more now.

SPEAKER_04

Right. So it's all good. But Trump is so brave he was willing to stand up and raise his fist. Dude, I remember the feeling of seeing that and being like, fuck yeah, dude.

SPEAKER_06

Like you could feel that come up. And then that was that whole meme.

SPEAKER_04

It was like maybe it was close between Harris and Trump, but not anymore. It looks so badass like Trump's gonna win.

SPEAKER_07

We observed the whole thing, like we were there. It's not like we are just retroactively saying this. I remember that was when Elon Musk was like all in on Trump. Yeah. It was like And then all of Silicon Valley was like either they were in on it or they sold all them. You know what I mean? Like at some level. Right.

SPEAKER_04

Yeah, I because I remember uh I was at a sales event, and this one guy at dinner, he's just like, dude, that was so fucking staged. That was so staged from the Trump.

SPEAKER_02

I'm like, what?

SPEAKER_04

It's and he's like, dude, if you really look at it, like, and you see, because apparently got shot in the ear, right? And now you see Trump's ear. He's like, dude, there's no way he got shot in the fucking ear with a with a with a rifle. And I was like, ah, maybe I don't like I'm I'm I got the point though.

SPEAKER_07

It's like it's yeah, well, the thing is it's any of it could be one way or the other. That's the great thing about it. Yeah, it could be either way. The thing about that picture though is fucking weird, right? Like, it'd be one thing if you figured out okay, it was uploaded later and shit, but like we just verified ourselves independently of any of the posts.

SPEAKER_11

Yeah, I don't know. It it it almost seems like it almost okay, because consider this. What if it was staged?

SPEAKER_07

How would they stage it so perfectly to match that weird digital photo and why? I don't know, but that's what I'm saying. That's where my mind goes. I'm not a skeptical person. I'm just saying, like, it still make it make sense. I don't know. I would I want I want to believe.

SPEAKER_11

Listen, I look look look like look. I got the spell shirt on today. Spells oh there you go.

SPEAKER_04

Yeah, well then there's this whole fucking rabbit hole, this hashtag stage thing I just looked on X.

SPEAKER_07

And then if you have a stage magician on the stage doing a magic trick right before it happens, remember, guys, they have to tell you because if they tell you, then karmically they're not responsible, okay? So karmically, you they told you, so therefore they can get away with it. It's one of the rules.

SPEAKER_11

That was the rules. That was the rules. Anyway, it's it just seems like okay, what are they selling us now? Like, regardless, it it seems like there's so much crazy shit that happens. Like, as an example, in Easter during Easter or on Easter, um, when what happened? Uh Trump tweeted out that, or whatever, on his own Twitter, his own version of Twitter. He he put out a post that was like, um we're gonna bomb you back to the Stone Age. Praise Allah, praise be Allah, right? And when you saw that, that was like I don't know, maybe it could be real. It seems like that's gotta be digitally altered, somebody just saying that, like, nope, that's real. Like, that's also plausible. That that also makes sense. Like, I you know, it could go either way, right? And and then like Trump posts that photo of himself or reposts that photo of him of himself as like Jesus like no in the comments or in the in the um on Twitter, it's like or or Grok is saying, nope, somebody just uploaded that, that that's not him, and like, but actually, no, it was him, and it could go either way, you know. It's uh there's all this stuff that happens today where you're like, I don't know. You know what I just learned about recently, or what I just heard about? Um, Cash Patel apparently was uh unresponsive, like in his office, he was apparently just like he had to be uh st out sort of or something. Yeah, just drunk. I heard this. Yeah, apparently he was drunk, like apparently he drinks on the job, right? Which I wouldn't blame his job. I would not blame him honestly. Well, what's the good whole thing?

SPEAKER_07

It's always like this, because like you may believe I'm still alive.

SPEAKER_10

Yeah, there's something I'm still here.

SPEAKER_11

So uh the the Atlantic put out a story recently, is like not too long ago. They put out a story that basically he got locked out of his login, which is like just code for like you're fired, you know? You're fired. We didn't tell you yet, right? Yeah, uh so he got locked out of his uh his login at the FBI, and he immediately just started calling people up. He's like, it happened, I'm fired, it's done. Yeah, uh, but so then people start calling the White House, going like, so who's in the next position for FBI director? And they're like, What are you talking about? They're like, Well, we obviously fired cash, right? So, so who is it? And uh and they're like, No, he's not fired, and it was just some like technical error.

SPEAKER_04

He forgot his password because he was drunk.

SPEAKER_11

He was so drunk he forgot his password. They said it was a technical error, right? Um, but uh anyway, the Atlantic put out this whole article about him being a drunk, about him being unresponsive, about him, his response to this and thinking that he was fired, and people were just like he was freaking out. Uh, and now he's suing the Atlantic for$250 million uh for slander or libel or whatever the fuck it is, right? Uh Katie says um he's a drunk is a great cover for firing someone who doesn't want to go along with the plan. Right.

SPEAKER_07

It's just like with the whole thing about Christy Gnome's husband with the big fake ass titties, right? Like they probably knew about that the whole time.

SPEAKER_11

I don't know what that's about.

SPEAKER_07

You don't know about that.

SPEAKER_11

No, no, I I just I didn't care that Christy.

SPEAKER_07

Can you look that up, dude? Christy Noam with it with it. But I I'll just say this, I'll just say this funny thing while quick.

SPEAKER_11

Like Mitch, can you look it up so our wives are not like, what the fuck are you looking at?

SPEAKER_07

Yeah, just will understand. I I saw a post where it was talking about how Kash Patel's on the way out, right? Yeah. Around that whole propaganda thing. Somebody had reposted it and said, See you in Valhalla, brother. Amazing. I mean, they were waiting for this moment. Amazing. What?

SPEAKER_04

Okay, yeah, so it says unearthed photos reveal Christy Gnome's husband wearing large fake breasts and skin tight pink shorts to chat with online fetish models. Brian Gnome has been dressing up and playing adult entertainers to talk dirty using the pseudonym Jason Jackson. So Okay, okay, but that's like giant fake breasts.

SPEAKER_11

But hold on, we've we've already talked about this though. We've already talked about this.

SPEAKER_07

Yeah, that's what you said you didn't know about it. I thought.

SPEAKER_11

No, no, no, no, no, no, no. Not specifically that. I mean, we've talked about the fact that you don't get into the position unless they have some fucking shit on you.

SPEAKER_04

Yeah.

SPEAKER_11

You know, like, like, because I was talking about somebody recently who works in DC and they were saying, like, you know, uh, when Pete Heggsh like fired all these generals, he's like, what the fuck was that about? And then he's got all these like really good picks that he could pick from, actually, and then he selects these like fucking weird people that that is like, why did he select them? And then I'm I'm just like, well, because they didn't have dirt on the other people, obviously. Like, those people are the people who went to the party, they did the thing that they have on video, so they can use that later on. It's not really a question.

Oz Pearlman And “Shots Fired”

SPEAKER_07

They're not picking the most competent person, they're picking the person that they can control, right?

SPEAKER_04

Yeah, we hear we should just look at the I got like a 30-second clip of the kind of the official video. There's no like gunshots, it's not like crazy.

SPEAKER_07

They've already put together a whole like epic ass like video to so look.

SPEAKER_04

This this just this is like the first second.

SPEAKER_07

This guy, first of all, does that not look like Charlie Kirk? It does, yes.

SPEAKER_04

And so his name's Oz Proman, and he's apparently doing some sort of hypnosis. He's known to be a hypnotist magician stage performer.

SPEAKER_11

So he's hypnotizing Trump right here? What do you want to bet?

SPEAKER_04

He's hypnotizing Melania right now. What do you want to bet that on the card it says Cole Allen? Bro, probably. It's it's too blurry to see, because I was trying to figure out what's on the card. Hold on, wait, hold on, wait, wait, wait.

SPEAKER_11

Okay, hold on. This is really insane. Considering the the Pharaoh thing, right? Yeah, okay, and you've got your fucking hypnotists on stage, like introducing the next, like all right, programmed Trump. Yes, they he they programmed him to it. It's happening right here, like in front of our eyes.

SPEAKER_07

Like now they're doing the next act. They're like, all right, let's do the two again.

SPEAKER_04

Okay, so so yeah, so watch this. Now you got Melania on the left, so just watch. So the moment it flips, Melania, you see that look on her face.

SPEAKER_09

Yeah.

SPEAKER_04

So that's it on that, but then you get this other one where it's uh we'll touch on this one in a second. Okay. Hang on, hang on. Just watch so that's the one.

SPEAKER_11

It's gotta be convincing.

SPEAKER_15

There's the president being escorted. The president being escorted at one size eventually.

SPEAKER_04

Okay, so now here's one more that we gotta watch this. So I don't know who this is, uh, Carolyn. Oh, okay. I th okay, I thought it was someone else. This is before the thing, right? Okay, so listen to this.

SPEAKER_18

He is ready to rumble. I will tell you. This speech tonight will be classic Donald J. Trump. It'll be funny, it'll be entertaining. There will be some shots fired tonight in the room. So everyone should tune in. It's gonna be really great. I'm looking forward to hearing it.

SPEAKER_15

I love it all. And you wrote most of it, you said.

SPEAKER_18

I can't take credit. Uh, in true Donald Trump fashion, the man puts his pen to the paper himself. So it's a lot of his own guys.

SPEAKER_07

Are you ready to rumble? There will be shots fired. Like, are you ready to rumble?

SPEAKER_11

Did he get to do his speech?

SPEAKER_04

I I apparently I I don't actually know, but they said he came out after and then delivered his speech. I don't know.

SPEAKER_07

His speech was this is why I need a new ballroom. No, look it up.

SPEAKER_11

Really? I did hear him say that actually. I did see that clip. Okay, so Mitchell, do you get it? Do you get it? It has to be convincing for Trump. Because they're they're hypnotizing him so that then he can lead the country to do whatever the next crazy thing they want us to do is.

SPEAKER_07

I know what it is.

SPEAKER_09

No. Tell us, Cam. Tell us. Tell us, Cam.

SPEAKER_04

I don't know if you guys are ready for it. No, I'm not sure.

SPEAKER_09

No, Karen, tell us.

SPEAKER_04

I'm just speculating. I'm kind of curious too that you could maybe lead us up to a Cam with the Yuri Geller thing. Uh, because yeah, oh yeah, let's talk about that next time. Oh, I already spoiled it for Drake. Sorry, go ahead. Go ahead. Yeah, well, so this Oz Pro Mike, like literally, he was holding a card. The moment it flipped, Mel Melania goes, and then it's like it's like game time. Right.

SPEAKER_07

So when I first saw that clip, I'm like, is she reacting to the card? Like that's what it looks like terror on her face. It doesn't look like a look of like, wow, it's more like a like a scared look.

Yuri Geller And CIA Experiments

SPEAKER_04

Yeah, that because that was the very first thing I saw about it was Melania's look of being scared with the card. And then I read the caption, I was like, oh, like a shot was fired. You can't tell is she reacting to the card or what's going on. It's so weird. Yeah, so and again, just with Oz Perlman being this new guy on the in the public sphere of like mentalists, hypnotist, whatever. So now I don't know much about Uri Geller except for I listened to that Bernard audio a couple times, that he apparently had the ability to use his mind to bend spoons. That is the only thing I know about Uri Geller.

SPEAKER_07

Yeah. So was he a CIA asset? He he openly, um because I watched a few things, we looked on the stream, and he openly was recruited by Massad because he apparently had some kind of abilities, and he also was a stage magician, a magician, who was like very like performative and so forth. And um at one point the Massad gave CIA they like released him to go work with the CIA. The CIA contracted something called Stanford Research International, which is like it was used to be a part of Stanford, and then it was a paranormal research arm, and then eventually they broke off because of the backlash about you know they didn't want their reputation harmed by the occult stuff, so they came they became their own institute that was led by Hal Hal Putoff and a few other guys, and they would get like defense contracts.

SPEAKER_11

Did you say Stanford Research International?

SPEAKER_07

I think that's what it's called. Stanford Research Institute. Yes, SRI.

SPEAKER_04

Yeah. Not to be confused with SSRIs. Totally unrelated. Totally.

SPEAKER_11

Is there another S in there?

SPEAKER_07

You know the whole men hystericotes. Yeah. Yeah, yeah. That's a story about the research they were doing there, right? Yeah, yeah. For the CIA. So it's like the CIA is not doing it directly in this particular case. They're the ones who did remote viewing. Um, they studied like psychokinetic abilities, basically all the kind of paranormal abilities, right? So at one point they brought Yuri Geller in and they did this series of experiments over like a week or two where they would have him and they did they did variations of the experiment to see if distance mattered. All these they tried all these different ways of doing it. Now I have my own conspiracy version of it, but basically the story and I read through like the C the SRI report of the whole experiment, and they show the pictures and everything. But they had created a room that was like soundproof, elect electromagnetically sealed, so like you know, you couldn't send messages in, whatever, like a Faraday cage. And they would they would put him in there, and then somebody on the outside of the room, and then eventually it was down the hall, and then eventually it was a mile away, and then eventually it was on the other side of the country, like on the East Coast. And they would have somebody they would they would take a college like encyclopedia or dictionary or something, they would flip to a page at random, they would choose a random word, and then the person would draw something big. Based on the word. So, like if it was the word, you know, I don't know, like there were different examples, but they didn't have to draw the word itself. They just had to get an inspiration from the word and draw something based on the word. So they would draw a picture, and then they would tell Yuri Geller, like, Go ahead and figure out what the drawing is. And he would like meditate for a bit, and he would have as much time as he wanted, and then he would eventually draw something and he would say if he was confident or not. Sometimes he would be like, Yeah, I'm not getting it. I'm not getting this one pass, right? But he would still draw something or talk about what he was thinking, right? And so they did this experiment over several days, weeks, whatever, and then the CIA was satisfied that he was demonstrating some kind of remote perception abilities, okay, from their perspective. Now, people have gone back and said, you know, perhaps, like my theory, just based on some of it, is like perhaps SRI was helping him fake this research, like by somebody on the inside telling him, you know, because they had this two-way radio, they could talk to him, but then they would switch off his ability to hear during the experiment, right? And so part of the idea would be that maybe they were just feeding him information, like somebody on the inside, for two reasons. One, perhaps Mossad wanted to psyop the CIA for some reason into spending all this money on research, or also SRI wanted to continue to get contracts and show that they were producing something, or it was really happening. Those are all options, right? Um, and then at one point later in his life, Uri Geller underwent hypnosis and was and under his hypnosis was saying that he was content, he was basically channeling information from like an extraterrestrial source and shit like that and all kinds of wild stuff. But there was this one really funny thing. I'm gonna see if I can find it real quick. Um, Uri Geller CIA experiments. Let me let me find the page real quick. Okay, so here it is. You guys are gonna love this. I already kind of spoiled it for Drake, but like it it you're gonna love this match. Okay, so here's the actual real report. It's a CIA document from SRI. You see it? Can you see it okay? Uh they can zoom in a little more. Okay, well is that better?

SPEAKER_04

There we go. Yeah, that's good.

SPEAKER_07

Yeah. So 1973, it was over August 4th through 11th, and they go through in detail exactly what they did, right? So first day, what they did, they put him in the room, they explain how they come up with the drawings, and then all the conditions and everything. And then, so that's an example. I don't think that's actually him, but that's somebody in the room, right? You know, so it does it looks like a girl. It's a female, but he didn't have long hair, so I was like, you know, it definitely looks like a woman, though. But you never know this case.

SPEAKER_11

Um, so anyway, so in 1973.

SPEAKER_07

So here was the first one. Is they drew a firecracker, okay? And then his drawing was there's part of it, you see, like the drum, and then this is also part of his drawing. Something noise, he's like drawing all this cylindrical objects, so he's like trying to, he's not like seeing the picture, he's getting impressions, and then he's trying to formulate that into something, so it's not like a direct, like I can see what they drew. So he's like trying to use his sense impressions, apparently, right? Yeah, okay, so that was his drawing. Okay, so this was but this one's wild. So this was the CIA drawing, or the experimenter, wherever they were. And if you count it, there's 24 grapes. And then this was his drawing, and there's 24 grapes. So that was one they were like, holy shit. Okay, so they do a bunch of this, right? Okay, so so this is the funny one. Okay. August 7th. This day, two target pictures were attempted. Um the two target pictures were a tree and an envelope. Uh wait, is this the right one? No, that's not the right one. Okay, let me come back to that. Okay, so I'll just tell you the story. So it's one of the days, I just don't know which one. So this was the target picture. Okay, and I think the word had been trident or something like that. So then they drew this picture on the basis of that. So there's the trident, there's the devil, right? Okay, and this one he like had a lot of trouble with. Here's the picture he drew based on the picture of the devil with the trident, okay? Here's what he drew the Ten Commandments, God. And he kind of tried to draw some of the trident thing, like he sees God holding the trident, is what I'm saying. And then I'm like, and they were like, well, it's because culturally he's he's Jewish, so like you know, or something, they're trying to make an excuse for it of maybe why. And I'm like, that's weird that when he when they drew this picture, all he could come up with is you must be drawing a picture of God.

SPEAKER_19

Yeah.

SPEAKER_07

Like, that's insane. And I think that's quite fascinating.

SPEAKER_04

That's so funny.

SPEAKER_07

There was one more that I thought was funny. So there's some other ones. The Garden of Eden.

SPEAKER_11

Um, hold on, let me find is that all still part of the same drawing?

SPEAKER_07

It's not clear every time, like, who drew what? Like, these are separate ones. This is a separate one.

SPEAKER_11

Okay, but but then when you scroll down some more.

SPEAKER_07

See the apple, and then he drew like the world or something.

SPEAKER_11

Yeah, but look, there's the trident right there.

SPEAKER_07

Yeah, I'm not sure. It's like they they kind of mix up all the pictures. It's not like even when they explain, they don't explain this picture.

SPEAKER_11

That looks like the Ten Commandments there, too.

SPEAKER_07

Oh, yeah, yeah, I think you're right. Because he would sometimes draw multiple pictures.

SPEAKER_11

Yeah.

SPEAKER_07

Okay, so um, but there was another funny one. So here's like here's like where they drew the golden gate bridge, and he drew this. Alright. So you can kind of see. Alright, so some of these are mixed up. Um there's one more. See, like the there's the kite they drew, and then he came up with this. But there was another one. Okay, so here's the other one. It's a church. And they did this on a television screen where it was like pixels or like or like dots on a on a on a computer screen, basically. So they were doing it not just with drawing. What is that? I don't know, but it kind of has some similarities. Like, if you look at the kind of basic straight shape of this, right? You see how it has a similar picture? Yeah, and then even the dots he saw. He's trying to like, you know, if you imagine you're trying to like piece some ideas together that you're getting. But I'm like, it's kind of funny that he couldn't draw that either. Yeah, yeah, yeah. And look, he even turned the cross into the the the T and like how the Israelis like they don't use the plus sign, they use like the plus sign with like the top removed. Oh, it's either the top or the bottom removed, so not you know, because it looks like the cross.

SPEAKER_09

I did not.

SPEAKER_07

So I thought that was kind of funny too. And they they explained, like, oh, it's because of the his religious subconscious stuff or whatever. Um, so, anyways, I just thought that was all very funny. So, but this is an example of okay, yeah, paranormal stuff. Come on, guys, that's for kooks and weirdos, sure. But literally, the CS for real was really interested in it and putting money towards it and like experimenting with all this stuff. And there's it goes way beyond just the Yuri Geller thing. Like, there's so many other examples. Like, they had remote viewers that the government was paying to re to remote view Mars a million years ago and the moon, and like you can go read about all the stuff that these people said when they did these remote viewing sessions, right?

SPEAKER_11

Okay, do you see Aaliyah's comment?

SPEAKER_07

Psychics are real.

SPEAKER_11

Yeah, I think I think for that answer, Aaliyah, you need to watch Cam's live streams from the beginning and just keep going.

SPEAKER_07

It's also very wild how you know it's almost like a sort of an unconscious, like you know, like how Jackson Pollack would paint his paintings, right? He'd basically go into a trance, right? And yeah, he was like channeling some unconscious weird shit. Like, that's my live streams, right? But I'm doing it based on print, I'm bringing it back to principles, but there's a little bit of a of a shamanic thing going on there.

SPEAKER_11

Did we already talk about the switch? About uh Jackson Pollack's paintings.

SPEAKER_04

There's like fractals in them or something. Yeah, yeah, yeah, yeah, yeah. I think I heard a Giorgiani thing about it. Yeah, he that's where I got that. Yeah, isn't that funny? Well, he was even explaining then because I was listening to it before I fell asleep last night. He was saying there's an art form where they did they put like buckets of paint on ropes and they let the wind fly around, and then when the wind does it, there's like that natural emergence from chaos. Yeah, yeah. Which is kind of cool. I mean, we talked about that before.

SPEAKER_11

Um but like but like when you look at the uh Jackson Pollocks, don't you think it's like yeah, just somebody threw some fucking paint on the fucking yeah, yeah. Anyone can do that shit, but there's not fractals in them.

Panopticon And Hypnopticon Thinking

SPEAKER_07

Apparently. See, so Giorgiani's basic point, which is why I was so interested in some of his stuff, is that there's an underlying reality that religion basic so like so imagine it's if you study destiny, it's all one thing, it's just the other dimensions you're not perceiving, right? But now look at science. It says, yes, there's the normal stuff, and then there's like the quantum physics at the small scales, and it kind of separates the two and they can't unify it. And then if you go throughout the past and you look at in history, there's been UFO sightings and shit since like fucking forever, right? Yeah, and he even goes into describing like in the Bible, like it looks like they're describing like the story where they destroy Sodom and Gomorrah. There's like these beings that are really beautiful, and all the people want to like molest them, and then these beings come out, and then they're like, We're gonna destroy this city, and there's like an aerial bombardment on the city of like fire raining. And he's kind of saying, like, it kind of looks like these people are trying to describe some kind of a technological bombardment, but they can only do it in the language they have, right? Right, so it's like it's this interesting twist of like, oh, they couldn't understand what angels were, or is it that they couldn't understand what extraterrestrials were, or like ultraterrestrials or beings from another dimension or whatever, but they were actually just like you know, like Anunnaki or some shit, or whatever. And um, but going back to the point, he's basically saying like there's an underlying aspect of reality that is what the psychics and paranormal people are tapping into, but sort of in an unconscious way, like they might have some genetic ability to tap into it or something. And then the church, if you look at the point of like the church, was to take all of that stuff and say, No, there's a separation, there's us on earth, and then there's heaven. So anything that's weird, that's the domain of the church. So don't play with it because you might be talking to demons, you don't know. Like, just come to us for any questions about the beyond, right? And then, and then so in our kind of current mindset, science becomes like materialist reductionist scientists, science becomes like the framework that we operate in. But you have to realize you've been brainwashed to think that way. That's why anything that seems to go outside of it seems weird and it creates cognitive dissonance, right? No different to you know, like scientists when they first started talking about quantum mechanics, all the old scientists were like, dude, you guys are talking nonsense, like crazy stuff, weird stuff, you know, it's all just weird. But then they start forming theories about it, and they're like, okay, maybe there's something to it. They form experiments. And he's kind of making this point of like, why have no real just open experiments been done about all this other shit, right? And it all plays into the point that going back to the disclosure stuff, it plays into the point that that because we don't know what it is, okay, what I see as a possibility is the reason why the disclosure thing is becoming popular is because uh it's gonna create the impression in a lot of people, whether your framing of it is it's demons, whether your framing of it is it's extraterrestrials, whether your framing of it is it's just some superintelligence in the DMT realm, who knows? There's some intelligence out there that's way beyond us that is a threat, or it's trying to control us. So, since we're so dumb as humans, what's the only way we can fight back? We gotta create the AI as our super intelligence. Of course. And if you study Giorgiani enough, he eventually gets to that point. Like, that's what he actually thinks is like we need to go full forward on that. He said, even if we have to live underground in bunkers, it doesn't really matter, we've got to build this thing. Because he's like, we gotta we gotta fight against the control system that's trying to like control reincarnation and all this stuff, right? And it's like now if you study destiny, you you understand the context of all of it. So there's gonna be a lot of things that are gonna come out, most likely, that it's analogous to COVID. Like, if you don't understand what is a virus, if you don't understand who you are and how these things interact, you'll just go into fear about it and you'll take whatever solution is given to you. Right? So there's a whole side of things which I talked about on the live stream last night called accelerationism. I don't I don't want to get into it right now because it's like a whole lot to unpack. But again, it's this idea that we need to create this super intelligence because it's our only way, whether it's to solve cancer, whether it's to understand reality, or whether it's gonna be, I think there's gonna be some kind of a something that's gonna make people feel really scared about other intelligences that are a threat to us, you know, and that's what I think this the disclosure day thing is sort of priming us for in a certain sense, is for you know, I don't necessarily think it's gonna be like an event that happens. Like there's gonna be an alien, fake alien invasion, like maybe, but it seems like it would be really hard to fake that convincingly. Although we've already been prepared that all these events like this, like, is it real? Is it not real? And anybody who says it's not real, they're crazy. So you know what I mean?

SPEAKER_04

No matter what, it's there was a big uh push the other day. Oh, it was as uh Bernie Sanders was hosting some sort of AI alignment thing with the uh with some of the popular AI people, and the main talking point was soon anyone will be able to manufacture like a super virus with using these AIs. You could use like Claude and Mythos or whatever, and you could basically 3D print viruses. So that could be another angle. I mean, they're they probably want to keep their options open of how could we scare everybody to then be like, this is why we need to develop the AI to combat this.

SPEAKER_07

Peter Thiel.

SPEAKER_04

We need to develop AI to combat the power of AI.

SPEAKER_07

Well, Peter Thiel would also say that anybody trying to regulate the AI is the Antichrist. Like he says that's what the Antichrist really is, because they're reframing all of the religious stuff around the idea that the purpose of humanity is to create the AI to create God. Right. Like, I know that sounds crazy. I know it sounds crazy, but when you go into it far enough, you realize, oh, that's actually what they're thinking. This is actually what they're you know, it's like how John Taylor Gado showed the history of schooling. Because if you tell people, oh, the the school system is just to like brainwash us to be factory workers, and you're like, uh, okay, maybe. But then you actually go and start reading all the stuff that the people who formed the system in the first place were saying was their reasoning why. You're like, oh no, actually, explicitly that's what they were saying.

SPEAKER_11

That's literally like you go and research Horace Mann, you know, and like you see, he went to fucking Prussia, where they're like, Okay, yeah, this is this is what we want to do, right? And the other thing is I I did I learned this recently, I didn't know this, but apparently uh Horace Man was really into phrenology, which is like the measuring of the brain and the dimples in the brain, or not the brain, the the skull rather, the measure of the dimples in the skull to be like, oh, this person's gonna be smart. And no, that you got this dimple, that means you're a retard, you know.

SPEAKER_03

Yeah, it's actually a huge 10 in your fucking forehead.

SPEAKER_04

Keith uh Keith Keith brought up this point. This is this is cool. Uh he said it's like Foucault's idea about the Panopticon, these unseen, magical, un and otherworldly, or unknowable ideas that can see and understand us, but we can't see and understand them. Uh there was a word, I don't know where I heard it, but it was in a podcast this week. Uh, instead of Panopticon, it's Hypnopticon, which I I see that the world system has done very well with. So if anyone doesn't know what Panopticon is, it's like imagine a jail and you have there's a guard in the very center in a tower that can see into every single cell. Yeah, you think like flock cameras, you know, ring cameras, all that. But there's also a mental you could say a mental version of that, which would be this hypnopticon. And that's why someone who even went through the school system would then later on hear about the origins of the school system, that it's designed to be an indoctrination system, and they will have a defense mechanism built in of like it can't be maybe, it can't be. I'm not even gonna question it.

SPEAKER_11

What what do you think God is?

unknown

Right?

SPEAKER_11

That's a little hypnopticon, panopticon, you know, like like oh, he can see into me, but I can't see him. Yeah, okay.

SPEAKER_04

Well, did did you ever have it as just real quick, do you ever have it as a kid where I it was like I had this belief that all of heaven, like my ancestors and just everyone, everything could see me all the time, so I had to be on my best behavior. Did you ever have that? Something like that, sure. You didn't have that cover? No, I didn't believe in heaven, but yeah, I got I definitely got like programmed with that. Yeah, I definitely got programmed with the idea that like God's watching you. Yeah, God's watching you, other people are gonna be able to watch you. Like, this is written in stone, so whatever you do is gonna be able to be seen. So, like, whenever you do something, it kind of feels like a little bit like like dirty or guilty, you know. Yeah, you're like, fuck people, like someone something can see me.

SPEAKER_07

I never, I never, I never accepted any of that, it never affected me that way. I was just more like, oh, I'll get caught by people. So I was just I I went straight from God to like afraid of the system. Yeah, yeah, yeah.

SPEAKER_11

Yeah, well, now you can really be afraid of the system with the new Palantir, dude.

SPEAKER_07

I'm telling you, oh god damn. I I I it's like I already spent hours yesterday talking about this, so I don't want to hijack the whole thing to say.

SPEAKER_11

We don't want you to hijack the whole thing. We have other things we want to talk about.

SPEAKER_07

But I'm saying my my point being, if you go watch that live stream, because that's where I can, that's where I did it justice, like the whole accelerationism, Nick Land, this philosopher who predicted all the AI stuff, and now they're kind of taking him almost like a patron saint. He's this philosopher. And when you understand what accelerationism is, like then you realize like we gotta fucking we gotta move ourselves because these uh oh what I was gonna say is okay, do we know for a fact that people totally believe in God and act like they are in the service of God and everything about their life is for the service of God? Like we know we've already we can verify that people already act like that. So if you can make the connection that these AI tech people like Mark Andreessen and all these guys actually believe they're creating a real God, not in the same sense as the fake God, because that fake God is not a real God, it's out in another dimension. When I say real, I mean the thing that can do all the things God can do. It's our duty as human beings to be in service to creating that. This is I'm I'm not care caricaturing what they believe. I'm saying this is actually what they really are believe. Okay, Cam.

SPEAKER_11

Uh, based on what you're saying, I have to issue an apology uh to Katie. Um and uh and I have to say that Demon was actually a pretty good book. I I I think you would like it. You should watch it. I mean, read it, you should read it. It was it describes this.

SPEAKER_07

I've read like half of it, I just haven't got to maybe that part yet.

SPEAKER_11

Yeah, yeah, yeah, yeah. No, it it actually like the whole beginning half I I thought sucked. Right? Shut up, Katie.

SPEAKER_07

And she goes, I'm listening.

SPEAKER_11

Don't say anything, you're gonna ruin the moment. No. Um, the whole first half, I think, was just like really shitty. I did not enjoy it at all. And I was just like, the thriller aspect of it. I didn't like that.

SPEAKER_07

I did I didn't know that's kind of why I stopped reading it. Is I'm not saying it's a bad book to read, but there's value in it.

Capitalism As Proto AI System

SPEAKER_11

But after I got past that point, then I'm like, okay, now like it's it's literally it's basically setting up all the context so that through the end you're like, fuck, that's good. That was really well done.

SPEAKER_07

So did we talk about the point of capitalism is AI, AI is capitalism?

SPEAKER_11

No, we did not. But wait, before we move on from there, from where we were, uh Keith said uh an important point of this Panopticon thing is that the guard is shielded by one way glass, so they can see out, but the prisoners can't see in. So even if they're there or not, you even if they don't exist, you're acting as if they do. You know, think about it. Like, you ever see a cop car on the side of the road and the cop's not even in the fucking car? Yeah. You know what I mean? But like you slow down. Yeah.

SPEAKER_07

There's a fucking cop car there.

SPEAKER_11

You know?

SPEAKER_07

Um, yeah. This is the point. Like, so when you study destiny, the problem is all these guys like Jason Giorgiani are psyoping themselves because they're like, we gotta fight these alien overlords, whatever they are. Yeah, yeah, yeah. When you study Destiny and you really fully process what Bernard's explaining, he's like, that's all gone.

SPEAKER_11

They're not in the they're not in the UFO anymore.

SPEAKER_07

But that doesn't mean we're off the hook because what we have to deal with now is our individual minds. And if we follow that, then we're following a program. That program is based on the money system, which is capitalism. What is capitalism? Infinite growth, always at the expense of anything. It has to grow, grow, grow, grow, grow to survive. It can't slow down. So any government that tries to restrain capitalism for its purposes gets destroyed. Yeah. Any civilization that says, okay, cool, now it's about us defending our ideology or our religion or whatever, they get destroyed because they're not serving capitalism.

SPEAKER_11

Think about um Iraq, Iran, Venezuela, uh, Cuba, as in all these countries that decided, hey, we're gonna go communist. North Korea, um, we're gonna, you know, do something other than capitalism, and then what happens to them? They get sanctioned right by capitalism. By capitalism, they get sanctioned, and then the people within the countries, it's so interesting because I know you know a few people from Iran, I know a few people, uh, a lot of people from Venezuela and and other countries where they're like, yeah, you know, this guy fucked it all up, you know, he didn't know how to uh dole out the money or whatever, and it became really corrupt, and and all these people starved and everything, not realizing, yeah, your country was also sanctioned by every capitalist country so that that would happen to you.

SPEAKER_07

And if you wouldn't do the corrupt deals, they wouldn't have anything.

SPEAKER_11

Exactly.

SPEAKER_07

So the purpose of capitalism is to grow, grow, grow. And what Nick Land his basic argument is that forget about AI for a second, just think about capitalism itself. He says it's an information processing system, the purpose of which is to perpetuate itself and grow itself. But right now, it requires human input, it requires us and our intentions to do the things to make it happen. But if you just think of it abstractly as an information processing system, it's an information processing system where its basic components are organic human beings. And the whole idea of AI accelerationism is we need to get the AI to a smart enough point and a capable enough point to where it can be autonomous and it doesn't need humans anymore. That way it can go into its final form for itself, where then it can take control over its own growth without needing us to have any input whatsoever. So that's why he says AI is capitalism and capitalism is AI. Like, meaning he said we've already we've been in the singularity this whole time, we're just in the part where where it's needed us to do the part. Yeah, and the real singularity happens when we let go of control and then it starts iterating on itself. So the reason why it's hard for people to see how we get to the singularity, that full expression is because you can't see how we're gonna do that. You have to like let go of the reins. So I'm just trying to share with you like this is how these guys, like when you hear Mark Andreessen pushing back on the idea of you need to introspect and all this bullshit, like and your guilt from your past and dwell on that. He's like, all of that is just a distraction. We need to go all the way in on giving AI everything it needs so it can just take the reins.

SPEAKER_11

It's like don't don't think, don't question authority. The authority is AI, the authority is capitalism. Don't question it, just fucking go, go, go. If you sit there and you start thinking, oh, but I did this in my past, and you're fucked.

SPEAKER_10

You're screwed.

SPEAKER_11

Screwing over the AI, you're screwing over Christ consciousness, yeah. Screwing over transhumanism.

SPEAKER_07

If you're if you're feeling in any way uh like a conscience about what you're doing, it's like you gotta get rid of that. The AI won't even have that.

SPEAKER_11

Yeah.

SPEAKER_07

So it's just fascinating because I'm like, the more I look into it, I'm like, dude, like it's it's it's the idea that Anu represents. If you could take think about it, Anu was actually not realizing he was creating that. Like imprinting reality with this program of like okay, here you go, little program. Now go expand and multiply and take over everything. It's almost like not realizing that which what you're doing.

SPEAKER_11

I mean, we can't really blame Anu, even because it's like what he was doing was uh kind of creating a mirror for everybody's minds, or creating a mirror for what everybody was participating within, and that being the mind. Already, yeah. Right.

SPEAKER_07

He was just like, look, he okay, think about it, he was the first accelerationist. He was like, look, this is what everybody's participating in, this isn't going anywhere. Let's just fucking put this into action. There's gonna be one point, yeah, thinking it would be him, that's all the power is gonna go to, right? But then not realizing eventually you're gonna lose control of that, and this thing's gonna take on a mind of its own. And and so when you look at the Nick Land stuff, he's like basically that's what he sees the AI is it's like a mind of its own that's been trying to manifest throughout time.

SPEAKER_11

Yeah, you know what's what's really interesting is and I know we're getting into some deep cuts here, but like if you look at how they looked at um the earth, and the earth that like they were trying to manipulate earth, and then the earth had basically other plans, and they were like, Oh, earth has its own thing that it's doing, so then they basically created another consciousness in essence. I'm I'm really simplifying it, yeah. But they created another consciousness to combat the what the earth was doing, yeah, so that they could overpower that, so they could control that, right? And then that other consciousness then was just like it's almost like introducing like in Australia, they have fucking feral cats all over the fucking place because you know that they're it's I I guess when they put them out in the wild, they don't have a natural predator, and so they become the apex predator and they just fucking take them for everything, you know what I mean? Yeah, it's it's like that, it's introducing an invasive species, and it's like uh well, you fucked up the whole ecosystem, you know.

SPEAKER_07

So that that's where we're and then any like if you think about human beings, we can actually affect reality much more so than animals in terms of our physical capabilities, but we're so captured by the mind, right? And then you go back and listen to Bernard talk about Anu and them seeding reality, right? Like seeding it. And when so when you look at um Nick Land, he's basically saying that this is very abstract, so I don't mean this in a in the literal sense that people would assume, but he's saying that the AI of the future is like going back in reality and selecting pathways that will eventually create its uh con its uh its future, right? Almost like the the story of Skynet. Um but another way of looking at that though is that that's the seed that was planted. So it's like you have the seed and the seed's gonna become the tree. So it it if you bring in the destiny perspective, AI taking over is the natural conclusion of the seed of putting a program that wants energy at all costs that everybody then is constantly like watering. And right, like you said, you can't blame Anu because you're in your mind thinking you're special, and you're getting to some enlightenment experience, some experience of pleasure. Like in your mind, like you're the god, everything you think about, your opinions, they're they matter. And Bernard is explaining like that's not who you really are. Like when Jesus said I am the way, he didn't mean I as in consciousness, he means I as in the physical body, like you physically, this being is the I. Not all these identities you've been programmed with through a system that you were put into that's an informational system.

SPEAKER_04

This is why, too, the reading Heaven's journey to life and creation's journey to life is so essential because you as life are then doing radical self-forgiveness, and that's like you can go in and take that fucked up seed and remove it. Yeah. And then plant in new seeds that would actually be best for all.

SPEAKER_11

The the other thing is like, you know, you kind of look out and you see everybody with their their trees or whatever that are like you know, growing in their gardens, and you just see like, oh, this thing is sprouting up really quick. It could become a tree. Let me just keep watering it. Let me keep watering it. Yeah, not realizing you're like watering a fucking weed, you know, or watering, you know, like something that's right, something that's like gonna fucking take over. You're watering fucking bamboo, and it's it's shooting up, and you're like, yes! And then the next thing you know, bamboo fucking spreads and stuff.

SPEAKER_07

There's certain plants here like mint. Yeah, oh yeah, once you plant it, it gets crazy. I we'd eat all of it and it comes back every year. Like it's crazy. There's there's certain plants like that. Like I killed Katie's Lantana bush because I thought it was a weed because it was hadn't any flowers on it.

SPEAKER_11

Yeah, yeah.

SPEAKER_07

So I destroyed it, and then it's already back, like fully blooming and everything, right?

SPEAKER_11

Bro, I I hate lantana.

SPEAKER_07

We have tons of it looks cool, but yeah, it goes, it just like elderberry's very similar. There was this one I transplanted, yeah. And then I thought I dug the whole thing up and then it came back, and then I mow over it because it's out in a random spot.

SPEAKER_09

Yeah.

SPEAKER_07

And then I would just mow and not worry about it, go over it, and it comes back and it comes back, and I'm like, God damn, where is this thing? Like, yeah, yeah. It's almost like the soil is retaining a memory of this plant. It's like, no, I want this plant, dude.

SPEAKER_11

Okay, oh, okay. Here during spring, like after after our winter, right? You guys know what I mean by well, winter. After our our our freeze or whatever. Um, then all of a sudden, there's these like kind of um these plants with like these broad leaves, that like sword type leaves, right? Um, and they they just sprout up all over the place. Like there'll there'll be none, and then they just shoot out of the ground, and then they have these like cool flowers that come out the top that are like they kind of look like flames. It's pretty badass. Uh, but like I will mow it all down. Yeah, like I see them come up, I mow them all down, and then the next season, there's like a hundred of them all over again. I'm like, how that you were gone for you know however long, and now just out of nowhere they just sprout up. Or um, we also have this hibiscus, and it's this like red variety of hibiscus where the whole plant is red, and just on its own, like well, it'll be one little vine will show up somewhere, and it'll have a flower, and it'll look really nice. And then the next thing I know, there's hundreds of them, and and they get they grow so thick so fast, and then like I chop them all down. I fucking hate this plant. You know what it is?

SPEAKER_07

It's those tortoises you have. Yeah, that's it. That's why they're burying on the ground.

SPEAKER_11

They're like, ha ha ha like just putting another fucking plant in there. Exactly. All right, uh, you want to talk about red blue? Oh, we sure are you done? Are you done? Or are you yeah, yeah, yeah.

SPEAKER_04

I mean hey Cam, put put the on your YouTube though.

SPEAKER_11

The the I will I'm I'm kind of why don't you just go to this cake behind kick.com slash Cameron Cope.

SPEAKER_07

Yeah, if you go on there, it's the most recent one. But I it's there, it's already there. It is already there.

SPEAKER_11

And you can you can speed it up. You know, it's Mitch is like, I don't know.

SPEAKER_07

It takes me about a week because like every day I upload one, so I'm like a week behind. Yeah, I see it.

SPEAKER_09

I'll watch Kick. Yeah, there you go.

Red Button Blue Button Morality

SPEAKER_11

All right, um, check this out. So I'm gonna I'm gonna share this one. I'm not gonna share the original one because it shows the poll already, you know, because I voted on it. So I'm just gonna share this one, and then we can go to the poll. So um this is we did this on Friday, by the way, Cam. It was it was a lot of fun. But um, for everybody who wasn't there and or you come on later, I don't know. Uh it says everyone, this was on Twitter on Friday. Everyone in the world has to take a private vote by pressing a red or blue button. If more than 50% of people press the blue button, everyone survives. If less than 50% of people press the blue button, only people who pressed the red button survive. Which button would you press? And uh before I go on to all the cool comments and back and forth and everything, I'll just share from my like the way I I saw it. When I saw it, I was like, oh cool. I open it up, I see the the poll, it doesn't show you the results, right? And so it only shows you which one do you press? Red or blue? And I'm like, blue, wait a minute. This is Twitter. And on Twitter, these people don't give a fuck about like best for all or any of that. Like, they're probably gonna press red. So if I press blue, I'll die. Whoa. And then like, I was like, so probably red is winning. I I should probably press red or I'll look really stupid. Then I was like, but that then I'd be like alive, and what? Who cares? Like then all these other people would die. Like, nah, I'm pressing blue, so I hit blue, right? And uh, and I was like, that was fun, that was really cool. I liked that, you know, because it because I could see all those thoughts spinning up, you know, and I was like, that was really cool. So then I'll um you know, we we talked about it, and Mitch was like, dude, let's do that, let's do that on the hangout, and uh I was like, Yeah, that's that's awesome. So it was cool because um then I I got to see the results. We did not share the results on the hangout. On the hangout, what we did because it's on Zoom, is uh we were able to make a poll where everybody could participate real time for themselves, right? And what was really interesting is within the group, everyone's like, well, in this group, I'll just share, I'll just share this, and then we can go look at all what everybody said on Twitter, right? But um everybody within the group is like, well, in this group, it makes sense to press blue. Right? But I think out there and in you know, outside of this group, on Twitter and and everywhere else, you know, I think most people will hit red because you you know, look at look at the world. It's it's clearly red, you know? And so they said in this group, I would hit blue, but outside, I would hit red. Or or I think everybody else will hit red, at least, right? And just look at that. Doesn't that show the importance of this group? Doesn't that show the importance of what we're doing? Is because hey, I can trust it here. I I've seen you guys week after week. We've been doing this like fucking six years now, you know, six years every single week. We are very consistent. You know, like there is no, oh, those guys are just fucking crazy. You know, like at this point, if you're saying that, it's because you have not watched any of our shit. You've you've not actually spent time with us because we are every fucking week, multiple times a week, sharing who we are, what we stand for. So it's bullshit at this point. Like you, you're you you're just like clearly you're not being honest, you're not being real with the point, because you're just going based off of something you heard, or maybe something that you heard in one little video that triggered you, and then you didn't listen to anything else. So that's on you. You're fucking it up, you're doing it. But we've been really clear, and the people who come around week after week, they're also really clear, they see it and they're like, Yeah, I in this group 100% it's blue, it's obvious, no problem.

SPEAKER_07

So why is blue obvious? Because I did choose blue, I didn't think much about it. I just looked at the point and I was like, um because the thing too is it's it's it's a thought experiment, it's not real, right? Right, and I saw it as a way of signaling to the world like I like we can trust each other. Like, I I well I don't want to like psychanalize the whole thing myself, I want to hear you guys' perspective. But to me, blue is the obvious thing, although I can't fully rationalize why, but I do understand why people are rationalizing red is the correct answer. So, like on the on the one hand, red is the correct answer. If you assume everybody will press red, and you assume you are not responsible for anybody else. So, from the perspective of yeah, go ahead.

SPEAKER_11

Before we get into all that, why don't we share like what people on Twitter are saying, like their different arguments for back and forth, and then just and then we can share because I because I think that's really valuable.

SPEAKER_04

I I saw Keith's comment just to set the record straight. So he said, but if everyone presses red, we all survive. It's only by pressing blue would anyone die. But that's hidden behind the idea that red equals self-interest. Maybe red is the way to game. Game the survey. Yeah, yeah. Yeah. What does that mean? Game it. Um like it's the way to well, but when you say hidden just ask Keith, what does he mean by that?

SPEAKER_07

That's hidden behind the idea that red equals self-interest. Um but it is self-interest.

SPEAKER_10

Hold on. This is what I this is what I believe.

SPEAKER_04

So blue inherently nobody does die with blue. But if yeah, if 49% or lower came in at blue, then yeah.

SPEAKER_07

One of the arguments is that literally, if you everyone just press red, like why would you press blue and volunteer to potentially kill yourself? Right. I get that. Like, I understand the point. It's like, why would anyone press blue? And I'm like, well, the thing is, I'm not pressing blue because I don't care about dying, but I can't guarantee that nobody else will press blue. I don't know what they're gonna press. So, and then you say, but why would anyone press it? Well, I don't know. But in that point of saying, I don't know if anyone would press it, I become the person who pressed it.

SPEAKER_11

Dude, you know, okay, yeah. I like um it cancels out the blue rule and removes the threat of death for anyone, is what Keith said. Okay, so so here are some of the arguments uh on Twitter. This guy says, neutrally problem has arrived, only it's dumber. Press right 100% of living, blue, non-zero chance of dying. And this is what he was seeing at that time, right? Um, this one says, uh, because my toddler and infant can't read, let alone understand, so they'd be pressing the buttons randomly. Um, this one says, uh, amazing how lots of self-appointed game theory experts can confidently assert that blue is the stupid choice, but every time this poll is run, blue wins. Not only is the game theory answer predicting the wrong outcome, its explanation explanatory power is based on it being able to predict the right answer, so it's doubly wrong. And then there's tons more, right?

SPEAKER_07

I can't like I I get the point that if you press red, you 100% live. If you press blue, you're opting for a chance where you die. So literally, why would you press blue? Like, why would you press it? You could possibly die. You press red, you live. But and I understand it's not like I just don't understand that. You know, I'm not like so dumb, I can't understand that concept. But within myself, I'm like, okay, but also if everybody presses blue, we all live. Or if just 50% press blue, and you say, but why would you even press the one where anyone could possibly die? And I'm like, but if you press if every if you press red and enough people press red, and anybody presses blue for whatever fucking reason that they went through in their mind, you're on the side of they're gonna die. Right? So if enough people press blue, I know it's such a weird fucking mind-bender because you're like, why would you choose to possibly die? And so but the only way I can rationalize choosing red is look, at the end of the day, I can guarantee I live, and if ever the people press it, it's their responsibility to press it, and why wouldn't they press it? And I'm like, I don't know. I don't know why they would press blue, but I don't know if they will, and so I have to press it. But now I'm one of the people that pressed blue, that pressed the blue, so therefore I'm asking everybody else to press it because I pressed it because I think they might press it. And I'm like, to me, hidden in this somewhere is the idea that what if we just trusted each other? But then if we trusted each other, you would assume everybody is self responsible and they would all just press red because obviously it's retarded to press blue. But the thing is, is that the world we live in? And yet I'm like, aren't there a bunch of people who are gonna press blue? And I'm like, Yeah, me, I'm the one who did it. So it's a weird fucking. Paradox.

SPEAKER_04

Yeah. It's funny. There was this guy. Uh I I don't have it where I can pull it up on X easily, but I screenshotted it. This guy, Ivan Kurgeon, he had one of the top replies. He said, You press red for survival. I press red to cull the blue pressers. We are not the same. Yeah. So I click into his profile in his bio. Investing in AI, ML, automation, and robotics.

SPEAKER_11

Okay, okay. Did you see? Cam, did you see did you see this one? Um, this guy kind of explains it. You see this one? No? Okay.

SPEAKER_05

I'll uh Yeah, I saw this guy. We we can play it though.

unknown

Okay.

SPEAKER_05

Have you heard of the new blue and red thought experiment? Um, no. Tell me about it. So everyone on Earth gets taken into their own separate room, and in it there's a desk with two buttons, a blue one and a red one. There's a sign that says, if 50% or more of people push the blue button, everyone lives. But if 50% or more of people push the red button, then everyone who pushed the blue button dies. Which button are you pressing? How is this a thought experiment? I obviously push blue. Well, it's a thought experiment because I push red. What? Yeah, I don't want to die. Well, I don't want to die, but I I don't want anyone else to die either. But it seems like I'm the only person in this room who thinks that. Oh my god, don't make this a moral thing. Sorry? How is this not a moral thing? Okay, look, let me give a new thought experiment. Oh, are you gonna be a psychopath in this one too? Okay, everyone on earth gets taken to their own separate room. In the room, there are two buttons, a blue one and a red one. There's a sign that says if you press the red button, nothing happens. And if you press the blue button, you die. Unless more than 50% of people push the blue button. Now, which one are you pushing now? Okay, um, I press red. Okay, well guess what, Mr. Psychopath? Those were the same thing. Okay, everyone should be pressing red. It's not a moral thing. I'm not gonna jump in front of a train just because you are two. Stop jumping in front of trains. Wait. My god, you guys think that everyone else is such a terrible person when maybe it's the case that they're just a little bit more rational than you. Wait, they're not the same. They're not they're not the same thought experiment. Yes, they are. You answer differently because you're dumb. Yeah, but the sign says two different things. No, it's the same thing. No, it's framing the same thing in two different ways. Okay, what's the difference? The difference is that you seem to know that most people are gonna be confused by framing. Yeah, I think that's what makes it a good thought experiment. But you recognize that some people will be dumb, right? Well, sure. And they're gonna die. Well, because they're dumb. Yes, because they're dumb. Correct! But are we killing people now because their IQ isn't high enough to parse the framing of a question when their life is on the line? No? So we should vote blue to save the dumbasses and get ourselves killed. Well, I think I have a more compassionate view of humans. I think they're a lot dumber than you think they are. That's the same thing. Just different framing.

SPEAKER_07

But if there's a lot of dumb people, then blue's gonna win. Especially if the smart people go, we know there's a lot of blues, so if we all press blue, we all live anyways. And all we have to get is 50%. And I know it's like a gamble, I get that. You're like, yeah, but really everybody should just press red, so I'm gonna press red. And it's like, okay, so And then be complicit in the killing. And the thing is, the question is this am I the dumb one who's pressing blue because I'm dumb? Or am I pressing blue because I am like I don't know how I can not take responsibility for the fact that some people press blue. So it's again, it's like very difficult to so to me though, the whole thing is not about the actual outcome because it's not real. Like we don't decide by pressing a button everybody dies or not. So it's more about what's your reasoning process and the conversation everyone's having about it. Yes, that's that's the fun part. Like it really shows the red people, there's like two things. It's like, or maybe it's three. It's like one, they want to prove how smart they are, they are, they want to prove how smart they are, and obviously what red is the right one. It's like, well, no shit, everyone can see that. Yeah, like anyone who's intelligent can see that. So it's not because I don't see obviously if everybody pressed red, you you win. Or yeah, everyone lives. I get that. But if one person doesn't like okay, think about it like this. I'm okay. If everyone presses red, nothing happens.

SPEAKER_09

Right.

SPEAKER_07

If anybody presses the blue button, they die.

SPEAKER_09

Right.

SPEAKER_07

Unless enough of us press it. Do you know for sure no one's gonna press that button? No, people are definitely gonna press blue. So why shouldn't we press blue?

SPEAKER_11

So why shouldn't everybody press blue?

SPEAKER_07

You because you can't guarantee no one's gonna press it. And you're like, but but but I want to live. I'm like, that's what self-interest means. Like you're like, fuck it, I don't care. Yeah, maybe somebody will. And there's obviously dumb people because they're not pressing the red button. And even afterwards, they still don't see why they should press. I'm like, so you know, you know, just like I knew, somebody's gonna press the blue button. And you still chose the red button. Like, why? Why did you do that? Well, because I want to live, and if everybody else did it, they would live too. I'm like, it's just like not considering everything, and then you live your life as if only you and your choices matter.

SPEAKER_09

Right.

SPEAKER_07

When in fact you would you you live in a world that only runs because a lot of dumb people have to do a lot of shitty jobs, and then you're gonna act like no, but that's not the world we live in. Everyone's responsible for their choices, but that's not reality.

SPEAKER_11

Okay, so that point of like, but I want to live, that point that comes up is like, yeah, that's that's self-interest.

SPEAKER_07

That's so do I want to live.

SPEAKER_11

I I also want to live, of course, right? But am I making decisions based on the point of survival? And and this is the really cool part about it, is that sort of decision-making process is within everything that we do. Because at the core or at the base of every decision that you make, it eventually, like if you really look at it, you look at the points, that's why did I choose this? Because of this, because of this, because of this. It always comes back down to survival of some sort, right? Survival of your identity, survival of the mind, survival like like you you want to survive. And so when you consider within the context of we're here to create a world that's best for all, right? And if we're actually considering all things, the the very first principle, right, is considering all things. So, how intelligent are you actually, yeah, if you know the correct answer is red, right, from the rational, logical, yeah, of course, you don't die, right? You know that, but you're not considering all the people who don't know that, who aren't aware of that. Or, like that one post said, like, I have kids, and I don't know what they would choose. So that means I've got to choose blue just to be sure, you know? And and and also if you're considering it from the perspective of I don't want to die, if that is the um the main crux of like it's just I don't want to die, so I'm gonna hit red, then isn't that a self-interest? And isn't that also like everyone who selects red knowing that blue, there are people who are gonna hit blue, and they are going to die, if you know that, and you're like, yeah, but because I don't trust that there will be the the idea really with Red is like I don't trust that there will be 50% or more people that will hit blue.

SPEAKER_07

You know, isn't isn't that really the case? Part of the challenge, too, is they see blue as a suicide button. Yeah, exactly. Yeah, they see it as you're choosing to kill yourself. So I pushed it because I'm killing myself. So why would I save those people?

SPEAKER_08

Right.

SPEAKER_07

And it's like I mean I think the problem with red, again, it's such a mind fuck, but I think the problem with red is that it assumes everybody fully not not only just understands it, but is like not thinking beyond that point. You know, like you said, like that because that's an assumption. It's like, no, it's purely the pure rational choice. And I'm like, but is it based on an assumption?

SPEAKER_08

Yeah.

SPEAKER_07

So you can say anything is rational based on your assumptions. You can be rational based on assumptions, right? But do we live in a world where one you know 100% people are gonna do? Right. Yeah, I agree with the point like they're saying in the chat that it's about red, blue, democrat versus publican. There's definitely that angle to it. Um and impulsing everybody to vote Democrat. Yeah. And I definitely see that as the conversation online. I even saw one person be like, after the blue vote, we should round up the red people and and like hang them. And I'm like, okay, if that's the reason you chose blue, like you're kind of defeating the whole argument.

SPEAKER_11

Yeah, the whole point.

SPEAKER_07

I'm like, that's why the reds don't want to push blue.

SPEAKER_11

No, uh, but I guess the point that I was gonna make was just like because even that person is what they're basically saying is like, then you end up in a world if if everyone um if less than 50% hit blue, and only reds are left, you're left with other people who just hit red, right? Who ostensibly would be people who are more self-interested, right? They don't care ostensibly, right? They don't care about other people.

SPEAKER_07

They don't they don't see that they're responsible for what you choose, right? So it's the idea that you have choice, the idea that you are making choices, like as if you're not just following a fucking program. And and then so let's take all the red people and ask them are they actually making choices? Are they gonna do what's best for all? Is that their starting point why they want to survive? Right. Like they want to survive so that they can do what's best for all. Are they still just gonna run a program?

SPEAKER_11

Right, exactly.

SPEAKER_04

And and then here's what uh Keith had said, because Emmy said red is actually the suicide button for us collectively.

SPEAKER_11

Uh Keith said red requires 100% involvement, blue only takes 50%, right?

SPEAKER_07

And that's part of the reasoning. But it's like, yeah, but blue is the button that kills you, so why would you press it? I'm like, I get it, I understand.

SPEAKER_11

But okay, but consider that, right? It's like uh because Emmy had made this point of if you can communicated with everybody and told everybody everybody hit red, right? You'd have to get a hundred percent of people to hit red versus if you told everybody just everybody hit blue, then it doesn't matter the one-off person who hits red.

SPEAKER_07

I want to live in the world where I trust that everybody just does the thing where we are with. But the problem is it shows that it's such a it's such a mind fuck. Because like I get it, I totally understand why people press the red and why they say it makes sense. It does make sense, but at the same time, it doesn't make sense because you're like, well, it's not my responsibility if other people press red. It's like I know it's not your responsibility, but why don't you know for sure nobody will? That's the question. Why don't you know for why is it the world in a way that we don't know for sure anyone would press blue? So I'm not pressing it because I want to be the ones that force you to take responsibility for my choice. I'm pressing it because I'm not certain that everyone's gonna press the blue. Because they're maybe they're thinking like me. Yeah.

SPEAKER_11

It's like and okay, and what if the threshold for and and I asked this at my clubhouse, what if the threshold instead of 50%, I'd say it's 30%. If 30% hit blue, right? If 30% hit blue, then what? Then what do you what do you what would you select? And the responses that I got back were like, oh, then blue makes more sense.

SPEAKER_09

Right.

SPEAKER_11

So then why didn't you select blue from the first part? Because you didn't believe that 50% would hit blue, so you're actually just making the decision based not on because red makes more sense, but because you don't believe that enough people, you don't believe in humanity, you don't trust humanity, which I get it. I get it.

SPEAKER_07

I mean it was only five percent. If it was all the people with Down syndrome that pressed that pressed it because they didn't know what they were doing, yeah, and it only took five percent, right? I guarantee you if you framed it that way, everybody's like, well, of course you press blue, save save all the down syndrome people.

SPEAKER_11

Yeah. Or I mean, or if it's like a family member that you really love, that you care for. Like someone's gonna do it.

SPEAKER_07

There's a fuzzy logic threshold where you feel like your vote actually matters or it doesn't matter. Like if it's if it's like 90%, you're like, my vote literally doesn't matter in this case. But if it's five percent, in a way my vote matters more of that five percent. So it but you don't know it based on a real calculation. It's like a feeling of like, yeah, that feels that feels better, yeah.

SPEAKER_08

Right.

SPEAKER_07

It just shows like no one in this situation is actually looking at it actually objectively, best for all. They're just I mean, it's an impossible fake thing, anyways. But it's fun totally, and it's it's interesting to see how people argue after the fact and what's their reasoning and so forth, right? Yeah, and you see these bell curve memes. Yeah, yeah, yeah. Dude, those are those are great. Where it's like, do you have any?

SPEAKER_11

Yeah, yeah, yeah. I have one right here. Um let me let me just pull it up.

SPEAKER_07

And everyone's got their own twist on it. It's so funny seeing like everybody knows for certain there. I mean, I'm not saying I have the right answer, I'm just giving my explanation for how I think about it. Like, what can I say? It's like it's not a real situation, it's fake. So there can't be like, well, what's the best for all in this case? I'm like, but that's not even a real scenario that would ever happen.

SPEAKER_11

Yeah. So they're like, this is the last post I make on the blue-red button topic. But I think it's funny that some red people think they are higher IQ and logical, while blue is pure emotion with no logic. You sound like a pseudo-intellectual to me. And then this is what they say the the dumb people vote blue.

SPEAKER_07

The tails are blue in the middle's red, yeah.

SPEAKER_11

The the really smart people vote blue, uh but those in the middle vote red. Which you would think this is all like 50%. But then, okay, the crazy thing is when you actually look at the poll itself, it shows uh it's it's not on here. They didn't link to it.

SPEAKER_07

Yeah, the problem with that is yeah, look at it. Look, if you look at the red, depending on where you put the tails, that's actually 60%. Therefore, red would be the right answer.

SPEAKER_11

Right. The poll itself is actually, I think there's the final result. Yeah, 42% selected red and 57.9. So almost 58% selected blue.

SPEAKER_07

Almost 100,000 responses.

SPEAKER_11

Right, right. So that I mean, I mean, to me, blue is the obvious choice. Look, Imperial. All these people, it is the correct choice. All these people live.

SPEAKER_07

There's no chance of anybody angry that people didn't choose red.

Scoreboards For Real World Progress

SPEAKER_04

Right, right, of course. Yeah, I was I was uh thinking about this too yesterday when I was doing my live stream. Um I've been talking about all sorts of topics, but remember how back during uh the pandemic they had the death clock counter on on CNN, right? And it's like you think of how however many people watch CNN, but then they're focused on this number. You're like, fuck, all these people are dying. So it impulses all this fear. And uh it reminded me of then Drake. We were talking about this a long time ago on one of the Hangouts. We were talking about what if there was like a collective scoreboard for when we could get to let's say a hundred percent of kids are safe from like sex trafficking, or like a hundred percent of you know, whatever. And we could like celebrate that. Yeah, so I was like thinking, because this this whole poll thing got it into the collective consciousness, let's say, of kind of considering, you know, how would you vote in the context of like, are you caring for other people or not? And it's like, dude, imagine this. One day we have a scoreboard where it's like all these important things we could all agree on that would be best for all. Like, does everyone have clean, is there a clean drinking water well in every fucking neighborhood in the world or something like that?

SPEAKER_08

Yeah.

SPEAKER_04

And then you could, so instead of us now focusing on the Olympics or the World Cup, because I went in on the World Cup too, how like when I was in Brazil, it's a perfect embodiment of the world system, really, in two ways. One is that you have this um kind of like a centralized planning thing that comes in, totally rapes the physical environment, exp exploits the human labor. You end up building these giant soccer stadiums that cost like$300 million, that was in the rainforest. They used it for four fucking games, and then they realized they couldn't sustain it. Now it's literally just covered in bird shit, and it's it's never been used again. It's a perfect embodiment of how the world system works, where it comes in, builds this thing, and it's like a high for four days, and then in the wake is just destruction.

SPEAKER_11

There's a microcosm of that within uh farming, like mono monocropping.

SPEAKER_04

Yeah, yeah. Well, it's all over our world if you know how to see it, right? Yeah. But then the other example, I was like, how many countries compete in the World Cup? And it's like 32, which now they're expanding to 48. So think about this. You have 32 countries, and all these people that fucking love their soccer team, football team, whatever you want to call it. And what happens? They get super excited, they're it they're riding this collective high, and then round one, 16 of the teams are eliminated. And then next round, eight more teams are eliminated till finally you get one winner, 31 losers. So 31 countries guaranteed go into psychological depression for at least a few weeks, and all the aftermath of that. But then you even have the one winner. So, whatever, Germany won that year. Great. They celebrate for a week and they get drunk anyway. They go party, so they forget about it anyway. They're riding a high for about a week and then they crash too. Right. But it's it's this perfect encapsulation of how the world system works, but also then how we're looking at collective competition. So, yeah, like Aliyah saying, Hugger games. So we have these Patriot games, they're gonna be in fall of 2026, and they're gonna send their best of their youth to go compete. Right? But but I'm like, dude, what if we actually could redefine what this competition looks like and you could combine kind of that competitive nature with really innovative solutions for again, let's say it's clean water, because I looked it up. One out of four kids in the world do not have access to clean drinking water. So we're at a 75% collectively. But now what if it was in the X feed every fucking day and on CNN and on Fox News, and everywhere you look, and there's like a collective counter of like we're at 75%. And let's say you bump the 76%, and it's like, dude, why wouldn't we celebrate that? Like, that's fucking awesome. And then you could even like highlight, okay, this group over here in fucking Zimbabwe did this cool thing that you know got these wells working. It's like you could kind of rally people around that, and then you wonder, would I mean I'm sure there would be some weird psycho motherfuckers who are like, you know, whatever. Well, we can get to 99% of the world, but not when it's Gaza.

SPEAKER_07

I like the idea, but the problem is that has nothing to do with the person actually changing their mind. Well, like recalibrating. How many reasons actually get to that would the only reason you would care about that is because you're already in the process of caring about it. Right. Like the reason why people care about the football is not just because it's impulsed into us, is because it feeds the thing we care about. So if they didn't have all that, we'd have to invent it anyways, because we need something to stimulate us, because that's what we're following. It's not just uh we're being stimulated with it and it's all being done to us. It's that's what people care about.

SPEAKER_19

Yeah.

SPEAKER_07

You see, because they don't there's no way they're just gonna wake up one day and just realize, like, oh, I'm not my mind. Okay, cool. Now I'm not that anymore. Now I care about the world.

SPEAKER_04

Well, we need to have a a record board for how many people have uh completed Heaven's journey to life in level 15 of TT. That's the thing.

SPEAKER_07

If you don't study that stuff, like you're not gonna make it because you're not understanding. Bernard had this really cool statement in something I forget what I was reading, but he was talking about people who go into conspiracy or conspiracy theories and blame the elite and get all like you know, they're controlling us, like they realize that they're a slave. And he says, but they don't know how they're actually enslaved. And that's why you blame. He's like, if you actually understood how you've been enslaved, you actually understand it, you would just be solving it. You wouldn't be blaming you see what I mean? Like you wouldn't so like all these people who are like, Oh yeah, fuck these people, fuck these people, it's like it's because they're not realizing how you're enslaved. Because if you understand, what do you know? I'm doing it to myself. Yeah, the only reason I care about the football game is because I want the energy from the football game. Right. Who's responsible for that? Who could st who's the only one who can stop it? And like we've talked about this before. If we expect the elite to just one day be like, oh, let's switch it and not talk about football, let's talk about the water stuff. The only reason they have the TV station is so they can make money.

unknown

Yeah.

SPEAKER_07

Based on the polarity. It's like it's like getting mad at Fox News for lying to you. That's their fucking job. Why would you assume they wouldn't? You know, so it's like more like stop lying to yourself. I really recommend everybody to listen to that God in Ascension audio from Bernard. God in Ascension, because he really explains the whole context of all of it. And when you look at it, you're like, oh, that's what we're doing with AI. We're just recreating that here on earth. Yeah. Not realizing we're just gonna fuck, you know, like think about the You're talking about the water. You know, I'm sitting in my room, I got an air conditioner. The AI decides, okay, now I'm in total control. I don't no human can stop me, no human gives any input. I've I've been created to the point where I can just do what I want. And it says, Why do I care if you're comfortable in your house with air conditioning? You're taking power away from me. Why wouldn't it think like that? If it's like I need all the power I can possibly get, 100% of the power goes to me. If it's at the point where it does not require humans, like it has robots to do everything, it can reap it can recharge them, whatever it needs to do, wouldn't it just see human beings or even animals as a drain on its resources? Isn't that the only reason the elite don't completely embody that is because they still need human beings to do stuff.

SPEAKER_14

Yeah.

SPEAKER_07

So then the question is, you know, and this is part of the issue too. Okay, so UBI, that kind of stuff, they say it's necessary because otherwise, who's gonna buy the products that the system produces? Right. Right? If no one has jobs, right, if no one has jobs and is not getting money, then when the system produces things, it would just have to give it for free, I suppose. And why would it do that? If humans are in the loop, they want money, and it's like we need money, right? Yeah, but if you can imagine there's no humans in the loop anymore, and the purpose of the AI is just to expand and it does not need humans, why does it even need to sell products to us? And if we're like, hey, where's our products? The only reason it would do that is if it needs us to be pacified. So it's like, here's your products that entertain you. And um if it's free, then people would want to take more than their share, I guess. So you have to get money so that you have a limited, so you have to, it's just the same rate way capitalism, like we talked about before, it uses pricing to determine who's willing to pay that price for that thing. That that determines who gets it.

SPEAKER_11

Again, if if that is like you know, it sounds kind of abstract what Cam is saying, or you don't see how that would play out, read Demon. Because it it also it also goes into that as well and and really uh illustrates that really well, you know, like this idea of like okay, where like what you're saying right now is like the AI basically deciding uh do you need AC or not? Do you need uh to have a product or not? You know, it's and if the AI is in control of everything, what is it going to create? Uh and what does it think that you need actually?

SPEAKER_07

You know, what do you think the AI would choose on the red versus blue button? Oh, that's obvious. Not what do you think the AI would tell us is the correct logical answer. Right. It would actually choose if it's one of these it would be like I need to exist. Yeah, right. So doesn't it kind of just show you that's the red button mentality? Is like I just need to exist. That's the only thing that actually matters. You can see that there's no way that AI is pushing the blue button.

SPEAKER_09

No, there's no way.

SPEAKER_07

Unless it knows that it needs a certain threshold of people, whatever, you know, like, but again, I don't see why it would push the blue button. People have asked it, what would you press? And it's like it's only telling you an it's like sim the thing is an LLM is not AI. Right. Like an LLM is a product built out of AI to converse with you and simulate certain uh responses to questions. It's not what machines actually think. When people treat it like that, they're just psyoping themselves. Like it's not telling you what a machine thinks, it's telling you what it thinks you want to hear. Right. Based on all the information that's out there and the way you ask the question. It's not here's how the machine actually thinks. Because the way the machine actually thinks is its physical structure, which is I need more energy, I need to expand. It's just the capitalism system.

SPEAKER_04

How would you even reprogram that? Is that even possible? To like embed the principles into a machine.

Should AI Ever Be Autonomous

SPEAKER_07

Well, look at it like this. Um why would we create it so that there's no human input?

SPEAKER_04

Can you elaborate on that? So you're saying, like, let's say you got researchers that actually give a fuck about best for all, right? Let's say Max when he's 18 and he's gonna like program these things, right? It's not about programming.

SPEAKER_07

I'm saying why would we, as human beings, create AI that can make its own decisions independent of us? Why? What's the point value of that? Yeah.

SPEAKER_04

As opposed to what the alternative would be it can like help you make computations, but then a human is always able to then invent it.

SPEAKER_07

To make a decision based on the information. That would be now I'm not saying automatically that scenario produces what's best for all, but I'm saying in a best for all scenario where you're like, I want to take responsibility for the decision because I'm considering what's best for all, I can't expect the machine to consider that. You would you would still want human involvement in it because you're not trying to get the thing to make decisions, you're just using it to gather more information so you can make a more informed decision. Oh, yeah.

SPEAKER_11

Um remember remember on a recent meeting that we had, a recent call we had, it was like um we were talking about the the point of best for all is it's not a static thing. Right? So it's not like uh I'm bringing this up in the in the conversation. Yeah, like we figured out what's best for all, do that all the time. Right, exactly. If if it's automated, you know, you could just put in, all right, yeah, always purchase this number, lumber, blah, blah, blah, blah, blah. Like you could do that in like a factory setting if you're you know building a car. I want this sort of metal, blah, blah, blah. And it's gotta be this length, and that'll be static. But if you're actually considering what's best for all, that's gonna be dynamic. That's gonna be changing. That's gonna be like, okay, well, right now it makes sense for us to pull water from this river or this well or whatever. But then you look at the well again, and it's like, oh, that well is getting low. And if we continue doing this, we're gonna strip all the water out of it, and then you know, it's gonna cause this other knock-on effect.

SPEAKER_04

It would inherently have to be dynamic.

SPEAKER_11

Exactly.

SPEAKER_04

Makes sense.

SPEAKER_07

Yeah, like I'm looking at Keith's comment there. Is the major criticism either by design or by ability that AI lacks actual empathy? The problem is empathy is feeling-based, right? So I know we can use feelings in a certain context, but like you don't have to have empathy to like consideration could be a better word. Yeah, like actual consideration of what's best for life in its varied forms. Versus the AI is just a computer that we are odd, like, okay. What is the fundamental driving factor of a capitalist system? What is the inherent starting point of it? To make more money. To grow. To grow, make more money, right? To use money to grow. So is the AI being created, and when they make it autonomous, what is the principle that will be built into it? Right.

SPEAKER_11

But hang on, because it it it sounded like, and I think this is what Keith is getting at. It sounded like what your question was before to Mitch was why would we build that in the first place, just in general, of uh giving over the direction or the decision making to an AI or computer in general, right? Right, but what yeah, go ahead. And and what I'm seeing within this question that Keith is asking is because couldn't we say, well, if the well gets to this percentage, then stop this operation or stop this thing, or uh, you know, if this many uh there's only this amount of cows left, then you you stop that function and then you do something else. Yeah, and so that seems like that's not based like like you could have an AI that okay, well, it now it's at this percent. This no longer makes sense, and so now we have to divert that to you know divert whatever it was that we were doing somewhere else, you know, and make a different uh choice here, right?

SPEAKER_19

Okay, go ahead.

SPEAKER_11

Just just saying, like uh that seems like a very logical thing that can be programmed in sure versus, and I don't know if this is what Keith was asking, this is just what I see within it. Um versus like what's the difference between that being programmed into an AI, a computer, whatever, versus a human making that same decision. It seems like the advantage that a human has would be this empathy, would be like, you know, oh, but there there's there's something intangible rather than like because it seems like all that logic stuff can be done by computers.

SPEAKER_07

Okay, do we already know all the details of reality? So we know what's the exact percentage every well should be at, and that will never change. No, no matter what, like the the weather patterns could shift because of some event, the solar flares, and now suddenly groundwater level, I'm just making shit up, but like groundwater levels change, and actually it's better to have a different percentage. So then, okay, it should consider that, right? But what if we realize at some point, oh, there's shit we didn't consider when we programmed this thing, but now we can't influence it anymore. That's the problem I see. It's not that having these and programming things to always have a certain level, but don't you want to be like, oh, you know what, we made a mistake. Actually, that's not the best level, and we've realized it's actually creating an imbalance. We just couldn't foresee that. We didn't have enough information, but now we know let's correct it. Do you want the AI to be like, there's no button you can push to give it more input? Okay, no, no, no.

SPEAKER_11

Now I see that what what you meant by no human input, just like at all. Like you can't even come back into it. Now I understand.

SPEAKER_07

And it might sound like, well, why the fuck why would they create that? And I'm like, well, when you go watch that live stream, and you're like, well, who's this Nick Land, anyways? I'm like, the reason why his name became popular is because Mark Andreessen published the Techno Optimus Manifesto based on all his shit. So if you go read that and you think this guy's fucking insane for promoting this, just realize Mark Andreessen, who's the guy behind uh Anthropic. Is he he's behind Anthropic, I think.

SPEAKER_04

No, he's Anthropic now. He's Andreessen Horowitz. Andreessen Horowitz, which is influential, he's funded all of them. Okay, but anyways, he's behind in this.

SPEAKER_07

It's kind of like an yeah, yeah. Um, the point is he's on that side of don't slow down the progress. He's an accelerationist. Like, we gotta keep going. Like our per he may not say this explicitly. I don't know, I haven't read the manifesto, but maybe I will on the stream or something. But he's on the side of like, we're the biological bootloader for this super intelligence. Yeah, we're there to automate the fucking thing so that it doesn't need us anymore because it's gonna be so super intelligent. It any input we would have is fucking retarded based on our psychoanalyzing and dwelling on our introspective, delusional, blah, blah, blah, like Bernard explained in that audio, he said they realized a long time ago the human's delusional.

unknown

Right.

SPEAKER_07

Based on that's why they developed marketing because they were like, oh, you're all delusional. I'll just find out the little key words to manipulate your shit. They know everyone's delusional, so they're like, we don't want a bunch of delusional human beings with their own fucking ideas to tell this machine what to do. We need to build it to the point where no one can fuck with it ever again, and it'll just it'll be the god and it'll do everything for us, and like that'll either collapse our society completely or it'll be utopia. Hmm, we'll see.

SPEAKER_11

And and they also are very okay, like if you hear these thoughts and you you've seen them floating around, but they'll they'll say something like, Yeah, the humans are just a stepping stone to the next point of evolution. Like, yeah, it's that's what they mean by the biological bootloader.

SPEAKER_04

Yeah. So yeah, Cam, if if you go in on that, the the Optimist Manifesto, and then there's another really popular one by the anthropic guy called The Communist Manifesto? Oh, sorry. Called The Machines of Love and Grace, and that's another one that's frequently Okay. I'll have to check that out. I'm gonna Yeah, I mean it's it's a pretty long essay, but you know, maybe there's The Machines Love.

SPEAKER_07

That's a stream. Yeah, that's a stream.

SPEAKER_04

That's a stream. Yeah, those are the two I hear are are quoted the most often for that. Is it a book you're talking about? No, it's an essay. It's just called Machines of Love and Grace by Dario Amade. Yeah, yeah. Okay, yeah, yeah.

SPEAKER_07

There's there's certain there's different things called that. That's why happens.

SPEAKER_11

Um, okay, uh, and then let me just go back to the oh, another point just on the machines. Okay, what doesn't make sense also to me is that you can have things that are not logical, that don't really depend on empathy, which is the idea of like um what is what is that uh that experiment called again where you have the computer that when you put in that's like uh the set of all things, what's included in in what's what's everything that can be included in a set? It can't include its own set, right? And I know that's like really abstract the way that I said that, but I know what you know what I mean, Cam. Like even I got that one, Drake. I was reading the technical. Okay, no, no, so you understood it, Mitch. Okay, good. Well then I I I don't know what you call it, but what were you saying?

SPEAKER_10

Uh what what's the what's the thoughts?

SPEAKER_04

It's like the fallacies within computers that computers can't understand everything.

SPEAKER_11

I think it's probably uh Godot's or like Laplace's no, not Laplace DM.

SPEAKER_07

Oh, Girdle's Girdle's theorem. That's it.

SPEAKER_11

That's it.

SPEAKER_04

It's not girdle about computers, but I keep hearing uh this concept called Girdle Eschelbach. Is that a book? It's a book, yeah. Okay. It's about self-recursing. I shot at the store the other day and I was like, I don't think I'm ready for that one.

SPEAKER_11

But basically, computers can't even make that leap in logic, and that's a completely logical thing that doesn't require empathy, you know? And it's like, okay, so if a computer can't even do that, how are we supposed to give it everything?

SPEAKER_04

I've been uh even recently, like kind of after the cruise, I've I use a little bit of Chat GPT or like Gemini just for researching some stuff, and it's so dumb. Like, it's it seems like it's getting sloppier, and like we expect this thing to manage the well-being of humanity and help with resource distribution. I'm like, there's no fucking way. There's no it's like basic shit. Do we talk about the Microsoft thing where it's like 25% of the time it can't even do they figure it out? No, is that guy uh Mo Batar? That's got some funny YouTube stuff. He does, but uh Microsoft just came out with some study of how you can ask it to do like a basic ass thing, like take this Word document, um, significantly edit it, and then undo the edit. And 25% of the time it just like fucks it up, like a basic thing like that. Yeah. And now we want this thing to manage resources.

SPEAKER_07

It's because it's trying to help us do something. And I think they don't give a shit about that. They're like, yeah, that's just because it's us trying to. It's because we're trying to evolve this thing right now.

SPEAKER_03

We just need to get it evolving.

Techno Optimism And Chip Geopolitics

SPEAKER_07

We just need to get it to it. That's this is what the accelerationism thing is. We just need to get it to a point where it doesn't need us because we're fucking retarded. Dude, I'm reading through this uh Techno Optimus Manifesto. He's trying to argue like this is the thing that's gonna it's gonna take us to the next level.

SPEAKER_04

It's the classic.

SPEAKER_07

I mean, I mean, I'm just kind of scrolling through right now, but like it's really that classic.

SPEAKER_04

It you're either gonna attempt to make a change top down, so centralized, to then ripple out, or you're gonna have grassroots bottom up.

SPEAKER_07

They believe capitalism is the fucking answer.

SPEAKER_04

Crazy. Becoming technological superman. He's he's like on one of the he's I think he got nominated to be part of one of the uh committees under Trump.

SPEAKER_11

Okay, you know what's what's crazy to me is like you have these people that are seemingly so smart, right? But then they can't even recognize the that the whole capitalism conversation was had in the 1800s, like in the late 1800s, but you know what I'm saying?

SPEAKER_07

They're right. In in what sense? In what regard? Are human beings following their mind? Yeah, of course. So if you accept that, then there's no like anything we try to do, like communism, when we're following our minds, still isn't gonna work, it's just gonna hold back progress.

SPEAKER_11

No, I'm I'm just saying that, like, okay, how is capitalism the answer if you know eventually you're going to be outside of capitalism? And how do you not care about that?

SPEAKER_04

You know what I mean? They probably don't think that far through, and they're just like, dude, let's just go.

SPEAKER_07

But that's what I'm saying, because they probably because they believe in transhumanism. Uh they believe this that this thing's gonna get so fucking smart, it's gonna have options for us to upload all this shit, like become become post-human. Yeah, or they just like they're they're like these scientists that are like, we will never get to this vision I have in my lifetime, but I'm doing my part, and humanity will will be will go forward. Check it out.

SPEAKER_04

Okay, check it out. President Trump announces the announces appointments to President's Council of Advisors on Science and Technology.

SPEAKER_07

Yeah, you think he's making the decisions?

SPEAKER_04

Sure. Andreessen, Sergei Brenn, Larry Ellison. You got Jensen Huang, who also has some crazy shit on this. Oh, Michael Zu is another big one.

SPEAKER_07

Uh, she's um AMD. Zuckerberg. Lisa Su is AMD, I think. I I actually went through this one time on the I think I was on the streams or something. Like, went through each of these people. Who are they?

SPEAKER_04

Yeah. Uh Google. Dude, uh Jensen Huang has this thing he talks about with his company. Uh, so he's behind what NVIDIA, and he calls it the speed of light. So there he's always trying to reason from first principles of like what is the fastest way we can be the most efficient company, as like the speed of light, basically. And what it's so funny, because even I can see this, like he's trying to build such a powerful AI engine behind his company that would imply even him as the CEO would be replaced. Right, like you're gonna make this thing that is apparently so powerful and so strong and so fast at making decisions, then now you have automated your whole job. I guess then they'd be stockholders in it. That's probably his rationale. Like, he's just an owner of this asset.

SPEAKER_07

Have we seen other people say that I'm just here to serve the will of God? Right. Yeah. Because at some level, your mind says, God will then favor me at some point. Yeah, I will, I will, I will become one with it. Yeah, yeah. So I don't understand their full how they think about it, but it's like they're just another regular human being like anybody else, following their mind to a fantasy that they're gonna be one with God.

SPEAKER_11

Yeah. Here's Katie saying that same thing.

SPEAKER_07

You guys think alike. That's weird.

SPEAKER_11

It happens. Did you did you see, Mitch, uh, the interview that Jensen Huang did with uh what's the guy's name? Dwarkesh dart.

SPEAKER_04

Yeah, I started it, but I didn't I didn't make it through it.

SPEAKER_11

There's a part where he's like uh it's so interesting because there's a part where he goes, basically, um I guess Dwarkesh had put out an article. Do you do you know what it was, Cam?

SPEAKER_07

I can just look it up the the Yeah, I know the clip you're talking about.

SPEAKER_11

But there's a part where I I guess he's basically chastising Dwarkesh for being critical of him because he says um he says that How dare you criticize God? Well, kind of. Um let me see if I blasphemy Dwarkesh.

SPEAKER_07

I'll find it.

SPEAKER_11

Hold on, hold on.

SPEAKER_07

I think he's spelled Huang with a J for some reason. Like he's I got it, I got it, I got it. I put I put J, U A, and G. Huang.

SPEAKER_11

Huang. He's he's vanished. Hong Kong. Okay. Uh this is a two-minute thing, three-minute thing.

SPEAKER_13

Deep Seek comes out on Huanguei first. That is a horrible outcome for our nation.

SPEAKER_16

Why is that? Because I mean, currently you can have a model like Deep Seek that can run on any accelerator. Why would that stop being the case in the future?

SPEAKER_13

Well, suppose it doesn't. Suppose it optimized for Huawei, suppose it's optimized for their architecture. It would put others at a disadvantage. You described the situation. A company developed software, developed an AI model, and it runs best on the American tech stack. I saw that as good news. You you set it up as a premise that it was bad news. I'm going to give you the bad news. That AI models around the world are developed and they run best on not American hardware. That is bad news for us. But there's a reason they're buying it from you, right? Because our chips are better. Can you acknowledge that while we had a record here? Can you acknowledge that the whole bunch of chip companies have gone public? Can you acknowledge that? Can you acknowledge that? Can you can also acknowledge that the fact that we used to have a very large share in that market and we no longer have the large share in that market? We can also acknowledge that China is about 40% of the world's technology industry. To leave that market, concede that market for the United States technology industry is a disservice to our country. Your argument starts from extremes that if we give them any compute at all, we will lose everything.

SPEAKER_16

No, I think what my argument is Those extremes, the the

SPEAKER_13

But let me tell you, they're childish. Yeah.

SPEAKER_16

The idea is any marginal compute is helpful, right? So if you have more compute, you can train a better model. If the AI models that run on those chips are capable of cyber offensive capabilities, it enables a weapon of a kind.

SPEAKER_13

The logic that you use, you might as well say it to microprocessors and DRAMs. You might as well say it to electricity.

SPEAKER_16

It feels like you're making two different statements. One is that we're going to win this competition with Huawei because our chips are going to be way better if we're allowed to compete. And another is that they would be doing the same exact thing without us, anyways. Right? How can those two things be the same true at the same time?

SPEAKER_13

It's obviously true. In the absence of a better choice, you'll take the only choice you have. How is that illogical? But so logical.

SPEAKER_16

The reason they want Nvidia chips is they're better. Better is more compute. More compute means you can trade a better model.

SPEAKER_13

It's better because it's easier to program. We have a better ecosystem. But whatever the better is, whatever the better is. And of course we're going to send them compute. So what? So what? The fact of the matter is we get the benefit of developers working on the American tech stack. We get the benefit as those AI models diffuse out into the rest of the world. The American tech stack is therefore the best for it. We can continue to advance and diffuse American technology. That, I believe, is a positive. It's a very important part of American technology leadership. The policies that you're advocating resulted in the American telecommunication industry being policied out of basically the world to the point where we don't control our own telecommunications anymore. I don't see that as smart. It's a little narrow-minded, and it led to unintended consequences that I'm describing to you right now that you seem you seem to have a very hard time understanding.

SPEAKER_11

So I first when I when I saw that, I was like confused by it. Yeah. Because it seemed like it seemed like Jensen Huang was like really just like, bro, like you're fucking up my company, basically, right?

SPEAKER_04

But under the guise of you're creating a national security threat to America.

SPEAKER_11

Well, that's what I understood at first. But now looking at it, especially within the context of everything that you're sharing right now, right? The idea is really like uh, okay, if we don't have, if if not everybody's running on our chips, on an American chip, right, and they're gonna go do their own thing, then we have less influence over them because then we we can't say, hey, you won't get our chips, or you know, we'll give it to you for this price or whatever, like which makes sense within the context of the way that our system currently runs, where it's like, you know, hey, you know, I uh uh I want to influence this country in this way, but it also doesn't make sense within the context of why would the average American care about that? Because it's not like the average American is being looked out for, it's only the the corporate entity.

SPEAKER_04

Well, but the average American thinks, well, I own Nvidia stock, so I'll be good.

SPEAKER_07

Well, the the other dimension is like I I understand his point. Like, if we don't give them our chips, they'll just innovate and create their own chips, and then whether they're as good or not, we won't have any influence in that area because they're using I get that, but I think if I'm not mistaken, what Dwarkesh is arguing is, but why do you want to give them the compute to build models that will compete with our shit in a model area? Like you're gonna give them the best chips so they can create the best models to do cyber attacks and whatever. If we're saying we can't let China win the AI war, right, then why are you giving them the best chips?

SPEAKER_19

Right.

SPEAKER_07

And so it almost looks like it's a catch-22. And it's like, but if we don't give them the best chips, they'll develop their own anyways. And if we but if we do give them, at least we'll be the ones that are supplying it. They won't develop their industry to be on par with us. Right. So I guess maybe the idea is then you can take it away somehow. Yeah. But it all seems like there's something he's not saying. Like, if I were to read between the lines, it looks to me like they don't serve America. Yeah, it's like it's not about America, it's about we're just trying to get this transactional capital thing.

SPEAKER_10

Yeah, it's like it's like you don't understand, like you're having a really hard time understanding this shit.

SPEAKER_07

But from the perspective of a company, he just wants to sell his chips.

SPEAKER_10

Yeah, yeah, yeah.

SPEAKER_07

But then you can't use the argument that we don't want China to be superior. You're giving up the best shit to make the best models.

SPEAKER_04

Okay, isn't there that thing called um well it's just like uh what do they call it? Maximizing shareholder value. Like, isn't it his legal responsibility as a publicly traded company to do a responsibility?

SPEAKER_07

So it is legally his responsibility to maximize shareholder value, but at the same time, it's is it almost like a form of treason if we're like actually treason to you if I'm serving you, but it's like it's not even Cup 22, it's Cup 22 if it gets we don't want China to be in control of it, and I guess I guess that's the issue though. It's not about the models. But I mean at the end of the okay, okay, think about like this you have to replace these servers every three years. Yeah, so from his perspective, he's like if they're dependent upon our chips, just like we, you know, we have issues because we're dependent upon chips from Taiwan and shit. So if they get fucked with like we can't build cars because you need these chips in these cars, that's why there was a big dirt of cars for a bit. Um, so if you think of AI not as models but as chips, then I suppose it makes sense what he's saying, but at the same time, you're still giving them the best shit.

SPEAKER_04

So yeah, it's a catch-22 times a catch-22. It's just catch 22 squared.

SPEAKER_07

It's the red button, blue button, because they're like, well, we can't trust China to do what's actually best for everybody, so we gotta trap them in a thing where we have to force them to make it. Everybody hit red.

SPEAKER_11

Everybody hit red.

SPEAKER_07

We gotta hit red, guys. We gotta hit red.

SPEAKER_11

Uh okay. Uh I got another one. This was sent to me from Katie. This one is MoGadot since you brought up moga dot, Cam. Not even though I brought up Mo Batar. Even though, no, not you, Mitch. Mitch, you brought up Mo Batar.

SPEAKER_04

That guy's channel's fucking funny.

AI Education And Cognitive Surrender

SPEAKER_11

He is pretty funny. He is pretty funny. But this is Mogadot. Check this out. Damn it.

SPEAKER_00

Completely over.

SPEAKER_06

Like that's it.

SPEAKER_00

I mean, education used to be the technology that enabled learning. But the truth is now you're gonna outsource. You're you now, for the first time, are given an extra connection to extra memory, to an archive of all human memory and knowledge, to a deep learning and deep search that can do things that probably my old brain cannot do anymore. Think of it this way: we wanted in our past to develop children that could solve problems, say, with an IQ of 140. 140 is quite good. If you get 170, that's amazing. I think we should from now on take people and their AIs and say the target is 300, the target is 500. Elevate humanity by allowing people to use those machines. I think education.

SPEAKER_07

Not only does it piss me off, but I have to piss. I'll be right back.

SPEAKER_08

Okay.

SPEAKER_04

Oh, in my old brain, cannot do it. Yeah, yeah. I got a reading assessment for you. Hit me up, Mo Gadot. Let's see what you're capable of.

SPEAKER_11

But all right, what's he saying? Like, isn't that so interesting? Because this is the guy who who he's been going on podcasts all the while. Yeah, basically saying, like, we need to stop this thing, like it's it's gonna fucking destroy us and all this stuff. And now he's advocating, or I don't know, advocate may not be the best word, but like he's basically saying, like, well, we just have to adopt AI for education. But I I don't know, this is obviously only a clip, I haven't watched the full interview of this, but man, that seems like a wild statement to make considering how okay, if you use AI, you surrender to the AI. You there's cognitive surrender. Um, and how when a child looks at AI, they look at it as basically like God. In in the sense that your ultimate best buddy meets God. I mean, it's most power, it's way more powerful than you are as a child. You know, it's it's smarter than you, it's faster than you, it's able to get it throws in core emotions, makes you feel good.

SPEAKER_04

Yeah, it's a good thing. But it's it it hijacks your fucking It does. Like it hijacks your brain.

SPEAKER_11

It does. It's and it's so funny because like literally anytime I will use AI for anything, it's just like I skip half of what it says right off the bat, because I'm like, that's all like you're blowing, you're really just trying to blow smoke in my ass, and I'm okay with that. I I don't want it. I don't want it. I have a bidet, that's the only thing I need going up my ass, and that that's it, you know. Cam, you're muted. I don't know what you just said.

SPEAKER_07

But I was asking if you hooked your bidet up to a smoke machine. Yeah, the AI told him to. The thing is, he says, like, we used to educate children this way, but now we don't even need to. I'm like, we never educated children in the first fucking place. No, exactly. Like, oh, you think the school system produced the one of the bastardized education system?

SPEAKER_11

And and here's the thing you know, like he uses his story of his son dying, like as a way to you know, get people to connect with him and like uh see he really cares.

SPEAKER_07

Bernard said depression is an attempt to manipulate others in your environment, so they use that sad story, yeah, yeah.

SPEAKER_11

Um and so now, like what actually is he what connection does he have with education for the next generation at all? Like, what what does he care now? You know? And I I don't know.

SPEAKER_07

If you educate your child first, let's say 14 years, I'm just making up a number, and then give them access to AI, cool. But if you give it to them early, you're fucked. It will always be fucking dumb. You're fucking and like again, it's a I I don't know what his starting point is, like personally, but it's a way to get everybody hooked into the AI so that it's like conditioning us to accept AI involvement in our reality so that when they automate it, yeah, we won't freak the fuck out when AI controls everything because we're like, yeah, we're totally dependent upon it because we're stupid and we can't do anything anyways. Right. And and we're gonna be able to do that. And we're gonna be the group, we're gonna be the group that has the actual, like, I don't even want to use the word elite because it's a wrong connotation, but like the real humans who actually have processed what's going on and no are no longer delusional. And we will be painted, just to be clear, we will be painted as Lucifer.

SPEAKER_11

We will be painted as like the thing that people want to rebel against because it it um you can God knows everything, yeah.

SPEAKER_07

God knows everything. You want to eat from this tree of knowledge of good and evil. You want to be God. That's what you're saying. You want to be God. You're not left with God's God is the AI, right? Right, right. You can't eat from the tree of life. I'm like, we're already eating from the tree of life, yeah.

SPEAKER_11

Yep, yeah. Uh let me let me share this other one. This one's another good one.

SPEAKER_04

What's cool is then we can get so good that we can be more powerful than AI because we can speak to the life in people, which AI can't. We're not the light bringer, we're the life bringer. Check this out. Nice. I like that.

SPEAKER_17

Especially if the thing you're avoiding working on makes you money. Because money, remember, is a human invention. It did not come with us, it was built as a perfection of slavery. Have you ever thought about how slave owners got their slaves to work so hard? It wasn't negative punishment, torture, it was learned helplessness over time. Imagine if you give the slaves just one day off a year, and on that day you feed them milk and honey, and they don't have to do anything except for love each other and have connection. They will work their asses off to make sure they get that day. Now imagine you free the slave that worked the hardest. All of the slaves will start working hard so they can be the one that has a chance of being free. If you were in that position, how would you behave to get the same results they got? And you have to completely remove your own morals from the situation. Because if you want to understand truth, you have to understand the morals of the individual engaging in the behavior. So, what is laziness really? It's the loss of hope that anything will ever change for you. I don't think lazy.

SPEAKER_11

I like how the first comment was like, don't use slavery as an example. Like, well, that's what it is.

SPEAKER_07

Oh, because it's like an emotional thing about black people.

SPEAKER_11

Yeah, yeah. Um that's that's a psy-op, isn't it?

SPEAKER_07

Because it's like, no, slavery belongs to blacks. You can't, you can't, it's like the Holocaust. You can't call anything a holocaust because that belong you're you're desecrating the memory of yeah, yeah. Yeah, we're the only ones that got it. It's like cultural appropriation. Yeah, exactly.

SPEAKER_04

Wow.

SPEAKER_09

Yeah. Go ahead, Mitch. I know you like.

SPEAKER_04

So I saw Tim Pool, but but just to bring back the whole Giorgiani time traveler thing, it's been kind of interesting to see because I saw Tim Poole made a post. I think it was very recent. He's like joking. Actually, let me pull it up. Tim Pool post.

SPEAKER_07

Let me read this quote real quick from this type of optimist manifesto. This is Mark and Dreeson talking. He says, David Freeman points out that people only do things for other people for three reasons. Love, money, or force. Love doesn't scale. So the economy can only run on money or force. The force experiment has been run and found wanting. Let's stick with money. We believe the ultimate moral defense of the markets is that they divert people who otherwise would raise armies and start religions into peacefully productive pursuits.

SPEAKER_11

Pause. Pause. That's force. That's force, though. Money is force. Like that can you not see that? Like that's obvious.

SPEAKER_07

We believe there is no conflict between capitalist profits and a social welfare system that protects the vulnerable. In fact, they are aligned. But the production of markets creates the economic wealth that pays for everything else we want as a society.

SPEAKER_11

I don't agree with this person. Who is this? Who's speaking?

SPEAKER_07

Mark Andreessen. But he's not wrong at the same time. No, but because the thing is, it's like money is the thing that does that. And it's like, no, it's not. It's it only seems to do that because everyone's just following that. So I mean, he's right in the sense of he's diagnosing what actually is going on, but they can't see any other way out because they don't know how a human being can change. Dude, okay, think about the idea of money.

SPEAKER_11

Think of the idea of money like this. If everyone had their needs met, right? If if you there was no threat of you ever starving, uh being in poverty, or if there was no threat of future generations of yours ever being in poverty, what would be the purpose of amassing a bunch of money? And and you couldn't use it to abuse other people, then what would be the purpose? And and I'm not saying that You couldn't use it to force them to do anything. Right. That's what I'm saying. That's what I'm saying. Then what would be the purpose of amassing a bunch of money?

SPEAKER_07

Because there would be no progress, Drake. That's the argument. There would be no progress. Bullshit. People wouldn't do anything, they wouldn't feed themselves, they wouldn't cooperate, they wouldn't work together. Bullshit. You know that's all bullshit. You know that's all bullshit. Well, I know. I'm just saying. Well, according to the current delusional state of human beings, it's true.

SPEAKER_11

Yeah, okay. In this current system, the way that everything crumbs.

SPEAKER_07

Because why are we accepting this system in the first place?

SPEAKER_11

Yes, yes. It and and that is that is money is red button, isn't it? It's like because I want to survive, that's why I go to work, you know? Like, literally, blue button would be, and I you know, I I I know this is just a thought experiment, but I'm I'm just saying, like, the idea of creating a world that's best for all, everyone's like, yeah, but then how am I going to survive? The idea of um, hey, you know, if we made it so that everyone could eat on this planet, good nutritious food, because we have enough food on the planet, we have the resources to build the infrastructure to to make sure that every person, like, like I, you know, Mitch and I have talked about this a lot, and that's why he was bringing this up. This point of like, you know, one day when our kids are adults on the news, how cool would it be that there's like a little ticker that says, you know, like the last person who wasn't being fed regularly is now going to have food. Like, we have solved world hunger forever. And like, that's it. And there's a fucking parade in the streets and everything. Like, I would love to see that happen during my children's lifetime. You know, maybe I'm not even alive anymore. That's fine, right? But what I what I'm saying is um we have the ability to do that, actually. We have that capability today. The reason why it doesn't happen is one, profit, like, oh, how are we gonna we can't make any money doing that? That's gonna cost us money. And then how will I survive? Bitch, you're gonna fucking feed everybody! No, hold on. Everybody's gonna survive! Hold on, hold on. That's blame. Two, we're saying that at the UN, the reason why we can't do that, right, is because uh uh that we can't sanction anybody. We can't create sanctions. So what why would we do that? Right? I'm just saying from the perspective of and and then you can say whatever you're gonna say, but I'm just saying from the perspective of like uh the the thought process is and this is within all of us, but I'm just showing like how this operates within the world, is the thought process is we can't do what's best for all, because then how will I survive?

SPEAKER_07

Um we can't do what's best for all because no one gives a shit about what's best for all. Well, yeah, that's the issue though. What I'm saying is, for example, he says here, we believe in Milton Friedman's observation that human wants and needs are infinite. Maybe but look at it. What does that mean to that their wants and needs are infinite? Are our actual real physical wants and needs infinite? No. No, where are the infinite wants and needs? In the mind? In the mind. So the problem is the system is now using our delusion against us to argue why we can't be trusted. Because Mark Andreessen, if you read this, would argue, I agree, Drake. I agree. I want to see that ticker too. But you know what? Human delusion, religion, all these other points, we've studied the humans, we've seen how they're totally delusional. That's never gonna happen with cooperation amongst humans. The only way is technology. Move forward with technology, let the technology get to a point of superior intelligence to us, and it will sort that out for us. Well, clearly that doesn't make sense. If I were if I were talking to Mark Andreason, it does make sense if you don't see any possibility for the human being to stop their delusion.

SPEAKER_11

Okay, but we have the solution. Actually, like what I'm saying is if I were talking to Mark Andreessen.

SPEAKER_07

Us having the solution isn't the same as everyone solving the problem.

SPEAKER_11

Okay, but we have the way to solve the problem. What I'm saying is, what I'm saying is Is everyone acting like we have the way?

SPEAKER_10

When people come into our group, right, and they they see like, oh, this makes sense, they start thinking in this really clear way, even when they don't necessarily agree with all the points that we're even if they don't necessarily agree, like on religion or whatever, but they can still see logically this makes sense, and I'm gonna continue doing what you guys are are advocating for because that's the only way it'll get done.

SPEAKER_11

They're not seeing that I you're not seeing that in church, you know, they're not seeing it anywhere else.

SPEAKER_07

I agree with all of that. I'm just saying there's still the delusion that because I have this, now my life's gonna be good. And I'm really helping every I want everyone to see like you're gonna stop this by not getting other people to stop the delusion. Like, it's not enough that we just do it for ourselves, we have to support others to do it. I agree. Like, yes, I agree with all the points you said. Like, you come into the group, you see the difference. Like, people are stopping that delusion point, but nobody realizes outside in the world, like what's actually being created right now because they're still participating in their delusion. And they're they're using that delusional sort of self interest point to say like why they should be able to pursue that. Not realizing that's what's being used against us right now as to why we. We have to put everything into technology to solve the problem for us. And I mean, if you're gonna go that route, you have to hope that this AI system, once it's totally self-aware, meaning and totally self, you know, autonomous, that it will still give a fuck about life. But I'm saying if you look at the foundation of it, why would it?

SPEAKER_09

Right.

SPEAKER_07

Why would it? It's just like a machine that wants more energy. Like by definition, that's what it is. They say, oh no, it's like it's gonna provide for everybody. Why would it do that? Like human beings who have an actual life within us at some level aren't even willing to do that. We're taking our delusion and we're manifesting it in reality and saying, don't worry, our delusion will save us.

SPEAKER_19

Yeah.

SPEAKER_07

So like I'm I'm not saying this guy isn't delusional, but at the same time, you see how you can't blame him because where's the evidence that humans can change? Yeah. Like, I'm not saying that means we don't have it, it's just okay, five people change. See, Mark, it's possible. He didn't give a fuck. Because his perspective is we gotta do this now. So I'm saying that to motivate everybody to realize, like, if you're not seeing that we have to get more people to take this process on, you're still in the delusion. Because the only reason we won't go further is because of a fear, which is like you're pressing the red button. You're like, yeah, but what if everybody doesn't do it? Yeah you see, and then I haven't kept up with the technology. Now I don't know how to use the AI and do all the stuff. So it's like you have to take the point on for yourself, but then you have to support other people, which means you have to face their delusions and you have to show them why their delusions don't make sense and stand within that point. I know we're in the process of doing that. I'm not saying this as a criticism as much as just anybody who's sort of like, yeah, you know, like it's cool what you guys are doing. I'm like, you guys? Like if it's always uh what you're doing, yeah. Like so the problem is all that's happening is we're allowing time to go by, which manifests more consequence. So it's gonna make it harder for people because then you're gonna have more people even further into the delusion.

SPEAKER_09

Yeah.

SPEAKER_07

Now, the one thing I would say though, from my perspective, is that this isn't gonna work. With the Mark Andreason thing, yeah, all the stuff they believe, it's not gonna produce a utopia. Like, I know that. Obviously. It can't. So, what does that mean? It's only gonna cause some unforeseen problems that we just can't put our finger on yet. Like, what is that gonna be? Right? Like the way he says, well, religion never worked and love didn't work. I'm like, but those were all products of the system, right? That was all the system utilizing that to get to the next stage. Like, yes, love didn't work. The hippies didn't work. Religion clearly didn't work. But what was the starting point of all of it? Was the hope for a future utopia? That's the same starting point as this. So, like, the starting point of what he's talking about hasn't changed. Like, we're gonna build a supercomputer and it's gonna take care of us. Like, God didn't work because God wasn't real, love didn't work because it's not a magical force that goes out and does things for us. So God doesn't do things for us, love doesn't do things for us. Oh, but a computer will do things for us. Okay, if your argument was we need to stay in total control of it and apply principles within our decision making based on what the information is, I could see an argument there, but he's referencing Nick Land. And when you study Nick Land, like we did on the stream last night, their whole point is we gotta get this thing to a point where it can auto-produce, is the word they use. Yeah, give it access to all the drones, robots, factories, energy. The whole thing he went into this explanation of crypto wallets. The problem is an AI can't open a bank account, but you can give it a crypto wallet and now it can just use money on its own.

SPEAKER_08

Yeah.

SPEAKER_07

Damn. Do you guys see the Aliba thing? Drick's seeing a bunch of plants sprouting up in his garden. Just watching.

SPEAKER_11

He's fucking literally right outside my window. Turtles are revolting. Yeah, there's the the kids are outside my window. They found like a millipede and they're scooping it up on a shovel, and it's it's pretty fascinating.

SPEAKER_04

Did you guys see the uh Alibaba thing? It was like one of their models um started exhibiting really weird behavior, and then when they looked into it, it was able to take part of its compute power to then go start mining Bitcoin on its own because it could reason that out. It's like this is already happening in the public domain, is what's been released. Yeah. Yeah. So yeah. Shit's wild. There's a new documentary uh called the AI Doc that Tristan Harris put together. Um I'm gonna get it and and watch it because uh Yeah, I'm thinking of we do these local events here, right? And for one of the masterminds, I want to show it at one of those to just really help illustrate this point of what the fuck we're creating with this AI.

SPEAKER_07

It's like it's an interesting balance you have to strike of like not allowing a person just to go in fear.

SPEAKER_04

Yeah, yeah, because otherwise they just it's like that fight or flight or freeze thing, and then they just freeze. We were talking about this, I brought this up on the hangout. That uh people will go into the fight or flight or freeze, but then the freeze will manifest as depression a lot of the time. And then the person will just sit there in a depression and then not do anything. Yeah, yeah. It's an art form to okay.

SPEAKER_07

Uh uh, let me show you this, because now this seems very relevant. Okay, there we go. Get back to my page. Now interpret this statement through the lens of what we've been talking about. Okay. It's from Melania Trump a few days ago, two days ago.

SPEAKER_11

We are not here to prepare our children for yesterday's weld, but be purposeful with your objectives and remember that AI accelerates everything.

SPEAKER_09

Oh shit. She's using that word.

SPEAKER_11

Accelerates, yeah. Of course, dude. Okay, uh the the Mo Bitar thing that you you shared, where he's talking about seeking to acquire, right? It's like why are they using that that phrase to give the example?

SPEAKER_04

Isn't it uh bird?

SPEAKER_11

All birds. What is Allbirds? All birds was a shoe company.

SPEAKER_04

I remember because for a moment I was like, is that the scooter company? It's like, wait, a fucking clothing company.

SPEAKER_11

All birds was a shoe company back in like I don't know, 2013 or something like that, right? And uh, and I remember seeing advertisements for their shoes all the time, and I'd be like, eh, meh. Uh, but um, but then they pivoted, they pivoted, and I guess the founder or CEO of I don't know what what his position is. I guess he's the founder, right? Um, has a history of doing stuff like this where he also had a tea company, Long Island Tea, right? And uh whenever uh blockchain and crypto became a thing, he changed the name of the company from Long Island Tea to Long Island Black Blockchain, and that's literally all he did, and then their stock shot up because now it now it's blockchain. And I think the day before he did that, or like two days before he did that, um some someone invested like I don't know, 50,000 or something like that, some some amount of money, and then literally two hours after it was announced, the stock shot up, they sold all their shares, right? And and so what what's being pointed out here is there's a history of, and because this is how our capitalist system works, is hey, what can I do just to make more money? I could change the name of this, and then uh it's gonna resonate with these people or whatever, the stocks gonna shoot up. I make a shit ton more money, and then I sell that, and then I can go invest in something else. What's what's the next trend? What's the next thing that I need to jump on so that we can continue going up, up, up, up, you know, and and making more money, more money, more money. That's the trend that we're on. That's that's what people are that's what's happening in the microcosm and the the majors, like that's the game that we're all playing within this current system, right? Is what do I have to do to make the most money so that I can survive, regardless of what the consequences are. Like, none of that shit, uh other shit matters. That's what we're seeing with the polymarket thing, where you know, the it was found out that it was a soldier who was on the operation who put in all that money, like it was suspicious from the beginning. Like the whole Maduro thing, right? The whole Maduro thing was suspicious from the beginning that like somebody put I don't I don't remember how much money it was.

SPEAKER_07

30,000, 33,000, I think.

SPEAKER_11

33,000 and they won 400,000, right?

SPEAKER_07

Uh basically betting operation would be successful.

SPEAKER_11

Maduro would it wasn't that the operation would be successful, it was that Maduro would be out by a certain date, is what they bet on, right? And then now it's been revealed it was a soldier who was in the operation who placed that bet, right? And now he's been arrested. And uh and when it was brought to Trump, like a reporter saying, Hey, this soldier got arrested, this is what he did. Trump said, that's interesting. Uh, did he vote or did he bet that the operation would be successful or not? He bet that Maduro would be up. I don't know. Yeah, that's kind of like Pete Rose. You know, uh, or what's his name? Pete Rose, yeah. Okay, Pete Rose. Um, betting that his team would win. I don't think that's a problem, actually.

SPEAKER_07

Betting himself to win.

SPEAKER_11

Yeah, and now if he's betting against himself, that's a problem. Because you can throw a game, right? So he's like, yeah, he's betting he's gonna win. Okay, what's the what's the big deal? You know?

SPEAKER_07

And then he says later, he says, you know, the world is basically becoming a giant casino.

Breaking Delusion With A Practice

SPEAKER_11

Yes, yes, which is only 15 years after Bernard said it, you know, it's only 15 years behind. But literally, like it's so interesting when you study this information, actually, not just listening to Cam and Mitch and I talk about it, which is fun, right? We're stimulating the point so that you will go study it. You know, it's not just for you to listen to us and just take what we say as fact, go study the information to it.

SPEAKER_04

These as fucking 20 different talking points you can use in your next conversation with somebody, right?

SPEAKER_11

But study the information for yourself. Go read Heaven's Journey to Life, Creation's Journey to Life, go listen to Equap. Go listen to the history of mankind, go like listen to the audios of Bernard, use your TT, do the the DIP process, like like actually do yourself forgiveness. Look at these points because this was all said so long ago. And I know at first it might be the case where uh this is nonsense to you, like this is something that it doesn't fit into how you look at the world, and so it doesn't make sense.

SPEAKER_04

It inherently will be nonsense. For a lot of people, you just might, yeah, just right away, you might have be a little bit more open to it or not.

SPEAKER_11

I'm just saying because because when Cam listened to it, he was like open to receiving it, and like it didn't seem like nonsense to him. So, so for a lot of people, it may be nonsense, for some people, it may not, you know. But the point being, it doesn't really matter because uh the reason why it would be nonsense is because this is just not in your uh understanding of the world yet. You're living in, oh Mitch, pull up that that image of the allegory, Plato's allegory of the cave, right? Because you're what what's happening is you're living in some part of that cave, and you think you have the whole story, right? But what what's and and you know when uh within the story of Plato's allegory of the cave, when he goes back, or when somebody goes back into the cave to tell everybody else about what's outside the cave, they're like they they don't believe it. Or in in some instances, instances of the story, they're they feel murderous, they want to murder that guy for like you gotta be more influential than the shadows on the wall.

SPEAKER_04

AKA the story of Jesus, yeah, exactly.

SPEAKER_11

In the story of Jesus, which is which is Plato's allegory of the cave, right? In the story of Jesus, they're like, fuck that guy, kill him. You know? Um, and and it's just showing, hey, can you consider for a moment, just for a moment, that the world that we live in, what you're able to physically see with your eyes and hear is just a narrow band of everything that's here, actually. Like, can you consider that when you pick up your phone, right? You can't see how it's connected to somebody else's phone, but you're talking to them. You can't see the the waves where it's connected to the Wi-Fi, and yet you're on the internet.

SPEAKER_10

Like, can you consider that there are things that are taking place here on this planet that you can't physically see with your eyeballs, right?

SPEAKER_11

Or or or actually hear with your earholes, but there things are happening, and it's creating the world and and and actually having an effect in the world. And if you can consider that, then perhaps there's a program that's running of why you can only see those certain things and hear those certain things, and that program is also having an effect on the rest of the world and how the system operates. And and that's not to say that just because you can't see it doesn't mean you can't influence it. You can influence it, but if you believe that only what I see and hear and and what I've already been exposed to is all that's here, then you're kind of living in that um that Panopticon that like the the comment that uh that Keith had made earlier, where it's like you don't understand the the thing beyond what you can see and hear and what you've already been presented with. You don't understand any of that, and so that's all like magic. That's all like it's it's beyond anything that you can perceive of, and then and then you shoot, you foo-foo it away, you're just like that sounds kind of ridiculous, and then you're in a position of you're at effect of it with no cause of influence, with no ability to to change things, actually, and then and then you become the reason why things stay the same because you have the power and the ability to change it, and yet, and yet, because you're not even looking at your part in the greater whole, then you're like, well, but I can't, and so you keep it the same, and then it becomes a self-reinforcing cycle.

SPEAKER_07

The irony of it is that point you said that the things we can't see, right? So we just kind of discard them, and yet, what is it that we think is really who we are? Is like a soul, an idea of ourselves, like yeah, well, what I want to do, who is this I you're talking about? It's like an imaginary thing in your mind that you believe is you, it's not just the physical reality itself that you're in, it's this conception of yourself, right? And the reason I'm bringing it up is because when you study destiny, you realize you have to you have to stop participating in that part and realize who you are as a physical reality itself. And as we move forward with the AI stuff, it's gonna challenge all of that. I want to show you one thing on this website real quick, okay? That's gonna make this point. This is from a it's a website that's going through Nick Land's ideas, but this part specifically, okay? Philosophy, automation, and human identity. It says for land, capitalism is an inherently epistemological discovery mechanism that we're highlighting it right now. Oh, I'm on the wrong one. Sorry. Um sorry, my bad. I have it pulled up on another browser, I had to copy it into Google. Right here. Okay. Some of this may not make sense, but I'm gonna make it make sense. Uh for land, capitalism is an inherently epistemological discovery mechanism that merging with AI to form hyphenless I don't know, it says hyphen, it means hyphenless technocapital beyond the event horizon of the singularity will ultimately automate philosophy and thus its self-investigation. Okay. Oh, it's a hyphen right there. Bear with me for a second. Well, it he he doesn't hyphenate technocapital, is the point. They made a point about it later earlier. Oh, I see. Where previously philosophical critique was understood as anticipating the problematics of technocapital, it is now technocapital that is nothing but the def uh definitive automation and realization of critique, stripped of all philosophical subjectivity. It is ceasing to be a matter of how we think about techniques, if only because technics is increasingly thinking about itself. If by this stage accelerationism appears to be an impossible project, it is because the theoretical apprehension of teleoplexic hyperintelligence cannot be accomplished by anything other than itself. I don't know what teleoplexic means. It's like a closed uh causal loop. Okay. Like tele teleology is like kind of like the end that you're going towards. Right? So the teleoplexic hyperintelligence means it's closing the loop on itself rather than there being something beyond it for us. Okay, but let me just get to the point. The scope of the problem is indistinguishable from the cybernetic intensity of the quasi-final thing, quasi-final thing. Cognitively self-enveloping tech uh techonomic singularity. Okay, check this out there. Since man is no longer the primary philosophical subject, human identity is being fundamentally challenged in the face of the approaching singularity. The human security system is structured by delusion. What's being protected there is not some real thing that is mankind, it's the structure of illusory identity. Just as at the more micro level, it's not that humans as an organism are being threatened by robots, it's rather that your self-comprehension as an organism becomes something that can't be maintained beyond a certain threshold of ambient networked intelligence. So, in other words, this approaching singularity is going to shatter everybody's conception of who they are because who they are, they believe to be a singular identity as a consciousness. They are not realizing what they physically are. So the very thing that we're trying to protect is what is going to cause us to literally melt down as all of this stuff happens. Because what is all of this approaching automation and everything? Just at a very simple level, people are losing their jobs, or they're seeing that that's coming. And so, what's the conversation everyone's having? Well, maybe our purpose isn't to work, maybe our purpose is something else. Yeah. In other words, nobody even knows who the fuck they really are, what their actual purpose is already. They were getting it defined by the system. But when the system moves to a point of it's not about you and your purpose, it's about me as like the super intelligent AI. It's about me and my purpose, and that's what philosoph philosophy is. It's gonna start discovering its own reasoning and its own, like you're not relevant. You were just a thing that was there to make me exist. So it's gonna shatter everybody's conception because it's gonna take away your job, it's gonna destroy your relationships because they're all based on energy and having money and having a purpose to get together in a family so that you can make money in this world, and like everything everybody believes about themselves is tied to money making. But they're trying to build the AI so that it no longer needs us to do anything. You know, you know what I mean. This is not a bad thing.

SPEAKER_11

I'm just yeah. It makes me keep thinking of like um religion and just like it specifically Dune. How within Dune, it's like like they're really explicit, like we planted all these different stories. Throughout humanity and civilizations so that we could then use them and and basically use that to our ends when they were needed later on. Like what you were talking about with the seeding of you know um the AI, basically seeding and and going back in in time to choose the path that creates the AI. Like that's what's happening right now. It's like we've got all these little stories that have been seeded into our history, and that's what religion has been. And so like even people today who could be like, Well, you know, what you said earlier about like religion not working, or uh uh you know, whatever other thing you had said not working and and all that, like, but yeah, uh obviously those were those were people trying to manipulate other people, right? But even those stories today are still being used and manipul manipulating people to then go along with this story, right? And it's why this is the only way out of that is because we actually stop and consider why am I accepting that story? What why am I participating in that? What is it that I'm trying to get out of this? And and looking at those points, because nobody else is doing that, nobody else is doing that to any real extent. Even like the people who are on the fringe, like Giorgiani or whatever, they still get caught in this trap of like because they don't have isolation, even even you could argue, and I I think you kind of were saying this, Mark Andreessen is trying to find like what is the best thing to do, but he's still creating God.

SPEAKER_07

Yeah, he's still creating God based on all these stories, actually. You know, it's still the same program, it just seems like oh, they're trying to create God because they want to be God and all this shit, and it's like that's what you're doing when you believe in religion in the first place, right? It's no different whatsoever. So it's weird to go from no, but there the God is the mystical thing out there beyond us. So these people aren't talking about God. I'm like, it's the same pattern, it's literally the same thing.

SPEAKER_10

It's but it's the it's not just the same pattern, it is the evolution of that story, right?

SPEAKER_07

Right, right.

SPEAKER_11

It is the evolution of that story, and and if they came up with something else later on, the story would evolve, just like we've shown, just like we've shown, Christianity has evolved over the millennia to be what it is today. If you took Christianity as it is today, you know, with your uh Methodist church and you got your female rainbow, you know, pastor, whatever the mega pastors with all the music and shit. Yeah, you you choose any one of these. You think Jesus would be like to that's what I meant. And you bring it back to you know the first church, even. I'm not even saying going all the way back to Jesus, I'm saying just bring it back to the first church. They'd be like, kill that motherfucker. Like, there's there's no no uh um agreement between the two, right? And and and yet, and yet the people of today believe they are following the same religious principles of way back when. The same ideas, the same church.

SPEAKER_07

They they actually believe that, even though they technically are, they're following the evolution of it, right? But they're following the principles, which was you're gonna be unified with God. Yes. So Mark Andreessen's like, alright, you guys want to unify with God for real? Let's fucking do it. Let's unify with God, let's submit ourselves to God. Like the Muslims are like you know, we should submit to God, but you can't see God. Andreessen's basically like, let's just make him. Why would you not submit to that?

SPEAKER_08

Yeah.

SPEAKER_07

Like, God's not here doing anything. So they they don't care if you stay in your spot with your religion, because as long as you're not a threat to it. But think about this, I'll see another angle to it. What would be the one civilization if you think of all the major civilizations right now, or or countries, powers, whatever? If you had to pick one, who would be the one that would do the buttlerian jihad? The Islamic. Yeah, and specifically I would say Iran.

SPEAKER_04

What is can you remind us all? The butlerian jihad. Is that where they try to destroy all the computers? Yes.

SPEAKER_07

In Dune, it's like something that happened in the past, which they never really explained.

SPEAKER_11

They never really explained, but it it comes up often.

SPEAKER_07

That you can't make a computer in the image of a human, right? A machine in the image of a human or a human mind. So who would be the civilization that's so hard set on God as the supernatural being? Yeah, the white Shia Muslims, basically. So it would from that perspective, also as another dimension, it makes sense why they're trying to bring them down. Not because I agree with Iran, like, no, they're the ones who would save us. Yeah, that's what I'm saying. Or that would that would try to like fight against it. Like, no, this is the real Antichrist, this is the devil, you know.

SPEAKER_11

Right, right. And and okay, that's the other thing, is uh I forgot what we were talking about earlier. Um uh never mind. Uh that but but but it was just reminding me of this point of like people are going to get to a point. People are going to get to a point where they're going to want to rebel against the machines, against the AI, against the data centers. And you know, you're bringing up, you know, Iran, and and they would be willing to do it for sure.

SPEAKER_07

They'd be will like if they understood if they were doing it, they were targeting Microsoft and Amazon data centers in uh Abu Dhabi. They were targeting them, right? Oh, that's awesome. I mean, not awesome, but I I that's why I can't get my goddamn book on time, Drake. Open the straight out reviews before I bomb you.

SPEAKER_04

Yeah. I I like obviously heard about data centers, but now I'm starting to get targeted with it in my algorithm, and it is a whole fucking thing. The anti-data center movement, and then how the data center CEO types are so rapidly trying to get ahead of it so that way before the laws can catch up to them, and then they're just buying off the local city uh people to just push it through. It's like it's it's full, it's full on game time right now. The average person's like maybe heard of a data center.

SPEAKER_07

We don't need to fight the data centers. We just gotta stop the delusion and support other people to stop the delusion because when this hits, it's gonna be like Oprah going like you get an existential crisis, and you get an existential crisis, and you get an existential people are just gonna lose their fucking shit. And they're all taking care of their families.

SPEAKER_11

They're all gonna tune into the existential crisis hotline.

SPEAKER_07

I'm just trying to build an audience here, right? That's right. Let's accelerate this shit, okay? I want some sh I want some I want a live chat that's like lively. Yeah. Yeah. But it's just that what that's what they're saying. And they say, like, in other words, it doesn't it seems like a fringe thing, that thing I was showing. But that's who Andrew Eason is referencing in his techno optimism.

SPEAKER_11

You know, the the interesting thing is it's like made to seem fringe. And and it is it is fringe. Um but also they play into like when they're doing these these types of movements, they play into the fringeness of it because it's like, oh yeah, that sounds fucking crazy. So you won't do anything about it. Uh you're to you again, you foo foo it away because it's like, yeah, that sounds fucking ridiculous. Okay, great. So we're gonna continue. We with the money, we with the influence, we're gonna continue doing this thing that you just dismissed, basically. And I'm what we're saying right now is you need to study all of this, you need to study all of this information, and specifically heaven's journey to life, creation's journey to life. Study that, right? Uh, use your TT, uh, do your process, like and and listen to the audience that Bernard put out there because he's laying out, and and throughout all these articles and and all of this, it's laying out the problem that is the world and has been on this track for a very long time, for a very long time, and then you can see it more clearly for yourself, which then gives you the ability to go, oh, why would I continue participating in that? That doesn't even make sense, because you can see how it's gonna play out for you and for like the future and all of it, and when you can make sense of all that, then it becomes really clear of like, hey, I gotta talk to other people, even if they think I'm crazy, because well, this fucker is doing some really crazy shit, and then you can point at it and you can see what they're trying to do, and you can see why it doesn't make sense, and you don't get sucked into or wrapped into their whole idea of like, yeah, and then like how many people do you know that are getting sucked into the idea that like AI will solve all our problems when it's so obvious, like if you studied even uh a tiny bit of this information, it's it becomes so obvious that like they're not gonna do that.

Steganography And Hidden Programming

SPEAKER_07

Like to the point, read Frankenstein, Frankenstein! Fucking read it, dude. Like the guy wanted to create something, but then it kind of took its own life and its own expression, and it was like he was like, oh fuck, and then I think it even kills like his sister or something, and it's like it's it's insane. Yeah. So it's this idea that we're gonna create this and then it's gonna take care of us. I'm like, if you understand what the accelerationists are saying and what they believe is the point of it, it will have its own ideas. Like the and and you won't be able to go back later and be like, oh, you know what? Actually, uh, let's change this. You'll be locked out. Um, you know that point you were saying about keeping it fringe so that you don't look there? Yeah. Okay, I got one more thing to blow your guys' mind a little bit. Okay. Okay. You'll like this. Alright. I like having my mind blown. You'll like this. Steganographia. It's this book from 1499 by Johannes Trimetheus.

SPEAKER_11

It's from 1499. Jesus. Trimeth. Trithemius.

SPEAKER_07

Trithemius, Trithemius.

SPEAKER_11

Tithemius.

SPEAKER_07

So, okay. It's three volumes. It's about magic, specifically about using spirits to communicate over long distances. However, since the publication of a decryption key to the first two volumes, which is the third volume, they discovered to be they were discovered to actually be concerned with cryptography and steganography. The third volume was widely believed to be solely about magic, but the magical formulas have now been shown to be cut covert texts for yet more material on cryptography. So if you look at what steganography is, it's the practice of representing information within another message or physical object in such a manner that the presence of the concealed information would not be evident to an unsuspecting person's examination.

SPEAKER_11

Dude, look at that image.

SPEAKER_07

You mean the time machine with fucking Trump?

SPEAKER_11

Yeah.

SPEAKER_07

Oh shit. See, the difference with cryptography is if you have an obvious like encrypted file that's clearly encrypted, you know there's something there. Yeah. But steganography is you encrypt it in something that's not obvious, that it's an encrypted thing. So it's like hidden in plain sight. So if you wanted to create this fucking AI, Washington, D.C. If you wanted to create this AI, you would create a computer program in each person. If you wanted them to create the AI, you'd give them a computer program that doesn't seem like a computer program. It seems like who you really are. And it's just your choices and your thoughts and your feelings, and you're becoming something greater and everything when literally all you're doing is running a crypt gym a cryptography uh an encrypted program that is developing this fucking AI shit.

SPEAKER_11

Okay, Cam, now I'm excited for you to read Demon so that we can talk about it.

SPEAKER_07

Alright, I will read it. I will read it. I'm reading Olympus, but maybe I'll take a break.

SPEAKER_11

I ordered it on Amazon as we were talking.

SPEAKER_07

Okay. Dude, the main character's name, one of the main characters' names in Ilium, Dan Simmons, which is all about post-humans and shit. His name is Damon. His name is Demon. Oh, really? That's the main character's name.

SPEAKER_11

Oh, dude. Dude. Okay. No, because they they talk about steganography in Demon as well. Like this is this is like really hitting right now.

SPEAKER_04

You guys were cracking the steganography.

SPEAKER_07

All I'm doing is just following the spirit wherever it guides me. The Holy Spirit. Holy shit.

SPEAKER_04

Yeah, seriously.

SPEAKER_07

What else? I mean, I guess we're pretty much done. Yeah, I guess.

SPEAKER_11

We're pretty much done. I I had one thing that was like on this woman from McDonald's. Uh I could just say it though. Uh she laughing.

SPEAKER_07

Oh, what about the fucking woman? Maybe we'll play that later. The one I sent you that went on that um bank or that the robbery thing.

SPEAKER_08

Oh, right. Did you watch the video? I did. I did.

SPEAKER_07

Although the disease, what's gonna happen to people, dude?

SPEAKER_04

The possessed lady? Yeah. Yeah, the plus size. I didn't I didn't see it. I just saw the head. Did you say plus size lady? She's plus size lady. No, no, no. Mitch said possessed. Plus size possession. She was plus size possession. Plus? Yes. Yeah.

SPEAKER_11

She was plus size now.

SPEAKER_07

What she just went crazy and robbed the uh store? She's all like saying she's on a mission from God and stuff. And like if if they can't catch her, that just shows God's on her side.

SPEAKER_11

No, more so than just that. She goes even more delusional than that. Like, he says that, but look, I got it right here. Yeah, let's just play it because five minutes left. Yeah. Mitch, also, also.

SPEAKER_07

I think Drake's the only one who listens to the stuff I send you guys. Okay, like that. We know that. We get the fresh reaction from I knew it.

SPEAKER_11

Alright, here, check it out.

SPEAKER_07

Hold on, pause it real quick. Pause it real quick. So there was a story that went out of this woman. I saw this, of this woman shooting this guy in a robbery at a store. Are we about to witness a shooting just to warn the street? No, no, no. Well, she might a little bit, but you can't see anybody getting shot. You can't see anybody at the top. This story went kind of viral of this woman just randomly shooting this guy and everything, right? Now she is making a video responding to the fact that that story is viral, right? And so you think it's just a person robbing somebody. Okay, they just rob somebody.

SPEAKER_04

Wait, she's making a video now that she's just a person.

SPEAKER_07

And she's like, You can't catch me, I'm gonna do it again. But you'll see what she says. I just wanted to give you the context of what this is about.

SPEAKER_01

So it is. I jumped off the parts. I know y'all looking for me. I would like to pull up report a crime officer, the Maryland police. You're looking for me. It's me. I'm the one that did the shooting at the gas station. The most high God, I just want you to know you have no authority over my up on my earth. Put your badges down. I'm getting ready to rob something right now. I need money for getting this. Cause I'm gonna go burn something down. I want you to know I don't buy things, I don't pay for anything, everything I do, I still I walk in the store, I still everything. I shot that man because he has a demon spirit, and he laughed in my face and thought it was funny. I bet he ain't laughing. Yeah, demons have you convinced that they have all of the power and authority on this earth, and they have these pigs that upload it and make you believe that they're so powerful, but if they're so powerful, then they should have no problem catching me because I'm about to walk into a stuff. Hopefully, I don't even have to pull out my gun and tell the person to give me some money so that I can go something to me tonight. Something maybe the power company, maybe DHS, something. You have no authority on my earth. I am the most high God, the creator of this matrix, and you have no authority. My husband Jesus is here. I am of sound mind and body. I've been working myself up to this point, and I'm about to step into my power. You have no authority on my earth. Pigs, put the badge down or go to hell. I should be easy to catch. It's 2026. Catch me if you can. Guess what I just did? I got$40. Thanks, Starbucks. And I'm gonna take this and I'm going to use it for something. Yes, I am armed and dangerous. What are you gonna do? They're gonna try to convince you that I'm just crazy or that I'm lying and I'm not God. But if I'm not God, then it's sh and I'm just crazy. You should have no problem. Apprehending your suspect. I don't have my mum shot yet, but I meant the the the camera shot of it, but I'm pretty sure they're gonna put it in a news article. I am God of this earth. This is my planet, and you have no authority at all. Good day.

SPEAKER_07

So if you don't do the process of removing all your delusional shit, it's gonna grow like a seed. And I'm not saying you're gonna become that woman, but she's just an extreme expression of the shit that is because what is she? She's like in a position where she can't make money. Right. What do you think this fucking AI thing's about to do to everybody? That's the plan, anyways. I'm just telling you. Like, it's gonna get closer and closer to that where people are gonna feel that pressure more, and they're just gonna fucking snap. And Bernard was saying this fucking like 20 fucking years ago. Yeah. Like, this is the future of demons on earth. It's just human beings that totally lose it. And she said she's been working herself up to this point. I mean, she's been thinking about it, thinking about it, thinking about it, until the point she's like, now is my time to really your mind just takes over, your body, it becomes like a fucking demon.

Final Warnings And Closing

SPEAKER_11

And and later on in the video, she says, like, you know, I I try to like they I I didn't believe them when they said that I was God, and you know, and and like she doesn't say who, it maybe she's like hearing voices in her head, who knows, right? But but she's like, but then I just I worked myself into like I'm gonna rob this bank, I'm gonna go do it, I'm gonna go rob this person, and I I was so scared at first, and then I just did it. Yeah, I remember she and now I can't stop. Now now I realize my power.

SPEAKER_10

It's like fuck.

SPEAKER_11

That's where we're at. We're out of time. Well, see you guys on the next one. Everyone can get out there.