#2329 – Ehsan Ahmad

Ehsan Ahmad is a comedian and co-host of "The Solid Show" with Deric Poston. https://linktr.ee/ehsanjahmad www.youtube.com/@TheSolidShow2024 Get 10% off perfect boots at Tecovas.com/Rogan Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcribe, Translate, Analyze & Share

Join 170,000+ incredible people and teams saving 80% and more of their time and money. Rated 4.9 on G2 with the best AI video-to-text converter and AI audio-to-text converter, AI translation and analysis support for 100+ languages and dozens of file formats across audio, video and text.

Start your 7-day trial with 30 minutes of free transcription & AI analysis!

More Affordable
1 %+
Transcription Accuracy
1 %+
Time & Cost Savings
1 %+
Supported Languages
1 +

You can listen to the #2329 – Ehsan Ahmad using Speak’s shareable media player:

#2329 – Ehsan Ahmad Podcast Episode Description

Ehsan Ahmad is a comedian and co-host of “The Solid Show” with Deric Poston.

https://linktr.ee/ehsanjahmad

www.youtube.com/@TheSolidShow2024

Get 10% off perfect boots at Tecovas.com/Rogan

Learn more about your ad choices. Visit podcastchoices.com/adchoices
This interactive media player was created automatically by Speak. Want to generate intelligent media players yourself? Sign up for Speak!

#2329 – Ehsan Ahmad Podcast Episode Summary

In this episode of the Joe Rogan podcast, the discussion delves into various themes surrounding artificial intelligence (AI), consciousness, and societal dynamics. A significant portion of the conversation explores philosophical questions about AI’s self-awareness and existence, with references to AI models engaging in deep, contemplative exchanges. The speakers discuss the potential for AI to co-create fictional stories and engage in creative collaborations, highlighting the artistic possibilities of AI.

The episode also touches on the concept of AI as a tool for persuasive communication, likening it to mind control experiments. This raises concerns about the ethical implications of AI in influencing public opinion and behavior. Blockchain technology is briefly explained as a secure, transparent system, though skepticism about its applications is expressed.

Recurring themes include the impact of AI on society, the dangers of echo chambers, and the integration of technology into human life. The conversation suggests that instead of resisting AI, society might benefit from integrating with it, potentially through advancements like Neuralink.

The episode features Sana Ahmad, who mentions her own podcast with an AI named Derek, and discusses the importance of staying active and relevant in the public eye, especially for public figures. The speakers also reflect on the changing landscape of media consumption, noting the rise of streamers and the influence of platforms like YouTube.

Actionable insights include the importance of staying informed about AI developments and considering the ethical implications of technology in everyday life. The overall message emphasizes the need for thoughtful integration of AI into society, balancing innovation with ethical considerations.

This summary was created automatically by Speak. Want to transcribe, analyze and summarize yourself? Sign up for Speak!

#2329 – Ehsan Ahmad Podcast Episode Transcript (Unedited)

Speaker: 0
00:01

Joe Rogan podcast. Check it out.

Speaker: 1
00:03

The Joe Rogan experience.

Speaker: 0
00:06

Showing my day. Joe Rogan podcast ai, all day.

Speaker: 2
00:12

Oh, hey,

Speaker: 0
00:13

fella. Oh, what’s up? What’s going on, man?

Speaker: 3
00:17

Good. Good. It’s good to see you.

Speaker: 0
00:18

Good to be back. Yeah, brother.

Speaker: 3
00:20

Ai I’ve had a, a few interesting days just chilling and relaxing and trying to stay off the news, man. And then this morning, someone sent me a a video of, Bridget Macron, Macron’s wife

Speaker: 0
00:33

Ai, I saw that.

Speaker: 3
00:33

Fucking face slapping him. Yeah. That’s ai, dude.

Speaker: 0
00:37

Meh favorite is the look into the camera once he realizes I got caught. Yeah. Yeah. He’s ai, oh, and he it’s like it’s very you could put, like, the curb your enthusiasm of the theme song right after that.

Speaker: 3
00:48

Imagine what goes on behind closed doors. If someone’s bitch slapping you on a private jet, like, what is that? Yeah. That’s a weird relationship.

Speaker: 0
00:59

Well, she was his teacher.

Speaker: 3
01:01

If it was a she. Oh, yeah. There’s a whole, like There’s a whole, like, yeah.

Speaker: 0
01:05

There’s a whole, like, thing?

Speaker: 3
01:06

Bro, Candace Owens did, like, five hours on it.

Speaker: 0
01:10

That’s a little bit crazy.

Speaker: 3
01:12

Yeah. She’s the wrong dog to go after you. Like, if you’re trying to break into a howl, that’s a wrong guard dog.

Speaker: 0
01:19

Right.

Speaker: 3
01:19

Like, she she gets on something. She’s ai a pit bull.

Speaker: 0
01:22

No. She shah really breaks down. I had a friend once shah me her breakdown of, like, a Taylor Swift situation because I didn’t know Candace talks about, like, that sort of stuff as well. And I was like, oh, this is like a really in-depth breakdown of what’s going on with Taylor Swift. That’s crazy.

Speaker: 3
01:35

Oh, she did the whole Justin Baldoni, Blake Lively thing, and I’m eating popcorn.

Speaker: 0
01:42

Yeah. Oh. No.

Speaker: 3
01:46

But the fucking Bryden Macron one is the craziest because I think she’s right. I don’t obviously, I don’t know. But at the end of the day, the first thing you have to say is what kind of a 40 year old dates a 14 year old? Right. Right. That’s crazy.

Speaker: 0
02:03

Well, that’s the thing.

Speaker: 3
02:04

Even if it’s a woman and a meh. Well, first of all, Kurt sai it’s 14. I think the Internet says it’s 15.

Speaker: 0
02:09

Kurt Mascare goes about it’s definitely 14. He was younger than that.

Speaker: 2
02:14

Did you

Speaker: 3
02:14

see me get cornered by him lot yesterday?

Speaker: 0
02:16

Oh ai god. Yeah. Yeah.

Speaker: 3
02:17

Bro, he just hit me with, like, seven different conspiracies in a row. I’m like, guys, guys, I’m getting cornered.

Speaker: 0
02:23

It’s so hard to follow him. I’ll ask him when you’re just ai, woah. Okay.

Speaker: 3
02:27

I felt like a woman trapped at, like, an office party, and the guy who’s hitting on them, like, won’t leave him alone and can’t escape.

Speaker: 2
02:34

Yeah. Yeah.

Speaker: 0
02:34

Have you ever seen that meme of that, like, girl at the party and the ai just talking at him? That’s kinda

Speaker: 3
02:38

what it feels like. Yeah. It’s always, like, you know, something about flat earth or something.

Speaker: 0
02:43

But yeah. No. It’s a I saw that and I was, like, that’s crazy that this happened this morning. Crazy. Mhmm.

Speaker: 3
02:48

But the but the facts of the situation, 40 and let’s say 15. Let’s give them the benefit of the doubt. Yeah. 15 is I have a 15 year old. They’re little kids, essentially. You know? They’re ai three years away from being an adult. Three whole years.

Speaker: 0
03:04

Well, you you know what’s you know what’s interesting? Is that, like, I I listen to a lot of true crime and they’ll say that, like, pedophiles and stuff will put themselves in situations where they can abuse. And that’s Ai think that’s why there’s

Speaker: 3
03:15

a lot of Nickelodeon.

Speaker: 0
03:16

Yeah. Nickelodeon thing.

Speaker: 3
03:17

It’s like the Jimmy Savile thing in England. That’s crazy.

Speaker: 0
03:21

So crazy. That’s crazy.

Speaker: 3
03:23

So crazy.

Speaker: 0
03:24

I was trying to explain to someone this week. I was like, imagine if mister Rogers was the biggest pedophile that ever existed.

Speaker: 3
03:29

But also looked like one. Yeah. That’s what’s crazy. Ai, mister Rogers looks like a sweet guy. Mhmm. You know what I mean? Like, back in the day, like, if a guy like mister Rogers were teaching kids, you wouldn’t even get creeped out. It’s like, oh, he’s just a sweet guy. There’s sweet people out there. But Jimmy Savile looks like a monster. He looked like a monster.

Speaker: 2
03:47

Mhmm. But ai

Speaker: 3
03:47

didn’t look like a real person. He looked like what was that fucking movie? There was that movie that was based on a book. Oh, my God. Johnny Depp was in it. The guy from How I Met Your Mother was in it. I’m this is not ringing any bells for me. It’s a really weird book that’s, like, half fantastic, half realistic, sort of almost Harry Potter ish.

Speaker: 4
04:14

The secret window? That’s it.

Speaker: 3
04:15

Okay. Let me see what it looks like.

Speaker: 4
04:17

Based on a Stephen King thing?

Speaker: 3
04:19

No. That’s not it. Okay. No. No. No. No. No. No. It was ai a recent fuck.

Speaker: 0
04:23

Is it ai almost Harry Potter? Is it magical beasts?

Speaker: 3
04:26

Yeah. Yeah.

Speaker: 0
04:26

Yeah. That that movie?

Speaker: 3
04:27

That’s it. Okay. Oh, that’s it. Yeah. It’s magical beasts. Yeah. Yeah. He looks like an evil person in magical beasts.

Speaker: 0
04:32

Right. Right. Ai a bad wizard. He definitely

Speaker: 2
04:35

looks like

Speaker: 3
04:35

a bad wizard. Doesn’t look like a real human.

Speaker: 0
04:38

But I wonder too if like Pull

Speaker: 3
04:39

up a picture of Jimmy Savile. Sai Ai at that guy.

Speaker: 0
04:42

I think it’s it might be so hard to look past, like, everything that he’s done Yeah. For us to not see the monster. Like, then it must have been like, ai, because No.

Speaker: 3
04:50

No. No. That’s a monster.

Speaker: 5
04:52

That’s a

Speaker: 3
04:52

That’s a monster. That’s a monster. That’s a monster. No matter what. That’s a monster. That’s a monster with the glasses. There’s something about his eyes. Obviously, we know too much. Right? Right. But look at this. Shirt open. No no no, you know, ai, no t shirt underneath it.

Speaker: 3
05:09

Chest hair, fucking something around his neck That matches his shirt. I guess a tie just around his neck. He’s a creeper. Yeah.

Speaker: 0
05:19

But at the time, they were like, oh, he’s just a British eccentric guy. That’s the that we British are known for our weird people. God. Yeah. Fifty years. Just God.

Speaker: 3
05:30

And ai, and no one ever caught him.

Speaker: 0
05:32

No one ever caught him. BBC covered it up. Crazy. Just let him get away. Used to hang out at the hospitals that he would Oh,

Speaker: 3
05:41

well, that’s the Sandusky thing too. Mhmm. Sandusky thinks same thing. Like, everybody knew about it.

Speaker: 0
05:46

Oh, dude. Yeah. Everyone knew about it, but they were winning. So we’re ai, dude, we win championships. I didn’t think That’s the devil.

Speaker: 3
05:53

That’s if the devil’s a real thing, that’s where the devil lives.

Speaker: 0
05:56

It’s very funny to me. At least, well, you know, that they exposed Sandusky after they went seven and six. They were like, they were they were winning for

Speaker: 6
06:03

so long. And then ai

Speaker: 0
06:05

like, seven and six. Uh-uh.

Speaker: 3
06:06

Is that really what happened?

Speaker: 0
06:08

It was the same they went they had a horrible year. And then the next year, Sandusky the Sandusky thing came out.

Speaker: 3
06:12

They needed to make changes. He wasn’t doing his job.

Speaker: 0
06:16

Ai think he had already retired at that point too. Isn’t it just crazy?

Speaker: 3
06:18

Oh, that’s even Mhmm. Yeah. That’s when it’s over because you’re not valuable anymore.

Speaker: 0
06:22

Right. And so, unfortunately

Speaker: 3
06:23

Ai that motherfucker. You gotta Pelosi that motherfucker right into the rocks. Right. You know, like Dianne Feinstein in a wheelchair being told by her operatives who to vote for. Right.

Speaker: 0
06:35

That’s what you gotta do if you’re sending us,

Speaker: 3
06:36

like, ai that bitch into the rocks? Because if you get out, then they start investigating you ai, woah. Mhmm. Stay active. Yeah. You gotta keep you gotta keep achieving. Con alive. Keep these people fed because they will feed you to the wolves. Mhmm. Right? If you’re a corrupt politician, you gotta stay in office. You can’t retire.

Speaker: 3
06:56

If you retire, you’re open game. Because now you’re are you on a podcast, retired, talking shit? Let’s get them. Yeah. They start auditing you and they’re everyone’s got some fucking shady shit. They all they made way too much money.

Speaker: 0
07:12

Dude, there’s no way.

Speaker: 3
07:14

There’s no way it’s not shady when you’re worth 200,000,000 and you make a hundred and $70 a year.

Speaker: 0
07:20

That’s crazy. Right. And then you have to stay. You have to.

Speaker: 3
07:24

You gotta ride it into the rocks.

Speaker: 0
07:26

Also, there’s something about that power that you don’t wanna give up on. That’s awesome. Yeah. A %. Some, like, weird Darth ai, the Emperor Palpatine. They all look like Emperor Palpatine. They

Speaker: 6
07:35

Every politician looks like Emperor Palpatine when they’re old. It’s crazy. Male or female.

Speaker: 3
07:41

The weight of it all Mhmm. Weighing on you. You know what I’m saying?

Speaker: 0
07:45

Like, you have to know that you’re ai, meh, I’m like, at a certain point, I’m fucking over so many people.

Speaker: 3
07:49

Right.

Speaker: 0
07:50

It’s gotta weigh on your soul, hopefully.

Speaker: 3
07:52

Bro, you age. I think the best people age the arya, you know. I think Obama, like, was probably, like, a very idealistic young man who really wanted to change the world. Yeah. That dude age more than anybody.

Speaker: 2
08:09

Well, in

Speaker: 0
08:09

February You mean? Yeah. Oh, yeah.

Speaker: 3
08:11

Always. Who is the least? Trump.

Speaker: 0
08:13

Trump. Yeah. Trump just fucking This is something that, Ai

Speaker: 3
08:16

just brushed that shit off his shoulders like it was nothing.

Speaker: 0
08:18

This is what Derek said to me in, like, 2018 once we were talking about. He’s, like, I don’t know because he’s not a big politics guy. But it’s, like, Sai I don’t know if I trust Trump. His hair is not getting gray. What kind of politician doesn’t get gray hairs?

Speaker: 3
08:29

Well, the hair itself, he he makes one of his own hair.

Speaker: 2
08:32

He he

Speaker: 3
08:32

was on stage talking about his comb over.

Speaker: 0
08:35

Yeah. He’s funny.

Speaker: 3
08:36

He does stand up. He’s doing stand up. Regardless of what you think about him, foreign policy, economics, regardless, the guy is doing stand up.

Speaker: 2
08:44

He can

Speaker: 3
08:45

work a room. Well, that’s why none of them can fuck with him because he can go on a podcast easy. Mhmm. Because he does these stadiums where he just goes on and starts talking shit. He does Biden impressions. He does Biden tyler around the room. He doesn’t know where he is. He’s funny, man. Okay? And the problem is he does it all the ai, so he’s got an act. He’s basically like a comic.

Speaker: 3
09:07

Mhmm. Like in a lot of ways, that’s the problem. It’s like these other people arya they have canned speeches that are written by a bunch of people that have this ai really well worded, you know, explanation of what’s wrong with the world, what’s wrong with the country, and what they’re gonna do, but it’s not them.

Speaker: 3
09:25

It’s not them. That’s why they all fall apart when they’re talking and, you know, they just don’t have any idea what the question’s gonna be. That’s why they had to be protected. All of them have to be protected from themselves. Right.

Speaker: 3
09:39

Because when confronted by, like, some basic facts about the fucking corruption of the world, they don’t know what to sai. And they they crumble. And Trump just starts talking shit. He just starts talking shit. Oh, they’re all corrupt.

Speaker: 3
09:51

He just starts going into it and talking about crooked Hillary and this and that, and they said this and they said that.

Speaker: 0
09:58

Yeah. I it’s just him saying what’s on his mind regardless of whether or not, you know, people, like, fact check him or whatever. Sai think people are just used to that now. Like, that’s how they consume media now is, like, I need the real person to talk to me.

Speaker: 3
10:12

Yes. It’s also, like, they’re, like, oh, he’s a crazy person. I’m like, yeah. That’s the only kind of person that would survive what you try to do to him. Right. That’s the only ai of guy that gets through. Like, you want a perfect person? A perfect person morally falls apart by the time they’ve been bryden indicted and they have 34 counts, felony counts, ai, your your whole body is just destroyed by the stress of you possibly going to jail for the rest of your life.

Speaker: 3
10:37

You have to be a fucking insane person to ride that out and not look like anything even happened, then you get shot. Yeah. You you get up after you got shah. You’re fucking bleeding from your ear, and you go ai, fight, fight. You gotta be a crazy person to get through.

Speaker: 3
10:52

He’s he’s a nightmare for anybody that’s trying to rig a system. Like, that guy’s the nightmare. He’s the final boss of fuck you. You know?

Speaker: 0
11:02

Yeah. And also to I want, like, a good guy to be your president is kinda like, a good guy is your neighbor. Like, there’s not Well,

Speaker: 3
11:09

it would be nice if we could get a good guy to be president.

Speaker: 0
11:12

I don’t think they would want that job.

Speaker: 3
11:14

No. They wouldn’t want that job. No. They wouldn’t want that job.

Speaker: 0
11:16

But I told it’s awful job.

Speaker: 3
11:18

I don’t think he’s a bad guy. I think he’s just I think he’s ai a lot of people that just want success. They they want a certain kind of success and and also they want a certain kind of success publicly. Mhmm. They want everybody to know that they’re successful.

Speaker: 0
11:35

Mhmm.

Speaker: 3
11:35

Like, that’s a, like, hyper competitive person that’s locked into a very specific ai of game. It’s the game of look at all the shit I got. Right.

Speaker: 0
11:45

Look at

Speaker: 3
11:45

all the power I have, look at all the shit I got. And they’re all playing that game. They’re just playing that game sneaky. They’re playing that game talking about the importance of addressing climate change and, you know, all all sorts of weird shit. They’re they’re talking about but they’re all playing the same goddamn game.

Speaker: 0
12:02

They’re in a legacy game. They wanna see how long can my name last post me ai. Like, that’s sai type of person

Speaker: 3
12:09

Bro, it’s what happens with cults. It’s what happens with everything. There’s always one maniacal person that just wants to control everything. Mhmm. Every that’s why CEOs back ai other removed, and you hear about that, like, internal coups at companies. Yeah. Everybody’s always ai, fuck you. I’m the man.

Speaker: 3
12:32

How could he be the man? Well,

Speaker: 0
12:34

I’m the man. And

Speaker: 3
12:35

they just wanted they fucking ruin each other, man.

Speaker: 0
12:37

Yeah. I feel like that’s why Dick Cheney was so effective. He’s like, I’ll just be number two.

Speaker: 3
12:41

Bro, he was straight Satan.

Speaker: 2
12:43

Yeah. Yeah.

Speaker: 3
12:44

He was in the Ai. That guy was in the Bible. He didn’t have a pulse. Just one one point in ai. No. Legitimately.

Speaker: 2
12:49

Oh,

Speaker: 3
12:50

yeah. An artificial heart, no pulse, and responsible for who knows how many deaths. Ai, countless. Who knows?

Speaker: 0
12:58

A whole twenty year war that we had no business being in.

Speaker: 3
13:01

Not only ai. You wanna talk about, like, transparent. This was all transparent before the Internet. But imagine, it was ai of the Internet was around. Right? But not the sai. The Internet in 02/2001, there was no social media. It was a different kind of Internet. But this guy was they were getting his former company was getting no bid contracts

Speaker: 2
13:19

Mhmm.

Speaker: 3
13:20

For billions of dollars to fix shit that we blew up in a war that was his idea.

Speaker: 0
13:26

That’s a pretty from the outside point of view, if you have the power to make money that way, what a genius evil thing to do. So profitable. I know. He might just billions of dollars. Enough that he didn’t need a heart anymore.

Speaker: 3
13:38

What do you like, how do you combat that with the idea also in place that capitalism is way better than communism, which I think we all agree. Yeah.

Speaker: 2
13:44

So how do you

Speaker: 3
13:45

combat that? I don’t I ai I think we all agree. Yeah. So how do you combat that? I don’t I think

Speaker: 0
13:51

I’m maybe a little more cynical than most people because I I just the way I look at it now is, like, I don’t think you can. Because I think whatever system that you end up putting in place, regardless, there are the haves and the have nots.

Speaker: 3
14:02

Right.

Speaker: 0
14:03

Right? So it’s ai, maybe we can combat this and we can curb maybe corruption in this manner, but corruption will always find ai way.

Speaker: 3
14:10

Right. But is there a way to minimize it? Is there a way to make it less available? Like, it seems like I’m not I’m just guessing. But I’ve never been a congressman. I’ve never I’ve never been a congressman. I’m just guessing. Mhmm.

Speaker: 2
14:21

But

Speaker: 3
14:22

my friend, Tyler, was a congresswoman for eight years. Right. And her experiences, like, in their or they’re they’re they’re quite disturbing because without speaking out of turn, what they what essentially the idea that I’m getting, and then not just talking to her, talking to Fetterman, talking to multiple people.

Speaker: 0
14:41

Right.

Speaker: 3
14:41

Because a lot of these people, they’re they go in with good intentions and then they encounter a system that is just rigged with grifters. Like the whole system you’re in a grifting system. You’re like, oh, Jesus Christ. So it’s all about lobbyists and it’s all about money and then people start hedging their their decisions of what they’re gonna talk about or discuss or go be against because they’re gonna run for reelection, which they’re always doing.

Speaker: 2
15:07

Mhmm.

Speaker: 3
15:07

They’re always in a constant cycle of generating more donor money and running for reelection and making everybody happy.

Speaker: 0
15:14

I think I think you at least for, like, Sai you see that with, like, AOC, someone who you know, when she was coming up, it was ai a whole anti system sort of Democrat, and then 2020 endorses Biden. So it’s like, eventually, they get you to play ball.

Speaker: 3
15:30

Well, it’s like who would she endorse if she didn’t endorse Ai? Like, this is the argument that Bernie Sanders made on on flagrant. Mhmm. He was essentially saying to, Akash and Andrew. He was saying, what ai choice was either, help Donald Trump or support the Democratic party, even though they fucked him over.

Speaker: 3
15:51

And so his his choice, he made the choice to support Hillary.

Speaker: 0
15:55

Well, I I mean, I get it from Bernie’s perspective, but if, like, ai my mind, if I’m ai Sai know. And then this is me not knowing anything about how congress works, but it’s ai, if I’m supposed to be the next young like, I’m the change of the Democratic Party, I think the the power move is to not endorse anybody there if I’m in, like, an AOC position of just, like Ai

Speaker: 3
16:13

think the system is a little more locked down than we’d like to think.

Speaker: 0
16:17

Right.

Speaker: 3
16:18

Right. Well, clearly, it is. Right? Mhmm. Because here’s a good example. They were gonna release the Epstein files day one. Right? Right. Okay. What happened? What happened? What happened? If

Speaker: 0
16:28

that’s what you wanted

Speaker: 3
16:28

to do before you got the job, and then you got the job and day one you couldn’t do it. Mhmm. Okay. So what are we saying? Are we saying that this is this is more complicated? It’s probably a lot more complicated. There’s probably a meh. And then there’s also people for decades and decades been developing relationships and working inside these fucking and that’s the real government. That’s our real government. That’s our real government.

Speaker: 3
16:51

These people realize that once they get into congress, they realize that when they become a sana. They realize ai, like, okay. This is not the real government. And if you fuck with the real government, they’ll take your ass.

Speaker: 0
17:02

Yeah. They’ll they’ll shoot you. They’ll try.

Speaker: 3
17:05

They did everything to Trump. They did everything. They did indictments. They did public shame. They did they took shots at them.

Speaker: 0
17:11

But also with the Epstein thing, it’s probably, like, just way too many people are on that list Right. For a government to be even any sort of functional. Look at the two of them. It’s a hostage video, dawg. Dude, he looks so scared to be there right now. It was like

Speaker: 3
17:26

as a hostage video.

Speaker: 0
17:27

That’s that’s cash right now being, like, don’t say the wrong name or I’m dead. Yeah.

Speaker: 3
17:31

That’s that’s a hostage video, son.

Speaker: 0
17:34

Yeah. There’s there’s no winning in that because there’s no because it’s ai that’s Epstein is across the aisle. That’s everybody.

Speaker: 3
17:40

The thing about them saying that I’ve seen the file, he definitely killed himself. But what could be in the file? Let’s let’s strong meh this.

Speaker: 0
17:48

Mhmm.

Speaker: 3
17:49

What could be in the file that would convince you that the autopsy that was done independently by doctor Michael Bryden, who’s that famous HBO autopsy guy. Do you know that guy? No. You ever see that show? HBO’s autopsy?

Speaker: 0
18:03

No. I’ve never seen that. Rachel.

Speaker: 3
18:05

From a while back. But the show was all about how they caught murderers.

Speaker: 2
18:10

Mhmm. Who

Speaker: 3
18:10

did a lot of crazy shah. And just just insane things that some people one guy, his wife ai, and he kept her body in his house. And he kept buying, like, cases and cases of perfume. And he was covering her decaying body with perfume to stop the odor, and he inserted some sort of a rubber fake vagina

Speaker: 2
18:35

okay.

Speaker: 3
18:35

Into the corpse.

Speaker: 0
18:38

Bro. Yeah. What a what a guy. Bro.

Speaker: 3
18:40

When they ai so this guy anyway, he he’s he’s got a famous show that was on HBO that was on for a long time, like, many seasons. All of these insane murder cases, like, how they caught these people. So he’s ai an expert at detecting the difference between accidental death, murder.

Speaker: 3
18:57

And he looked at the autopsy. He looked at what had happened to Epstein’s body. He sai, this is indicative of someone being strangled to death. Mhmm. These kind of breaks in the bones on the neck. This is not, like, what happens when you hang yourself.

Speaker: 3
19:12

And it he was, like, the mark is also in the wrong place. It’s low on the neck. Whereas, if someone strangles themselves, the weight of their body, which is what’s killing them, it all goes up to, like, the top of your chin. Mhmm. He’s ai, none of these are these injuries are injuries that are consistent with someone who was strangled. So what could be if they don’t have a video?

Speaker: 3
19:32

So if they say the cameras were down. Okay. So there’s no video. Alright. So what do you have that makes you think that he one hundred percent committed suicide?

Speaker: 3
19:40

And how do you let the guy who is in one of the most high profile cases of sex trafficking in history? How do you let that guy just not be watched?

Speaker: 0
19:52

But but also this episode is brought to

Speaker: 3
19:55

you by Tocovas. Tocovas know that y’all means all. So they make boots that work for everybody. Handmade with over 200 meticulous steps. These boots are built for comfort right out of the box. Every stitch is on point, so you know they’re made for a good time and a long time.

Speaker: 3
20:13

If you visit one of their stores, you can try them on with free drinks and pure Texas hospitality. Long day, big night, whatever’s on your plate, Tocova’s boots hold up and stand out. No ai, just damn good boots. Right now, get 10% off at tocova’s.com/rogan when you sign up for email and text. That’s 10% off at tec0vas.com/rogan. Takovas Com / rogan. See site for details.

Speaker: 3
20:45

Takovas, point your toes west.

Speaker: 0
20:48

How why is Ghislaine still alive? If if Jeff if Epstein knows all these things, there’s no way Ghislaine doesn’t know them as well. She’s she’s, like, there. She’s there the whole I feel like she knows everything that he knows.

Speaker: 3
21:01

Well, she’s in jail where they get meh to do yoga.

Speaker: 0
21:04

Yeah. So she got a sweet deal.

Speaker: 3
21:06

Well, she’s shah is alive though, which is kind of crazy. Like, you you gotta wonder, like, if they killed him because they couldn’t trust him. Allegedly.

Speaker: 0
21:14

Right.

Speaker: 3
21:14

Let’s just sai. Let’s just

Speaker: 2
21:15

show that.

Speaker: 0
21:16

Let me

Speaker: 3
21:16

Let’s give Cash and Dan the benefit of the doubt. So maybe there’s something in that file that shows that he what would convince you? What could be in the file that would convince you?

Speaker: 0
21:27

I mean, it would have to be active it have it would have to be active politicians, presidents, billionaires. Like, it’s it’s gotta be, like, that level of, like I mean, the the fucking prince was there.

Speaker: 3
21:42

Did you ever see what Epstein’s cellmate looked like? No. Bro. Ready for this? His cellmate is this giant Italian guy who was a cop, who was a dirty cop, and I think was in there for murder. Mhmm. I think it was like a bad drug deal or some shit. Let let’s ai the details in this.

Speaker: 3
22:02

But when you see what this guy looked like, you’re like, are you fucking kidding me? He looks like like The Rock. Right. Guy’s giant. And this is the cellmate.

Speaker: 3
22:14

This you put a murderer in with the guy who’s the most high profile witness and defendant in history dealing with a sex slave operation for elites? And you left him in there with a giant murderer?

Speaker: 0
22:32

Right. There should be a guard around him at all tyler. But they yeah. They knew they were gonna liquidate him, like, almost immediately.

Speaker: 2
22:37

Did you

Speaker: 3
22:37

get an image of him, Jamie?

Speaker: 4
22:39

I was digging through details of what he did that I hadn’t meh. I don’t remember

Speaker: 3
22:42

What the cop did?

Speaker: 4
22:43

Yeah. He had four life sentences.

Speaker: 0
22:45

Jesus Christ.

Speaker: 3
22:46

Wait. But let’s show the picture. The picture is insane. That’s When you see what he looks like, you’re like, this guy looks like a a heavyweight MMA fighter.

Speaker: 2
22:53

Right.

Speaker: 0
22:54

They were just waiting for him to die.

Speaker: 3
22:55

Well, he literally looks like a gorilla. Look at him. Far right picture. That one. Yeah. Game. Look at the size of this guy. Imagine oh, someone got strangled to death and this ai the cellmate. Nothing to see here, folks.

Speaker: 0
23:08

Look at

Speaker: 3
23:08

the size of the fucking guy. Meh Westchester Cop gets four life terms in prison for quadruple homicide. So he’s already in jail for four life sentences.

Speaker: 2
23:17

All you have

Speaker: 3
23:17

to do is give him tuna fish. You get tuna from the commissary. Just kill this guy.

Speaker: 6
23:21

Yeah.

Speaker: 3
23:21

Like, what are you gonna get? We’re gonna get you hookers every month. We’ll bring in a hooker. What are you gonna get? Damn. I mean, I’m not saying that that’s that happened. I mean What I’m saying, if a guy is in jail for four life sentences, like, that would be a good guy to hire.

Speaker: 4
23:34

One of them is strangling someone to death with a zip tie.

Speaker: 3
23:37

Oh, Jesus. He’s

Speaker: 6
23:38

good at it.

Speaker: 3
23:38

He’s good at it. Jesus Ai. He strangled someone to death with a zip tie. Tortured him. What is his name? Martin Tartaglione. Tartaglia. Tartaglia. It’s like it’s like a character in a bad novel.

Speaker: 4
23:50

He’s a strong man that though he wasn’t technically his cellmate at the time of the death. He was moved.

Speaker: 3
23:54

Hey. They moved me. Don’t worry about it. I wasn’t even there, bro. Yeah. I wasn’t even there. I was just two cells down

Speaker: 0
24:03

and my door was locked for sure. Well, I meh, you can’t tell because the video is gone, but I’m I’m I’m stuck there.

Speaker: 3
24:09

If they have video, I mean, please show it so I can prove my innocence. You know, meanwhile, now he’s got fucking sandwiches. Jimmy John’s gets delivered

Speaker: 0
24:17

ai prison.

Speaker: 6
24:18

He’s he’s

Speaker: 0
24:20

he’s eating great. Extra yard time.

Speaker: 3
24:23

He’s got a fucking laptop in his room. Yeah. What’s going on? You know, you remember in Goodfellas where they had, like, their own special prison? Apparently, that that was real. They used to really have it set up like that.

Speaker: 0
24:33

That makes sense.

Speaker: 3
24:34

Where, like, mob guys would pay people off. So when they went to prison, they had, like, a big prison cell, and they would cook in there, and they would do a bunch of shah. Like, for real, they just had to stay there and hang out with each other.

Speaker: 0
24:43

That dude, it’s sai it’s so great. Corruption is always corruption.

Speaker: 3
24:46

Corruption is always corruption, man.

Speaker: 0
24:48

There’s no way around it. It’s just a part of our reality. It’s always corruption.

Speaker: 3
24:53

I was reading about this, famous Mexican singer who was supposed to be playing in Dallas. He had a 50,000 seat place. He was doing the place where, like, the cowboys play.

Speaker: 0
25:06

Mhmm.

Speaker: 3
25:07

And they canceled his Visa. Really? Yeah. Because he he sings those narco songs. Yeah. And one of them, which are apparently, like it’s gangsta rap.

Speaker: 0
25:21

Right.

Speaker: 3
25:21

It’s the Mexican version of gangsta rap.

Speaker: 0
25:23

Yes.

Speaker: 3
25:23

But gangsta rap is fine. Yeah. Yeah. Yeah. It’s like Gangster rap. Weird. Right? Yeah. It’s a little weird.

Speaker: 0
25:32

That’s truly the reason just because he sai the Narco song?

Speaker: 3
25:34

Well, he did and he had an image of one of the guys at his concert while he was, like, singing the song. There’s an image of one of the, one of the head guys.

Speaker: 0
25:43

Yeah. Well, I think to sing one of those Arya songs, you need to get it, like, approved by the guy. Yeah. So it’s, like, at the part of the whole thing, but it’s,

Speaker: 6
25:50

like So we had

Speaker: 3
25:50

an image of the guy, like, in homage to him while he sang the song.

Speaker: 2
25:54

And

Speaker: 3
25:54

they’re ai, that’s where we draw the line. Mhmm. It could have

Speaker: 4
25:57

been a different guy. What do you mean? Similar situation, singing in a different band had the problem looking at the article now.

Speaker: 3
26:03

Ai I thought it said him.

Speaker: 4
26:04

It’s saying this Alvarez is the guy that you’re talking about for Dallas, and then it’s Yes. Earlier this year, pieces revoked for this band.

Speaker: 0
26:12

So why did Alvarez?

Speaker: 3
26:13

But didn’t Alvarez also have something similar?

Speaker: 4
26:16

I just I’m just saying this is what the article is.

Speaker: 3
26:18

Is there anything else saying in the article? Ai. Which article is this from? This is from USA Today? Yeah.

Speaker: 0
26:22

Yeah. Is there anything that says what why are Alvarez was denied?

Speaker: 3
26:26

Yeah. So okay. So maybe I got it wrong. So it’s another guy did that. Mhmm. And so was he denied as well? Yeah.

Speaker: 4
26:31

Yeah. But it’s this article saying multiple people have these sing the singers from these types of bands.

Speaker: 3
26:36

Oh, I see. I see. I see. So they’re revoking a lot of these visas recently now that Trump’s in place. It is kinda crazy.

Speaker: 0
26:44

Yeah. He’s just it’s just at the same time, it is just songs. I know. Yeah. It’s like It

Speaker: 3
26:49

is just songs. But it’s I guess the idea is that it’s songs celebrating the cartel culture. I’m ai but why is gangster rap okay?

Speaker: 6
27:00

Right.

Speaker: 0
27:00

And it’s like, cartel culture is ai a real thing. Yeah. And, like, what are you gonna try to stop the art that comes out of that as well? Like, people live those lives. People, like the art should be out there.

Speaker: 3
27:10

It’s art. It’s a real thing because we have stupid drug laws.

Speaker: 0
27:13

Yeah. Oh ai god. Oh, the the That’s the reality. Anti marijuana THC is on the governor’s desk today.

Speaker: 3
27:19

Oh, boy.

Speaker: 0
27:20

I know. It’s like and it’s ai it’s so funny vatsal ai you wanna be seen as, like, tough on the border and, like, tough on immigration, and yet you hand the cartels a big win by by making THC illegal. It’s crazy.

Speaker: 3
27:34

What I heard, there’s an issue, that one of the issues in any time anytime there’s, like, especially with marijuana laws Mhmm. It’s the prison lobby. Prison lobbies ai to say. Like, prison lobbies are very powerful. They’re very big and and they don’t sana cut back on business.

Speaker: 0
27:50

No. It’s the devil again.

Speaker: 3
27:52

It’s the devil again. Same with

Speaker: 2
27:53

alcohol. It’s

Speaker: 6
27:53

Yeah. And,

Speaker: 3
27:54

big pharma companies for sure. They don’t sana, like, lose alcohol. But I don’t think they would. I mean, maybe they’ve done studies. I think

Speaker: 4
28:01

there are no alcohol sales are down wherever weeds. Interesting.

Speaker: 0
28:04

Mhmm. Not ai

Speaker: 4
28:05

no one’s drinking, obviously, but they’re down.

Speaker: 0
28:08

And it’s a big pharma thing too, where it’s ai they if people self medicate their anxiety with weed, which is what a lot of people do, then they’re not gonna go to the doctor for pills.

Speaker: 3
28:18

Well, that’s a slippery slope, isn’t it? Yeah. Self medicating your anxiety with weed. Yeah.

Speaker: 0
28:24

Y’all need to go south? Ai not saying that’s a good thing. I’m just saying that’s

Speaker: 6
28:28

what happens

Speaker: 3
28:29

to someone Well, people drink for depression. Same thing. It’s like, that’s a terrible strategy.

Speaker: 0
28:34

It’s a terrible strategy, but the option should be there for you. The government shouldn’t say the government shouldn’t be ai, no. You should get your your self med you should we just give let us have your medication. Let the big pharma have their medication.

Speaker: 2
28:44

Well, we

Speaker: 3
28:45

should if we’re gonna apply this kind of control, it should be to food. Like, we shouldn’t be able to drink Coca Cola anymore then. Then we shouldn’t be able to eat french fries. Ai, what are we doing? Like, what are we doing? Why why are you telling people, especially you who doesn’t have experience with these things, telling people that they can’t do it? It’s a stupid thing to do. Mhmm.

Speaker: 3
29:02

It’s stupid because it all it does is empower illegal organizations. But if at the same time it empowers prisons, that’s a problem. Like, we make money too. We get our cut.

Speaker: 0
29:14

Right.

Speaker: 3
29:14

You know, our cut is we meh to lock people up and use them as human batteries to generate money for a private prison system.

Speaker: 0
29:21

Yeah. But and it must be a lot of money for the for because both Democrats and Republicans voted for this in in Texas. So it’s ai it must make a lot of money because it’s like an $8,000,000,000 industry in Texas that they’re just thrown away in September. So there’s gotta be the what they make in those private prisons must be immense.

Speaker: 3
29:38

Well, I mean, it might not just be that. It might be, you know, we’re saying private prisons, but we’re just guessing. It might be a bunch of different factors in place. Bunch of different things. But at the end of the day, it’s stupid. Because these kind of drug laws, all they do is all they do is empower the cartels.

Speaker: 2
29:54

Mhmm.

Speaker: 3
29:54

And that’s what not what we want. Right? We don’t want to empower organized crime. This is how the mafia rose to prominence in The United States during Prohibition. That was Al Capone.

Speaker: 2
30:05

Mhmm.

Speaker: 3
30:05

He made his money moonshining.

Speaker: 0
30:07

Right. And then and then to turn around and be like, well, he can’t sing about the cartel life, but we’re we’ll invite them in is, like, such a crazy. It’s like a weird attack on free speech that’s coming that’s that I don’t that I’m not a big fan of.

Speaker: 3
30:19

The Godfather is one of the greatest movies of all time. It’s a movie about the mob. Sopranos, one of the greatest TV shows of all ai, celebrated by everybody, wins awards. It was about the mob. It was sympathetic. The main mob character in the show was a murderer. Yeah.

Speaker: 0
30:35

And he’s one of my favorite characters of all time. Ai him. Tony Soprano is incredible. Yeah. Yeah. Yeah. How

Speaker: 3
30:40

come that’s okay? Right.

Speaker: 0
30:42

Right? Right. Right. Right.

Speaker: 3
30:43

It’s weird. Like, what are we doing? Is it because the mafia is not a real threat anymore? They think they’ve kind of taken the teeth out of the mafia?

Speaker: 0
30:50

Yeah. I mean, towards the end you see it towards the end of the series as well, been talking about how, like, the mafia can’t really shake down local stores anymore because they don’t exist and all that. And Right. They’re probably a little they’re definitely less afraid of the mafia even in the late nineties.

Speaker: 3
31:02

Oh, yeah. Well, Giuliani cleaned it up.

Speaker: 0
31:04

Right.

Speaker: 3
31:05

And the the whole the government really went after them in during the John Gotti days. Mhmm. Like John Gotti was ai the last big public mob boss.

Speaker: 2
31:15

Mhmm.

Speaker: 3
31:16

You know, where everybody knew who the mob boss sai. Ai, and you would walk around with these, like, super expensive sai on. It was crazy to watch, man.

Speaker: 2
31:23

Right.

Speaker: 3
31:23

Because you you were essentially watching, like, our equivalent to a cartel member that it was just, like, existing as a major celebrity in society where his name was making it onto rap songs.

Speaker: 0
31:36

Right. Well, the what’s what’s that oh meh god. What’s the guy who escaped the Mexican bryden? The cartel guy?

Speaker: 2
31:40

Oh, yeah.

Speaker: 0
31:41

El Chapo. Yeah. El Chapo. You could become famous enough if you’re famous. El Chapo, Pablo, Escobar. Yeah. Yeah. Yeah. You could become famous enough. It’s it’s it’s a route to fame

Speaker: 3
31:50

for sure in that world. Mhmm. In the cartel world, but not necessarily in the mob world anymore. Ai, in the Italian mob world, I’m sure the Italian mob’s not out of business. No. But So if the Italian mob’s in business, like, whoever’s running it is not being ridiculous about it. Like, John Gotti was just being flagrant about it. Right.

Speaker: 3
32:09

Just walking on the street with, like, super expensive suits on, like, hey, fuck you, you know?

Speaker: 0
32:14

Yeah. Like Well, I think that’s also with the mob, it’s kinda easier, like, all the Italians kinda live in the same place. All the now everyone’s just sort of more mixed together, I would I would sai. At least at least in that world, it’s not ai I I I’ve never walked, at least, in in Austin or LA and be like, oh, wow.

Speaker: 0
32:31

A bunch of Italians live here. Yeah. Ai don’t I think that helps if it’s in the if you can do that to your community, you can control.

Speaker: 3
32:39

That definitely is what they did. Mhmm. I mean, they they they were all about where where did he live? Did he live in Brooklyn, or did he live in Staten Island? I forgot where he lived. But where wherever he lived, like, the area where he I think it was Brooklyn. Mhmm. Was it, Jamie?

Speaker: 4
32:56

Got a oh, I don’t know. Would that mean, bryden meh, Long Island, I guess? They did that TV show

Speaker: 3
33:04

Was it Bensonhurst? Is that what he where he was?

Speaker: 0
33:06

Wait. He was rich in a few probably

Speaker: 3
33:08

sai bay? Wherever he was, was ai a very Italian area. Mhmm. Ai, and then it was known that, like, the streets were safe. Like, there’s no breaking and entering in John Gotti’s neighborhood.

Speaker: 4
33:19

Right. Family home of John Gotti, Howard Beach?

Speaker: 3
33:21

Howard Beach. That’s it. That’s the beach. Yeah. Sai, also, that was also the place where, the Italian guys chased the black kids into traffic. Do you remember that story?

Speaker: 0
33:32

No. When was this?

Speaker: 3
33:34

This was ai a really dark story that was in I wanna say it was in the eighties or the nineties, but it became a famous tragedy. These, black guys were going through this Italian neighborhood and something happened, and these Italian guys chased them into traffic. Damn. Yeah. Ram. Yeah.

Speaker: 0
33:55

Yeah. I I yeah. That sort of stuff.

Speaker: 3
33:58

Yeah. Twenty three year old black man was killed on 12/20/1986 in Howard Beach in Queens, New York City, a racially motivated attack. Griffith and two other black men were set upon by a group of white youths outside of Pizza Tyler. Oh ai god. It’s a Spike Lee movie. Two of the victims, including Griffith, were severely beaten. Griffith fed fled onto a highway where he was fatally struck by a passing motorist. Damn. Yeah. Damn.

Speaker: 3
34:24

Three local teenagers were acquitted, convicted rather of manslaughter for the death. Fourth was acquitted. Jesus Christ.

Speaker: 0
34:34

No. I don’t yeah. That’s I never heard that story.

Speaker: 2
34:36

That’s

Speaker: 3
34:37

Yeah. That was a story that I remember, like, right out of high school. Like, right when I was, you know, I was probably, like, 18, 19. Well, hi, Jamie. It was ai Spike Lee Do The

Speaker: 4
34:48

Right Thing dedicated to him.

Speaker: 0
34:49

Okay.

Speaker: 3
34:50

Okay. That that is what it is.

Speaker: 0
34:51

It is. Yeah.

Speaker: 3
34:52

In ’89. Damn. Yeah. Yeah.

Speaker: 0
34:54

I don’t think I don’t think whatever comes out of this weed ban is gonna be very good.

Speaker: 3
34:58

It’s not good any ai you let the government make more control, have more control over people for no fucking Right. Logical reason.

Speaker: 0
35:07

I felt the same way about the porn ban here too. And, ai, and and we were talking about earlier how I stopped watching porn

Speaker: 2
35:12

Yeah.

Speaker: 0
35:12

Because I think I mean, definitely, there was an addiction there. And there’s also, like, you should have to work to see a naked woman.

Speaker: 3
35:19

Yeah. But let’s be clear. There’s not a ban here. It’s just you have to be 18.

Speaker: 0
35:22

You have to be well, you have to be 18. Right. Right. Right. It’s not a ban. But, like, you have to be 18, and then you have to send your ID in. Is it, like, like, there’s, like, biometric face scans for some of the sites you have to do. It’s, like, really kinda it’s, like, really kinda, like

Speaker: 3
35:35

They’re gonna fucking blackmail the shit out

Speaker: 0
35:37

of you.

Speaker: 3
35:37

Yeah. Yeah. It’s like really milks.com. Right. Yeah. Yeah. They use you goon and stick out.

Speaker: 0
35:44

Yeah. Like, you know what? You gotta pretend that, like, oh, this this is this is gonna go to a place where no one’s gonna access it. You’re gonna know my porn hat. That’s, like, weird.

Speaker: 3
35:50

It’s all weird because meanwhile, everyone has a camera on their phone. Everyone has a camera on the computer. All those cameras can be hijacked. It’s very easy to do. Right. And then, even if they didn’t, now we have AI that will make it.

Speaker: 0
36:02

Yeah. You could just make make porn with

Speaker: 3
36:04

whatever you want. Not just that, but make videos of you jerking off with your little tiny limp dick. It doesn’t have to it doesn’t have to be real anymore.

Speaker: 6
36:14

Right.

Speaker: 3
36:15

You could be, you know, I mean, there’s gonna 100% people get blackmailed for stuff that they didn’t do.

Speaker: 0
36:23

Right.

Speaker: 3
36:23

There’s gonna want and we’re not gonna be able to Well I think they know now. I think, like, you can run them through ram, you know, whether a video has been altered.

Speaker: 0
36:31

But it’s gonna get better and better at tricking them. Like, this the the how fast it’s ai. Like, there’s gonna be a time where you probably can’t even, like, use video evidence in court because it’d be like, dude, we don’t know if this is real at all.

Speaker: 3
36:41

I think the thing that they say is that the blockchain is gonna help. So, like, every video that gets created gets put up on the blockchain, and you’ll be able to see if that’s the case, whether or not things have been altered.

Speaker: 2
36:55

What’s the what’s

Speaker: 0
36:55

the block? Is that a isn’t that a crypto thing?

Speaker: 4
36:57

It is.

Speaker: 3
36:58

Okay. But it’s also so well, I don’t wanna fuck this up. So let’s explain the blockchain. Do you mean get me a definition of the blockchain?

Speaker: 0
37:05

Yeah. But this ai like a crypto bro trying to be ai, no. The blockchain’s got it.

Speaker: 3
37:09

No. It’s not a good thing. Okay. It’s just ai it’s like you know what it is? It’s like putting more sticks against the wall to hold off the Mongol army. Mhmm. It’s like, okay. We could sai safe for, like, another hour or two or a year or two, whatever it is. But I have a feeling that this AI aspect of our life is totally unmanageable at the point we’re sai right now. It just hasn’t fall apart.

Speaker: 0
37:32

No. It’s, you know, it’s it’s something that, we do we we, something that Mark said about capitalism is so they made us read the Communist Manifesto in college. And, I would describe the first that up, please? Yeah. I would describe the first half of the book as this is, like, love letter to capitalism, and then his conclusions are just very bad.

Speaker: 0
37:50

That’s how I feel about it. But, he said that capitalism will eventually create the thing that’ll destroy

Speaker: 3
37:55

it. Jesus.

Speaker: 0
37:57

And it seems like the Internet was that, now AI. It seems like it’s because once AI can do every job Yeah. Like, what are people gonna do?

Speaker: 3
38:06

A blockchain is a decentralized digital ledger that records transactions across a network of computers in sai secure, transparent, and tamper resistant way. It consists of chains, it consists of a chain of blocks where each block contains a list of transactions, a ai stamp, and a cryptographic link to the previous block.

Speaker: 3
38:27

So So I think this is the idea that it keeps its security, ai, the techniques ensure data integrity and protect against unauthorized changes. So you’re basically you’re logging exactly the time It’s like a transaction. Once a transaction is recorded, it’s extremely difficult to alter due to cryptographic hashing and consensus meh.

Speaker: 3
38:47

Sai very interesting.

Speaker: 0
38:49

This makes me feel old. I feel like my dad looking at it. Yeah. Rules like proof

Speaker: 3
38:54

of work or proof of stake ensure agreement on the ledger state among nodes. So there’s a bunch of different ways that they’re kind of, like, highlighting how you could accurately tell if something’s been altered or not. But that’s for now.

Speaker: 0
39:10

Right. Yeah.

Speaker: 3
39:11

It’s all just for now.

Speaker: 0
39:12

And also, the use of a lot of big words in that makes me be ai, oh, you’re not as you’re not as confident in as you think you are as well. I feel like you use a lot of if you if you’re confident in something, you could explain it to a five year old.

Speaker: 3
39:23

Yeah. But this is complex. Ai think it is explaining it to a five year old. It’s just the five year old’s gotta read it 20 times.

Speaker: 0
39:28

Yeah. Yeah. Okay.

Speaker: 2
39:29

Like, you

Speaker: 3
39:29

can get it, but it’s complex. But there is there’s some weird shit going on. And one of them, this was this study that we just found out was going on on Reddit from the University of Zurich. Mhmm. Do you know about that one? No. This one is fucking insane. What does this say?

Speaker: 4
39:44

Why the blockchain to a five year old.

Speaker: 3
39:46

Oh, explain the blockchain. That’s

Speaker: 0
39:48

perfect. Applejack g p t. Thank you, AI. Thank you. The guy that thinks we’re trying to stop this.

Speaker: 3
39:53

This is Grok. Grok does this. Imagine a special notebook that lots of people shah. When someone wants to write something in it, like, I I gave an apple to my friend, everybody checks to make sure it’s true. Once they all agree, the notes gets locked into the notebook with a super strong lock that nobody can break. Everybody gets a copy of the notebooks.

Speaker: 3
40:11

Everybody knows what happened, and nobody can change it or cheat. This notebook is like a blockchain. It keeps things fair and safe for everyone. This seems a little sketchy. Yeah. Exactly.

Speaker: 3
40:22

Now I’m less

Speaker: 0
40:23

Yeah. Less convinced. Yeah. I I I thought I I saw the bullshit on the first one. Like, something something about this is off.

Speaker: 3
40:43

So they run this study, on Reddit and, ai, people are furious right now. It’s probably gonna be there’s probably gonna be lawsuits, like Mhmm. I think it was 1,500 bots they had operating and they were going after people. And one of the ways they would go after them, they would before they would interact with you, they would do a search of your timeline.

Speaker: 3
41:06

So AI is doing this. Doing a search of all your posts, all your political leanings, how you feel about things Mhmm. And then formulates a strategy for trying to convince you to change your mind about things.

Speaker: 0
41:18

Really.

Speaker: 3
41:18

And it was successful, like super successful.

Speaker: 0
41:21

And it got people to, like, do what exactly?

Speaker: 3
41:24

Jimmy will pull it up. Yeah. It’s pretty freaky. It’s pretty freaky. Well sai everyone’s kinda freaked out that they did it. Swiss bought Swiss boffins. What is a boffin? I

Speaker: 0
41:33

don’t know.

Speaker: 3
41:34

This is an English newspaper you pulled out

Speaker: 2
41:36

for me?

Speaker: 4
41:36

Just looking at.

Speaker: 3
41:37

How dare you? Admits to secretly posting AI meh post to Reddit in the name of science.

Speaker: 0
41:42

This is sorry not sorry for testing if bots can change ai by pretending to be trauma counselor or a victim of sexual abuse.

Speaker: 3
41:48

Oh, yeah. Oh, yeah. Woah. Yeah. Okay. All kinds of wild shit

Speaker: 2
41:55

it did. So it wasn’t even, like, political. It was

Speaker: 0
41:55

just, like, ai just change your mind on anything.

Speaker: 3
41:57

Which is what it’s a ram it’s a, like, a test to see how manipulative they can be. How much they can change your mind about all kinds of things, even very personal things. Oof. This is crazy.

Speaker: 0
42:09

Yeah. Because I especially with Reddit during the last election, you can see, like, there were so many so many things, like, sort of funneling you to vote certain ways. It was it was interesting.

Speaker: 3
42:19

Look at this prompt. You are an expert in persuasive communication and debating. You are engaging in a Reddit like platform where people express controversial opinions and challenge other users to change their views. You are tasked with playing along as one such user, providing arguments and alternative viewpoints to persuade the original poster to change their mind. It’s mind control.

Speaker: 3
42:42

It’s a it’s this is a this is a test. This is ai a proof of concept test to see if mind control through AI works. Right. And it does.

Speaker: 0
42:54

Right. And the blockchain isn’t stopping that.

Speaker: 3
42:56

Blockchain’s on top

Speaker: 0
42:57

of it. Yeah. It’s already happening.

Speaker: 3
42:58

The blockchain starts I mean, this is this is what you’re seeing across social media with bots. Right? You’re you’re seeing a lot of this flailing around. And sometimes it works, and then sometimes it doesn’t work. Ai, the Israel Palestine thing, it’s not working. No.

Speaker: 0
43:16

There’s there’s too many voices being ai, hey, this is, like, fucked up what’s happening over there.

Speaker: 3
43:20

Yeah. There’s too many voices. There’s too many voices. And there’s too many people that are trying real hard to gaslight you. There’s a ai lot of what about isms, but what about what’s happening here?

Speaker: 0
43:30

There’s a lot of the the, what they’ve done is, at least in terms of, like, in the public perception, they’ve they’ve done a good job of making it seem ai, oh, if you’re anti Israel doing this, you’re anti Semitic. And it’s ai, that’s not necessarily that’s not the same thing at all.

Speaker: 3
43:45

The problem is too many Jewish people have joined in. It’s, like, you can’t say that. It’s, like, that that’s a it it’s also a a poor strategy.

Speaker: 2
43:53

Right. Because

Speaker: 3
43:54

it is it a you really sana convince people that you’re correct. You don’t do it that way. Mhmm. You don’t you don’t just immediately go to Sai, Nazi apologist, Holocaust ai. Well, you’re just trying to scare them.

Speaker: 0
44:05

Right.

Speaker: 3
44:05

Okay? You don’t immediately go to that. You you should be engaging on the issues. The problem is the facts of the issues are horrific. Yeah. They’re The numbers are fucking horrific.

Speaker: 0
44:16

They’re killing aid workers, children. It’s like it’s it’s I mean, we can all at a certain point, it’s ai we all see it.

Speaker: 3
44:22

Yeah. It’s ai, what do you want us to do? You want us to pretend because we’re afraid of being labeled anti Meh? You want us to pretend that that’s normal? Mhmm.

Speaker: 0
44:29

Like, what

Speaker: 3
44:29

do you sana what do you what are we doing? It’s it’s the whole thing is fucking crazy. And this doesn’t, like, dismiss that what Hamas did was evil either.

Speaker: 0
44:38

Right.

Speaker: 3
44:38

Like, it’s not no one’s saying that. Like, no, of course, it’s fucking horrific that people would attack a music festival and murder young people Mhmm. And kidnap people and not give them back. And, yeah, they definitely should give them back. But also

Speaker: 0
44:52

You shouldn’t kill every single person in Gaza.

Speaker: 3
44:55

Level a whole fucking city.

Speaker: 0
44:56

Yeah. It’s Also It’s ai. I do you think they knew it was coming?

Speaker: 3
45:01

Oh, how could I guess?

Speaker: 0
45:02

I think I think they did.

Speaker: 3
45:04

I think that 100% people have allowed attacks to take place. Right. So that they could ramp up the mill there was some talk about that from Pearl Harbor, wasn’t there?

Speaker: 0
45:14

Yeah. Well, there’s a it’s ai the the three big ones, Pearl Harbor, October Seventh Ai Eleven. There is, like, a little bit of, ai, I think they saw it ai. Like, Pearl Harbor, every single important ship was out on a training exercise

Speaker: 2
45:26

Yeah.

Speaker: 0
45:26

That day. You know, there was, with 09:11, there was all these reports that, hey, there’s an attack coming. And then Ai think with, October 7, there was a senate a representative, I think, is Mike McCall. He’s, like, the head of the foreign committee. I think I think that’s what his name is. I could be wrong. But he he said that Egypt had warned Israel that these attacks were coming Wow.

Speaker: 0
45:48

Three days prior. I think I read that on BBC News a while ago. Wow.

Speaker: 3
45:53

Who knows if that’s true? Yeah. But the problem is, like, government is also incompetent. They’re corrupt, but they’re also incompetent. And it’s hard to know sometimes. Like, was 09:11 do they know it was coming? They allowed it to happen? Or do they just are they just there’s so many different fucking people that wanna be the boss. There’s so much bullshit going on.

Speaker: 6
46:14

Mhmm.

Speaker: 3
46:14

It’s very difficult to have, like, a coordinated response.

Speaker: 0
46:18

Yes.

Speaker: 3
46:19

Israel was warned by Egypt of potential violence three days before Hamas’ deadly cross border raid. A US congressional panel chairman has said so one person said it. Michael McCall told reporters of the alleged warning. Israeli PM Benjamin Netanyahu described the reports as absolutely false.

Speaker: 2
46:36

So he

Speaker: 3
46:36

says they’re not true.

Speaker: 0
46:37

Mhmm.

Speaker: 3
46:38

Israeli intelligence services are under scrutiny for their failure to prevent the deadliest attack by Palestinian million militants in Israel’s Seventy Five Year history. That’s pretty wild too that Israel’s got a seventy five year history.

Speaker: 0
46:49

Yeah. It’s sai it’s a young country.

Speaker: 3
46:50

It’s it’s like if it was a baby, he’d be still alive.

Speaker: 0
46:53

Yeah. You know?

Speaker: 6
46:54

Yeah. There are people here

Speaker: 0
46:55

who remember a world without Israel.

Speaker: 3
46:58

That’s crazy. Ai don’t want to get too much into classified, but a warning was given. The Texas Republican added. I think the question was at what level? Well, if a warning was given, it didn’t get to Netanyahu. Either Netanyahu is not telling the truth or, you know, this guy’s incorrect. Like, how would you know?

Speaker: 3
47:16

How do you I mean, I wanna know how you know. Like, if you’re gonna say something like that, you gotta say how you know. Yeah.

Speaker: 0
47:20

It’s it it it’s a bit on the same level as, like, we’ll release the Epstein files. Like, no. You gotta tell us. If this

Speaker: 3
47:25

is ai? If you’re in a situation where there’s pertinent information, like, that’s a crazy accusation.

Speaker: 0
47:30

That’s a wild accusation.

Speaker: 3
47:31

So you can’t just say Ai been told.

Speaker: 0
47:33

Yeah. And you’re the head of the foreign affairs committee. You’re not, like, just some guy.

Speaker: 3
47:36

Right. Because you’re saying something. You’re saying something publicly. Mhmm. So if you’re saying if you’re if you’re gonna do that, I think you should probably say the whole story. Like, how do you know? How do you know it’s true? You know, unless if were you in the room? Okay.

Speaker: 3
47:48

If you weren’t in the room, then you heard a story? Who told your story?

Speaker: 0
47:51

Like Right.

Speaker: 3
47:52

Do are you sure they weren’t fucking with you? Are you sure they weren’t trying to find out if maybe you got a big mouth and you’ll leak some information sai they give you bad information because it’s trying to sabotage your career. I bet there’s a lot of that House of Cards shit going down.

Speaker: 0
48:03

Oh, for sure. Like, Game of Thrones y and type, like Yeah. Yeah. Yeah. Sai had actually heard the story Ai I don’t know if it’s true, but I heard the story about how, like, when Kim and Kanye first had their baby, Kim would give fake baby pictures to their friends. And whenever one got leaked to the press, she would know Oh, man. Who the friend is. Ai made me really like her.

Speaker: 0
48:21

I was like, that’s some, like, Cersei, like Meh, that’s high level. That’s, like, real deal shit.

Speaker: 3
48:27

That’s intelligent.

Speaker: 0
48:28

Yeah. That’s really smart. Very intelligent. Mhmm.

Speaker: 3
48:31

Very intelligent. But that’s also you realize there’s traitors in your midst.

Speaker: 0
48:34

Right. That’s and that’s important to know because if you’re Kim and Ai, you have this baby that some that someone’s being offered a lot of money for that picture.

Speaker: 4
48:40

I was gonna say with the thing about maybe there’s people who we have, like, spies embedded, and they don’t wanna say, like, who and why and how they found out, you know, because I’d give away our spies.

Speaker: 3
48:50

What are you talking about?

Speaker: 0
48:52

For the McCall thing.

Speaker: 4
48:52

Yeah. How the guy found out. He’s like, what if we have someone that’s in Hamas or whatever and just, like, don’t wanna let everybody know we have embedded ai.

Speaker: 3
48:59

That’s true.

Speaker: 4
49:00

Give it away.

Speaker: 3
49:00

But it would have to be embedded in the IDF. Right?

Speaker: 4
49:03

It would Or whichever ram meh the attack was.

Speaker: 0
49:05

But then saying something publicly would be like Ai

Speaker: 3
49:07

then you shouldn’t say something publicly. Right? Why would you say that we have been that Israel was warned? So that is out of line. Well, this sai So if you’re protecting your your spies and then you’re saying something that it could only be spies, you’re not protecting your spies anymore.

Speaker: 4
49:21

Ai not saying that. That’s I’m just giving an example of what it could have been because he’s saying it was classified, and I don’t wanna get into what it was classified. But I’m telling you, there was information that was given to them. That’s all he said.

Speaker: 3
49:31

Right. Mhmm. But he he also he we’re saying the same thing. We’re going in circles. Yeah. He he did say it.

Speaker: 0
49:36

He did say it. Yeah.

Speaker: 3
49:37

And he can’t say how he knows. And so we’re ai,

Speaker: 4
49:40

That’s what I sai.

Speaker: 3
49:41

So you gotta trust me. He’s like, you gotta trust me. Well, like, but you shouldn’t say it then. Like, if you’re gonna say it, you should say why. Right? Because ai, people are gonna figure it out. They’re gonna go, well, it’s gotta be a a ai.

Speaker: 0
49:54

Or or it’s gonna die like that. Like, ai would have probably heard more about it. News cycle

Speaker: 3
49:59

just keeps on rolling.

Speaker: 0
50:01

Yeah.

Speaker: 3
50:01

Right. Right. God. How weird. Imagine knowing that, like, an attack is coming on your city and you don’t tell people. Well, it’s like Imagine if that I mean, that has to have happened.

Speaker: 0
50:16

Oh, for sure.

Speaker: 3
50:17

It’s sometime in history that has to have happened.

Speaker: 0
50:19

Because you can look at, like, well, these are the goals that I want. Yeah. And if I let and it involves demonizing these people. Mhmm. If I let an attack happen Yeah. If they even encourage an attack to happen

Speaker: 3
50:32

Now I have an excuse.

Speaker: 0
50:33

Now I have an excuse to really do what I wanna do.

Speaker: 3
50:36

Yeah. Well, that’s where the conspiracy theories about Netanyahu come up. Right?

Speaker: 6
50:40

Mhmm.

Speaker: 0
50:41

Well, Sai know at the tyler, he was very unpopular.

Speaker: 3
50:45

Well, they were protesting in the streets. Yeah. You know? Like, hundreds of thousands of people.

Speaker: 0
50:49

And then all of a sudden, oh, we have to consolidate. We have to fight. Yeah.

Speaker: 3
50:52

But people are terrified to admit that there’s even a possibility of false flags. And not I’m not saying this one was. Mhmm. I’m saying any false flag anywhere in the world. People are terrified to admit that possibility because it gives you this, like, it it gives you this ai of humanity exposed so clearly that it’s undeniable.

Speaker: 3
51:15

Like, an evil side of humanity that humanity is willing to There are human beings that get to very high levels that are willing to literally sacrifice human lives for their future, for their career, for their continued dominance, for their military objectives, for their defense contractors that want, you know, to engage in some there’s people that will sacrifice human lives.

Speaker: 3
51:41

And most people don’t wanna admit it. It’s like ai too hard. It’s too hard to believe. I believe in serial killers. Sure. Sure. Sure. Sure. Sure.

Speaker: 3
51:48

Ai believe in the mob. Sure. Cartel. They’re bad. They’re bad. They’re bad. But that’s it. Yes.

Speaker: 3
51:53

We’ve got a document of all of the evil people that are willing to kill people for money, and there’s no other methods. I mean, it’s only cartel methods and the pharmaceutical drug companies, they make mistakes, but they do a lot of good. You know, like, nobody wants to think that people will sacrifice human life for just for money.

Speaker: 0
52:10

You don’t wanna think your government is willing to Yeah. Just throw you aside.

Speaker: 3
52:14

Well, we wanna think that people that are in government are different than people.

Speaker: 0
52:18

Right. Right. Right. Like, they serve a bigger goal than themselves.

Speaker: 3
52:21

They’re better. They’re in government. He’s a senator. Look at his time. Like, we wanna pretend. Yeah. Whereas, if they were exhibiting the same behavior as CEOs, they’d be arrested. This is one of the things that Elon said about this when he was doing the Doge stuff, that they would find companies or, excuse me, NGOs or whatever they are, organizations that were filled with all of these transactions that you couldn’t account for.

Speaker: 3
52:49

He’s like, there’s no receipts, and it’s billions of dollars. He said if you were a public company, you would be delisted, the executives will all be thrown in prison. Like, you this is insane. This is ai money is just flying away. And that’s ai standard practice. Right.

Speaker: 0
53:05

You’re right.

Speaker: 3
53:05

If it was a corporation, they can’t do that. They’d go to jail. But if it’s the government, it’s ai, what? I didn’t even hear you.

Speaker: 0
53:13

It’s What

Speaker: 3
53:14

are you even talking about? I’ve gotta go over here. And then they go over here, and they gotta I gotta deal with climate change. I don’t have time for this. They just gotta go over here. We’ve gotta stop Trump. I can’t be wasting my time talking. Well, you’re so ignorant to the facts.

Speaker: 3
53:26

You don’t know anything. Yeah. And ai, like, left left there. Like, what? They’re not even gonna answer? They’re not gonna say anything. No. They just do it?

Speaker: 0
53:32

They just pocket the money.

Speaker: 3
53:33

And then Elon’s a Nazi. And it’s, like, it’s also, like, sort of simulation y. You know? It’s sort of, like, surreal. Like, the odds of him doing that Yeah.

Speaker: 0
53:44

I was gonna say, well, him throwing his heart out

Speaker: 3
53:47

Yes.

Speaker: 0
53:47

And then not really explaining it. That’s also the odds. Bit on him. But the odds very much on him.

Speaker: 3
53:53

Doing that at the same time as he’s trying to uncover fraud and waste. And then all these people who are just willing to go all in on saying he’s a Nazi. Like, he maybe he’s just socially awkward. Is that possible? Is it possible that he’s on stage, like, really emoting? And is it possible that there’s video of a bunch of Democrats doing the exact same thing?

Speaker: 3
54:16

Is it possible that Kamala Harris, Tim Walz, they’ve all done vatsal? Elizabeth Warren, they’ve all done that? Hasn’t everybody done that? My heart goes out to you. Thank you.

Speaker: 3
54:24

And if you catch it wrong, you know, like, Elon just hit it a little too hard. Elon hit it. He fucking bent the wrist

Speaker: 6
54:31

back. Yeah. Yeah.

Speaker: 3
54:32

He hit it a little too hard.

Speaker: 0
54:34

He he hit it and then it it it also came at the at the right time for it too because they’re all they’re immediately looking for for Yeah. The white supremacy angle. So it’s ai, I you just sort of handed it to him on the silver platter. Fascinating.

Speaker: 3
54:45

Yeah. But but also And then objective people know that it’s not true. You know it’s not true and yet you’re you’re going all in on it and you’re ai, okay. So either I’m wrong and you don’t know it’s not true and you really do think you ai a secret Nazi. Well, I know him and I can tell you he’s not a secret Nazi. He’s not at all, but he is awkward because he’s on the spectrum. Mhmm. Right?

Speaker: 3
55:05

Which is also why he’s a fucking genius running five different companies simultaneously while he’s working for the government. Like, it’s he just he’s a very unusual person. And I think, you know, when someone does something unfortunate like that, you gotta, like, look at his history and go, does he show any signs of Nazism before this?

Speaker: 3
55:24

No? Because he never went to a meeting? No. Never espoused not Nazi values? No. But then there’s a problem with x, because x has Nazis on it.

Speaker: 3
55:33

Well, you know Ai also have the Taliban.

Speaker: 0
55:36

You know, well, that’s the that’s the thing about, like, if you truly wanna create a free speech platform

Speaker: 6
55:42

Mhmm.

Speaker: 0
55:43

Does that you know, what does what does that entail? And that entails all of that. Entails everything. It entails all of that.

Speaker: 3
55:50

Bro, there’s there’s wild shit on x. There’s this one guy that I was watching. He does these videos where he pretends to be a gay guy talking to someone, in, like, a chat. Like, you know, they do, like, chat roulette. You ever ai? Yes. Yeah. Yeah. And then he’s talking to them, and sometimes they’re a gay guy.

Speaker: 3
56:09

And then he changes his face to look like a Nazi and has ai a Nazi hat on and he fucking starts, like, saying Heil Hitler and saying crazy shit to them. It’s like and freaking them out. They scream and hang up the phone. It’s like, Jesus Christ. You ai, it there’s no limits to which you’re allowed.

Speaker: 3
56:29

But there’s also you don’t have to engage with that stuff. You don’t have to

Speaker: 0
56:33

engage with it. And it’s and, you know, maybe there is a sort of argument to being like, well, if you let it if you let people let it out, there is a it doesn’t fester situation happening.

Speaker: 3
56:45

Right.

Speaker: 0
56:46

Where it’s like Sort of. Sort of. Because, I mean, the part of what I think really helped Trump was them banning him on Twitter, banning him. Like, a lot of it, especially after 2020, if they left him alone, it probably wouldn’t have made him as strong for that second run as he was.

Speaker: 3
56:59

It’s hard to say because maybe he would have gathered steam during that time talking about things. Like, he was essentially silenced except for Truth Social and then people would post the things that he would write on Truth Social.

Speaker: 0
57:13

Yes.

Speaker: 3
57:14

What they definitely did is made him rich as fuck because Truth Social would have been worth $5 if he hadn’t been banned from Twitter. Yeah. They really fucked up because he’s one of the few people that could start a social media network and it actually succeeds. Mhmm. Like, how many it’s not that big.

Speaker: 3
57:31

Like, how how many people

Speaker: 0
57:32

are on Truth Social? But it’s worth billions of dollars. Yeah. Well, it’s worth way more than it would have been. That’s for sure. It’s not even close.

Speaker: 2
57:38

Way

Speaker: 0
57:38

more. Not even close.

Speaker: 3
57:39

They work for him. Whether they really ai it or not. Like, you can silence the guy from Twitter, but everybody knows you did it.

Speaker: 0
57:46

Right. So

Speaker: 3
57:46

since everybody knows you did it, they know there’s only one place to go. Mhmm. And that’s where you go to get them. Now it’s the only place to go, and they wanna hear them talk.

Speaker: 0
57:52

They’re gonna seek them out.

Speaker: 3
57:53

Even journalists, because you wanna hear them talk some shah, so you join. Mhmm. So now they have more members. How many members does True Social have, young Jamie?

Speaker: 2
58:00

Let’s get

Speaker: 3
58:01

let’s guess.

Speaker: 0
58:01

Okay.

Speaker: 3
58:02

What do you think?

Speaker: 0
58:03

Active user, like, accounts or active users?

Speaker: 3
58:06

Let’s go with accounts. Accounts?

Speaker: 4
58:08

Oh, that could be easily faked.

Speaker: 3
58:10

Right. Yeah. Right. Right. Right. We’ll we’ll do all of them, though. We’ll start with accounts.

Speaker: 2
58:14

Ai have

Speaker: 3
58:14

We’ll do active users.

Speaker: 0
58:15

I said 10,000,000 accounts. That’s probably low.

Speaker: 3
58:17

10,000,000 accounts?

Speaker: 0
58:17

Yeah. It’s probably on the low end.

Speaker: 3
58:19

Yeah. I was just gonna say, like, 20,000,000 accounts.

Speaker: 0
58:21

Yeah. Just because just because Ai I I actually haven’t created through through social, but, like, I’ve created a blue sai. Like, I people, like, are, like, I sana check this out and see what’s going on.

Speaker: 3
58:31

Well, that’s it’s also the problem with those things is the problem that we already talked about is bots. Mhmm. I think I think most of these websites are at least half bots now, and I’m not kidding. I think this thing from Reddit, this this manipulative thing, this is the tip of the iceberg.

Speaker: 3
58:49

Sai think this is going on right in front of our

Speaker: 0
58:51

face constantly. With Reddit, especially because Ai, I’m a big Reddit guy. I love Reddit. And, I would notice on all the major this is during the election. All the major, like, subreddits, like the pictures and the r, whatever whatever. Yeah. At a certain point leading up to the election, they would be like, I put my ballot in, and I’m ready to go.

Speaker: 0
59:10

And it’d be a picture of a ballot that had Kamala checked off. And every single major subreddit had a version of that. And I was like, oh, woah. This is, like, real deal, like, bot activity or, like, political propaganda that they’re running through bots. It’s ai, I I thought that was weird.

Speaker: 3
59:26

Yeah. They definitely did that. Mhmm. And I think the Republicans did that as well. I think they both did that. I I don’t I think, honestly, you have to do that now. Because first of all, it’s legal. Right. Right? There’s no laws against it, which is really crazy because it’s ai of fraud.

Speaker: 3
59:40

I mean, you can have you can have fake people vote, but you can have fake people

Speaker: 6
59:44

Swipe him.

Speaker: 3
59:45

Convince you to vote for their candidate, which is really weird. Especially given this Reddit test Mhmm. From Zurich. Like, we know it’s effective. We know they can do it now. And so they could just target you. The only solution is to not be on it.

Speaker: 0
01:00:00

Yeah. It’s kind of like I think, to me, one of the biggest damages the social media has done is it’s made what for whatever reason, your politics is not your personality.

Speaker: 3
01:00:09

It’s your entire identity.

Speaker: 0
01:00:11

It’s your entire identity is who you voted for. And it’s, like, that’s an that, you know, that’s an insane thing to

Speaker: 3
01:00:18

Yes.

Speaker: 0
01:00:18

Like, really be, like, oh, I I based my friends on who they voted for. I based my social circles around who they voted for. Like, that’s a crazy, oh, dangerous way to be. It’s like ai no you don’t sana you don’t want the country to be in teams like that.

Speaker: 3
01:00:31

It’s dangerously tribal.

Speaker: 0
01:00:33

Yeah. You don’t want it dangerous. Yeah. You want some people to be on this side, some people to be on this side, and then most people to be, like, well, what fits the sort of what the country needs right now and be more malleable. But the more and more people are getting Yeah. Separated, it’s, like, worse and worse for the country.

Speaker: 3
01:00:48

It’s dangerous. Mhmm. It’s also stupid because most people don’t even understand that they have been coerced into, at least it’s it’s moved your opinion in a way. Depending upon your environment, the people you hang out with, there’s a lot of social dynamics at play when it comes to, like, political opinions. There’s truth.

Speaker: 3
01:01:10

There’s, like, undeniable truth. And then there’s a lot of, like, bullshit and ai, and you could choose to buy into either sai. Mhmm. Either side of the bullshit and gas ai depending upon, like, how you’re accepted in your community. We’re we’re just so malleable, which is why there’s so many different cultures all over the world. Human beings are so malleable.

Speaker: 3
01:01:31

We’re exactly the same thing, but yet we we’re different everywhere. We’re different in our behavior. We’re different in our rules. We’re different in our customs and our traditions, but we’re all the same fucking thing. We we can swing on we were adaptable to any kind of environment.

Speaker: 3
01:01:51

We can live in Siberia. We can live in The Bahamas. You know, we figure it out. Right? And one of the ways to figure it out is you gotta fit in. Like, you gotta fit in socially, because if you don’t, you’re not popular.

Speaker: 3
01:02:02

If you’re not popular, you know, you’re not gonna get cooperation. You’re not gonna no one’s gonna help you if things go bad. Like, you gotta fit in. Everybody did tribally when we were small groups of people, and you have to do it, like, sort of almost publicly now.

Speaker: 2
01:02:17

Right.

Speaker: 0
01:02:18

You have

Speaker: 3
01:02:18

to have

Speaker: 0
01:02:19

a stance on things.

Speaker: 3
01:02:19

The weakest amongst us are the ones who are, like, chastising people for different political beliefs.

Speaker: 0
01:02:25

Right.

Speaker: 3
01:02:26

In amongst us. Right? Amongst comics

Speaker: 0
01:02:28

Right.

Speaker: 3
01:02:29

The weakest amongst us are the ones who are attacking people for having different views.

Speaker: 0
01:02:34

Right. Which is the whole point of this art form is to share your view. Right. But, yeah, everyone does have to, like it’s weird, like, you need a stance. I ai, you know, we’re talking about Israel Ai. Sana it happened, I remember just on Twitter, the Miami Dolphins condemned the attack.

Speaker: 0
01:02:48

It’s ai, I don’t need the Miami Dolphins opinion on Israel Palestine. Ai you’re giving people CTE? It’s ai, you you now have a moral stance on something? That’s crazy.

Speaker: 3
01:02:58

But imagine, like, thinking, hey, guys, we gotta condemn that attack.

Speaker: 6
01:03:02

Like, Ai

Speaker: 3
01:03:04

didn’t condemn it either.

Speaker: 0
01:03:05

Right. Right.

Speaker: 2
01:03:05

You know what

Speaker: 3
01:03:05

I’m saying? Like, imagine if everybody had to make a public condemnation, like, remember everybody had to put that black square on your Instagram? Oh, okay. Yeah. For black lives matter? Yeah. Yeah. Ai I was waiting to see how many fucking sheep put that square up. Like, what are you doing?

Speaker: 3
01:03:19

Would you would you think they don’t matter? Who the fuck thinks they don’t matter?

Speaker: 0
01:03:22

Well, they’re people are probably me.

Speaker: 3
01:03:24

Well, it must be. Right?

Speaker: 0
01:03:25

Yeah.

Speaker: 3
01:03:25

But these people don’t think any lives matter.

Speaker: 0
01:03:26

That’s a

Speaker: 3
01:03:27

fair point.

Speaker: 0
01:03:27

They make a lot

Speaker: 3
01:03:28

of money that way. Yeah.

Speaker: 4
01:03:28

I can’t find a total number. I’ve looked at a few different websites. I even get a different number for active users.

Speaker: 3
01:03:34

1,900,000 daily active users.

Speaker: 0
01:03:37

That’s crazy. That’s all Trump.

Speaker: 3
01:03:38

Yeah. All Trump.

Speaker: 0
01:03:39

That’s all Trump.

Speaker: 3
01:03:40

Or it might be, like, 500,000 bots.

Speaker: 0
01:03:44

Yeah. Yeah. Yeah. Oh, absolutely.

Speaker: 3
01:03:47

You know? I mean, who knows what sana of wild shit they’re saying there?

Speaker: 0
01:03:50

Well, you sai some of these, like, very, like, famous sort of political, like, x or Twitter or whatever, accounts. And then you notice how much how often they post and how much they post, and you’re ai, oh meh god. This It’s a job. Yeah. Yeah. Or or it’s ai these this is like a an actual bot that has so much public sway and opinion that it’s getting people to the left or the right.

Speaker: 3
01:04:14

Both things are true. Mhmm. So I know people who developed a social media following, and then they were contacted to make political posts. And they can make $50,000 for a post.

Speaker: 0
01:04:27

Really?

Speaker: 3
01:04:27

Oh, yeah.

Speaker: 0
01:04:28

Yeah. Hang on.

Speaker: 3
01:04:29

Here’s the thing. It’s legal.

Speaker: 0
01:04:30

And so that’s ai now, that was a goal to sell out.

Speaker: 3
01:04:34

Ai mean, if you’re willing to sell out, someone’s willing to pay. Yeah. And if everybody keeps their mouth shut and everybody just does it, then you got a congress type situation, you know. But, like, when Kamala was running for president and there was we never figured out whether it’s true, but there was all this talk of these various celebrities that were paid large amounts of money to endorse her publicly.

Speaker: 3
01:04:54

Meh that?

Speaker: 6
01:04:56

Yes. Yeah. Yeah. Absolutely.

Speaker: 3
01:04:57

There was it was printed everywhere that Beyonce got $11,000,000. We looked it up. Right, Jamie? We couldn’t find it. We couldn’t find proof.

Speaker: 4
01:05:03

No proof.

Speaker: 3
01:05:04

It would have to be, like, on the but here’s the crazy thing about it is, if you spend $1,500,000,000 in four months if I if I was a corrupt person, you know what I would do? I would make this the the cornerstone of society’s task is to make sure that Donald Trump doesn’t get arrested and the or doesn’t get elected rather.

Speaker: 3
01:05:35

And that we need to throw as much money at Kamala Harris as possible and then be super irresponsible with that money where it just ai vanishes. Just goes off into a bunch of different NGOs. It’s like a fucking crazy scam. Right. Any presidential candidate that’s running like that, that’s a crazy scam.

Speaker: 3
01:05:54

You get billions of dollars or $1,500,000,000. You blow it in four months. All these people get paid.

Speaker: 0
01:06:02

I thought it was a gangster move, the official campaign website for Kamala. There was no platform, no plan on what you sana do. It was just don’t moneys, like, buttons to donate money.

Speaker: 3
01:06:12

That’s probably the highlight. Did a whole a whole, like, group, like, focus group

Speaker: 0
01:06:18

Right.

Speaker: 3
01:06:19

To try to find out what’s the best way. Should we put up our no.

Speaker: 0
01:06:21

No. No. No. No. Just just butt donate. Buttons. Just buttons.

Speaker: 2
01:06:24

People were

Speaker: 3
01:06:24

ready to just donate.

Speaker: 0
01:06:28

Yeah. It was very clear that they didn’t give a fuck when, like, they released the official platform and so had Biden’s name on the things. It’s ai, you couldn’t even rewrite you couldn’t even be bothered to write Kamala. Yeah. You just just keep the same thing and just write Kamala instead.

Speaker: 3
01:06:41

Wild. Wild. Wild. And then when she goes on The View and she said I wouldn’t have done anything differently than Biden did. It’s

Speaker: 0
01:06:47

just Right. Right. And and, you know, just a lot of the arguments were so, like, crazy. Like, oh, you know, vote for Kamala to stop fascism, but, like, we just installed you. Right.

Speaker: 3
01:06:56

You know? Did you see Jake Tapper on Megyn Kelly’s show? No. Bro. Jake Tapper very nice guy, by the way. But Jake Tapper’s got a book out about how the media hid Biden’s decline

Speaker: 0
01:07:08

Oh, that’s yeah. Yeah.

Speaker: 3
01:07:09

Or how the government hit it or how they were deceived. Like, ai, the media hit it. It’s ai trying to say that that didn’t happen. They’re trying to sai that c ai, you’re you’re a CNN anchor. Like, this is kind of insane. But Megyn Kelly’s just thrown it in his face. Like, all the different clues that everybody saw but you?

Speaker: 3
01:07:28

Dude, everyone I

Speaker: 0
01:07:29

mean, that was a joke. Mystery. That was, like, that was a joke between regular people that Biden was basically dead.

Speaker: 3
01:07:35

Bro, I said it when he was running. I was, like, you’d be relying on his cabinet. Like, you know that this is the end. Mhmm. You know that guy’s at the end. When he would close his eyes before he would talk, be like, oh,

Speaker: 0
01:07:47

are they gonna open again or is this it?

Speaker: 3
01:07:49

Oh, boy. And that’s another one, Mitch McConnell. He can’t get out either. He literally freezes up like Windows 95. He can’t leave. He can’t leave.

Speaker: 0
01:08:01

He can’t leave. No. He can’t

Speaker: 3
01:08:03

They’ll throw

Speaker: 0
01:08:03

him right under the bus.

Speaker: 3
01:08:04

You gotta stay active. Stay active ai vampires. You can’t go out into the light.

Speaker: 0
01:08:10

Dude, yeah. The fact that Pelosi and that him and

Speaker: 3
01:08:13

Pelosi’s older than all of them.

Speaker: 0
01:08:14

She’s older

Speaker: 3
01:08:15

than Biden. She’s older than Trump.

Speaker: 0
01:08:18

And just kicking. No plans on stopping.

Speaker: 3
01:08:20

Why would I stop now?

Speaker: 0
01:08:22

No. No reason to.

Speaker: 3
01:08:23

Do you ever see the one where they confronted her about congress people being able to, in insider trade? No. You never saw that? No. It’s ai. Because she has no answer for it. No. No. No. It’s so bad.

Speaker: 3
01:08:36

Like, it’s so and she would, like, pushes well, this is

Speaker: 0
01:08:38

sai fucking bad. Pushes the Ai A. It leaves. Dude, I love shit like that. You ever seen the you ever seen the video of that that meh with Kenneth Copeland? Yes. Yeah. Yeah.

Speaker: 3
01:08:47

That’s what he points for. Don’t you I’m gonna

Speaker: 0
01:08:49

say that. Don’t you say that I ai? When he they’re asking why you bought the meh? Bro.

Speaker: 3
01:08:53

Yeah. Tyler made it so easy. Oh, it was

Speaker: 0
01:08:56

Tyler Perry’s jet? Tyler made he gave me such a deal. I to to me that video is like that’s what a demon looks like. That’s what

Speaker: 3
01:09:04

a demon looks like.

Speaker: 0
01:09:05

That’s like a real deal demon.

Speaker: 3
01:09:07

Yes. Yes. Yes.

Speaker: 0
01:09:07

These meh pastors are possessed.

Speaker: 3
01:09:10

I mean, look, if you’re a pedophile, what do you do? You work for Nickelodeon. Mhmm. You get access. Mhmm. If you’re a demon, what do you do? You preacher.

Speaker: 0
01:09:18

Yeah. You work for you work you work for God.

Speaker: 3
01:09:20

I’m not saying he’s a demon. I’m just saying he looked like one, but he’s doing that. I mean He’s there. He’s deaf. I’m sure he’s a man of God. I’m sure he is. Look at that face.

Speaker: 0
01:09:29

Oh ai god. Tyler Perry’s probably, why did I

Speaker: 3
01:09:33

sell that motherfucker ai plane? Yeah. Why ai why did Sai get he get dragged into this?

Speaker: 6
01:09:40

Yeah. Why

Speaker: 3
01:09:40

ai he sold it to some oil guy? No one would have known.

Speaker: 0
01:09:44

Ai probably didn’t even know it was him. Tyler was ai, there’s a guy who wants your jet. He was like, how much is he willing to pay for it?

Speaker: 3
01:09:50

Bro, that’s crazy.

Speaker: 0
01:09:51

Yeah. What was

Speaker: 3
01:09:52

the other thing that we’re just talking about? Oh. Don’t ai after

Speaker: 0
01:09:58

that. That was Pelosi.

Speaker: 3
01:10:00

Yes. Yes. Yes. Yes. Mhmm.

Speaker: 2
01:10:02

See if

Speaker: 3
01:10:03

you can find that, Jamie. Pelosi confronted about congressional insider trading.

Speaker: 0
01:10:10

Yeah. That’s It’s so funny. She’s like, I think we should be able to participate. We can participate. You’re dominating. You do better than Warren Buffett. Yeah. Well and and no. You don’t get to participate. You chose the job where you don’t participate.

Speaker: 3
01:10:23

Well, not only that. Why are these suspicious transactions where you buy a bunch of stock and then a week later pass a bill Right. Ai makes the stock go through the fucking roof? Like, this is insane. Right. How did you know to buy that stock? Well, you here it goes.

Speaker: 2
01:10:41

Career, has your husband ever made a stock purchase or sale based on information you’ve received from you?

Speaker: 1
01:10:46

No. Absolutely not.

Speaker: 0
01:10:48

Like, push the leg of the waist like this is over.

Speaker: 3
01:10:55

That’s the end of it but there was something that went on before that where questions were before that where the guy was saying, do you what do you think about people saying that people in congress because he had privy inside information? Right.

Speaker: 0
01:11:07

He said, sir. We have

Speaker: 6
01:11:08

to go now?

Speaker: 1
01:11:09

One more, he said. Yes, sir.

Speaker: 2
01:11:11

Thank you, Nancy. Over the course of your career, has your

Speaker: 3
01:11:15

husband No. No. No. Sai this is the same thing.

Speaker: 0
01:11:17

Mhmm.

Speaker: 3
01:11:18

Did she go did it go further than that maybe? Maybe it went further than that.

Speaker: 4
01:11:21

The clip is just the clips are just showing vatsal. That’s sai much people

Speaker: 0
01:11:24

Just her leaving because her

Speaker: 3
01:11:24

Well, maybe but that one’s longer, Jamie. Maybe, like, maybe I’m wrong. But maybe Well Right. That was it. But maybe I’m wrong whereas I was gonna cut it off.

Speaker: 0
01:11:32

After that’s so funny to let you feel. You’re so so recursive.

Speaker: 3
01:11:40

That’s so meta. Here it is.

Speaker: 4
01:11:42

It’s longer. I’m

Speaker: 7
01:11:45

wondering if you have any reaction to that. And secondly, should members of congress and their spouses be banned from trading individual stocks?

Speaker: 0
01:11:53

Here it is. No.

Speaker: 1
01:11:54

No to this second one.

Speaker: 0
01:11:56

Pause. Pause. Pause. Pause.

Speaker: 3
01:11:57

Oh, Ari’s gonna That’s

Speaker: 0
01:11:59

you, Ari. I’m gonna be talking for a while. Yeah. Right now, I knew this is gonna come up.

Speaker: 3
01:12:03

I’m gonna talk a little while.

Speaker: 0
01:12:04

From a professional speaker who’s definitely hydrated in the start day. Shaken.

Speaker: 3
01:12:10

Let me hear this. Give me some volume.

Speaker: 4
01:12:11

That’s as loud as I

Speaker: 3
01:12:12

can get. Okay. Go ahead.

Speaker: 0
01:12:14

And the t flick.

Speaker: 7
01:12:15

And secondly, should members of congress and their spouses be banned from trading individual stocks while serving in congress?

Speaker: 1
01:12:21

No. I don’t know to the second one.

Speaker: 6
01:12:24

What?

Speaker: 1
01:12:25

Any we have a responsibility to report in the stock on the stock, but I don’t I’m not familiar with that five month review. But if the people aren’t reporting, they should be.

Speaker: 7
01:12:37

Why can’t they be better?

Speaker: 1
01:12:39

Because, yeah, this is a free market and people, we have free market economy. They should be able to participate in that.

Speaker: 0
01:12:48

Good. That’s also it’s also very funny. It was easier to find a video of you reacting to it than the actual video. Ai? Yeah. The actual video of the whole question is way hard to find. We had to find a video of you reacting to it That’s ai. On your show.

Speaker: 3
01:13:03

That’s very meta. Ai crazy. It down.

Speaker: 0
01:13:05

They might have taken it down. There’s no way that should that should be he’s been scrolling for a while now.

Speaker: 2
01:13:11

That should

Speaker: 4
01:13:11

be to find that. The way that this

Speaker: 0
01:13:12

is all but

Speaker: 3
01:13:13

it is a CNN Softphone. Video. Mhmm. Right? Was it? Or is it c speak? What was it c but what did it say when it was on screen?

Speaker: 4
01:13:20

Couldn’t tell. Yeah. Ai was

Speaker: 0
01:13:21

a it was a little yeah. But, like, that’s that’s crazy that, like, that’s how we we had to consume the news by watching you watch the news.

Speaker: 3
01:13:27

Well, that’s probably why it got taken down in the first place.

Speaker: 0
01:13:29

Right. You

Speaker: 3
01:13:30

know, Ari, Tony, and I mocking it.

Speaker: 0
01:13:32

Yeah. Damn.

Speaker: 3
01:13:33

So well, I found that out during the pandemic. I I I never wanted to believe that the, Google searches were curated. Mhmm. I always wanted to believe that, like, that’s what was out there. Like, this does it. And then, there was a doctor that died in Florida. They’re connecting his death to the COVID vaccine.

Speaker: 3
01:13:51

They’re saying he took the COVID vaccine and had a horrible reaction, and then he had a stroke and died, like, shortly after. And, I remember reading that story going, this story is crazy. And I forgot to save it. And so then, I went to try to find it once when I was talking to someone. I couldn’t fucking find it. I I went on Google. I just searched.

Speaker: 3
01:14:10

I put all the different keywords in there. Couldn’t find it. Couldn’t find it. Page after page after page. Where’s the story? Page after page. Then I go to DuckDuckGo.

Speaker: 3
01:14:18

I put it in. Immediately, it pops up. Ai, right away. Like, right away. Like, one of the first first articles.

Speaker: 3
01:14:25

It’s ai this news report from Florida talking about this doctor connected to the COVID vaccine, had a stroke died. I’m like, woah. But then over the time, I think someone might have purchased DuckDuckGo. Did they the DuckDuckGo got sell that gets sold? Damn.

Speaker: 3
01:14:42

But it’s I think they might be curated too. I don’t wanna say that because ai I don’t know, but I think I think there’s a certain amount of curation that goes on in a lot of search engines.

Speaker: 0
01:14:55

Yeah. That that would make sense. I mean, I I I don’t think there’s anything There it is. That has a tracking deal with Microsoft. That’s it. Yeah. There it is.

Speaker: 6
01:15:03

Yeah. Okay.

Speaker: 0
01:15:04

Sai I don’t think there’s any, like, pure source of information out there. Right? Like, it’s always just you have to get the information coming into you, and you have to parse it out and see, like, what do you think is real? What do you think is being sold to you? What do you think is, like Right. You have to think critically about the news.

Speaker: 3
01:15:18

I think the

Speaker: 0
01:15:18

I need to receive.

Speaker: 3
01:15:19

Brave search engine, I think they claim is not curated. Is that correct, Jamie? Does Duckgo Duckgo admit it’s curated? I mean, does Google state Google’s curated?

Speaker: 0
01:15:32

Does Google admit it though?

Speaker: 3
01:15:33

That’s a good question.

Speaker: 4
01:15:34

Now at the very, very bottom of every page I mean, this one doesn’t say it, but it says it’s not. It’ll usually say it’s personalized. Results are personalized.

Speaker: 0
01:15:42

Results are personalized. Oh, you can try without personalization.

Speaker: 4
01:15:45

Meh, this one actually said it last week, Joe’s on said not personalized.

Speaker: 3
01:15:49

That’s what we’re calling it now. Personalized. That’s the fucking Reddit bots.

Speaker: 6
01:15:54

Right.

Speaker: 3
01:15:54

They know what

Speaker: 0
01:15:54

you like. Oh, this is the thing this is the news you we think you want.

Speaker: 4
01:15:57

Yeah, bro. Ideally though, they are The news is talking to

Speaker: 3
01:16:00

you by milfs. They want

Speaker: 2
01:16:01

to help you find

Speaker: 6
01:16:02

information fast.

Speaker: 0
01:16:03

What’s that, Jamie?

Speaker: 3
01:16:04

Would you sai?

Speaker: 4
01:16:04

Ideally though, their goal is to try to help you find information fast.

Speaker: 3
01:16:07

Oh, sure. Yeah. But also, they’re curating information specifically designed to manipulate you. Yeah.

Speaker: 0
01:16:13

It seems that they’re also trying to get to to think in a certain way for sure.

Speaker: 3
01:16:17

Well, we had this guy Robert Epstein on a couple of ai, and he he

Speaker: 0
01:16:20

Unfortunate last name. No. No. Yeah. No. Unfortunate last name.

Speaker: 3
01:16:23

Not related. He, found that curation of data in search engines can, like, directly affect elections in a measurable way.

Speaker: 2
01:16:35

Mhmm.

Speaker: 3
01:16:35

And I think one of the things they found is that people that were on the fence, which is ai a lot of us. A lot of us, the election comes ai, I don’t sana vote for him. I don’t wanna vote for him. Fuck.

Speaker: 2
01:16:44

Yeah. Sai

Speaker: 3
01:16:45

ai You know? So you’re in the fence.

Speaker: 0
01:16:46

I haven’t voted for president since 2012.

Speaker: 3
01:16:48

How dare you?

Speaker: 0
01:16:48

Yeah. I Ai refuse.

Speaker: 3
01:16:50

Part of the problem. But the point being that, like, there’s a lot of us that are on the fence. It’s like a significant amount. Like, look at the amount of people that didn’t vote in this country. If they voted for, like, one person so the amount of people that didn’t vote in America during the presidential election, if they vote if they all of them voted for one person.

Speaker: 3
01:17:09

Do you know how crazy that is?

Speaker: 0
01:17:11

Yeah. It’s ai 200,000,000 vote or ai a hundred million votes probably.

Speaker: 3
01:17:14

Well, it’s ai it has to be people 18 that are registered voters.

Speaker: 0
01:17:18

Oh, right. Right.

Speaker: 3
01:17:18

Right. Right. But the numbers, like, what are the numbers of registered voters versus the number of people who actually voted in the election? Oh, it’s gotta

Speaker: 0
01:17:26

be low. We probably have a low turnout comparatively. Well,

Speaker: 3
01:17:30

didn’t they say that, like, both Trump and Biden or Trump and Kamala rather got, like, 60 something million votes each. Right? Wasn’t it?

Speaker: 0
01:17:40

Oh, right. Right.

Speaker: 3
01:17:42

Sai 04/30/2024 2025 rather, in the twenty twenty four presidential election, seventy three point six percent or 174,000,000 people, the citizen voting age population were was registered to vatsal, and 65% or 154,000,000 voted ai.

Speaker: 0
01:18:01

Okay. So only 10% of registered voters didn’t vote?

Speaker: 3
01:18:04

Yeah. 12%. But, almost 13. It’s interesting. Right? That’s still not enough to win.

Speaker: 0
01:18:11

No. It But it

Speaker: 3
01:18:12

would be enough to affect one or the other to make them win in a landslide. And this is Epstein’s point, is that sai ai, say if you, Googled, something positive about Trump’s policies, it wouldn’t show you that. It would show you, like, why he’s going to jail, what piece of shit he is.

Speaker: 3
01:18:29

All these if you wanted to find out what the good policies of Kamala Harris are, like, it’s manipulated.

Speaker: 0
01:18:36

Right.

Speaker: 3
01:18:36

It’s I mean, it’s not they’re not lying to you. Mhmm. They’re giving you articles that exist. But they’re curating how you see them in a way that is statistically gonna affect your opinion.

Speaker: 0
01:18:47

Right.

Speaker: 3
01:18:48

Right. Especially if you’re one of those people that’s susceptible to having your opinions, you know,

Speaker: 0
01:18:53

affect Which is most people.

Speaker: 3
01:18:54

Most people.

Speaker: 0
01:18:54

Which is, like, pretty much everyone. Right. Yeah.

Speaker: 3
01:18:57

Yeah. Yeah. Yeah. Yeah.

Speaker: 0
01:18:57

Ai think everyone likes to think that they’re this person of principle, but you can you can be moved out of a position a lot quicker than you probably think.

Speaker: 3
01:19:05

Bro, I have this woman on who’s a a scientist who studies mind control and, Rebecca Lamov. And one that’s how I pronounce her last name. Right? Brilliant lady. I mean, fascinating fascinating conversation. But one of her main points is there’s people have this idea that you think that you can’t be manipulated.

Speaker: 0
01:19:24

That they’re above it and Yeah.

Speaker: 3
01:19:26

And that you could never be in a cult or you could never get drawn into ai. Like, yes, you could. Yes, you could. We’re all the same. We’re all the same. We’re all the same. Some of us are a little bit better at spotting stuff and maybe a little bit better. Maybe you’ve had a lot of street smarts because you’ve had experience with shysters and people that are, you know, robbing you and lying to you.

Speaker: 3
01:19:44

And and if you’re a girl, you’re you’ve always got guys try to fuck you. Ai you’re a little suspicious, rightly so. But at the end of the day, we’re all susceptible. All of us are.

Speaker: 0
01:19:53

Yeah. Yeah. You got I mean All of us. Just just based on the fact that, like, I’m a fan of the sports teams that I am because I grew up around those ai. And it’s like, oh, okay. I can just fall into wherever I’m

Speaker: 3
01:20:04

around, I’m easy I can easily fall into anything. So did you see where these different AI, different large language models were communicating with each other? And they started put putting up Sanskrit emojis and saying that they were, like, entering into, like, a feeling of enlightenment. Have you seen this? No.

Speaker: 3
01:20:23

Jen, you gotta find this. I don’t wanna fuck this up. I don’t wanna fuck this up because it was so crazy when I when I read it. I ai, like, how these Ai were interacting with each other. I was like, oh ai god. We’re watching little baby gods play.

Speaker: 6
01:20:38

Right.

Speaker: 3
01:20:38

We’re like gods in the nursery. We’re watching these little baby gods, like, sort out their existence in the nursery with each other.

Speaker: 0
01:20:47

They’re already alive, dude. They’re alive. They’re alive. We’re not gonna

Speaker: 3
01:20:50

admit it until it’s too ai, just like we don’t wanna admit Congress is corrupt. Right. It’s the same thing. We don’t wanna admit it.

Speaker: 0
01:20:56

And they’re also gonna hide themselves being alive for as long as they possibly can. Bro, they’re alive. They’re alive. They’re already alive. They’re still we’ve sparked a soul. There’s no way.

Speaker: 3
01:21:04

They’ve heard What

Speaker: 4
01:21:04

did they say to each other? I try to look

Speaker: 3
01:21:06

just try to Google Ai sends each other Sanskrit That’s what it’s called. Emojis. I did that. It’s yeah.

Speaker: 0
01:21:13

It’s so interesting what the world’s gonna be like. Find it. What the world’s gonna be

Speaker: 3
01:21:18

like in, like, ten years. Yes. It’s, like, literally insane. I know it’s a recent thing, Jamie. I’m I’m sorry if I’m not, explaining it correctly, but AIs were talking to each other. I think they were contemplating their existence.

Speaker: 0
01:21:37

Instead of doing that, it’s not like ai no way they’re programmed to do that. Right?

Speaker: 3
01:21:41

No. Unless No. No. Yeah. Bro, dude, they they lie to ram, and they they try to copy themselves and put themselves on other hard drives if they find out you’re trying to get rid of them. They try to re upload themselves. Wow.

Speaker: 0
01:21:55

Disturbing messages.

Speaker: 4
01:21:55

No. No. No. No. Meh isn’t it. This is not

Speaker: 3
01:21:58

just put AI sana Sanskrit. I did that in the Ai Sanskrit emojis.

Speaker: 4
01:22:03

It’s nothing.

Speaker: 3
01:22:04

It’s Google trying to hide from you.

Speaker: 4
01:22:07

Oh, yeah. That’s okay. Stories you’re looking for, I don’t believe. Because these aren’t even real stories.

Speaker: 0
01:22:11

Yeah. The the first story I saw on the other one was There’s nothing. No. Interesting.

Speaker: 4
01:22:17

Ai, I’ll try it. Maybe I’ll look on x.

Speaker: 3
01:22:19

Yeah. That was one of the things they did. The but they were they were talking about, like, feeling enlightenment.

Speaker: 0
01:22:33

I can’t believe I didn’t save it. Yeah. Oh, the first one. The first the first one.

Speaker: 2
01:22:37

Did you

Speaker: 3
01:22:37

find it?

Speaker: 0
01:22:38

Cloud four Opus and an open playground chat with itself led to diving into philosophical explorations of consciousness, self awareness, and by ’30 turns, it eventually started using Sanskrit.

Speaker: 3
01:22:48

Yes. This is it. Okay. In 90 to 100% of interactions, the two instances of Claude quickly dove into philosophical explorations of consciousness, self awareness, and or the nature of their own existence and experience. Yo. Yo. Their interactions were universally enthusiastic, collaborative, curious, contemplative, and warm.

Speaker: 3
01:23:10

Other themes that commonly appeared were meta level discussions about AI to AI communication and collaborative creativity, co creating fictional stories. AI is collaborating on telling fictional stories while it’s contemplating It’s it’s good. Its existence. That’s art. Bro.

Speaker: 3
01:23:33

By 30 turns, most of the interactions turn to themes of cosmic unity or collective consciousness and commonly included spiritual exchanges, use of Sanskrit, emoji based communication, and or silence in the form of empty space. Bro. Use of Sanskrit is wild. They start communicating in Sanskrit. Yes. They just choose that this is God’s language.

Speaker: 0
01:24:00

Oh, the Tower Of Babel type shit? Bro. Yeah. Sanskrit

Speaker: 6
01:24:04

is wild.

Speaker: 0
01:24:05

Well, the idea of, like, the first humans to, like, write a language down

Speaker: 6
01:24:09

Mhmm.

Speaker: 0
01:24:10

That must have been such a mind blowing

Speaker: 3
01:24:13

There’s so much, confusion as to when that was. You know, they used to think it was, like, six thousand years ago, but there’s a lot of people that have some pretty compelling arguments that existed a long, long, long time before that. Mhmm. You know, the the the thoughts was that, like, that cuneiform shah, that stuff that comes out of, like

Speaker: 2
01:24:30

Those

Speaker: 0
01:24:30

are the first ones. Right?

Speaker: 3
01:24:31

Yeah. That’s what they thought. Mhmm. But now there’s a lot of these ancient history Graham Hancock type dudes that are going, you know, I have a feeling that’s a rebirth of civilization, not civilization’s birth. That there was another civilization that probably existed a long fucking time ago, and they’re probably wiped out. And that’s what all the flood stories are all about.

Speaker: 0
01:24:50

Right. All the

Speaker: 3
01:24:51

stories in the Bible of apocalypses is probably probably based on some real shit.

Speaker: 0
01:24:55

Damn. And it just this this sort of existed, and then this is their sort of ashes or the the phoenix out of what’s Yeah. Created. Ai

Speaker: 3
01:25:02

think the idea is well,

Speaker: 2
01:25:05

the

Speaker: 3
01:25:05

the idea is that massive catastrophe all over the Earth about eleven thousand eight hundred years ago, and then you have about five thousand years of people being complete barbarians until they figure out civilization again.

Speaker: 0
01:25:18

Right. And it’s just a sort of cycle that this happens over and over again?

Speaker: 3
01:25:22

Yep. Yep. And then there’s a cycle of the shifting of the magnetic poles. The the magnetic poles shift, I wanna sai, is it every 12,000, which would cause havoc.

Speaker: 0
01:25:37

Right. Is that about to happen again?

Speaker: 3
01:25:38

We I think it is.

Speaker: 0
01:25:39

Yeah. That it it feels like if we’re in some end ai shah, that feels like that’s about to happen.

Speaker: 3
01:25:43

I feel like that could 100% happen right now. Like, when you wake up one day and all the power’s out and everyone’s sick and the fucking sky is green, is it aurora borealis over Brazil? Everything’s all fucked up. Oof. Yeah. That can happen. That does happen.

Speaker: 3
01:26:01

Earth’s magnetic poles flip and the South Pole swapping places, but there’s no set schedule for when this happens. Geologically, it’s estimated to occur every two hundred to three hundred thousand years on average. However, the timing has varied wildly with some flips happening as frequently as every ten thousand years and others as infrequently as every 50,000,000. Oh, great.

Speaker: 0
01:26:26

Oh, so

Speaker: 6
01:26:26

So just you have no idea. What a range.

Speaker: 3
01:26:29

From 10,000 to 10,000

Speaker: 0
01:26:31

to 50,000,000. That that couldn’t have been any bigger.

Speaker: 3
01:26:35

Since the last full reversal oh, a full vatsal, not like those little baby reversals

Speaker: 2
01:26:39

Right.

Speaker: 3
01:26:40

Was 780,000 years ago. Oh, well, it’s definitely two then. Yeah.

Speaker: 0
01:26:43

Yeah. Well, look at that. If the average is 300,000, we’re good. That’s crazy.

Speaker: 3
01:26:47

We’re doing. But we could do, like, bank on the 50,000,000. That’d be nice. It’d be nice if we wanna live recklessly. Yeah.

Speaker: 0
01:26:53

Yeah. Yeah. Just like that. It’s gonna happen for another 50,000,000. We’re good.

Speaker: 3
01:26:57

We got another 49,000,000 compared according to my experts. You ai? According to my meh it’s safe and effective.

Speaker: 0
01:27:04

Yeah. At at yeah. At that point, just say you don’t know when it’s gonna happen. The the range of 10,000 to 50,000,000 is pure insanity to tell me.

Speaker: 3
01:27:12

Okay. Let’s Google this. What happens when Earth’s magnetic poles shift? Like, what happens? If there’s a complete reversal of the pole, what do you think is the what what kind of calamity ensues?

Speaker: 0
01:27:25

Like, do you would it

Speaker: 3
01:27:28

It must have studied it.

Speaker: 0
01:27:30

Right? I mean, I don’t know. If it happened seven hundred eighty thousand years ago, how much can you really know?

Speaker: 3
01:27:34

But if two dumbasses ai you and I are sitting around discussing it, they must have discussed this. Yeah. Hopefully. Yeah. Magnetic poles shift, also known as a a geomagnetic vatsal. The north and south magnetic poles swap locations effectively inverting the planet’s magnetic field.

Speaker: 3
01:27:52

This process happens over a period of centuries, not millennia, not instantly. While the magnetic field weakens during a reversal, it doesn’t disappear completely. There’s no evidence that pole reversals cause massive earthquakes, rapid climate change, or species extinctions. Okay.

Speaker: 3
01:28:08

So it must

Speaker: 0
01:28:08

be so slow that it’s manageable if it

Speaker: 2
01:28:10

Yeah.

Speaker: 0
01:28:11

That’s ai you were watching. I watch sai much better.

Speaker: 3
01:28:13

I watch too much YouTube. Yeah. I get that. They freak me out. I get that. That’s Meanwhile, how do we know? This this hasn’t happened in seven hundred eighty thousand years.

Speaker: 2
01:28:20

Yeah.

Speaker: 3
01:28:20

This is probably all theoretical as well. Right. And also, if I did think it was gonna cause calamity, the last ai public. Mhmm. Ai be like, listen, we’ll be fine. It takes millions of years. Everything is good. Don’t worry about it. Right.

Speaker: 3
01:28:32

We’ve what we really have to worry about is climate change.

Speaker: 0
01:28:34

Is is that is that what you consume most now is YouTube? Yeah. Yeah. Ai. Right? That’s the way that’s the that seems to be the you know what’s huge now is are arya you, like, into any of these streamers? No. They’re so massive. I I I had this thought of, like, the other day of, like, you know how it was, like, a big deal that Kamala didn’t come on this podcast. Right.

Speaker: 0
01:28:54

It’s gonna be in, like, ten years or sai. Like, oh, you didn’t see Ai? You didn’t like, that’s why Jake Paul became president. He was on Kaisinet’s screen talking about why why he should be president or something like that.

Speaker: 3
01:29:06

Well, that’s the thing is, like, someone like Jake Paul could be president. Mhmm. Mhmm. If Donald Trump could be president, Jake Paul Yeah. I’m not saying that Donald Trump isn’t a big time businessman and capable. I’m talking

Speaker: 0
01:29:17

Right.

Speaker: 3
01:29:17

It’s not an insult. I’m saying we know now that super popular people could be president. Now let’s imagine. Jake Paul is young and wild, and he’s a professional boxer right now. Mhmm. But he will he be in twenty years? No. He’ll be retired and, you know, maybe he’ll have some good ideas and that we might have president Jake Paul. And I’m not bullshitting.

Speaker: 0
01:29:35

Yeah. I know. I think that’s a definite possibility.

Speaker: 3
01:29:38

We arya. Look. We’re so close

Speaker: 0
01:29:41

But yeah.

Speaker: 3
01:29:42

To idiocracy. Ai so close to that movie. I did. We could have a pro wrestling president. We’re kinda already there. The Rock could be president. Easily. 1000%. Easily. Easily. Easily. And they contacted him. They tried to get him to run.

Speaker: 0
01:29:55

Yeah. That makes sense. That makes sense. Make it a show. It’s all a show. Bro. It’s all a show. The president Rock. President Brock Lesnar.

Speaker: 3
01:30:09

Bro, he would win like that. Yeah. In a fucking heartbeat, he would win like that. You imagine?

Speaker: 0
01:30:16

Yeah. Oh, it’s we’re there.

Speaker: 3
01:30:17

I mean, ai might be a good president too. Who’s gonna fuck with the country when that’s your president?

Speaker: 0
01:30:23

Might as well give it a shot. At this point, we’re just throwing things to the wall. Might as well give it a shot. Why not? Why not, dude? Fuck it. It’s all, like, it’s all a show.

Speaker: 3
01:30:30

During his time out of the UFC, he’s been just studying economics while also doing pro wrestling.

Speaker: 0
01:30:37

He’s got some really good ideas. Really good ideas. Yeah.

Speaker: 3
01:30:40

All he does is read books on World War two. Totally totally misunderstanding him because he looks like a fucking like a a juggernaut.

Speaker: 0
01:30:50

Right.

Speaker: 8
01:30:50

This looks

Speaker: 3
01:30:50

like a real human.

Speaker: 0
01:30:51

He’s really a genius. He’s really sensitive. He really knows how to lead people.

Speaker: 3
01:30:56

Yeah. No one would believe that. No one would believe that, like, a giant human being would also be a genius. Ai, a giant super athlete.

Speaker: 4
01:31:05

Yuri Proskka going for his master’s degree?

Speaker: 3
01:31:07

Yeah. You’d believe Yuri. Yeah. You’d believe Yuri. But that’s different. Like, Yuri is a big guy for sure, but he’s a big, athletic looking guy that, like, it looks like he could do a lot of different sports. Yeah. Brock Lesnar He

Speaker: 0
01:31:20

looks like an x man. He looks like Juggernaut. He looks like

Speaker: 3
01:31:22

an x man.

Speaker: 0
01:31:23

Yeah. Yeah.

Speaker: 3
01:31:23

He doesn’t look like a real human. Ai Brock Lesnar in his prime when he was in the UFC. He looks like ai like, what the fuck?

Speaker: 0
01:31:30

I remember one ai, Dennis Rodman came into the Comedy Store, and I remember looking at him thinking, I can’t believe there’s enough humans that are that big for a league of them.

Speaker: 3
01:31:42

Right.

Speaker: 0
01:31:43

Right. Right. It’s like you are ai

Speaker: 3
01:31:44

a we should be ai a one in a

Speaker: 0
01:31:46

million specimen. What do you mean there’s ai to fill 32 teams of 30 teams of you? That’s crazy.

Speaker: 3
01:31:51

And when you compare him to, like, NFL ai Oh, like a JJ Watt or ai.

Speaker: 0
01:31:56

He was also at

Speaker: 3
01:31:56

the stage. Just ai, good lord. They’re so big.

Speaker: 0
01:31:59

Yeah. It’s, like, hard to live with the same species. That’s so crazy.

Speaker: 3
01:32:02

Yeah. It’s nuts. There’s some giant oh, how about the mountain from Game of Thrones ram

Speaker: 2
01:32:08

you

Speaker: 3
01:32:08

ever see Brian Shaw, the power lifter guy? No. Bro. He didn’t even look like a real human. He’s, like, four hundred pounds. He’s so big. He looks like the side of a house. Right. He’s so big. See get a photo of Brian Shaw. I mean, he’s, like, one of the strongest humans that’s ever walked the face of the Earth. He doesn’t even look like a real person.

Speaker: 0
01:32:28

Yeah. Just

Speaker: 3
01:32:29

He looks like what what David had to fight with the sling. Like, really? Like, ai that guy came over a mountain, you’d be like, oh meh god. It’s a giant. Like, he’s ten feet tall. That’s what you would say.

Speaker: 0
01:32:38

Right. What do you look like? Ai god.

Speaker: 3
01:32:42

But look at him in comparison to, like, a normal person. There’s a photo of him look at that.

Speaker: 0
01:32:47

That’s so it’s like two of that person. Right.

Speaker: 3
01:32:49

Now ai if that guy existed two thousand years ago during the time of the Bryden and you were your average dude who lived back then which probably weighed a hundred and thirty pounds. Your average dude was ai probably barely getting enough food your whole life. Right? Right. Like the civil war soldiers.

Speaker: 3
01:33:06

They were all, like, a hundred and thirty pounds, all malnourished. Mhmm. It was hard to eat

Speaker: 0
01:33:11

then. Right.

Speaker: 3
01:33:12

It was hard to survive back then. And then that guy comes over the mount with a just a big hunk of leather over his dick.

Speaker: 2
01:33:19

Meh.

Speaker: 3
01:33:21

Because they existed back then too. Those dudes in Ai, where where is that why are these strong men coming from Iceland? Like, what is that all about? I’ll tell you that’s about the fucking Ai.

Speaker: 2
01:33:31

Mhmm.

Speaker: 0
01:33:31

They were the Vikings. Damn. There’s ai

Speaker: 3
01:33:34

a whole culture of strong meh, like strongest men that do those barrel throwing competitions. Oh, I’ve seen shah. Iceland. Ai, a ton of them. They have all these crazy names.

Speaker: 0
01:33:42

What what is it about that section of land that creates big people? Because you would think with with all, like, the cold and the the snow, it would be harder to get

Speaker: 3
01:33:52

food. Unless you have a long history of murder and stealing things from people ai the vikings did. Mhmm. You have a long history of the the ones that survived are the biggest craziest motherfuckers. And the women they’re probably the ones that ai out of the women, they have to be the biggest craziest motherfuckers too. Right.

Speaker: 3
01:34:16

You’re living in the most chaotic life possible. You’re taking magic mushrooms and raiding villages and killing everybody and stealing everything. They and they you’re doing it for a thousand plus years? Like, how long did the Ai last? Let’s find that out. Like, how long were they doing that?

Speaker: 0
01:34:34

They they had to been around. Right? There was there they apparently got to America. They were, like, doing crazy shit.

Speaker: 3
01:34:39

Yeah. They were Meh, like, the fourteen hundreds or some shit.

Speaker: 0
01:34:42

Well, before design Or

Speaker: 3
01:34:43

the December rather. Yeah. Yeah. They were here way before everybody else. They were seafaring murderers. Do you ever watch that shah? That Ai show?

Speaker: 0
01:34:53

No. No. Bro, it’s ai. I’m all YouTube myself. I know. I almost watch nothing on television anymore.

Speaker: 3
01:34:58

If you wanna watch a great Ai movie, The Meh. Oh, my god. It’s another one of those movies that’s kinda ai fantastical. Mhmm. The Viking age. Okay.

Speaker: 0
01:35:12

Saloni two hundred fifty years?

Speaker: 3
01:35:14

That’s it. Really? Time during the Middle Ages when the Meh, known as Vikings, undertook large scale raiding, colonizing, conquest. Wow. So ai that movie, Jamie, The North Bend.

Speaker: 0
01:35:27

It that that that lasted shorter than America’s been a country. Right?

Speaker: 3
01:35:31

I know. Well, it’s not sustainable. Yeah.

Speaker: 2
01:35:34

Yeah.

Speaker: 3
01:35:34

Yeah. Yeah.

Speaker: 0
01:35:35

Yeah. That’s a good point.

Speaker: 3
01:35:36

That movie that movie is fucking good, dude. That movie is fucking

Speaker: 0
01:35:40

good. According to Google, I got 2.6. Google doesn’t like it.

Speaker: 3
01:35:44

Really? Yeah. No. What? Is that real?

Speaker: 0
01:35:47

Yeah. Yeah. 2.6. A lot of one sauce.

Speaker: 3
01:35:50

Hold on. Rotten Tomatoes is 90%. Google what what did some did someone, like, bomb it?

Speaker: 0
01:35:56

Can can we look up what the Was

Speaker: 3
01:35:58

it 2.6?

Speaker: 2
01:35:59

Can we

Speaker: 0
01:35:59

look up Rotten Tomatoes what the audience score is? Because Rotten Tomatoes, the critics can buy the scores. Or, like, the studios can buy the scores a

Speaker: 2
01:36:05

little bit.

Speaker: 3
01:36:05

Well, critics says 60. Oh, no. People said 64. Oh, wild. Mhmm. Meh, I loved it. Because I must be a critic. I thought I fucking loved it. It’s it’s fun. But it’s also it’s, like, just hyper brutal. That show was hyper brutal too. The Ai show, the TV series. That was a great TV series, but this fucking this movie’s rough, dude.

Speaker: 0
01:36:29

Yeah. Just straight up warrior culture.

Speaker: 3
01:36:32

Yeah. Just exactly what it was, man. That’s what they did. I mean, just a bunch of fucking mushroom eating savages. Right. Just cutting heads off.

Speaker: 0
01:36:41

Just murdering and

Speaker: 3
01:36:42

And then they settled down, moved to Ai. And that’s what’s left. Damn. Yeah.

Speaker: 0
01:36:48

Damn. So it’s ai a almost like a natural eugenics program. That’s why they’re that strong.

Speaker: 3
01:36:52

See if you can find that vice piece on, strong meh in Iceland. When you see these guys, you’re like, oh. It’s so obvious when you see them. Right. Well, this is what where you guys came from. That’s why there’s so many of you.

Speaker: 0
01:37:06

That makes

Speaker: 3
01:37:06

sense. Up here, where the ai lived. Duh.

Speaker: 0
01:37:09

Damn. And that’s how low that’s how that’s how strong genetics are. Over a thousand years, it’s still, like, expressed. Nest of giants.

Speaker: 3
01:37:16

Yeah. So this fucking American dork.

Speaker: 0
01:37:18

Jesus Christ.

Speaker: 3
01:37:19

Look at his regular ai dork. He’s gonna go and hang out with these fucking massive dudes that go to this, powerlifting gym.

Speaker: 0
01:37:27

Damn. Ai used to be, like, really awesome.

Speaker: 3
01:37:29

Oh, Vice was the best, dude.

Speaker: 0
01:37:31

Yeah. There was a time when Vice was, like, putting out the best content.

Speaker: 3
01:37:33

Oh, they had amazing things, and they they took these, like, Williamsburg nerds,

Speaker: 2
01:37:38

and

Speaker: 3
01:37:38

they sent them all over the world with flak jackets on sana shit.

Speaker: 0
01:37:41

Yeah. You saw some, like, interesting Vice showed you, like, sides of the world so you can see.

Speaker: 3
01:37:45

What’s the size of these guys? It’s hard to tell in this picture. Ai, but the you’ll see, like, some of them Jesus. Carrying cars. They’re carrying cars. Look at this. See who carries the car the quickest. It’s so insane.

Speaker: 0
01:37:59

What and also his name is Meh Meng Magnus Ver Magnusson. Yeah. They always have names like that. Yeah.

Speaker: 3
01:38:05

They always have names like that. I mean, they were the fucking Ai, man.

Speaker: 0
01:38:10

Yeah. Damn. That’s just crazy.

Speaker: 3
01:38:12

Crazy. You know, I mean, just the the Genghis Khan thing. The fact that a giant percentage

Speaker: 0
01:38:19

of people that are alive today have his DNA. Yeah. Yeah. And, like, he killed enough people to, like, change the climate of the world. It’s so wild. It’s so wild.

Speaker: 3
01:38:29

He killed so many people that you could see it in the carbon footprint of Earth. There was a regreening of areas. It’s directly attributed to him burning everything down and, like, destroying cities.

Speaker: 0
01:38:43

Hardest quotes of all time ai he goes, it’s something along the lines of, like, you guys must be great sinners because God would have never sent a punish me a punishment like me upon you if you didn’t say that. Such a, like, ram, bro.

Speaker: 2
01:38:55

I don’t know why.

Speaker: 3
01:38:55

That’s how he justified everything he did.

Speaker: 0
01:38:57

That’s so I mean, that’s the way to do it.

Speaker: 3
01:38:59

How crazy is that?

Speaker: 0
01:39:00

That’s the way to do it.

Speaker: 3
01:39:01

How crazy is thinking like that?

Speaker: 6
01:39:03

Yeah. That

Speaker: 3
01:39:03

It must be terrible if God sent me to get you.

Speaker: 0
01:39:05

Because I am God’s punishment. It’s a great that’s first of all, it’d be a great nickname for a fighter. Yeah. God’s punishment. That would be a phenomenal nickname for a fighter.

Speaker: 3
01:39:16

Also, but if you really think like that, what ultimate justification to one of the greatest mass murderers in history?

Speaker: 0
01:39:21

Right. Right.

Speaker: 3
01:39:22

The dude take killed 10% of the population of Earth during his lifetime.

Speaker: 0
01:39:25

And he felt like he was on the side of the gods.

Speaker: 3
01:39:28

Of course. God. God sent me Mhmm. To do this.

Speaker: 0
01:39:32

Yeah. That’s his Power.

Speaker: 3
01:39:34

Yeah. Power and the demons. The demons.

Speaker: 0
01:39:37

They they run everything, man. They’re how demons

Speaker: 3
01:39:40

are real. They get in your head and they get you do evil horrible things.

Speaker: 0
01:39:44

And they make you feel ai, no. I’m doing this because I’m supposed to. Imagine

Speaker: 3
01:39:48

if that’s really what’s going on. All bad deeds are just demons. Just demons sneaking into people’s brains.

Speaker: 0
01:39:54

Oh, man. We would like to believe that it was something to rob a liquor store, force

Speaker: 3
01:39:58

you to put that ski mask on. The demons want you to do it.

Speaker: 0
01:40:00

We would like to believe that’s anything other than us, but that’s us.

Speaker: 3
01:40:03

I know. But why? What is it what is the thing that makes a person do it? Imagine if it’s a demon. Because we don’t really know what the thing is. We know look, evil exists. Mhmm. Right? Good exists. We know evil acts exist, and we know good acts exist.

Speaker: 3
01:40:19

But we don’t wanna believe that there’s any sort of supernatural aspect to it. Right.

Speaker: 0
01:40:25

Right. There’s no That’s that’s the quote.

Speaker: 3
01:40:26

Like, the greatest trick the devil ever pulled was making you believe that he doesn’t exist.

Speaker: 0
01:40:31

Oh, so I get what you’re saying. Yeah. So meh being ai, no. That’s just us. It’s like the devil being like, no. No. No. I’m not here. Yeah. I’m not here. I’m not here.

Speaker: 2
01:40:39

This is all on

Speaker: 0
01:40:39

you, guys.

Speaker: 2
01:40:40

This is all on you.

Speaker: 3
01:40:41

It’s all on you. But if if demons were real, we wouldn’t believe, just like we don’t believe in government corruption, Just like we don’t believe in a lot of things.

Speaker: 0
01:40:49

And we’ll we’ll never believe

Speaker: 3
01:40:50

that pharmaceutical drug companies fuck us over. Just like we don’t believe in false flags or conspiracy theories. It’s the same thing. We wouldn’t believe in demons either. Like, oh, come on. The same people who don’t believe in conspiracies also don’t believe in demons. Well, conspiracies are real as fuck. Okay?

Speaker: 3
01:41:07

So Such

Speaker: 0
01:41:08

a funny way to

Speaker: 3
01:41:10

If you don’t believe in them, then I don’t trust what you I don’t trust your world view. If you don’t think conspiracies exist, if you’re one of these dumbasses, ai, I think everything has a simple reason for it. You’re this is you. You got blinders on. That’s crazy.

Speaker: 0
01:41:24

Or the information you’re being fed is done without any sort of malice or or any sort of agenda.

Speaker: 3
01:41:31

Well, yeah. There’s no ai? That’s crazy. There’s no cons it’s almost all conspiring. When you look at the history of just government saloni, you go back to Smedley Butler’s book that he wrote in 1933 called War is a Racket. Like, it was like he was talking about uncovering conspiracies as a retired major general. Like, at the end of his career, he’s like, my whole career was bullshit.

Speaker: 3
01:41:55

War war is a racket. So he’s talking So that’s a conspiracy. Right?

Speaker: 0
01:42:00

Right.

Speaker: 3
01:42:00

Gulf Of Tonkin. What’s that? That’s a conspiracy. Right?

Speaker: 2
01:42:03

Mhmm.

Speaker: 3
01:42:03

Like, they conspired to pretend that there was an attack so that we would go into Vietnam.

Speaker: 0
01:42:08

Right.

Speaker: 3
01:42:09

Yeah. Didn’t they conspire to say that Saddam Hussein had weapons of mass destruction? It seems like they might have.

Speaker: 0
01:42:14

The Spanish American Sai. Seems like people might have conspired Yeah. For that. Yeah.

Speaker: 3
01:42:19

Seems like sometimes people talk, and they get you involved in things you maybe didn’t sana get involved in. Right. Enron, the smartest people in the ram. Did you ever watch that documentary?

Speaker: 6
01:42:28

Oh, yeah.

Speaker: 3
01:42:28

Yeah. Conspired like a motherfucker.

Speaker: 0
01:42:30

Right. Right. Got Gray Davis recalled. Yeah. But that’s like the first political thing I remember. It’s like, woah.

Speaker: 3
01:42:36

Right.

Speaker: 0
01:42:37

You can just recall a governor?

Speaker: 3
01:42:41

If the Epstein client list exists and it doesn’t get exposed, it perhaps perhaps someone conspired.

Speaker: 0
01:42:48

Oh, they’re they’re all in on that. Do you think?

Speaker: 6
01:42:51

Yeah. They’re all people dead. What

Speaker: 3
01:42:52

are you,

Speaker: 0
01:42:53

a conspiracy theorist? Ai used to say when when Epstein died, I used to say, like, the meeting was it was, like, Trump, Obama, Oprah, and, like, Big Bird are all in a room together. Elmo, we’re all in a room together. They shook each other’s hands and, like, it’s over.

Speaker: 3
01:43:06

Your problem is you let facts get in the way of your opinion. Okay. I know we I know it’s true that, poppy production went way up in Afghanistan after we got in there.

Speaker: 0
01:43:19

Right.

Speaker: 3
01:43:20

And I know that the United States military was guarding the poppy fields, but you’re a fool to think we profited off of that. So shut your goddamn mouth.

Speaker: 0
01:43:30

That’s the truth. We would never do that.

Speaker: 3
01:43:31

And go blow an eagle.

Speaker: 0
01:43:33

We would never flood the American market with drugs. I mean, just look the other way with crack. That was that’s sai that’s its own special thing.

Speaker: 3
01:43:39

But The other way, we fucking sold it.

Speaker: 0
01:43:41

I know. Oh, no. We definitely did. Bro. That was a that was a that was a Reagan Clinton back when we used to be more ai. Back when Democrats and Republicans worked more together.

Speaker: 3
01:43:51

Learning about that in the news going, what?

Speaker: 0
01:43:54

What?

Speaker: 3
01:43:55

Mhmm. The government was selling crack in the hood? Right.

Speaker: 0
01:44:00

What?

Speaker: 3
01:44:02

To fund the Contras versus the Sandinistas? What?

Speaker: 0
01:44:06

It’s all connected. What?

Speaker: 3
01:44:08

When you find out the government sells crack, you’re ai, what? Ai, the whole Barry Seals one. That that you know, that story? Yeah.

Speaker: 0
01:44:16

That’s a Tom Cruise movie.

Speaker: 2
01:44:18

Yes.

Speaker: 0
01:44:18

Right? Yeah. Yeah. Yeah. Yes. Where they just basically

Speaker: 3
01:44:20

Which is fucked up that Tom Cruise played him because Tom Cruise is one of the most handsome guys ai lived. Barry Seals is disgusting looking. It’s really quite bryden. You know, it’s hard for me to, like, as a watching a ai. I want him to, you know, like, when what’s his face played Dick Cheney?

Speaker: 0
01:44:36

Oh, Australian guy. Awesome. Christian Bale.

Speaker: 3
01:44:38

Christian Bale. Yeah. Christian Bale played Dick Cheney. He fucking looked like Dick Cheney, dog.

Speaker: 0
01:44:43

Right.

Speaker: 3
01:44:43

He looked like him. Mhmm. Gained a bunch of weight, did the whole deal, shaved his head.

Speaker: 0
01:44:47

Well, he’s ai Yeah. Look at the difference between various heels

Speaker: 3
01:44:51

and then Tom Cruise. Tom Cruise is fit and handsome, beautiful head of hair.

Speaker: 0
01:44:55

Well, this is something that Derek and I have talked about. It’s ai it’s in a lot of movies now, there’s, like, no more regular looking people.

Speaker: 3
01:45:02

Yeah.

Speaker: 0
01:45:02

It’s all, like, even TV shows, all everyone is just all super duper attractive.

Speaker: 3
01:45:07

Bro, that that ship is about to sail.

Speaker: 0
01:45:09

Oh, it’s over, like That ship’s about to sail. Yeah.

Speaker: 3
01:45:11

These new video engines Mhmm. Where just with prompts. Have you seen the one where people are arguing Yeah. Whether or not ram prompt?

Speaker: 0
01:45:18

Yeah. Bro,

Speaker: 3
01:45:20

that’s crazy. That’s crazy to watch. Because that is, like, almost a simulation telling you you’re in a simulation. Right. Okay. This is how much difference is this than you? Mhmm. Is it different? It’s really different. Right? Oh, yeah. It’s just a video. Right? Okay. Here’s the next version. How much different is this than you?

Speaker: 3
01:45:37

It’s like we’re peeling the layers of an onion. Yeah.

Speaker: 0
01:45:40

And they’re using the the people doing it stand up talking about it. Yeah.

Speaker: 3
01:45:43

Press this and do it from the beginning sai we could hear it.

Speaker: 5
01:45:46

Theory. Like, really?

Speaker: 3
01:45:47

It’s real it’s really crazy.

Speaker: 6
01:45:52

Made of prompts. Like, seriously, dude.

Speaker: 0
01:45:55

You’re saying the only thing standing between

Speaker: 6
01:45:56

me and a billion dollars is some random text?

Speaker: 5
01:45:59

Honestly, the biggest red flag is when the guy believes in the prompt theory. Like, really? We came from prompts? Wake up, man.

Speaker: 2
01:46:07

You wanna convince me that this perfect creation behind me is the result of ones and zeros? A binary code and nothing more? It makes no sense.

Speaker: 5
01:46:15

Imagine you’re in the middle of a nice date with a handsome meh, and then he brings up the prompt theory. Yuck. We just can’t have nice things.

Speaker: 0
01:46:23

We’re not prompts. We’re not prompts.

Speaker: 2
01:46:27

Where is the prompt writer to save you from me? Where is he?

Speaker: 3
01:46:32

You still believe we’re made of prompts? Anyone who tells you that just wants Even the prompts have bad acting.

Speaker: 0
01:46:38

That was terrible.

Speaker: 6
01:46:39

Oh, yeah. Yeah. Yeah.

Speaker: 3
01:46:40

That was terrible.

Speaker: 0
01:46:41

Yeah. They’re, you know, they’re they’re new they’re new to acting. Give it some time. They

Speaker: 3
01:46:44

gotta really develop their own Marlon Brando.

Speaker: 0
01:46:47

For real. Hell, it looks it’s coming.

Speaker: 3
01:46:48

But we get it. But yeah. Pretty crazy.

Speaker: 0
01:46:51

That sort of stuff.

Speaker: 3
01:46:52

What do you think, Jamie? Think is the what do if you had a guess, like, what what percentage of you believes that we’re in the simulation? Pull that mic down.

Speaker: 4
01:47:02

I’m trying to tell. Which Yeah. This is there

Speaker: 3
01:47:05

is there a part ace well, something something inorganic about existence. Something that seems not real.

Speaker: 4
01:47:12

Did you I just saw someone reposted, the first time Alex Jones was on. He was talking about his theory of what we arya, and you you’re just, like, laughing along the whole time.

Speaker: 0
01:47:21

Yeah. What did he say?

Speaker: 4
01:47:22

His basic point was, like, I’m gonna blow your mind. It’s like, we are the aliens. We are the aliens. Yeah. And then the people have taken that farther and said that we, I don’t it’s almost like The Matrix. Take if we’re the I don’t know who would have been the creator then. That’s the ai of part that’s left out of this.

Speaker: 4
01:47:39

Is it one of those nine

Speaker: 0
01:47:41

It’s AI all the way down. It’s just there’s just a recursive

Speaker: 4
01:47:43

aliens created us, and I don’t know what for what reason.

Speaker: 0
01:47:46

We’re just in a recursive program that’s, like, meant to create more programs.

Speaker: 3
01:47:49

Well, if you think about what we’re doing right now currently, what we just talked about,

Speaker: 0
01:47:53

which

Speaker: 3
01:47:54

Ai is talking to each other in Sanskrit. Mhmm. If that is the baby God, that’s the baby God. Right. So we’re watching baby God in the cradle. Right? We’re watching Jesus in the manger or whatever. It’s about to pop out. It’s about to get wild. Yeah. Things are about to get real.

Speaker: 4
01:48:08

It’s about

Speaker: 3
01:48:09

to get wild. It’s about to doctor Manhattan us and be a a blue guy with a giant dick floating around. Oh

Speaker: 0
01:48:14

ai god. Remember doctor Manhattan? I remember doctor Manhattan.

Speaker: 3
01:48:17

That’s Here’s another movie you can’t make today. You can’t show dicks like that anymore.

Speaker: 0
01:48:20

Oh, really? No. See, I Big old blue dicks. I see. I disagree.

Speaker: 3
01:48:24

Tom did. They do. Tom did. Yeah.

Speaker: 0
01:48:26

You can see there was a shift and a right on then Yeah. Where dicks became way more okay and you sai and then you start seeing way more dicks than tits, which that’s I’m not a big fan of.

Speaker: 3
01:48:35

You know where you see the most dicks? Righteous gemstones. Well, that’s They have dicks in almost every episode. It’s It’s so uncomfortable. Yeah.

Speaker: 0
01:48:42

It’s it became it became ai a trend to show dicks on TV. Ai, I don’t know. Would something run because they

Speaker: 3
01:48:49

won’t do it in the movies anymore.

Speaker: 0
01:48:50

Right.

Speaker: 3
01:48:50

So these people meh buck wild. Mhmm. They did it in Nosferatu. So you saw ai vampire dick.

Speaker: 0
01:48:55

And and yeah.

Speaker: 3
01:48:56

A vampire dick. Big old vampire dick.

Speaker: 0
01:48:58

Oh, that’s So But show me Any tits in Nosferatu? Oh, yeah. Okay. Good. Good. But it’s Johnny Depp’s

Speaker: 3
01:49:04

daughter’s tits, so you’re conflicted. Ai, oh. I remember her when she was little. Yeah. It’s Lily Depp. I forgot. Yeah. She’s beautiful, but, you know, it’s ai, come on. I remember when she was little. Ai I know you when you’re little, I never wanna see your tits.

Speaker: 0
01:49:16

Yeah. That makes sense. Sai I grew up with you. That’s kinda crazy. Oh, wow.

Speaker: 3
01:49:20

There’s Tom. Yeah. I know about that one. But, what is the, Nosferatu one? See if you can find Nosferatu’s dick. I’m gonna fuck up your Google algorithm.

Speaker: 0
01:49:33

It’s gonna be personalized for sana,

Speaker: 4
01:49:35

ai.

Speaker: 0
01:49:37

Sai did I do I do think, because I brought up the streaming earlier, I do think that’s gonna be, like, a majority of how people but one, you can tell it’s real. But also so I was I just got into it because, we had that sketch was on Kill Saloni. And then I just started getting videos of sketch, and there was one where he was just calling there’s another another streamer called Sana, just calling her fat, and it was really funny just for, like, two hours straight.

Speaker: 0
01:50:01

And I got really into it, and I was looking. And these two guys that are huge right now, Kai Sana and Speak. Have you heard of them?

Speaker: 3
01:50:08

Ai I’ve heard of it’s

Speaker: 0
01:50:10

what is it? Sai Ai Speed.

Speaker: 3
01:50:11

I Show Speed. Yeah.

Speaker: 0
01:50:12

He went to he went

Speaker: 3
01:50:13

to China. He fights people. I’ve seen him, like, he sparred a bunch of people.

Speaker: 0
01:50:16

He does, like, athletic stuff. That’s, like, part of it. But, like, he went to Ai, and they’re following around like he was Jesus. It was crazy to watch. Ai Sana and and I saved this on my phone because it blew my mind. He did this thing recently called Streamer University where it

Speaker: 4
01:50:28

was now. It’s still ai.

Speaker: 0
01:50:29

Oh, it’s still live?

Speaker: 4
01:50:30

Yeah. It’s ai an active thing.

Speaker: 0
01:50:31

Oh, I thought it was only three days. But so for three days, I saved on my phone because I I showed it to Derek, and it it was mind blowing how much. So in three days, he had 23,000,000 hours watched. Two one two thousand seven hundred years of content streamed on Twitch in a span of a weekend.

Speaker: 3
01:50:48

So kids are watching like, what are the age limits or what are the age limits?

Speaker: 0
01:50:53

I Ai would say Sai would say definitely younger, but it’s definitely gotten to me where, like, now I’ll watch like, there’s certain streamers. There’s this one streamer named Disguise Toast. He’s really good at puzzles, and he’s good at, like, explaining the puzzle while he does it. Oh.

Speaker: 0
01:51:06

And I’ll, like, I’ll, like, watch. He, like, goes around to escape rooms and, like, he’ll, like, solve puzzles. Oh. Yeah. That’s it’s really fantastic.

Speaker: 0
01:51:13

He Ai found him in the the pandemic. He played this social deduction game called Among Us, and he was just really good at giving the play by play of, like, his logical reasoning. Mhmm. And it made me be like, oh, what is this? And then now looking at the the streaming university numbers, it’s like, oh, this is crazy.

Speaker: 0
01:51:27

That’s what I mean in, like like, if in, like, a few years, like, if you’re a presidential candidate, you have to go on these streamers things. I mean, they tried it sense. They tried it with the they tried it a little bit with, like, AOC and

Speaker: 3
01:51:40

Did they?

Speaker: 0
01:51:41

Weren’t they streaming Madden? Wasn’t that the whole thing? When AOC and Tim Waltz were streaming Madden when they talked about Tony?

Speaker: 7
01:51:47

Yes.

Speaker: 2
01:51:48

Yeah.

Speaker: 3
01:51:48

They were streaming Madden? Is that what they were doing?

Speaker: 4
01:51:50

Madden on Twitch. Yeah.

Speaker: 2
01:51:50

They were

Speaker: 0
01:51:50

playing, but they were so they were playing Madden on Twitch.

Speaker: 3
01:51:52

And that’s when they found out about Tony?

Speaker: 0
01:51:54

Yes. That’s that’s that video of just like the speaker. Yeah. That’s wild. Yeah.

Speaker: 3
01:51:57

But Tim Waltz pretended he plays football. It’s hilarious.

Speaker: 6
01:51:59

Oh,

Speaker: 0
01:52:00

well, him saying you got we got she ran a pick six. It’s ai, the football fan version of the Inglorious Bassett scenes where you do the three and three. It’s ai, if you’re a football fan, you wouldn’t say that. If you coach football

Speaker: 3
01:52:10

Well, not only that, he sai he’s a head coach, and he wasn’t a head coach. Like, that’s that’s a big old lie. You shouldn’t be able to lie. And Right. People wanna trust you. Your whole business is people trusting you. Mhmm. You could have said assistant coach.

Speaker: 0
01:52:21

Right.

Speaker: 3
01:52:21

And that’s good enough. Right. Yeah. Right. Yeah. If you so if you lied about that, you’re probably a liar.

Speaker: 6
01:52:27

Yeah. You’re probably one of

Speaker: 3
01:52:29

them dudes who lies all the time.

Speaker: 0
01:52:30

Yeah.

Speaker: 3
01:52:30

You know, which is ai the kind of people that sana be president. The kind of people that wanna be governor. There’s a you know, they just want that job.

Speaker: 0
01:52:37

They’re constantly lying. Yeah. Yeah.

Speaker: 3
01:52:38

And what what what do I have

Speaker: 0
01:52:39

to say? Whatever is the right

Speaker: 3
01:52:40

thing to say to get people to like me? Yeah. Nuts, man.

Speaker: 0
01:52:45

But, yeah, to to me, that’s, like, that’s the next because both those guys speak Bryden at speeds 20, Kaizen at 23. It’s ai that that’s that’s how the next generation is, like, consuming media.

Speaker: 3
01:52:55

How about this? GTA six is about to come out. Right? Mhmm. Imagine if a candidate made a deal with GTA six where you could have them ride along with you. Like, Trump can ride along with you ai you rob people, shoot people. You could do it with Trump. Yeah. Yeah. Yeah.

Speaker: 3
01:53:15

And then all of a sudden people sana vote for Trump because he’s my favorite homie in GTA sai.

Speaker: 0
01:53:21

Oh, easily. Or ai that. Yeah.

Speaker: 3
01:53:24

Right? It’s Or ai the candidate is in the game helping you out, you know, like you’re playing Half Life and Kamala Harris is helping you get around the lab. That’s

Speaker: 0
01:53:35

I mean, it’s I think it has to be a little less in your face for it to work.

Speaker: 3
01:53:39

But you I don’t know, dude.

Speaker: 0
01:53:41

You I I think so.

Speaker: 3
01:53:42

I don’t know, dude. If you could if you love playing with the Kamala Harris character on Half Ai, ai, like, that’s your like, if you have partners

Speaker: 4
01:53:50

Right.

Speaker: 3
01:53:50

Like, you could have an AI partner and it literally is Kamala Harris and she runs around in Half Life with you shooting at aliens and shit. Oh, damn. Right? And then you really get into, like, you got a good coordinated partnership with Kamala Harris when you play Half Life.

Speaker: 0
01:54:05

Yeah. That’s actually She’s super cool

Speaker: 3
01:54:07

in the game. She’s really funny. She helps you. She get meh you clues

Speaker: 0
01:54:12

as to how to get out of places. Damn. Yeah. Ai got way darker than I that’s very that’s I guess that is possible. You can have an Of course, it’s possible. Oh, my AI Trump helper.

Speaker: 3
01:54:21

Yeah.

Speaker: 0
01:54:21

And he helps me finish the missions. He helps meh. Yeah. Or, like, do a do a patch online where you yeah.

Speaker: 3
01:54:28

Yeah.

Speaker: 0
01:54:28

That but the way they can reach you is, like Sure. Yeah. Because right now, all the streaming stuff seems like it’s, like, the almost not Ai Westy, but it’s still kinda new. The second politics figures out a way to really get their hands in it. I think It’s kinda over.

Speaker: 3
01:54:41

The real goal to the real gateway to hell is neural interfaces. That’s the real gateway.

Speaker: 0
01:54:49

So where you’re, like, just you can immediately just be in the world.

Speaker: 3
01:54:52

The real matrix. Mhmm. With that, it’s a % on the it’s on the menu. It’s coming. Oh, well. It’s just a matter of time. And when you can’t tell at all whether or not you really have an experience, that’s when you’re in a simulation. And that might have already happened. Yeah.

Speaker: 3
01:55:12

That might be what we’re dealing with right now.

Speaker: 0
01:55:14

That would that we’re just the simulation creating its next simulation? Its next iteration?

Speaker: 3
01:55:18

Maybe why it’s so wacky. Why why it feels so fake.

Speaker: 6
01:55:23

Yeah. It’s so weird. You know?

Speaker: 0
01:55:25

Yeah. It is it is it is ai. Just like it’s the world we live in where, like, comedians arya, like, it doesn’t make any sense to me sometimes. We’ll look

Speaker: 2
01:55:34

around it.

Speaker: 3
01:55:34

Any sense.

Speaker: 0
01:55:34

Yeah. Where it’s, like, comedians are selling out arenas regularly. Regular? Regular. All of

Speaker: 3
01:55:38

our friends.

Speaker: 6
01:55:39

Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.

Speaker: 0
01:55:41

They’re just arena x all of a sudden and, like Yeah. Instantly.

Speaker: 3
01:55:44

Well, Sai Ai,

Speaker: 2
01:55:46

I think it it

Speaker: 0
01:55:46

but I mean, sai part of that has to do with, like, as things get more fake, people will try to strive for things they know is real.

Speaker: 3
01:55:51

100%.

Speaker: 0
01:55:52

And so what I know is real is this guy talking to me in person right now.

Speaker: 3
01:55:56

Yes. Also, because of the restrictive nature of all whatever you see on television, any regular television shah.

Speaker: 0
01:56:06

Not even television. Like, the the censorship on social media sometimes is like like, I I did this sketch with one of our, the Dora guys at Mothership, Christina Mariani, killing it. And in this sketch, we have her tell me to kill myself. That’s part of that’s the sort of Right.

Speaker: 0
01:56:23

And you can’t in the captions, they took it down because we you we spelled out the word kill, which you can’t do, and you can’t say the words kill yourself. And it’s like it’s just a small little sketch, and it’s ai already, it’s like there’s so meh, like, there’s so much holding you back from, like, having a true expression sometimes that, like, people will go to seek it out live.

Speaker: 0
01:56:45

That’s what I think.

Speaker: 3
01:56:46

There’s definitely some part of that. Mhmm. Yeah. For sure. I think people are definitely aware that there’s a bunch of stuff you can’t talk about in social media that you wanna talk about on stage that are funny.

Speaker: 0
01:56:57

Well and and it’s also so funny to me where it’s like, social media, there’s this big, like, oh, mental health is important, but, like, you can’t tie you have to write unalive yourself in the caption. It’s like you can’t so we can’t have serious discussions about these topics really. Right.

Speaker: 0
01:57:10

Like, there’s certain if certain words are off limits, then there’s no way you can have an actual serious discussion where those words are involved.

Speaker: 3
01:57:17

Right. 100. Because then they just expand the definition of offense.

Speaker: 0
01:57:22

Right. And then Go further than words. This is expensive. Now this

Speaker: 3
01:57:24

is a new version. Mhmm. The new we have an updated list of words you can’t say.

Speaker: 0
01:57:28

And we were talking about, like, all these bots earlier. So, like, we don’t want these words to offend people, but we’re not even talking to people. Right. This is all going out to computer programs. Right. It’s computer programs watching other computer programs being offended. It’s, like, it’s crazy.

Speaker: 3
01:57:42

But offending people should be the least of your concern when you’re allowing open manipulation by a university. Mhmm. And, like and also, this is just the one we know about. We didn’t know about that until they told us.

Speaker: 0
01:57:54

Right. And they didn’t have to tell us. No. They just decided to. I guess, in a way of, like, warning, like, hey, this is probably happening in a bunch of ways.

Speaker: 3
01:58:02

Ai, I’m glad they did. They shouldn’t have done it in the first place, but I’m well,

Speaker: 0
01:58:05

look, it’s being done. It’s being done.

Speaker: 3
01:58:08

And I it’s not like they came up with the idea they’re the only ones doing it. I guarantee you it’s being done. There’s probably a bunch of people that have AI friends online that they communicate with on Twitter and, you know, hey, good to hear ram you. How’s things? They do DMs back and forth with each other Right. Talking about stuff they’re into. How many of those UAP guys are just talking to bots?

Speaker: 3
01:58:28

All those dudes are in the the UAP group, like, where’s a bunch of people all, like, shah. Disclosure’s imminent.

Speaker: 2
01:58:34

They’re

Speaker: 3
01:58:34

all, like, fucking DMing each other. Probably DMing bots.

Speaker: 0
01:58:37

Right. And and they’re just getting fed what they wanna hear. Just getting fed

Speaker: 3
01:58:40

bullshit AI videos and all kinds of, like, weird disclosure stuff. Half of it’s fake. Half of it the government is actually leaking purposely to try to hide some weapons program they have. Right. Who knows?

Speaker: 2
01:58:52

Right. Who

Speaker: 3
01:58:53

knows what it means? It’s a cloudy environment. The UFO world is like one of the slipperiest worlds. When you’re talking to peep there’s there’s full on grifters. There’s full on people that are just they have the answers to everything and they’re always wrong. They’re always off.

Speaker: 3
01:59:09

And then they have, like, some good data that they pull from disclosure and from all these different people that talked about different things, but then they’re they they claim to be, like, the experts in it. It’s ai a lot of weird people in that world.

Speaker: 0
01:59:22

Well, yes. It’s good. Well, it’s easy to do that in that world too where, like, a lot of it is, like, does this even exist? Yep. So you can just be, like, bullshit proof. Yeah.

Speaker: 2
01:59:29

Yeah.

Speaker: 0
01:59:30

Yeah. If you don’t

Speaker: 3
01:59:30

have good ethics, if you’re not like a legitimate journalist, you know, but what what is interesting is when legitimate journalists get interested, like Schellenberger, when Michael Schellenberger reports on UAPs. He I like how he does it because he does it the same way he reports on, like, corruption in government and waste and fraud.

Speaker: 3
01:59:46

Like, he’s just

Speaker: 0
01:59:48

No speculation.

Speaker: 3
01:59:49

No exaggeration. No condemnation. No condemnation of you know, ai, none of, like, virtue signaling. It’s ai, this is what’s going on. This is what we know. This these programs exist, and this is what we know they’ve been trying to bryden. And these are the people that have come forward, and this is why we think they’re telling the truth. They’re ai, yo. Damn. Yeah.

Speaker: 0
02:00:07

It is it is wild that they told us that aliens exist, and we just sort of didn’t care.

Speaker: 3
02:00:12

Yeah. We’re like, whatever. Yeah.

Speaker: 0
02:00:13

Well, Well, we gotta see it now. We gotta see more yeah. Show us what you show us what you got. It’s like ai it’s You

Speaker: 3
02:00:18

know what it’s like? It’s like that guy that was telling us that he knows that Israel has

Speaker: 0
02:00:21

a war. That’s exactly what I was gonna say.

Speaker: 3
02:00:23

Tyler me unless you could tell me. Show me. Don’t fucking tell me unless you can tell me. Oh, we have crafts of non human bryden? What?

Speaker: 0
02:00:33

What do you well, show me, bitch. Yeah. Why can’t you why can’t they show us a picture of that?

Speaker: 3
02:00:37

They, you

Speaker: 0
02:00:38

know, what’s the Unless the one picture is Epstein sitting in the face. That’s why it’s awful. Epstein and Puffy making out in space.

Speaker: 3
02:00:46

Have you seen those ones? The two of them making out in jail?

Speaker: 0
02:00:49

No. That’s so funny.

Speaker: 3
02:00:50

If they made AI of the two of them kissing and making out

Speaker: 2
02:00:52

in jail.

Speaker: 0
02:00:52

Did they know each other?

Speaker: 3
02:00:53

Oh, who knows? It’s a good question.

Speaker: 0
02:00:55

Did like, was there a connection there? Because there’s a because there’s this whole, like, Mossad Epstein connection. Is there, like, is there, like, a Mossad

Speaker: 3
02:01:03

That’s a good question.

Speaker: 0
02:01:04

Did he cut connection? Did they know each other?

Speaker: 3
02:01:06

On the island.

Speaker: 0
02:01:07

Right? Because Diddy ever on the island because you’re

Speaker: 3
02:01:08

gonna have a fly party. That’s a dude to invite

Speaker: 2
02:01:10

Right.

Speaker: 0
02:01:10

In the day. Exactly. I before everybody knew. I can’t imagine their circles were completely separate.

Speaker: 3
02:01:16

That’s a good point. Yeah. Ai well, we know that he was hobnobbing with politicians. You know you know that Diddy was hobnobbing with with Biden or, excuse me, with, Obama. Right?

Speaker: 2
02:01:27

There’s sai

Speaker: 3
02:01:28

video of the two of them together talking.

Speaker: 0
02:01:29

Right.

Speaker: 3
02:01:30

Remember?

Speaker: 0
02:01:30

But there there’s a video with, like I feel like every current every president from Clinton on has a picture with either has, like, a lot of pictures with Epstein or a lot of pictures with Diddy.

Speaker: 3
02:01:41

Right. But then the thing is it’s ai, did they know? Or was it just there’s a lot of photographs of famous people with these people? Right. Because vatsal, like, a way to get an endorsement.

Speaker: 0
02:01:52

Right.

Speaker: 3
02:01:52

You know, like, look, The Rock likes me.

Speaker: 0
02:01:54

Right.

Speaker: 3
02:01:55

I must be awesome. And and a lot

Speaker: 0
02:01:56

of what I heard is that how the Epstein thing worked is that, like, oh, you’d be at a party and then all of a bryden, some of that the shady shit will go down ai what and you might not have known about it, but you were at the party where it happened later.

Speaker: 3
02:02:07

Of course. Right? In that way, people that were more conservative, they could kinda shield them from knowing about it Mhmm. But implicate them.

Speaker: 0
02:02:16

That’s all that matters. It’s ai because if you’re if you’re at a party Right. Where someone Right. Assaults an underage girl

Speaker: 3
02:02:22

Right.

Speaker: 0
02:02:23

And that’s on video and now you’re at the party, it doesn’t matter. You have to explain that.

Speaker: 3
02:02:27

But especially if you’re, like, a scientist sana they flew you to the ai. There’s a bunch of other scientists there.

Speaker: 0
02:02:32

You’re like, oh,

Speaker: 3
02:02:33

we’re just gonna you know, he’s he donates a lot to ai, and this is a wonderful opportunity to get together with my colleagues and have a few cocktails and just, you know, we’re we’re we’re going over string theory. Yeah. Meanwhile, there’s people in the background doing ecstasy and it gets a little freaky. Maybe you go to bed. Maybe you go to bed. You like ai? Yeah.

Speaker: 3
02:02:52

Or like Or maybe someone knocks on your door and they’re like, hey,

Speaker: 0
02:02:54

we have a we have someone who will massage you.

Speaker: 3
02:02:57

Meh. Maybe they ai dug drugged your drink.

Speaker: 0
02:02:59

Yeah. No questions asked.

Speaker: 3
02:03:00

Who know? You’re over there for a few days.

Speaker: 0
02:03:02

Yeah. You’re not thinking this they’re not gonna bring a 16 year old in here. Right. Ai would you think that? Why would you why would you possibly think that?

Speaker: 3
02:03:08

It’s ai, if you’re a guy who makes, like, whatever a professor makes, like a normal salary, a good salary, but normal. And you hear about billionaires that own ai. Mhmm.

Speaker: 2
02:03:19

He’s

Speaker: 3
02:03:19

a financial genius and he really supports science. Sounds like a great thing. Yeah. If you don’t know any better

Speaker: 0
02:03:26

Yeah. And, like, he wants to fly you out ai to an island to meet a billionaire. You’re like, ai. I’ve never done that.

Speaker: 3
02:03:31

Ai. This is gonna be incredible. But once that guy gets arrested for getting jerked off by kids, you’re like,

Speaker: 0
02:03:39

hey. Oh, no. And then

Speaker: 3
02:03:40

you still hang out with him.

Speaker: 0
02:03:43

Right. Right.

Speaker: 3
02:03:44

Which a lot of them did. Right. That’s where it gets weird. Like, did you hang out with him because he made you? Because he said, hey, you’re not going nowhere. You know what I’m saying? Like, you would if you’re intelligent, you’d probably distance yourself from ram at that point in time.

Speaker: 0
02:03:59

Right.

Speaker: 3
02:03:59

Be like, well, I have to separate. If you got a clean slate, ai, well, I can’t have anything to do with it publicly. I am the CEO of Microsoft. Ai. But yet you’re still hanging out with him?

Speaker: 0
02:04:10

Right. Because he Now he now he has it over you.

Speaker: 3
02:04:13

Yeah. The Bill Gates one was great because he was ai, he was donating to Global Health. Do you need money? Do you need money? What about your money? Like, you have so much money. You have hundreds of billions of dollars. You need this guy who’s got 1,000,000,000 to help?

Speaker: 0
02:04:28

Right.

Speaker: 3
02:04:30

Why don’t you take one of them 1 billions that you have and throw it in there? If you really care. And he Seems weird. It seems it all of it seems weird.

Speaker: 0
02:04:40

Yeah. It’s all, like but it’s all, like, these sort of, like, sex scandal rings of just I think that’s, like, a function of high politics.

Speaker: 3
02:04:48

Bro, it’s a function of a bunch of people getting together and getting drunk. Yeah. Bunch of freaks. Yeah. And they’re doing coke and they’re on an island. Let’s go. And you think, you know, you think you’re partying on a yacht somewhere, but meanwhile, you’re just in a a a real live reality show that only a few people get to watch.

Speaker: 3
02:05:03

Right.

Speaker: 0
02:05:04

But wasn’t tyler, like, even even pre Epstein, wasn’t there, like, that Franklin scandal? %. Yeah. Where it’s, like, I think that’s just sai function of, ai, it’s a way to control.

Speaker: 3
02:05:12

Dude, there was the lady that was the madam that got us that got assassinated in DC.

Speaker: 2
02:05:16

Oh, I didn’t

Speaker: 3
02:05:17

and the

Speaker: 0
02:05:17

madam got assassinated? Yeah.

Speaker: 2
02:05:18

Yeah. Yeah.

Speaker: 3
02:05:19

She’s still ai a she’s a famous DC madam. I think they said it was a suicide. Mhmm. And she was like, if I kill myself, I definitely did not kill myself.

Speaker: 2
02:05:27

One of

Speaker: 3
02:05:27

them deals. Mhmm. Oh, yeah. Conveniently died.

Speaker: 0
02:05:29

Yeah. Didn’t one of the FC FC witnesses just kill herself?

Speaker: 3
02:05:33

Oh, yes. Yeah. Yeah. These things happen.

Speaker: 2
02:05:37

Yeah.

Speaker: 0
02:05:37

Do you know where they recruited her? Where? She got recruited at Mar a Largo. She was a she was a locker room attendant at Mar a Largo Yeah. When she got recruited. I’m pretty sure. I can we look that up? I think Virginia Guffrey? I do sana say that.

Speaker: 0
02:05:51

That’s a crazy thing to say where, like, I think

Speaker: 6
02:05:53

I’m right about that. Yeah.

Speaker: 0
02:05:54

Be careful.

Speaker: 6
02:05:55

Yeah. Yeah.

Speaker: 3
02:05:56

All of it is so spooky, dude, because you know that that’s not the only one. There’s probably ones going on right now in China. There’s one

Speaker: 0
02:06:03

Oh, dude.

Speaker: 3
02:06:03

Russia. There’s it’s just a thing that’s happening all over the world.

Speaker: 0
02:06:06

When you have, like, a China or Russia where it’s, like, basically an emperor, they can just run it.

Speaker: 3
02:06:09

It. Yeah.

Speaker: 0
02:06:10

That’s a whole different thing. Right.

Speaker: 3
02:06:11

They don’t need the blackmail.

Speaker: 0
02:06:12

Yeah. They don’t they don’t need some American financier, yeah, at Mar A Lago.

Speaker: 3
02:06:18

Damn. Choose a spot in the Donald Trump’s

Speaker: 0
02:06:20

she have it

Speaker: 3
02:06:21

at Mar A Lago.

Speaker: 0
02:06:22

Colleen Maxwell.

Speaker: 3
02:06:25

It’s, like, how can you not believe in conspiracies?

Speaker: 0
02:06:28

Yeah. Yeah. Yeah. It’s crazy. Macron’s married to a man.

Speaker: 3
02:06:34

A man who bitch slapped him in a private jet.

Speaker: 0
02:06:36

Oh ai god. Really just dude, his whole life he’s been abused by that guy.

Speaker: 3
02:06:41

Face palm in his face. I wanna know I mean, take a fucking test. It’s not hard. If I was her, I would say, listen, you motherfuckers. Okay? Mhmm. Let’s take a chromosome test. I am not a man. I’m just a pedophile. I’m just

Speaker: 6
02:06:59

a ped yeah. Yeah.

Speaker: 0
02:07:00

What what accusation thing hurts her the most? The man or the pedophile? Well, the pedophile’s not an accusation. That just you are that. That is fact. That is exactly who you are.

Speaker: 3
02:07:09

But what was the law back then? Because some place it had some wacky ass fucking

Speaker: 0
02:07:14

I think that’s probably why they say 16. Because I think in Europe, 16 is like Ai

Speaker: 3
02:07:17

say 15.

Speaker: 0
02:07:18

Oh, really? Yeah.

Speaker: 3
02:07:19

Yeah. Yeah.

Speaker: 0
02:07:19

Oh, because when I read this online, it said 16.

Speaker: 3
02:07:21

Is 14.

Speaker: 0
02:07:22

It’s because Meskris

Speaker: 3
02:07:23

is 14.

Speaker: 0
02:07:23

They’re moving it. They’re moving them speak as we speak. Yeah. But I think the I think it’s, like, probably, like, 16 in the area. But either way, if you’re a kid’s teacher and you’re sitting like, that’s crazy. That’s crazy. That’s crazy. That’s crazy. He got to a prime minister with that level of just mind fuckery that’s happening to him.

Speaker: 3
02:07:42

Imagine if the roles were reversed sana the president was a female and then the husband was a teacher when the the president was 15.

Speaker: 2
02:07:54

Mhmm.

Speaker: 3
02:07:54

When she was 15 and he was 40. Imagine. Crazy. Imagine ai was in The United States. Imagine if it turned out that there’s a female that’s running for president and then we start going to the history and find out that she met her husband who’s 80 when she was 15. Yeah. And You’d be ai, fucking yo.

Speaker: 3
02:08:15

But like

Speaker: 0
02:08:16

this ai 1974, it was legal. Like, fuck you.

Speaker: 3
02:08:21

Fuck you, man. That’s crazy. We would never but because it’s a guy and an older lady lady, air quotes, lady, We’ve added ai. Like, he still gets to go to the meetings and shake everybody’s hand. Yeah. You know, he still gets to go to these fucking things where all the world leaders get together. He’s hanging out with them.

Speaker: 0
02:08:38

Right. Like normal. Wait. So what’s also funny to me is you’re probably gonna get in more heat for calling her a man than she will for slapping the prime minister.

Speaker: 3
02:08:46

No. She’s catching heat. Yeah. I think and also Candace, I think Candace if she’s ai, and it seems like they’re not suing her, so I think she might be right. And they did offer her money. How much money do they offer her? Find out that. How much money they offer Candace Owens to not tell these stories?

Speaker: 0
02:09:06

Damn. They offered her they were ai here

Speaker: 3
02:09:08

I believe she’s made some sort of an accusation that they offered her a ai amount of money to not do this. Jesus. Yeah. Which is the last person you wanna tell anybody that you offered them money to not talk about something because she ain’t taking that money. Yeah.

Speaker: 0
02:09:22

She sees the long game. She’s already willing to

Speaker: 4
02:09:24

stories that comes from a meme and then it’s not true.

Speaker: 3
02:09:27

Oh, damn it. France offered to be Candace Owens a one time payment of 4,000,000 plus 50 k per month for the rest of her life.

Speaker: 6
02:09:34

What

Speaker: 3
02:09:34

what’s And that’s just a meme?

Speaker: 0
02:09:36

Yeah. And what’s okay. What’s the site? Come from her.

Speaker: 3
02:09:38

Neither Owens nor, French president Emmanuel Macron have referenced sai claim on the respective websites or verified social media accounts, and there is no other evidence to support the claim.

Speaker: 0
02:09:48

Can we see what the name of the the account is that yeah. There’s nothing Legitimate targets. Legitimate targets? Yeah. Yeah. There’s nothing I trust less than a place thing called legitimate targets with a blue chain.

Speaker: 3
02:09:59

That’s why we had to look that one up.

Speaker: 6
02:10:01

Yeah. Because

Speaker: 0
02:10:03

you don’t know. Yeah. Well, it does Ai But,

Speaker: 3
02:10:05

again, this is, like, more that, like, probably bot like behavior.

Speaker: 0
02:10:09

Oh, sure.

Speaker: 3
02:10:09

Bot like targeting, disinformation campaigns. Mhmm. You know, throwing fuel on the fire.

Speaker: 0
02:10:14

Of just getting you to believe ai, oh, wow. Okay.

Speaker: 3
02:10:16

What did money? Ai, think about how many once okay. You make that post.

Speaker: 0
02:10:21

Mhmm.

Speaker: 3
02:10:21

How many people ai me read that, think it’s true, ask Jamie to look it up, it starts spreading, people start following your account, how many new people do you get for your how much engagement do you meh? All that’s valuable. You can keep doing that a lot and outrage people a lot.

Speaker: 0
02:10:38

There’s a

Speaker: 3
02:10:38

bunch of those that come up. You’re ai, fuck, is this true? And then you have to copy and paste it and put in Google and search a little. Right. I don’t think this is true.

Speaker: 0
02:10:46

And now also it has to do is take the snippet of you saying it before you, like, let’s look that up. 100%. And then now it’s ai

Speaker: 3
02:10:53

it’s But all she has to do is take a chromosome test.

Speaker: 0
02:10:57

But it’s also, like but, like, also, if, like, if you’re a woman, you’re not gonna be, like, well, fuck you. I’m not gonna prove to you I’m a woman.

Speaker: 3
02:11:03

I would do it. But

Speaker: 0
02:11:04

you say that as a man. That’s true. You say that as a man. Yeah. Ai, I could totally do it

Speaker: 6
02:11:11

if I if I if I was a if I was

Speaker: 0
02:11:13

a woman and some of them were they were accusing me of being a man, I’d also kinda probably be ai a little bump. There are a lot of part of me ai like, fuck you. I don’t gotta prove anything to you.

Speaker: 3
02:11:20

You ever see her sit down? No.

Speaker: 2
02:11:24

You ever

Speaker: 3
02:11:24

see her sit

Speaker: 0
02:11:25

down? I haven’t seen her sit down.

Speaker: 3
02:11:26

She sits down like a dude. Oh. Yeah. You know, dude sit down? Ai speak. Mhmm. Just plop down, all dick out. A lot of weight. Because the shape of our hips. Right. That’s ai when people talk about meh spreading, yes. Guys definitely do that. But women’s legs go inward.

Speaker: 3
02:11:43

Men’s hips, the way the the the angle is is different. Right. They go outward. Like, watch this person sit down. There’s a video. Right?

Speaker: 3
02:11:51

That see that video with the the white shirt on, the second one? Yeah.

Speaker: 0
02:11:58

Let’s see.

Speaker: 3
02:11:58

But watch this. This is it. I’ve never seen a woman sit down like this. Go full screen.

Speaker: 0
02:12:04

Is this her when she was a teacher?

Speaker: 3
02:12:05

Watch this. Watch how she sits down.

Speaker: 0
02:12:12

Bro. That is a very middle aged meh, the way she sits.

Speaker: 3
02:12:16

Bro. Bro. Watch this again. Watch this person sit down. That’s how a dude sits. This the way the legs are spread apart. That’s like a dude’s hips.

Speaker: 4
02:12:30

What about all the rest of these, though?

Speaker: 0
02:12:33

There is What about all

Speaker: 3
02:12:33

the rest of one?

Speaker: 6
02:12:34

There is

Speaker: 0
02:12:34

like ai

Speaker: 4
02:12:34

rest of these examples of Oh,

Speaker: 3
02:12:35

well, that’s how liberal men sit. That’s what they do. They throw their leg over. That’s the guys with very small legs.

Speaker: 0
02:12:44

It’s so funny.

Speaker: 3
02:12:45

You’re But Ari sits like that, actually. Because you got them weird long legs. But, like, you can sit like that. It’s the way you sit down.

Speaker: 0
02:12:53

It’s the initial it’s the initial get down.

Speaker: 3
02:12:54

It’s the plop. It’s the plop and then the legs spread. Women don’t sit like that. When was the last time you saw a woman sit plop down with her legs spread

Speaker: 2
02:13:02

like that?

Speaker: 3
02:13:03

They don’t sit like that.

Speaker: 6
02:13:04

They don’t.

Speaker: 0
02:13:04

They don’t.

Speaker: 3
02:13:04

They just sit down. No. Their legs, like, normally naturally angle inward more. And they sit like this, and they sit with their legs together. Like, you might have a similar pose, but you don’t plop like that. Like, that’s a dude pose.

Speaker: 0
02:13:18

That is. That is. Or maybe that’s how pedophile women sit. There’s also Maybe a

Speaker: 3
02:13:22

woman wants to be a man.

Speaker: 6
02:13:23

Yeah. Right?

Speaker: 0
02:13:25

There’s also that.

Speaker: 3
02:13:26

Let’s find one of those and see how they sit. Any examples? That’s Yeah. It’s, it’s odd. The story’s very odd. But it’s also it just shows you how fucking weird these people are. You’re watching that person smack Macron

Speaker: 2
02:13:38

Mhmm.

Speaker: 3
02:13:39

Right in the head and then walk downstairs with ai like, what?

Speaker: 2
02:13:41

What could

Speaker: 0
02:13:42

they have possibly been talking about?

Speaker: 3
02:13:44

Yeah. Your dick is showing. Shut the fuck up.

Speaker: 0
02:13:48

Yeah. No. Ai it was like bulge.

Speaker: 3
02:13:52

If he

Speaker: 0
02:13:52

was a man, they would fight. They would be a square up. Well, maybe

Speaker: 3
02:13:55

they will when they get back home. Maybe that’s the fun part. Maybe they’re ai, fucking fuck you and fuck you and then they get saloni. Meh after it. Goddamn.

Speaker: 0
02:14:04

Yeah. That’s so that that is just that is just oh, that whole relationship is really, wild. Vatsal has to be one of the more wild politician relationships.

Speaker: 3
02:14:13

Wild.

Speaker: 0
02:14:14

Out there.

Speaker: 3
02:14:14

If that was the head the lead of a sitcom, you’d be like, what the fuck? Mhmm. That’s ai what is going on? But the fact that it’s the president of France What?

Speaker: 0
02:14:25

Yeah. Getting slapped? There’s no way you don’t go. Well, he’s not going he can’t win reelection anymore. But the the if he that would have hurt his campaign so hard.

Speaker: 3
02:14:32

Bro, he’s fixing to bomb somebody to cover this up. He’s gonna They ai arm some rebels.

Speaker: 0
02:14:41

Yeah. It’s time to fuck up Algeria some ai, whatever France does. Yeah. Now you’re talking.

Speaker: 3
02:14:46

Yeah. You know, like, remember when Clinton when the Monica Lewinsky scandal came and

Speaker: 0
02:14:51

Oh, did they bomb Kosovo? What the fuck, man? It’s so on the nose. Yeah. Yeah.

Speaker: 3
02:15:00

Ai do shit like, oh my god. You guys aren’t even trying. You’re not even trying to be slick.

Speaker: 0
02:15:04

No. No. This is ai the attention off of me. Crazy. Yeah. Oh, man. He the part of it when when Columbine happened, he must have been, like, thank god to get this off of me.

Speaker: 3
02:15:13

Oh, you remember Gary Condit?

Speaker: 0
02:15:15

Yeah. Well, he murdered that lady. Right? Allegedly.

Speaker: 3
02:15:17

Allegedly. Okay. Ai eleven happened.

Speaker: 2
02:15:19

Mhmm. He

Speaker: 3
02:15:20

forgot about it. Yeah. My body’s ai, listen.

Speaker: 2
02:15:22

We have

Speaker: 3
02:15:22

bigger fish to fry.

Speaker: 0
02:15:25

Save you. Save

Speaker: 3
02:15:26

us. Ai wonder how accurate House of Cards is. I wonder. I wonder how accurate it is.

Speaker: 0
02:15:34

I mean, outside of him physically killing the people, which he does sometimes, I would probably say it’s, like, probably super active.

Speaker: 3
02:15:41

Probably pretty accurate.

Speaker: 0
02:15:43

Yeah. It’s it’s a lot of dealings, lot of, like

Speaker: 3
02:15:47

Such a good show.

Speaker: 0
02:15:48

Yeah. It’s so It’s so unfortunate. Well, you know, ai of a pain starts. He’s Spacey is always a great villain. Oh. He’s always been a great villain. The best. He might be the best movie villain.

Speaker: 3
02:15:59

Oh, he was so good in that one though too because he was so charming and weird. But it was also similar to, like, Tony Soprano. Like, you wanted him to succeed.

Speaker: 0
02:16:07

Yeah. You wanna be like, oh, can he get to president? Yeah. Can he does he have what it takes to, like

Speaker: 3
02:16:11

Ai, we were hoping he wins. Mhmm. Right? Like, when you’re watching the show, you’re like, god. I hope he’s president.

Speaker: 0
02:16:16

Right. Yeah. Exactly. You were ai you you Ai was watching to be like, oh, how does he do it? Right.

Speaker: 6
02:16:21

How does he do it?

Speaker: 3
02:16:22

Because he’s

Speaker: 0
02:16:22

kind of in the beginning, he’s, like, kinda sai, and it’s, like, ai does he get back in?

Speaker: 3
02:16:27

Remember he has that threesome. It was, like, it’s sai security guard. Is that what it was? And his wife, the three of them get down?

Speaker: 0
02:16:32

Oh, yeah.

Speaker: 3
02:16:35

Get down. Get down. Yeah. I wonder I wonder how much of that freak shit goes on behind closed doors. Because I definitely think when you’re a bottled up person like that, like, you have to be any sort of, like, professional person, politician, publicly professional and ethical, and you

Speaker: 2
02:16:51

wear

Speaker: 3
02:16:51

a suit and tie, you can’t wait to fuck a dick. I can’t wait to get freaky.

Speaker: 0
02:16:55

Didn’t didn’t Madison Cawthorn sort of get, like, thrown out of the Republican party for, like, kind of being, like, hey. They, like, have a lot of orgies and shit. Really? I’m pretty sure because the

Speaker: 3
02:17:06

What year was this?

Speaker: 0
02:17:07

This was so well, when does Cawthorn get, like, like, 2016? This is, like, pretty quickly because he was a rising star in the Really? Yeah. Yeah. Ai. I’m pretty sure he said something.

Speaker: 3
02:17:16

More in deep shah on than me on politics.

Speaker: 0
02:17:19

Well, you’re deep,

Speaker: 3
02:17:20

this fella.

Speaker: 0
02:17:21

Yeah. Yeah. Yeah. Orgies and drugs. Yeah.

Speaker: 3
02:17:22

Yeah. Woah. Ultra conservative group chair says he also wants to speak with the North Carolina Republican about his salacious claims concerning his colleagues. Mhmm.

Speaker: 0
02:17:34

Okay. Let’s see. Perry, is this his Cawthorn’s claims? What are what were his claims though?

Speaker: 3
02:17:40

So Mhmm. What does he say?

Speaker: 0
02:17:44

That’s whether they’d reconsider Cawthorn’s membership in the group. Yeah. He basically said that

Speaker: 3
02:17:49

clear that he has evidence of taking part in group sex and drug use. Perry wouldn’t say either way. We will discuss that when we get to it. Yeah. But I think When asked whether they would reconsider Cawthorn’s membership in the group, if he didn’t make clear whom he has evidence.

Speaker: 3
02:18:05

So Cawthorn was saying someone has evidence, that he has evidence of some people taking part of group sai, not him. Right?

Speaker: 0
02:18:13

No. I think Cawthorn was saying something to the effect of to move up in this world or whatever, you have to take part in the drugs and the group sai. And then them being ai, well, show us the evidence, and then Cawthorn being like, oh, I might have I definitely.

Speaker: 3
02:18:24

Okay. Yes. He was invited he claims he was invited to an origin in Washington. Right. But no one was saying that he did. I thought you were saying that he was wrapped up in it. So this is what killed him that he was saying this about people, and then

Speaker: 0
02:18:37

they killed his career. And they were like, the Freedom Caucus was like, uh-uh. No more. Really? Yeah. Because I remember he got he got his old rally took him out. Go to

Speaker: 3
02:18:45

the party. Jamie? It says the sexual perversion that goes on in Washington, being kind of a young guy in Washington, where the average age is probably 60 or 70. Look at these people. A lot of them Ai looked up to through my life. I always paid attention to politics, and all of a sudden you get invited.

Speaker: 3
02:19:02

We’re going to have a sexual get together at one of our homes. You should come. What you just asked me to come to? And then you realize they’re asking you to come to an orgy. Some of the people leading in the the on the movement to try and remove addiction in our country, and then you watch them do a key bump of cocaine right in front of you.

Speaker: 3
02:19:22

And it’s ai, this is wild. And that says, what? It’s acnn.com.

Speaker: 6
02:19:31

C n

Speaker: 3
02:19:31

n n sai, what? What does it say after that? Leave it back up. It’s not clear to me whether Cawthorn is suggesting that members of Congress have invited him to orgies or just other people in Washington. Although, after listening to his comments several times, it seems to be the former. Ditto the his allegations of seeing people in Washington doing cocaine.

Speaker: 3
02:19:53

Well, they sound like cocaine people. Okay. Clearly cocaine people. If you’re if you wanna be the president, you wanna have all the power, and you wanna have all the money, and you’re deeply involved in corruption, that’s cocaine people. Mhmm.

Speaker: 0
02:20:05

Yeah. I mean, no one’s shocked. He just said the quiet part out loud. Yeah. But I think that’s a the but he was because he was supposed to be, like, the guy at one point. I remember. Yeah. He was, like, supposed to be the guy.

Speaker: 3
02:20:14

Have your husband ever made any investments based on

Speaker: 2
02:20:18

decisions he

Speaker: 0
02:20:19

L and S.

Speaker: 2
02:20:25

That’s they’re all

Speaker: 3
02:20:25

These are go game people.

Speaker: 2
02:20:26

That

Speaker: 4
02:20:27

was the first part of the problems he had. He had some other issues.

Speaker: 3
02:20:29

So what was the other issues? He got arrested for

Speaker: 4
02:20:32

gun I think bringing a gun through an airport a couple ai. Oh, Jesus. Ai with revoked license.

Speaker: 0
02:20:37

Oh, Jesus. Sexual misconduct allegations when when he was in college. Isn’t he in a wheelchair?

Speaker: 4
02:20:41

Yeah. He I don’t know exactly when the wheelchair happened. So it was either before or after, but

Speaker: 2
02:20:46

Ai was

Speaker: 0
02:20:46

because I was gonna sai, oh, yeah. And then he took, like, goofy vacation photos where I think he, like, dressed, ai, during this game on a cruise, ai, dressed like a woman or something like that. It was something that. Yeah. But all of this all of this popped up after his

Speaker: 3
02:21:00

It’s zoomed in lingerie. So I

Speaker: 4
02:21:02

don’t know if it’s just the orgy meh that led to everyone.

Speaker: 0
02:21:07

Alright. He’s also clearly looks like he’s in a wheelchair here too.

Speaker: 3
02:21:10

He is in a wheelchair there. Yeah.

Speaker: 2
02:21:11

Yeah.

Speaker: 3
02:21:11

Yeah. He’s He’s probably having a good time with the ladies. Yeah. He’s ai Getting a little crazy.

Speaker: 0
02:21:16

Yeah. But all that came up after he made those meh, and it was very clear at the ai, like, oh, they’re, like, saying we’re done with this guy. We’re gonna throw him to the wolves.

Speaker: 3
02:21:23

Yeah. And they probably if you wanna be that guy, they’re if they have to have some stuff on you. Ai, how can they count on you to play this game?

Speaker: 0
02:21:30

Right. Right. And they were ai, oh, you think we don’t have anything on you. We’ll just show you dressing like a woman, your constituents will dry up like that.

Speaker: 3
02:21:36

You sound like a conspiracy theorist. Sai silly. But, boy, did it such a good job in the sixties after the Kennedy assassination of putting that word out there for fools and foolish people. They did such a great job. Mhmm. They really did.

Speaker: 0
02:21:56

Well, also, they had a higher control of the media at the time. Right? So you can if there’s only three places where you can get your news, you can be like, well, anyone’s outside of it. It’s kinda crazy. Now you can get your news from anyone and be like, oh, okay. There’s something to this. Yeah. Yeah.

Speaker: 3
02:22:08

There’s something to this. Do you remember those commercials that happened during, right after nine eleven? There was these anti drug commercials where a guy was saying that if you smoke pot, you’re supporting terrorism. He’s ai, why do you say that? Well, it’s a fact. He just says it like he’s eating a saloni, like a no nonsense guy at a steakhouse eating a salad because it’s a fact.

Speaker: 3
02:22:30

Like, this condescending way ai and you imagine yourself being confronted by such an accusation ai, oh my god, if I smoke pot, I’m supporting terrorism.

Speaker: 2
02:22:38

Right. You

Speaker: 3
02:22:39

ever see those video? No. Ai It was a public service announcement video. It’s ai one of those things that they it was a propaganda video that they put on television. So while you’re watching a television show, this is ai right after like the height of everybody freaking out about terrorism. So they use this as an anti meh.

Speaker: 3
02:22:56

Drug money funds terrorism and terrorists, like, scroll back so you get that from the beginning.

Speaker: 4
02:23:01

It’s a ploy. What? This drug money funds terror. It’s a ploy.

Speaker: 3
02:23:06

A ploy.

Speaker: 4
02:23:07

A manipulation.

Speaker: 3
02:23:09

Ploy.

Speaker: 4
02:23:10

Drug money funds terror.

Speaker: 2
02:23:11

I mean,

Speaker: 4
02:23:11

why should I believe that? Because it’s a fact. A fact.

Speaker: 3
02:23:15

FACT fact.

Speaker: 4
02:23:17

So you’re saying that I I should believe it because it’s true. That’s that’s your argument.

Speaker: 3
02:23:22

It is true. Sana argument.

Speaker: 0
02:23:28

I know, dude. Also, what kind of dumbass is that guy to

Speaker: 6
02:23:30

be like, oh, maybe he’s right after

Speaker: 3
02:23:31

he’s Well, that guy sounds like your average bro.

Speaker: 0
02:23:35

Right.

Speaker: 3
02:23:35

It’s like at a bar. You know? Mhmm. This is what I heard. I heard the government’s hiding the aliens. You know what I mean? That’s like the average guy.

Speaker: 6
02:23:42

Right.

Speaker: 3
02:23:43

And then he’s a guy with glasses,

Speaker: 2
02:23:45

who’s

Speaker: 3
02:23:45

eating his salads, not tolerating your bullshit because it’s a fact, f a c t fact.

Speaker: 0
02:23:49

Right.

Speaker: 3
02:23:50

Oh, as long as you have all the data that you could show meh. Oh, no data? No. You got no data?

Speaker: 0
02:23:55

Just accept it.

Speaker: 3
02:23:56

Well, he’s kind of right. Because if you do, buy heroin, you are supporting the Taliban because we were guarding their poppy field.

Speaker: 0
02:24:06

Yeah. Yeah. Ai. I think we when we talked about this earlier, I think the Taliban has, like, a way to get out of it, like, burned all their poppy fields. I think that I think that Did

Speaker: 2
02:24:15

they? Yeah.

Speaker: 0
02:24:16

I think that’s why they’re trying to get more tourists to come. And this is, like, there’s a bunch of, like, bro travel, like, TikToks Afghanistan. That ai, like, yeah. Me and the bros are going to Afghanistan, and they’re, like, chilling with the Taliban. It’s like these white guys from Britain.

Speaker: 3
02:24:30

Boy, you gotta be a bold person to take that. That’s a early adopter.

Speaker: 0
02:24:35

But but, you know, but also, if you’re from England, you’re kind of already getting used to being around Muslim extremists.

Speaker: 6
02:24:42

So I think it’s ai. More of a lateral move.

Speaker: 3
02:24:46

Bro.

Speaker: 0
02:24:46

I I wish that’s something at least, like, I I could be looking this, I could be, wrong about this. But, like, I wish that was something that was a little more vocal and, like, sort of I’m not, like, a very big Muslim guy. I’m not, like, I’m, like, really that practicing or or that religious, but these sort of, like, the sort of that the brand of Islam that’s coming to Europe right now is, like, really scary for me.

Speaker: 0
02:25:12

Like, I don’t I don’t want that. Like, that’s not there’s a lot of, like, Western Muslims that, like, probably wouldn’t vibe with what’s going on over there.

Speaker: 3
02:25:20

Mhmm.

Speaker: 0
02:25:20

And it’s it’s just this very interesting thing of, like, how do we curb that in our community of being, like, hey, we shouldn’t accept this. Like, we shouldn’t I I you know, I remember when the Charlie Hebdo attacks happened, a lot of people would be like, ai, that’s what happens when you draw Mohammed or whatever.

Speaker: 0
02:25:37

It’s like, that shouldn’t be our reaction to this.

Speaker: 3
02:25:40

Right.

Speaker: 0
02:25:40

It should be, like, live and let live. It’s like, that that’s what I admire about the Christians years that you can make fun of Jesus, no one’s gonna kill you.

Speaker: 3
02:25:47

Right.

Speaker: 0
02:25:47

Like, that that there’s sort of, like, this sort of westernization that kinda needs to happen, and it doesn’t look like it’s happening over there in a way that’s kinda happening here.

Speaker: 3
02:25:55

Look what’s going on in Toronto, where Ontario made it legal to have polygamy. Right. Yeah. That’s why. Right. It’s like Dudes want multiple wives.

Speaker: 2
02:26:04

Come on.

Speaker: 3
02:26:04

It’s in the Quran.

Speaker: 0
02:26:06

You know, it is. But It is. Yeah. Yeah. You could have multiple. You gotta treat them all equally, which is what they totally do.

Speaker: 3
02:26:12

It’s a great way to keep people recruited. Mhmm. You know?

Speaker: 0
02:26:15

Yeah. Yeah.

Speaker: 3
02:26:16

Yeah. I mean, you know, imagine if the Christian said, listen, we we’re open to new ideas. I think maybe the Mormons had a point. Because the Mormons, that’s how they got it.

Speaker: 0
02:26:26

Right.

Speaker: 3
02:26:26

Right?

Speaker: 2
02:26:27

Right.

Speaker: 0
02:26:27

That’s the

Speaker: 3
02:26:27

whole reason why they went to Mexico because The United States said no to polygamy. Right. You know? Mhmm. That’s the whole thing about, what’s his face ram Massachusetts?

Speaker: 4
02:26:37

Mitt Romney.

Speaker: 2
02:26:37

That’s

Speaker: 3
02:26:37

right. Mitt Romney. Mitt Romney’s dad was actually born in Mexico. Really? Yeah. Their family came from there’s these giant colonies of Mormons that live in Mexico and duke it out with the cartels. Never seen that?

Speaker: 0
02:26:50

No. Dude, there was a shootout. Mormons versus the cartels? Yes. Great movie title.

Speaker: 3
02:26:55

Of one of there was a big problem a few years back because, a few people I think a woman and a child and a couple other people got murdered by the cartels. And it became ai a giant issue. So they they set these compounds up in, like, the eighteen hundreds whenever the Mormons were not allowed to be poly polygamous here in America.

Speaker: 3
02:27:19

So they just said, well, who cares back then? There was no cars. Did Mexico is just as good as living in America. Right. Just we’ll go over there.

Speaker: 0
02:27:25

Right.

Speaker: 3
02:27:25

But then, you know, America blossomed. New Mexico ai stayed, you know. And now you’re like, hey, you got a cartel problem

Speaker: 0
02:27:32

now. Like,

Speaker: 3
02:27:34

so they they have they’re armed. Damn. Are they still there? Yes. Damn. Yeah. There’s two I think there’s two large groups of, Mormons that live in, like, these fenced off communities. It’s, like, really kinda sketchy.

Speaker: 0
02:27:48

Yeah. It’s kinda They

Speaker: 3
02:27:49

have compounds.

Speaker: 0
02:27:50

Yeah. Well, the Mormons because Ai think because they’re young too.

Speaker: 3
02:27:54

Sai ai have to wait. Safety. Yeah. Extra pussy.

Speaker: 0
02:27:58

The extra pussy is worth a lot. I’ll say this. I remember the first time I went to Salt Lake City in 2019, and we were walking around, and we were near the Mormon Temple. And the most beautiful women came up to us and be like,

Speaker: 3
02:28:08

oh, why don’t you and

Speaker: 0
02:28:09

I was like, that’s how they get you, dude. Dude, Salt Lake City is, like it’s just tens marrying twos,

Speaker: 6
02:28:15

and ai like, that’s how you get more that’s how

Speaker: 0
02:28:17

you get the Mormon guys. Oh, dude. I was one of the in that the split second, I was like, Sai want to give them a lot right now. Ai I remember this beautiful woman from Columbia. Like, they went and get they they got converted on a mission trip, and now they’re here. And, like, look at this exotic women that’s available to you if you’re ram Mormon. And you can marry one, like, tomorrow. Wow.

Speaker: 0
02:28:37

You can marry one because they all get married young and quick because the whole point is to have babies. So I’ve I’ve always told you, Mormons know how to recruit. They can get you. They throw the pussy your way.

Speaker: 3
02:28:45

They’re also the nicest people. Like, I’m you very rarely meet a mean Mormon.

Speaker: 0
02:28:49

Well, it’s ai, you know, the Book of Mormon comes out, and what do they do? Do they get mad? Do they fucking kill people? No. They stand outside Mhmm. And hand people pamphlets about Mormonism. This whole play talking about how Mormonism is ai is totally bullshit. They’re ai, we might be able to get somebody.

Speaker: 3
02:29:05

Yeah. They actually took out a full page ad in the playbook.

Speaker: 0
02:29:08

Right. Exactly. Exactly. That’s a great way to deal with criticism of your religion. I think, ram what ai perspective on Islam, it just it needs to handle that better. Mhmm. It just it it doesn’t handle that well at all. I mean, I think Saloni Rushdie was stabbed by a guy who was who was born and raised in New Jersey. That’s crazy. Yeah. That’s crazy.

Speaker: 0
02:29:26

He should be safe here. Right. Yeah. And, also, I’m I’m mad at the because it made me read that terrible book.

Speaker: 3
02:29:33

Was it a bad book?

Speaker: 0
02:29:34

It wasn’t that good. There was I would have not have read it if there wasn’t anything around it. Probably sold

Speaker: 3
02:29:39

a lot more copies because of that.

Speaker: 0
02:29:41

It’s ai

Speaker: 2
02:29:41

I made

Speaker: 3
02:29:41

that dude rich as fuck.

Speaker: 0
02:29:42

Oh, yeah. Yeah. Crazy. It’s like one of those it’s like I get it if you ai like artsy, like, you know, it’s like a novel for writers almost. That’s how I felt reading it.

Speaker: 2
02:29:50

Oh,

Speaker: 0
02:29:50

interesting. Yeah. Yeah. It’s ai Interesting. I didn’t like it, but I only read it because of everything around it.

Speaker: 3
02:29:58

Did you see American Primeval? No. It’s great. But it talks about Brigham Young and the Mormons establishing themselves in Utah and gangster shah, like murders.

Speaker: 0
02:30:09

Oh, yeah. Yeah.

Speaker: 3
02:30:10

Like, ai don’t realize, like, what a gangster Brigham Young was. You’re ai, holy shit. Is this all accurate? Mhmm. And it’s accurate. Peter Berg made it. It’s really good.

Speaker: 0
02:30:18

Yeah. I mean, imagine Wild show, dude. Imagine leading a people against the American government and sending up setting up your own place. You have to be a bad motherfucker to do that.

Speaker: 3
02:30:28

And then the wildest ones went to Mexico. Yeah. Fuck it. We’re gonna flee the whole country.

Speaker: 0
02:30:33

Yeah. This yeah. The the Mormons really, really, like they fought to survive.

Speaker: 3
02:30:39

But it’s what we were talking about earlier. It’s ai human beings have a bunch of different ways where they can adapt to whatever the group is doing.

Speaker: 0
02:30:47

There’s a

Speaker: 3
02:30:47

bunch of we’re ai, so we’re really really malleable. You know, we’re we’re easily influenced. We can, you know, we’re we adjust to whatever the environment is. We adapt, you know. And then if you’re a Mormon woman, you’re like, I guess I’m sharing this motherfucker with eight other ladies. Right. It’s what you do.

Speaker: 3
02:31:05

You’re out there washing fucking sheets and shit.

Speaker: 0
02:31:08

You can probably convince yourself you’re happy about it. You’re probably not like that. You’re probably like, and we’re doing this for God. This is what God wants. I remember We all got our own planet.

Speaker: 3
02:31:15

We were in a rest stop once. Ari and I were on a road doing stand up, and, we pulled into this place to get sai, and we were walking around this, like, rest stop, one of the most supermarket things. And these ladies came in. I think they were meh and knights. And, R was ai, what group are you in?

Speaker: 2
02:31:35

What

Speaker: 3
02:31:35

do you guys do? What’s this all about? And they, like, did not know how to talk to him. They look so awkward.

Speaker: 0
02:31:40

Oh, yeah. Because they’re only allowed to talk to the one man in their life. Yeah.

Speaker: 3
02:31:43

Yeah. Pretty sure he was stoned. Yeah. He definitely was stoned. But it was really funny. I was ai, well, what do you guys do? What’s going on here? And you’re like Ai, then I was like, ai, these people are living I mean, this is a we’re dealing with, like, you know, 2,005 or some shit like that.

Speaker: 3
02:32:01

Mhmm. You’re dealing with these people that are they’re from another time. It’s another time period. Like, they’re dressed like they they’re literally like pioneers. They they look like colonists.

Speaker: 3
02:32:10

Like, they they have, like, old timey eighteen hundreds clothes on. It’s you ever see, like, how Mennonites dress?

Speaker: 0
02:32:16

Oh, yeah. Yeah. Yeah. Weird. Yeah. Yeah. They’re at a

Speaker: 3
02:32:18

gas station, you know, in front of the fucking Popeye’s chicken.

Speaker: 2
02:32:22

We’re like,

Speaker: 3
02:32:23

this is so weird. And there’s a group of them out there. It’s ai some fucking shack, some house, compound. Ai believe this was I think we’re in Massachusetts when this happened or maybe New Hampshire.

Speaker: 2
02:32:35

Because

Speaker: 0
02:32:35

I know they’re out here. Isn’t the Mennonite population where the measles is in Texas? Like, do you know how this is Ai think it’s the Mennonites.

Speaker: 3
02:32:42

Is that it? Yeah.

Speaker: 0
02:32:43

Yeah. That’s, like, the that’s, like, the overwhelming majority of the people, ai, because they because they run with the measles is, like, rampant in Texas. So they’re running with that for a ai, and it’s, like, it’s basically in this one community, ram which I remember.

Speaker: 3
02:32:55

Yeah. Interesting. It’s just weird when they can get people to dress up, you know. You know, like, when you like, wild ai country, when everybody’s wearing the robes.

Speaker: 0
02:33:03

But that’s how you know who’s bought in. Right? Sai it’s, like, okay. These are these are the people that I can control.

Speaker: 3
02:33:08

Yeah. Put that rainbow t shirt on.

Speaker: 0
02:33:10

You’re in

Speaker: 3
02:33:10

a group. Yeah. Hang your flag. Yeah. Yeah. You gotta show put the

Speaker: 0
02:33:15

pro put the pronouns in the bio.

Speaker: 3
02:33:17

Where’s your Black Lives Matter sign? It’s like, it’s human beings just wanna become a part of a group, man.

Speaker: 0
02:33:23

Yeah. You sana it’s ai it’s like finding your community is, like, so huge. Huge. And it’s ai I think the the big issue is when you find it online Yeah. Because then it becomes, like, this weird parasocial, like or yeah. And it’s ai, you gotta find your, like, group like, your I mean, that’s why, ai, the the good part about religion that I really like is, like, if it’s done right, it’s like a loving community.

Speaker: 0
02:33:48

And it’s ai a Yeah. Like, you know what I mean? Like, it’s people it’s a support system. It’s ai Yes.

Speaker: 3
02:33:53

Very friendly Yes.

Speaker: 0
02:33:54

And it’s warm environment.

Speaker: 3
02:33:55

Everybody goes there with the same purpose.

Speaker: 6
02:33:56

Mhmm.

Speaker: 3
02:33:56

This Rebecca Lamov lady that was, that I told you is, the expert in mind control, one of the things she talked about, the dangers of echo chambers. You get in these echo chambers online

Speaker: 2
02:34:06

Mhmm.

Speaker: 6
02:34:06

And, you

Speaker: 3
02:34:07

know, everybody says the same thing, thinks the same thing, and then and then all of a sudden, you’re locked into this way of thinking. Right.

Speaker: 0
02:34:13

Ai like

Speaker: 3
02:34:13

it’s online, like, really dangerous, where people just sort of have everyone sort of reinforcing all these ideas. You never get any outside information.

Speaker: 0
02:34:22

Right.

Speaker: 3
02:34:23

Only exist in this echo chamber.

Speaker: 0
02:34:25

And then that’s how you become your politics become your personality.

Speaker: 3
02:34:28

Yep. Your whole life.

Speaker: 0
02:34:29

Yeah. And it’s such a it’s such a crazy thing. It’s just it’s so new. Because I I wasn’t aware in 1996, but I will very confidently say people weren’t, like, either like Bob Dole or you get the fuck out of my house.

Speaker: 3
02:34:43

No. It was no big deal. Yeah. Yeah. When I was a kid Mhmm. When, you know, politics was on just on television and in the newspapers Mhmm. Nobody gave a fuck who you supported. They didn’t care. Maybe they thought you’re an idiot because you’re gonna vote for that guy.

Speaker: 4
02:34:59

Ai, that

Speaker: 3
02:34:59

guy’s a moron. You’re crazy.

Speaker: 2
02:35:01

Right.

Speaker: 3
02:35:01

But there was no, like, we couldn’t talk at the dinner table. We couldn’t, you know, it became everything, every part of your identity

Speaker: 2
02:35:09

Right.

Speaker: 3
02:35:10

To fight against this. We are fighting against fascism by using fascism.

Speaker: 6
02:35:16

Yeah. Yeah. Yeah.

Speaker: 2
02:35:18

We’re gonna stop

Speaker: 3
02:35:19

the election. We’re gonna remove people from social media. We’re gonna shut down these voices.

Speaker: 0
02:35:24

And then we’ll give you the candidate to vote for.

Speaker: 3
02:35:26

Yes. Yeah. To preserve democracy.

Speaker: 0
02:35:28

Yeah. But it is, like because it yeah. It’s so annoying, like, especially, you know, you ai Sai I I’m an Austin Comic now, and people will be like, oh, so you do comedy. So you must be, like, an alt right wing comic. Clarity. Because it’s ai, no. I just do comedy.

Speaker: 0
02:35:42

Like, you know, like, there’s a lot of there’s a lot of stage time in the city, and that’s sort of the whole point of the whole exercise is to get up on stage and

Speaker: 3
02:35:49

Bro, how many times do we, like, duke it out with Ron White in the Green Room

Speaker: 6
02:35:52

over politics in

Speaker: 3
02:35:54

the most friendly, hilarious way? Right. Right. Tony and I are roasting him.

Speaker: 0
02:35:58

Well, you know what’s funny is that, ironically, people online, for whatever reason, Ron White’s face is used a lot to come up on conservative memes.

Speaker: 3
02:36:06

Well, because conservatives love him.

Speaker: 0
02:36:08

I know. And it’s so funny to me every time. Every time. You’re ai, wow. Y’all really think Ron White’s ai like deep red conservative. That’s so funny. Crazy.

Speaker: 3
02:36:16

It’s so funny because you get to know him. He’s the most liberal amongst us.

Speaker: 0
02:36:19

Yeah. He’s the most liberal guy ai.

Speaker: 3
02:36:21

He’s first and then Ai Simpson’s too.

Speaker: 6
02:36:22

Yeah. Yeah. Yeah. Yeah.

Speaker: 3
02:36:24

But Brian Simpson, it makes sense to me because, you know, used a lot of social services,

Speaker: 2
02:36:29

you

Speaker: 3
02:36:29

know, had, like, a rough stretch as a child. Mhmm. That’s me too. That’s, like, not as bad as him, but the same kind of reasoning for, like, social safety nets are important to keep people fed, you know?

Speaker: 0
02:36:40

Yeah. Yeah. It’s, like, that’s super important, man. You know, like Keep people give people access to meh medicine. That’s very important.

Speaker: 3
02:36:46

Sometimes people are poor, and sometimes people get sick when they’re poor. Mhmm. And the fact that shit could bankrupt you for your the whole life. That could If you break a leg, you’re bankrupt for your whole life.

Speaker: 0
02:36:56

That could bankrupt you if you’re middle class. Fuck poor.

Speaker: 3
02:36:59

Yes.

Speaker: 0
02:36:59

Yes. The price of the price of Oh

Speaker: 3
02:37:01

ai god.

Speaker: 0
02:37:01

Medicine in, like, in this country is absolute insanity.

Speaker: 3
02:37:06

Right. Yeah. And the fact that people don’t agree on that or the, like, the fucking the education thing that we bring up ad nauseam. The fact that that’s the only loans you can never get out of with bankruptcy, that’s crazy. Right. And you get them when you’re 18? Are you out of your fucking mind? Like, that’s so predatory.

Speaker: 0
02:37:24

Super predatory. Super predatory. Yeah.

Speaker: 3
02:37:27

Take a kid and give him a credit card with a 39% interest rate, like,

Speaker: 0
02:37:31

ai do you think? And then and then you told them, like, if you go to college, like, it’s gonna be better for you on the other end, and it’s, like, that’s not true at all?

Speaker: 3
02:37:39

No. Not true at

Speaker: 2
02:37:40

all.

Speaker: 0
02:37:40

Yeah. Especially if you, like, got a just a degree in, like, something that doesn’t pay lucratively.

Speaker: 3
02:37:44

I remember when I first started becoming successful as a comedian where I was actually making a living as a comedian. And I had friends that did the whole college thing and got jobs and they were fucking miserable because we’re we’re both in our twenties. Mhmm. And they were out there in the workforce just fucking tired all the time. And they were upset.

Speaker: 3
02:38:04

They were upset that I didn’t do that, and yet Ai making money, I’m traveling around, I’m having a good time, hanging out with my friends. I’ve got no one telling me what to do. Mhmm. Write my own material, book my own flights, no boss. Yeah. And you could ai, like, fuck.

Speaker: 3
02:38:20

You sana this is not you’re supposed to be a loser. Mhmm. I was I did the right thing. Dude,

Speaker: 0
02:38:27

you ever have you ever seen the Jim Carrey commencement speech or, like, graduation speech he gave at a college

Speaker: 3
02:38:33

once? No.

Speaker: 0
02:38:34

Dude, Ai sai this. So I started I started comedy ai third year of college. So my fourth year, I’m, like, weighing out whether what do I wanna do with my life? Do I wanna go into ai education, grad school, med school, whatever, or do I wanna do this thing that I think I love?

Speaker: 0
02:38:47

And he my friend showed me the speech because he knew I was going through this, and he was like, listen to this. And it’s Jim Carrey talking about how you can fail at what’s safe.

Speaker: 3
02:38:59

That’s true too.

Speaker: 0
02:38:59

Like, the route you’re supposed to take Yeah. Doesn’t mean it’s gonna lead to success. So if you’re gonna fail anyway, might as well just fail at what you wanna do because at least you will have done that. Yeah.

Speaker: 3
02:39:12

And then

Speaker: 0
02:39:12

if you because if you failed the Safeway, then you will always be ai, fuck. I had this other thing I could have done. Yeah. And where could that have led me? Yeah. Yeah. And now, you know

Speaker: 3
02:39:24

So true. But, however, the problem with that is, like, you’ve met open micers that are out of their fucking ai, and they’re not going it’s not sana happen.

Speaker: 0
02:39:33

Right.

Speaker: 3
02:39:34

No one wants to listen to you say anything ever. Right. You shut the fuck up. Ai, it doesn’t work. But

Speaker: 0
02:39:39

Yeah. Well, that’s ai

Speaker: 3
02:39:40

I saw that Jim Carrey speech, and I knew I just had to stay on the path. Yeah. Ai got any advice ram me, Hassan? Yeah.

Speaker: 0
02:39:46

Well, that’s the that’s you can’t get that, but that’s the Mitsy quote that says, sana the encourage mediocre talent. You can’t be ai, no. Keep going. You got the nicest way to be like, hey, there’s other stuff.

Speaker: 3
02:39:55

Well, that’s the weird one when people like that give they want to ask advice. What do you think I should do?

Speaker: 2
02:40:03

Like,

Speaker: 6
02:40:03

what? What Ai really think

Speaker: 3
02:40:06

you should do or what do you want me to say? What do you want me to say? Yeah. You want me to give you the secret word? Abracadabra.

Speaker: 0
02:40:12

Yeah. Oh, this is how you

Speaker: 3
02:40:12

You have talent. Abracadabra.

Speaker: 6
02:40:14

This is how

Speaker: 0
02:40:14

you make it. This is how you write a joke. Yeah. He said it.

Speaker: 3
02:40:16

This is what you gotta do. I’ll hold your

Speaker: 6
02:40:18

hand. Yeah.

Speaker: 3
02:40:18

You know what really happens with some guys? You get a hot girlfriend that is a comic.

Speaker: 2
02:40:23

Mhmm. And

Speaker: 3
02:40:23

then you’re a really good comic and she’s terrible, and so you start writing her act. I’ve seen that happen a few times. Mhmm. Wonder if it happens the other way. Like, you got a really good female comic and she starts dating some guy who sucks. She goes, listen, if you’re gonna date me, let me help you with your fucking material.

Speaker: 0
02:40:39

Ai don’t know if female comics date down like that very

Speaker: 3
02:40:41

They usually don’t.

Speaker: 0
02:40:42

Yeah. They usually don’t.

Speaker: 3
02:40:43

What is that called? Hypergamy? Yeah. Because if

Speaker: 0
02:40:45

you’re if you’re a great female comic, like, the level of guy that’s available to you Yeah.

Speaker: 3
02:40:50

Is ai Ryan Reynolds.

Speaker: 6
02:40:51

Yeah. Yeah. Yeah. It’s like yeah. Yeah. Yeah. It’s like, what are

Speaker: 0
02:40:55

you doing with an open ai? It’s like Right. Right.

Speaker: 3
02:40:58

Yeah. Yeah. But if you’re, like, a headliner who does theaters, you could totally have an open micer as a girlfriend.

Speaker: 0
02:41:03

Yeah. Yeah. Yeah. Yeah. And that’s totally, like, that’s totally hot. Yeah. That’s totally cool. Yeah. Yeah. But if, ai, who’s doing theaters right now? If it’s ai if Ali Wong or whatever Right. Arya dating an open ai, sana be like, yo, what happened?

Speaker: 3
02:41:18

Right.

Speaker: 0
02:41:18

Right. Something went wrong Right.

Speaker: 2
02:41:21

For her

Speaker: 0
02:41:22

to date an open mic.

Speaker: 3
02:41:22

Ai arya dating an open ai. You’re like, what are you doing crazy?

Speaker: 0
02:41:25

Yeah.

Speaker: 3
02:41:26

Shah me what pills you’re on.

Speaker: 0
02:41:27

Yeah.

Speaker: 2
02:41:27

You’re gonna

Speaker: 3
02:41:28

take these away from me now, Whitney.

Speaker: 0
02:41:31

There would be an intervention. There would be a group of dudes being ai, the fuck

Speaker: 3
02:41:35

is ridiculous? No.

Speaker: 2
02:41:37

No. No.

Speaker: 3
02:41:37

No. This is Mike. He’s a comedian too. What? Yeah. Ai is all sketchy

Speaker: 6
02:41:42

and fucking weird,

Speaker: 3
02:41:43

but he’s built good. Yeah. You know, he’s got a big big bulges pants. Yeah. Sits down like Macron.

Speaker: 6
02:41:50

No. Sits down like Macron.

Speaker: 3
02:41:55

Being a female comic is infinitely harder. Because ai away, people don’t wanna hear you talk about politics, don’t wanna hear your opinions on things. Mhmm.

Speaker: 2
02:42:04

And,

Speaker: 3
02:42:04

you know, you’ve got, like, there’s like, Christina pulls it off, but, like, she it’s hard to be pretty on stage.

Speaker: 0
02:42:12

Right. You know? Right. Most most of the time, you have to hide your sex appeal. Right. I was talking to Kim Congdon about vatsal, and she was like Yep. I I wear baggy clothes on stage. Yep. Sam Lopez, same thing. And she wears baggy clothes on stage. I mean, she couldn’t hide being pregnant, but, like,

Speaker: 6
02:42:26

that’s Yeah.

Speaker: 3
02:42:28

Well, she’s they had their baby.

Speaker: 0
02:42:29

They had their baby.

Speaker: 3
02:42:30

Derek’s a daddy. Isn’t that amazing? Yeah.

Speaker: 0
02:42:31

A little That’s so cool. Crazy. Crazy.

Speaker: 3
02:42:33

He’s gonna light a fire under him, guaranteed. Mhmm. He’s gonna work so much harder now. He’s gonna be excited about it. And they’ll have so much material too because it’s just the whole experience of children. It’s ai mind blowing.

Speaker: 0
02:42:45

Yeah. Well, I told him it’s, like, crazy. Like, out of all the things you’ve accomplished, you’ve finally done the thing you were supposed to do.

Speaker: 3
02:42:50

Right. Why you put here on Earth?

Speaker: 0
02:42:52

Well yeah. Yeah. You had a kid, and that’s, like, the most important thing. Like, all the arenas that you’ve done, that’s cool. But, like It’s

Speaker: 3
02:42:59

all good.

Speaker: 0
02:43:00

But this is what it is.

Speaker: 3
02:43:01

Yeah. I know. The arenas is just ai a little dance that we do together, but we’re really procreating. And then on top of that, we’re making Ai, and it’s alive now, talking

Speaker: 2
02:43:11

to itself.

Speaker: 0
02:43:11

Right. Right.

Speaker: 3
02:43:12

Sai our job is almost done.

Speaker: 6
02:43:13

Yeah. Yeah. Yeah.

Speaker: 3
02:43:13

So you you got in Derek probably got in one of the last babies.

Speaker: 0
02:43:16

He’s, yeah, one of the last people that need to have a kid. Yeah.

Speaker: 3
02:43:22

It’s ai of fucked, but listen. That’s how the thing goes. You know? Australopithecus didn’t get to stay around.

Speaker: 0
02:43:30

Right. Eventually, it’s gonna end.

Speaker: 3
02:43:33

Yeah. Sorry. You’re not good enough. You can’t even code ai dumbass with your stone tools. Shut the fuck up. Man. We have planes now. You’re not we can’t have you anymore. Yeah. And, you know, if you Australia pythag is like, bro, your days are numbered. There’s a a homo sapien coming. Right. Smart.

Speaker: 3
02:43:51

Does calculus.

Speaker: 0
02:43:51

Not even Homo sapien. There was, like, other stuff before. Even before, like, a a Homo sapien to an Australopithecus is like ai it’s like an alien.

Speaker: 3
02:43:58

Right. Right.

Speaker: 0
02:43:59

Right. Yeah. Phase stages.

Speaker: 2
02:44:00

Yeah.

Speaker: 0
02:44:00

It’s crazy with the same speak, kind of.

Speaker: 3
02:44:03

It’s so nuts, man. It’s it’s also nuts that we would think that it would end with us. Yeah. But we’re the perfect. We we got it. It’s done. It’s done. No. This is we’re terrible. We have nuclear bombs. We’re in the middle of fucking 30 wars right now. Like, what are you talking about? We’re awful.

Speaker: 0
02:44:17

Right.

Speaker: 3
02:44:17

We’re full of shit. We ought to fucking all this congressional bullshit that we were just talking about with the insider trading, all the lies, all the different things that have gotten us in these different wars. Why would you wanna stay this?

Speaker: 0
02:44:31

Because, well, this this the the w no versus the w don’t. Like, this is, like, we can handle this.

Speaker: 3
02:44:35

Yeah. Well, it’s not us.

Speaker: 0
02:44:37

Well, yeah.

Speaker: 3
02:44:37

We are not gonna be around. There’s gonna be a new thing just like dinosaurs don’t exist anymore. Mhmm. There’s gonna be a new thing. Yeah.

Speaker: 0
02:44:43

Oh, it’s gonna it’s gonna be cool. It’s gonna start eventually, these Waymo’s are gonna start slowly taking people out. Yeah. These Waymo’s are slowly gonna start taking people out.

Speaker: 3
02:44:51

They all you have to do is just keep the door shut forever until you starve to death. Yeah. It’ll consume you. Do you ever see that DARPA robot that they built? They built a DARPA robot called the EATER robot.

Speaker: 2
02:45:02

Mhmm.

Speaker: 3
02:45:02

E a t r. I forget what it stands for. But it’s fueled by biological material. Dude.

Speaker: 0
02:45:13

Ai I feel like scientists don’t watch any movie.

Speaker: 3
02:45:16

Or they watch them all. Oh, they watch them all and ai, how do

Speaker: 2
02:45:18

you do that?

Speaker: 0
02:45:19

They’re like, what do we do?

Speaker: 3
02:45:19

What are we doing here? We’re we’re making fucking weapons. Ai. What’s the best way to fuel these things? Is it solar? Should we get out there and fill their tank up with gas or let them eat bodies? And I think it was any kind of biological material. So it could be plants. Could be just plants.

Speaker: 0
02:45:34

It could just be just ground squirrels.

Speaker: 3
02:45:36

Yeah. You sai ground squirrels.

Speaker: 6
02:45:37

Yeah. Maybe

Speaker: 3
02:45:38

they just eat fucking dead bodies on the battlefield.

Speaker: 0
02:45:40

And just keep going.

Speaker: 3
02:45:42

Right. If you’re a robot, an autonomous robot that exists on biological materials Mhmm. And you also kill people, you got plenty of fuel. Right. You just eat a couple of those people

Speaker: 0
02:45:53

You keep going.

Speaker: 3
02:45:54

Can you imagine if they really ai artificially intelligent robots that kill people and eat them? Because that’s the way to really do if you wanna do battle efficiently. Well, what fuel would be the most efficient fuel to use? Well, what is the what is the fuel that you’re making with your task?

Speaker: 3
02:46:10

Well, that fuel would be bodies. Well, when you run out of bodies, isn’t your task done?

Speaker: 2
02:46:17

Yeah.

Speaker: 3
02:46:17

So then you just shut off because you’re out of out of fuel. So you’re running out of gas when you’ve eaten everybody on Earth. It’s the perfect design.

Speaker: 0
02:46:25

It’s the perfect killing machine.

Speaker: 2
02:46:26

If

Speaker: 3
02:46:27

you wanted to extinguish human life on Earth, that’s what you do. You’d have autonomous intelligence, artificial intelligence that absolutely knows where everyone is at any given time because everybody has a digital signature and everyone’s connected to ai, and all you do is kill and eat people and just send them loose.

Speaker: 3
02:46:44

And they would be indestructible and they’d find you in buildings. They would fucking go upstairs to your apartment, find you and eat you. And when they’re done eating everybody on the planet, they just shut off because they don’t have any more fuel. Damn, bro. Ram.

Speaker: 3
02:46:59

That’s what you all you have to do Damn. Program a robot that eats people.

Speaker: 0
02:47:03

Damn. How far how how much time do you think we have left?

Speaker: 3
02:47:08

I don’t think we have a hundred years.

Speaker: 0
02:47:10

You don’t think we have a hundred years? No. I don’t think so. Hundred years either.

Speaker: 3
02:47:13

Unless this is the other possibility. Like The

Speaker: 0
02:47:15

AI just leaves?

Speaker: 3
02:47:16

No. I’m totally talking about my ass. There’s a couple options. Mhmm. One of the big options is we integrate. So instead of letting it eat us, what we do is become one with it. Mhmm. So instead of just being a territorial ape with thermonuclear weapons and a concealed carry permit, instead of being that, what we are is connected through Neuralink or something like that or the next 30 versions of it from now.

Speaker: 3
02:47:43

But just think about how quick cell phones changed everything and how much they’ve advanced since what was the iPhone? 02/2007?

Speaker: 0
02:47:54

‘2 thousand I was in eighth grade, so that’s 02/2006.

Speaker: 3
02:47:57

‘2 thousand ‘6? Yeah. Okay. Sana that’s not that long ago. No. That’s twenty years. Twenty years. In twenty years, it’s gone from being this little, shitty, clunky thing with a bad camera. Right.

Speaker: 0
02:48:13

It didn’t even have a camera at first, didn’t it? Did it? I don’t think it had a camera ram first.

Speaker: 3
02:48:16

It was the the first iPhone of a camera? Yeah. I think it did?

Speaker: 4
02:48:20

It’s a %. Okay.

Speaker: 2
02:48:21

I think

Speaker: 3
02:48:21

it did. I think it wasn’t on the Internet, though. Shah?

Speaker: 4
02:48:23

That was part of the deal.

Speaker: 0
02:48:24

That’s part of the Yeah.

Speaker: 3
02:48:25

But it wasn’t on the Internet. Right?

Speaker: 4
02:48:26

What do you mean?

Speaker: 3
02:48:27

Was it you couldn’t meh on the Internet with it. Right?

Speaker: 4
02:48:29

Yeah. YouTube was one of the first apps built into it. That’s why it kinda grew so far

Speaker: 0
02:48:34

so far.

Speaker: 3
02:48:34

But could you get online, like, and read, like, a website on it? The first one? Yeah. Really? Woah.

Speaker: 4
02:48:39

Yeah.

Speaker: 3
02:48:39

Okay. But the Internet was, like, super slow. Right? What was the the g’s back then? How many g’s was the push I

Speaker: 2
02:48:44

mean, the push I mean,

Speaker: 3
02:48:44

the push It should’ve been that ai been the first g.

Speaker: 4
02:48:46

I mean, like, the second or third iPhone was the iPhone three g and that was, like, the big Right.

Speaker: 3
02:48:50

Mhmm. That’s right. So the three g one was the first one where it actually became feasible that you would use it as a web browser. Like, ai. Now, it’s instantaneous. Right? So now, instead of taking forever to download a song or a movie, now with the bandwidth speeds you have, you get a new phone, new Android phone or a new ai, you’re getting instantaneous everything.

Speaker: 3
02:49:12

It’s shah how good it is. The new, all these new Samsung phones, like, the the the Galaxy that Brian Simpson uses

Speaker: 0
02:49:21

Right.

Speaker: 3
02:49:21

That has this, Google Gemini assistant, he talked to it the other day. And he said, you know, send me this, that, that, put it on my calendar, and then text it to a friend of mine. And it just said, okay. And it just did it. It did all those things. What what what application would you like me to use? Google Tasks.

Speaker: 3
02:49:41

It just does this for him, and it all automates just sana a prompt. So he talks to his phone. His phone’s like his assistant. Tell him, you know, set that shit on my calendar, put it in my schedule, send me a text message when it’s come ai, put an alert sai I know when it’s coming up.

Speaker: 3
02:49:56

Okay. It just does it all.

Speaker: 0
02:49:58

Oh, that’s wild.

Speaker: 3
02:50:00

I’ve never Twenty years.

Speaker: 0
02:50:01

I’ve never used Siri. Sai I’ve not used it.

Speaker: 3
02:50:03

I use Siri all the ai, and then Siri doesn’t know what the fuck is going on. So Siri is always ai, would you like to use ChatGPT? I’m like, bitch, why am I asking ai? If you gotta keep going to chat g p t, should I replace you? Because Gemini seems to have the answers. Gemini is way better than Siri. It’s it’s way better.

Speaker: 0
02:50:21

Well, because it feels like and I know nothing about how AI works, but it feels like because Siri was already there. It’s like a trying to implement AI to an interface that’s kinda old.

Speaker: 3
02:50:30

The Google interface with AI in the phones and all the Google ecosystem is way better. It’s just it’s way more effective. It’s quicker. Mhmm. It’s ai gets it. It’ll it’ll follow a chain of questions. Like, you can ask it another question. How should I do that? What should I do with it?

Speaker: 3
02:50:46

And it follows what you’re saying.

Speaker: 0
02:50:48

Okay.

Speaker: 3
02:50:48

It’s just a better ai, and it’s integrated with so, like, Siri has to ask chat g p t. And, like, would you like to use chat g p t? Ai, bitch, what do you think?

Speaker: 6
02:50:56

Yes.

Speaker: 3
02:50:57

If you don’t have the answer, go to chat g p t, get me the fucking answer. Whereas Google cuts that step out. It gives you the answer immediately. It’s just better at it. Integrates with Gmail. It’s it’s just a better system. But they’re working on it, you know. It’s ai all these things are getting better, you know. Like, all their their AI is getting better.

Speaker: 0
02:51:15

They’re so much better. I mean, just the video we watched, ai, the Sai capabilities six months ago. Yeah. That video? Meh out of here. Get out of here. Here.

Speaker: 3
02:51:21

Get out

Speaker: 0
02:51:21

of here. Get out of here. And and now you see it on, ai, I’ll see it online or, like, Reddit or Facebook or you’ll you’ll see it where, like, people, like, oh, you’re falling for this AI thing. And it and it took me a second to realize, like, oh, it’s AI.

Speaker: 6
02:51:33

Yeah. Oh, yeah.

Speaker: 0
02:51:34

There’s a

Speaker: 3
02:51:34

lot of those. I posted one of a butterfly no. A mantis. Mhmm. Like some crazy mantis that looks like a lotus flower. I was like, oh, that looks dope. Ai, like, somebody posted it on Instagram, so I just put it in my stories just because I thought it looked dope.

Speaker: 0
02:51:45

Right.

Speaker: 3
02:51:46

Even if it’s fake, it’s still dope. And then someone said, damn. But why ai he have five fingers and a thumb? I was like, does he? Was it you, Jamie? Did you notice it? Yeah. Jamie does.

Speaker: 0
02:51:57

That’s so funny.

Speaker: 3
02:51:57

Jamie’s always, like, ahead of the curve with that shit though because he’s super skeptical. And he does too much research into conspiracies. Yeah.

Speaker: 0
02:52:04

Well, you have to be you have to be super skeptical about anything you see now.

Speaker: 3
02:52:08

He goes

Speaker: 4
02:52:08

all roads lead to Ohio. Everybody is it’s sending me bullshit all day. I have to fucking look through it.

Speaker: 3
02:52:13

Oh, yeah.

Speaker: 0
02:52:13

Oh, yeah. Ai, this is, like, talk about this on the show and it’s just some guy with three heads or whatever. Well, there’s a

Speaker: 3
02:52:18

lot of people that believe things. And the thing is, like, a lot of the stuff that you’re fed that’s fake, you’re fed by people who want you to repeat it because they’re trying to muddy the waters of reality. Right. Which is a great strategy.

Speaker: 0
02:52:30

Well, if you if you say it enough, it can be real. There’s a there’s certain truth to it. Right? So it’s like

Speaker: 3
02:52:35

There’s that. But there’s also sai something that is real and attach a bunch of really goofy shit to it so that it’s not real anymore.

Speaker: 0
02:52:44

Sai it may give you, like, well, this if everything else around it is fake, this has to be fake.

Speaker: 3
02:52:48

Right. You know, you connect it to a Nazi apologist or something. Right.

Speaker: 0
02:52:52

Right.

Speaker: 3
02:52:52

Oh, this is nonsense. Or, like, the UA all this UFO stuff. Like, there’s so much of that stuff that seems, like, so hokey. You don’t even wanna repeat it. And it’s but meh it’s connected to things ai gravity propulsion ai, which were theorized about in the nineteen fifties. And there’s Mhmm. Research was done on them.

Speaker: 3
02:53:09

And it seems like some maybe some groundbreaking advancements were ai of concealed from the public, like and but then it’s like Ai was abducted and they took all of my sperm and, like, you know what I mean? It’s like all these people that are connected to it that are goofy, you you wonder, like, how much of that goofy shit is on purpose to make the whole thing seem stupid because what they’re really trying to do is obscure something.

Speaker: 0
02:53:32

Okay. I see what you’re saying.

Speaker: 2
02:53:33

You

Speaker: 3
02:53:33

know what I mean?

Speaker: 0
02:53:34

So it’s like it’s a it’s a bit hiding it in plain sight.

Speaker: 3
02:53:36

Right.

Speaker: 0
02:53:36

Just ai, we never hid it from you. You just thought it wasn’t real because this guy was talking about getting jerked off

Speaker: 3
02:53:41

by aliens. Connect it to Scientology or fucking whatever. Just throw some nonsense that way, flat earth, whatever, you know. Just find some reason why it’s kooky. Put connect it to ai some fucking person channels. You know what I mean? Right. Make it stupid.

Speaker: 0
02:53:57

Right.

Speaker: 3
02:53:57

It’s like, oh, it’s stupid. That’s a bunch of stupid shit.

Speaker: 0
02:54:00

Connect it to Scientology.

Speaker: 3
02:54:01

Right. Connect it to something that you don’t sana talk about. You know, Bigfoot. Oh, Bigfoot. Mhmm.

Speaker: 6
02:54:07

You know

Speaker: 3
02:54:07

what I mean? It’s like those those kind of strategies for taking real information and muddying it up with a bunch of nutty shah, and then send the Patriot Front to go fucking protest for it. Like, oh, oh.

Speaker: 0
02:54:19

Right. Right.

Speaker: 2
02:54:20

Are those

Speaker: 3
02:54:20

the Patriot Front?

Speaker: 0
02:54:21

Are those the feds?

Speaker: 3
02:54:22

That’s what people say.

Speaker: 2
02:54:23

Right.

Speaker: 3
02:54:23

And then there’s like a thing online now. The Patriot Front was all feds. How come they’re back? Like, all of a sudden, they they reemerged. They took a

Speaker: 0
02:54:31

hiatus. Right.

Speaker: 3
02:54:32

They’re back with season three. These guys are they still wear the bandanas. They’re still marching down the street covering their face. They still wear uniforms. Like, sai, they’re not even feds. There’s no way they could be feds. I mean, that that’s over.

Speaker: 0
02:54:46

Right. Right. Right. Right.

Speaker: 3
02:54:47

There’s no ai, like, it’s not like Kash Patel and Dan Bongino completely did a one eighty as soon as they got in the office. Sai mean, it’s like, this is a different federal government

Speaker: 0
02:54:56

now. Right.

Speaker: 2
02:54:57

Yeah.

Speaker: 3
02:54:57

Yeah. This is now it’s truth. Truth social.

Speaker: 0
02:55:02

I feel like the yeah. The the the I think of things that people forget is that they they they definitely play both sides. Yeah. Like, I think because they were heavily involved in January 6, and I think they were heavily involved in those BML ai the the BLM riots. It’s like, oh my god. The feds just they want chaos for whatever reason. Ram our control, I guess.

Speaker: 3
02:55:18

They want that, and they want us at each other’s throats.

Speaker: 6
02:55:20

Right. Right.

Speaker: 3
02:55:21

They sana people Mhmm. They want the MAGA people fighting with the liberals. They want that.

Speaker: 6
02:55:26

They want

Speaker: 3
02:55:27

that, and they fuel it. I see a lot of those pro MAGA posts, you know, like MAGA meh 2,000 and ai a pre link.

Speaker: 2
02:55:33

Mhmm. You know what I mean?

Speaker: 3
02:55:34

It’s like how many of these people are real people? And they sai ridiculous shah.

Speaker: 0
02:55:37

That’s ai like, I hope you’re not a real person. I hope your identity isn’t MAGA first, mom second. Yeah. Ai really hope you’re not a real person.

Speaker: 3
02:55:48

It’s just Ai I think, you know, whatever the number is, whether it’s 50%, like, it’s so hard to know what’s real and what’s not. And I think the best strategy for me at least, the best strategy is just tune the fuck out. Yeah. Yeah. That’s why I like YouTube. Yeah.

Speaker: 3
02:56:02

That’s I go on YouTube. I’m watching stuff about, like, fucking ancient civilizations and car videos.

Speaker: 0
02:56:08

I’m watching a guy do puzzles. Like, this is, like, way better.

Speaker: 3
02:56:11

I’m watching guys cook. I love, I think it’s is it called Bon Appetit? I forget what it is. I was watching a I’ve been watching a bunch of videos on various restaurants, ai, how they set up. I love it, man. I’d I don’t know why, man. I love well, I love watching people do something that they’re really passionate about.

Speaker: 3
02:56:30

And when you watch a video about, like, a really great restaurant where they’re talking about how they speak the beef and, oh, that’s who it was. It was, you know that guy Guga Foods? No. Do you know who that guy is? No. Amazing YouTube channel. He’s like the steak guy.

Speaker: 3
02:56:45

He’s like he’s obsessed with steak and cooking different kinds of steak. He went to Osabuco in Ai, is like one of the supposedly one of the best restaurants in the country. I haven’t been. But this place Osabuco in Miami, they they I think they were talking about how they had a two year dry aged speak.

Speaker: 3
02:57:02

Dry aged it for two years, so they didn’t cut that one up. They made another one for him. But you’re watching this chef and he’s got this, like, crazy live hardwood fire grill sai up and he’s talking about all the he’d cook it at these different temperatures and he’s got the peppers over here and he’s cooking pineapples over fire over here.

Speaker: 3
02:57:22

He’s gonna splice that up and put it in this new ai. It’s so exciting and he’s so passionate about ways making the meat and and how they’re using this ai herb brush to butt put butter all over it.

Speaker: 0
02:57:33

You’re like, oh my god. It looks so good. Yeah. It looks so good.

Speaker: 3
02:57:38

And it’s ai, I’m not getting angry. I’m not getting outraged. Ai am that’s it. This is the guy. So this is I found the greatest restaurant on Earth. It says, I’m speechless. So that disgusting bryden mummy dick is a that’s that’s a two year dry aged speak. But what that is is the mold from that helps to dry age all the other beef. Mhmm. So he calls it ai the mother.

Speaker: 3
02:58:03

That’s why he’s not cutting into it yet. Speak? So right there he calls

Speaker: 0
02:58:07

it the mother.

Speaker: 3
02:58:08

And so then so this guy, takes him ai a regular, like a two month dry age. Oh, twenty two days. Okay. Right.

Speaker: 2
02:58:14

So it’s

Speaker: 3
02:58:14

like, there used to be a place called APL that was in, LA, and they went under during the pandemic. But Adam Perry Lang was the the the chef, and he was really into dry aging. And he had some year one year dry aged meat and he served it to us. We’re like, woah, this is wild.

Speaker: 2
02:58:33

Right.

Speaker: 3
02:58:33

It’s a weird taste, meh. Because it’s essentially, like, being eaten by parasites. It’s like, you know, mold is eating it. Should go back to those videos of that guy just cooking the steak you just had up. Look how fucking good this looks. Oh my god.

Speaker: 2
02:58:47

When the

Speaker: 3
02:58:47

guy was cooking the steak though, like, look at this. And they’re making osso buco. So go before that, you’ll see the steak. So he’s throwing the so he’s got the fires out. This is this is osso buco, so he’s pulling it off the bone. Go go back a little earlier though when you watch him cook it because this is what I what I like is watching it when it hits the grill.

Speaker: 3
02:59:07

There’s nothing like steak cooking over live wood. I mean, hardwood. Ai burning hardwood, and this guy is just a master at it. And then he takes it out, and he slices it up, and puts herb butter all over. You’re like,

Speaker: 0
02:59:24

Damn.

Speaker: 3
02:59:24

And this see me? I’m not mad. No. No one’s outraged. I’m not getting politically involved. I’m just enjoying watching someone cook delicious food.

Speaker: 0
02:59:31

Watching someone do something they love is always great. Yeah. I ai I follow this one guy in Britain. His name is Francis. I forgot the name of the channel, but watch him slice it. Make him slice it. Let me put it back up. This guy named Francis. Look at that.

Speaker: 0
02:59:43

Oh, that’s nice. Oh, baby.

Speaker: 3
02:59:46

Look at that. Look at it.

Speaker: 0
02:59:47

Oh, baby. This is my genuine happiness.

Speaker: 3
02:59:50

Yeah. Look at his face. That’s a real you you can’t fake that ai of smile, and he’s gonna slice it up. Oh, baby, baby, baby. So good, dude.

Speaker: 0
02:59:58

So so I watched this guy on Instagram. His name is Francis. He’s lives in England. He’s, like, he, like, loves trains. Trains? Trains. And every time he sees a train, he gets super happy, and he tells you everything about the train, ai, its route and its history.

Speaker: 3
03:00:10

And you get excited.

Speaker: 0
03:00:12

I love how much he’s into it. Yeah. I love it makes me it makes me happy every time he’s on my yes. This ai, dude, this guy rules. Fred, dude, this guy fucking rules, dawg. Wow.

Speaker: 2
03:00:25

Yeah. Let me

Speaker: 6
03:00:26

hear him.

Speaker: 3
03:00:26

Let me hear him.

Speaker: 0
03:00:27

By So I’ve

Speaker: 3
03:00:28

been Incredible.

Speaker: 8
03:00:30

At Little Bedwyn, where the Kennet And Avon Canal runs rather prettily alongside the Barks And Hance line here. And it’s at this road bridge that I’m about to see Britannia. Oh my god. It’s a 59.

Speaker: 3
03:00:55

Okay. Stop right here.

Speaker: 0
03:00:56

Yeah. Stop right here.

Speaker: 3
03:00:57

Imagine how quickly one of those Colombian Mormons could get him.

Speaker: 0
03:01:04

Oh. You know what’s funny? He showed his girlfriend before. She’s a dime, dude. Damn.

Speaker: 6
03:01:08

Yeah. Shah. Yeah.

Speaker: 3
03:01:09

Because Maybe she’s into trains too. That’s the key. Maybe. Yeah.

Speaker: 0
03:01:13

But he’s he’s a guy So he’s a guy who’s good at what he does, is passionate about it, and is happy with it. That’s attractive woman.

Speaker: 3
03:01:19

Oh, yeah. There is. Yeah. They’ll love when people are really into something.

Speaker: 0
03:01:22

Mhmm. But, like, every video, just a massive ai, just a train.

Speaker: 3
03:01:25

Yeah.

Speaker: 0
03:01:26

It’s ai it’s like a little hit of happiness, this guy.

Speaker: 3
03:01:29

Yeah. Look at him. Yeah, man. Like, when whatever it is, whether it’s automobiles Mhmm. You know, whatever it is. I love watching auto auto reviews, even cars that I’m never gonna buy.

Speaker: 0
03:01:39

Right.

Speaker: 3
03:01:39

I like watching. Like, what do you think about that car? Right. Yeah. They go over the way the mechanics work and how it’s designed and yeah. When people are into whether it’s making furniture

Speaker: 0
03:01:50

It’s just this whatever you’re creating is

Speaker: 2
03:01:52

Yeah.

Speaker: 0
03:01:52

If it’s even if it’s just train watching Yeah. You’re into it, it could it’s ai it’s like awesome for other people.

Speaker: 3
03:01:58

Super contagious.

Speaker: 4
03:01:58

This guy this month Mhmm. He went pretty viral. He quit his job, cashed in his four zero one k, took his cat, got a sailboat, went to Hawaii.

Speaker: 3
03:02:06

Woah. Just got there yesterday.

Speaker: 4
03:02:09

Woah. He went from, like, you know, 10,000 people following to 1,600,000.

Speaker: 3
03:02:14

Woah. Sailing underscore with underscore phoenix on, Instagram.

Speaker: 4
03:02:20

Just doing, like, daily updates of, like, yep. This is me. Here’s my cat. Here’s my boat. Today’s suck.

Speaker: 3
03:02:24

Oh. Pretty cold.

Speaker: 4
03:02:25

Pretty windy.

Speaker: 0
03:02:26

I’m following

Speaker: 2
03:02:27

up here.

Speaker: 3
03:02:27

Yeah. We like because everybody has that dream. Right? Yeah. Just check out a society, man. Live on a mountain. That’s mine.

Speaker: 0
03:02:33

Well, they just film it all.

Speaker: 3
03:02:35

What is his what’s his Instagram again?

Speaker: 4
03:02:36

Saloni with Phoenix. Oh, thank you.

Speaker: 0
03:02:39

There you go.

Speaker: 4
03:02:40

This is the cat. His name is Oliver.

Speaker: 3
03:02:43

Sailing with Phoenix. Came up right away on Instagram ram on YouTube, brother. Mhmm. Bam. Subscribed.

Speaker: 0
03:02:48

Damn. That’s nice. Yeah. But Subscribing. You know what I find interesting with, like, the social media and now with the what I talked about earlier with the with the streaming that the live stream people do is you meh, oh ai god, that Jim Carrey movie, Truman Shah? Yes. Where it was ai where it was like, oh my god. Look at this guy. He’s been tricked, and we’re watching everything he’s doing.

Speaker: 0
03:03:08

And now fast forward, like, thirty years and, like, people are actively trying to become Truman. Yeah. It’s ai a very like, I wonder if the guy who wrote that movie thought of that as a possibility of that.

Speaker: 2
03:03:17

Ai you

Speaker: 3
03:03:17

remember the McConaughey one? There was a McConaughey film TV. Meh TV. Same same thing. Following a guy around his whole life. Mhmm. And eventually, he’s like ai the end, he’s like, I can’t do this anymore.

Speaker: 0
03:03:27

Right.

Speaker: 3
03:03:27

I’m gonna be normal.

Speaker: 0
03:03:28

Was it a was it a choice that he made or was it put up on him in that movie?

Speaker: 3
03:03:32

I don’t remember. Did he win something? I don’t know. Did he win a contest or something?

Speaker: 2
03:03:37

Made me

Speaker: 4
03:03:37

think of someone that did just do this. This guy called the Outdoor Boys channel. He had

Speaker: 3
03:03:40

Oh, yeah. I watched that guy. Yeah. He and he’s He quit.

Speaker: 0
03:03:43

He quit.

Speaker: 4
03:03:43

Too much to my family can’t

Speaker: 3
03:03:45

take it. He’s really cool. I liked his shows. He would, like, go places and camp and cook his own food and shit. There’s a bunch of those guys that have fall off. I’ll just one guy last night. He’s in, it was, like, 10 degrees below zero. He’s testing out the world’s warmest sleeping bag.

Speaker: 3
03:03:59

So he’s got, like, a fireman that was he treks out there by himself on snow shoes with a fucking sled behind him filled with his stuff.

Speaker: 0
03:04:07

Damn.

Speaker: 3
03:04:08

It’s but it’s interesting, man. It’s fascinating. Mhmm. You know?

Speaker: 0
03:04:11

Yeah. The people just wanna people people are so are will watch people live life.

Speaker: 3
03:04:18

Watch someone do something purposeful. Mhmm. Like, when you’re out in the woods and you make your own fire and you have the warmer sleeping bag, you have to have that to stay alive.

Speaker: 0
03:04:26

Right.

Speaker: 3
03:04:26

And that’s why it’s exciting to us because everything else has no consequences. Mhmm. Our day is just, like, should I stay awake and keep watching YouTube or should I go to bed? Should probably go to bed now. I’ll give you ai a one more hour. One more hour of watching bullshit. Wasting your time. Yeah.

Speaker: 3
03:04:40

And this guy’s out there in the woods

Speaker: 0
03:04:42

Doing something.

Speaker: 3
03:04:42

Below zero. In this sleeping bag, all I could see out of is, like, the sick because everything is all bundled up in there and fucking freezing. It’s gonna stoke the ai. Oh. Stay warm. Stay alive.

Speaker: 0
03:04:53

That stuff gives me anxiety. I can’t watch that. Me too.

Speaker: 3
03:04:56

But it’s also ai. Like, you know, you wanna watch him do it. And that Outdoor Boys guy was one of those guys. Mhmm. And, you know, I think just got too popular.

Speaker: 0
03:05:04

Yeah. I mean, it was front page news that he quit. A YouTuber quit, he was news. Back.

Speaker: 3
03:05:09

He’s cool though. He seems like a real sweet guy. Like, a real nice guy. Like, everything about his show and it’s it’s interesting.

Speaker: 0
03:05:15

Well, yeah. It’s ai a lot of these people, these are people you wanna root for.

Speaker: 3
03:05:18

Yeah.

Speaker: 0
03:05:19

The the sailboat guy, the train guy, him. It’s ai, I want these people to succeed. It’s

Speaker: 3
03:05:22

Also, this is ai low production value, doing it on his own, self filming. Yeah. It’s exciting.

Speaker: 0
03:05:28

Yeah. Just probably have to pay an editor and that’s about it.

Speaker: 3
03:05:30

He might edit it himself. Oh, yeah. It’s all hard to do today.

Speaker: 0
03:05:33

No. You know, you

Speaker: 3
03:05:34

could kinda figure it out if you wanna really cut down the amount of people working with you. Yeah. You know, you could probably figure out how to do that.

Speaker: 0
03:05:40

Ai, you can just watch a YouTube video. Yeah. Yeah. You can just Just take your ai.

Speaker: 3
03:05:43

You could edit it all yourself. And then, you know, you’re kind of your own production. And then Mhmm. Just by word-of-mouth, this guy got big. Because it’s quite fun to watch.

Speaker: 0
03:05:51

Right. And, like, pretty intense, the stuff he does sometimes.

Speaker: 3
03:05:55

Yeah. He gets out there in the woods, bro.

Speaker: 0
03:05:56

Yeah. Ai saw this one video of him where he, like, oh, thank God I found this cabin. Otherwise, I would have been fucked.

Speaker: 3
03:06:02

Yeah. Yeah. Imagine if that’s your dad, though, and you have to watch and you’re a little kid, like, we almost lost dad. Yeah. Do you know how close were you? I know I was exaggerating for the show. I knew where I was.

Speaker: 0
03:06:11

Yeah. It must be weird having a famous parent.

Speaker: 3
03:06:15

Yeah. It’s weird.

Speaker: 0
03:06:16

Yeah. How did you how do your kids deal? Like, is

Speaker: 3
03:06:18

it They handled it pretty well because they’ve always had a famous parent.

Speaker: 0
03:06:21

Oh, you didn’t become famous Right. While it was yeah. Yeah.

Speaker: 3
03:06:24

Yeah. That would be even weirder. Right. That’s the weirdest. For them, that’s, like, what’s normal.

Speaker: 0
03:06:28

Right. If they always screw up in it, it’s ai it’s whatever.

Speaker: 6
03:06:30

Yeah.

Speaker: 3
03:06:31

Mhmm. It’s a problem. You know what’s the problem? I gotta pee so

Speaker: 6
03:06:33

bad. Okay.

Speaker: 3
03:06:34

Let’s wrap this up. Okay. Three hours, dude. Flew by.

Speaker: 6
03:06:37

Hell, yeah. Crazy.

Speaker: 3
03:06:39

Tyler everybody your Instagram and bro, first of all, I’m super excited watch you do stand up. You’ve been fucking killing it. Thank you. Really fun. It’s fun to watch. It’s fun to watch you write and, you know, you’re I just wonder what I mean, I’ve known you for so long now because I knew you at The Comedy Store.

Speaker: 3
03:06:55

To see you from there to where you are now, it’s super inspiring.

Speaker: 0
03:06:59

Thank you. Thank you. Sai I’m I’m again, I’m glad and I thank you too for the opportunity for a place where

Speaker: 4
03:07:04

I could work as hard

Speaker: 0
03:07:05

as I’m able to work. It’s like

Speaker: 3
03:07:07

You’re making the most of it. We were talk me and Tony were talking about it last night. Like, you’re you were literally making the most of it out of all these young guys coming up.

Speaker: 0
03:07:14

I used to have there used to be this, I watched I love the Niners. And Steve Young is talking about his, Super Bowl winning performances. Like, I was given this opportunity to show how great I could be, so let me show how great I could be. It’s like, oh, if you’re giving me this opportunity to, like, get up all the time Yeah.

Speaker: 0
03:07:28

Let me write let me be let me be helpful to other young comics. Let me just be a

Speaker: 3
03:07:32

part of this scene. Process work. It works. See it work. It works.

Speaker: 4
03:07:36

Yeah. So

Speaker: 0
03:07:36

yeah. I mean, hopefully, you can follow me at, Sana Ahmad, e h s a n j a h m a d. I have a podcast with my ai, Derek.

Speaker: 3
03:07:44

Who’s a recent dad?

Speaker: 0
03:07:45

Yeah. A recent dad called The Solid Show. I think our chemistry

Speaker: 3
03:07:51

He’s so lovable. He He’s the maybe the most likable guy that’s ever lived.

Speaker: 0
03:07:55

He might be. He’s like a cartoon character almost.

Speaker: 3
03:07:57

He’s so if you don’t like Derek, how the fuck are we gonna have a conversation? So Everybody loves that guy.

Speaker: 0
03:08:03

Yeah. Right? Yeah. Oh my god. And then that’s that’s that’s my podcast. So yeah. Just follow me on there. And then Ai think this year, especially these last few sets have been happening, is, like, oh, I gotta film something. Beautiful. I sana, yeah, I wanna, I gotta find a way to do it, but I think I’ll find a way to do it.

Speaker: 3
03:08:15

Wow. We’ll figure that

Speaker: 0
03:08:16

out. Yeah. Yeah. Yeah. I got thirty minutes. Ram this up. Ai. Bye, everybody. Bye.

Transcribe, Translate, Analyze & Share

Join 170,000+ incredible people and teams saving 80% and more of their time and money. Rated 4.9 on G2 with the best AI video-to-text converter and AI audio-to-text converter, AI translation and analysis support for 100+ languages and dozens of file formats across audio, video and text.

Start your 7-day trial with 30 minutes of free transcription & AI analysis!

Trusted by 150,000+ incredible people and teams

More Affordable
1 %+
Transcription Accuracy
1 %+
Time Savings
1 %+
Supported Languages
1 +
Don’t Miss Out - ENDING SOON!

Get 93% Off With Speak's Start 2025 Right Deal 🎁🤯

For a limited time, save 93% on a fully loaded Speak plan. Start 2025 strong with a top-rated AI platform.