#2270 – Bridget Phetasy

Bridget Phetasy is a writer and stand-up comedian. She hosts the show “Dumpster Fire" and also the podcast “Walk-Ins Welcome.” www.phetasy.com Learn more about your ad choices. Visit podcastchoices.com/adchoices
Your partner in AI voice technology
Transform voice into your most valuable asset.
Capture, transcribe, and analyze audio and video with the Speak platform - or work closely with the team on custom solutions and conversational AI agents.
Try Speak Free Book Consult
Free trial includes 30 minutes , 30 minutes with a work email.
What you can do
Capture, transcribe, and analyze audio, video, or text
Summaries, action items, themes, quotes, and key moments
White-label embeds, repositories, and exports for real workflows
Trusted, fast, global
Users
250,000+
Languages
100+
Exports
DOCX, SRT, VTT, CSV

You can listen to the #2270 – Bridget Phetasy using Speak’s shareable media player:

#2270 – Bridget Phetasy Podcast Episode Description

Bridget Phetasy is a writer and stand-up comedian. She hosts the show “Dumpster Fire” and also the podcast “Walk-Ins Welcome.”

www.phetasy.com

Learn more about your ad choices. Visit podcastchoices.com/adchoices
This interactive media player was created automatically by Speak. Want to generate intelligent media players yourself? Sign up for Speak!

#2270 – Bridget Phetasy Podcast Episode Top Keywords

#2270 - Bridget Phetasy Word Cloud

#2270 – Bridget Phetasy Podcast Episode Summary

In this episode of the Joe Rogan podcast, the discussion revolves around the evolving landscape of podcasting, the influence of artificial intelligence, and the impact of social media on mental health. Joe Rogan and his guest, who appears to be Michael Malice, explore how podcast clips often gain more popularity than full episodes, highlighting the changing consumption patterns of audiences. They delve into the potential of AI to create virtual podcasts with historical figures like Albert Einstein and Steve Jobs, although the technology is still in its early stages and somewhat crude.

A significant portion of the conversation touches on the transactional nature of interactions in the podcasting world, especially with wealthy individuals who are eager to appear on popular shows. This is contrasted with the genuine connections that can be formed through podcasting. The speakers also discuss the mental health challenges faced by podcasters, particularly those who are not traditional performers and are deeply affected by negative social media comments.

Recurring themes include the commodification of podcasting, the role of AI in shaping future content, and the mental health implications of being in the public eye. The episode underscores the importance of authenticity in podcasting and the need to navigate the pressures of fame and audience expectations.

Continue reading the full guide (click to expand)

Actionable insights from the episode include the value of maintaining authenticity in content creation and being mindful of the mental health impacts of engaging with social media. The overall message suggests a cautious optimism about the future of podcasting, with technology offering new possibilities while also presenting challenges.

This summary was created automatically by Speak. Want to transcribe, analyze and summarize yourself? Sign up for Speak!

#2270 – Bridget Phetasy Podcast Episode Transcript (Unedited)

Speaker: 0
00:00

This episode is brought to you by Squarespace. When it came time to make a website, there was no question that we would power it with Squarespace. From the intuitive design intelligence that helps to create a bespoke digital identity to the seamless payment options that can help give your customers more ways to pay or the fact that you can measure your end to end online performance with powerful website and seller analytics.

Speaker: 0
00:22

The reasons to power your website with Squarespace are endless. So if you’re looking to build or even upgrade your current website, check out squarespace.com for a trial or go to squarespace.com/rogan to save 10% off your first website or domain purchase. Joe Rogan podcast. Check it out.

Want to run this on your own file?
Upload audio, video, or text and get a transcript, summary, and insights in minutes.
Try Speak Free Book Consult For voice partners, white-label, routing, and advanced workflows
Free trial includes 30 minutes (60 with a work email)
Speaker: 1
00:43

The Joe Rogan experience.

Speaker: 0
00:46

Showing my day, Joe Rogan podcast ai night, all day.

Speaker: 1
00:52

Ai horse.

Speaker: 0
00:53

This thing is no one’s happy with just being, like, a little successful. You get a little successful and then they wanna get more.

Speaker: 1
00:59

Is that everyone though?

Speaker: 0
01:00

I don’t know. Was that you, Bridgette?

Speaker: 1
01:02

I’m I’m I’m a little successful.

Speaker: 0
01:05

That’s it.

Speaker: 1
01:06

And I’m happy.

Speaker: 0
01:07

Yeah. Well, it’s ai, I don’t know. Just sana find why you’re doing it. You don’t wanna just be on a hamster wheel.

Speaker: 1
01:14

Well, I think it’s easy to get lost in chasing more. You know, like, I’m I’m an addict. So it’s very I try to stay away from analytics and all that stuff because I can become hyper focused and obsessed with them.

Speaker: 0
01:28

Well, right.

Speaker: 1
01:29

And one of the reasons that after I did, who was it? I was opening for Landau, and I we would go out and just talk to the people after the show. And it was it was or, they were like, oh my gosh. I love Walk and Welcome. I love Dumpster Fire. And it was, like, such a good reminder that you get, like, chasing numbers. And it was like, oh, no. These are not just numbers.

Speaker: 1
01:51

They’re people unless you’re, like, buying bots. But I think it can be easy to just be get on that hamster wheel and start being like, we need more. We need more downloads. We need more and

Speaker: 0
02:01

Yeah.

Speaker: 1
02:02

More more and more, and then you forget. And I never wanna take the audience we have for granted.

Speaker: 0
02:08

Yeah.

Speaker: 1
02:08

You know, like, they’re they’re amazing. They’re they’ve been with some of these people have been with me for forever.

Speaker: 0
02:14

That’s ai of the key to it all. Right? So I always have, I mean, it sounds so corny because it’s such ai a new wellness way of looking at things. Have gratitude. Do some gratitude. But gratitude is, like, very important. It’s really important to to be thankful for what you have.

Speaker: 1
02:29

Yeah. And it it was it was one of the key things in getting sober. I think when I’ve dealt with, like, anxiety, depression, other things in my life, gratitude is it is a powerful mechanism for, like, shifting your your perspective because you can get into that feeling of, like, not being enough, not having enough, not

Speaker: 0
02:50

Yeah.

Speaker: 1
02:51

It never being enough.

Speaker: 0
02:52

And Ai Cowan was telling me about his buddy who’s a billionaire. His buddy’s worth, like, $3,000,000,000, and he feels like he’s poor because he’s friends with people who have a hundred billion dollars. Yeah. Like, imagine.

Speaker: 1
03:04

No. I mean, this when I was dating this very wealthy guy who is, like, probably half of a billionaire, you know, ai, 500 millionaire. And we were in Sana Tropez, and he we he felt poor. Ai remember being I meh it so vividly. I was in the shower, and he he loved me because I was, like, this, like, poor back packer that was, like, entertaining. Yeah.

Speaker: 1
03:31

He ai like, look at this entertaining. Artist. He wanted me to be ai his pet monkey ai came around and just, like, made him laugh. And I gave him shit, and I think guys like that are used to be you get surrounded by yes men too at a certain

Speaker: 0
03:44

Oh, yeah. For sure.

Speaker: 1
03:44

And you don’t have people, like, taking the piss out of you. And so I would make fun of him for his boring stories about his mattresses that people would sit there. I’m like, why are you guys listening to this guy talk about a mattress for an hour? And he was I was in the shower, and he was talking about how he and his friends got together.

Speaker: 1
04:02

And he’s like, you know, we sat around, and we were talking, and he’s and there’s a certain level at which you can be happy no matter what. And I thought I had been rubbing off on him. Like, oh, my yogi spirituality is run it rubbing off him.

Speaker: 2
04:18

What did

Speaker: 0
04:18

he get to, like, 10,000,000,000?

Speaker: 1
04:19

Now he ai, and it’s $250,000,000. Ai was crying. I quit that leave the rest of us swatches, by the way.

Speaker: 0
04:34

Like, I’m seeing

Speaker: 1
04:35

this Carrera Marvel freaking shower.

Speaker: 0
04:38

Imagine the thought that the only way you could ever be happy is with $250,000,000.

Speaker: 1
04:44

That’s the minimum.

Speaker: 0
04:45

I know some people worth $250,000,000 are miserable as fuck.

Speaker: 1
04:49

Yeah. No. It’s just

Speaker: 0
04:50

It’s not ai do it. It’s not gonna do it at all. So

Speaker: 3
04:54

It’s like, I’m sorry. Where does that leave people like me?

Speaker: 0
04:56

No. Don’t. You need I think you need a few things. You need your health above and all.

Speaker: 1
05:02

Yeah. It doesn’t matter how

Speaker: 0
05:04

much you are. Sana number one.

Speaker: 1
05:05

Mhmm.

Speaker: 0
05:06

Number two is you have to have friends. If you if you’re just like the man and everybody’s kissing your ass and you’re, you know, you’re you’re the head of this giant business and you live in a bubble and you’re not happy. That’s not happy. Happy is you have to have colleagues. You have to have companions, comrades. You have you have people that you’re you’re like them.

Speaker: 0
05:27

You get to hang together, go to dinner, and laugh, and hug each other.

Speaker: 1
05:31

Yeah. Have

Speaker: 0
05:31

fun. Yeah. Enjoy your life.

Speaker: 1
05:34

I was thinking about even the other night in the green ram, like, it keeps it it’s ai everyone takes the piss out of each other. It doesn’t fucking matter what level you’re at, who you are. Everybody’s talking shit. It keeps everybody it doesn’t matter. Like, you can walk out that door and be very famous. But in that room, it’s just convenience talking shit.

Speaker: 0
05:54

Yeah. It’s a beautiful environment to keep your head straight.

Speaker: 1
05:57

It’s very it’s necessary, I think, too. And and and it is, ai, even though you might think you want yes men all around you, it just it’s I think what I’ve learned even just from being around rich guys who I give a lot of talk a lot of shit to is they don’t really want that.

Speaker: 0
06:14

No. It’s uncomfortable. You don’t want yes men around you. You want people that are making fun of you.

Speaker: 1
06:19

And also, how are you gonna like, you need people to push back.

Speaker: 0
06:23

Yeah. I think it also depends on what is your personality. Some people are, like, very deeply deeply insecure, and they really almost desire yes men just to maintain stability. It’s some people are very weird, you know. And you don’t know it because their public face is that they’re normal.

Speaker: 0
06:39

You know, their public face when they’re getting interviewed, they know how to, like, turn it on for five minutes, but then when you’re around them all day, you know, they’re fucking crazy people, which is why they’re successful in the first place, which is really weird. It’s ai what got you to dance is literally mental illness.

Speaker: 1
06:54

I I I was talking to I think Malice came on my podcast recently.

Speaker: 0
06:58

He’s the best. She’s he’s the best.

Speaker: 1
07:01

One of my best friends, truly.

Speaker: 0
07:02

Is he doing stand up now?

Speaker: 1
07:03

I hope he does.

Speaker: 0
07:04

I’d heard he’s doing stand up. I heard he’s gonna do stand up or he’s planning on doing stand up.

Speaker: 1
07:08

He wants to, but here’s the thing. He probably has to do it at Mothership. Like, he can’t be at an open mic where somebody’s gonna record. Oh, yeah. Well,

Speaker: 0
07:17

he could totally do mothership

Speaker: 1
07:19

on the mic. So we’re

Speaker: 0
07:19

gonna make that Sunday and Monday night. Yeah.

Speaker: 1
07:21

That’s what I said.

Speaker: 0
07:22

I was

Speaker: 1
07:22

like, talk to Joe.

Speaker: 0
07:23

Oh, I’d I’d let him go up into a guest spot on one of my shows. Fuck it. Yeah. He’s funny, man. He’s Yeah. Yeah. Fucking funny. I when he when he said that, you know, that viral clip when, you know, if you, you know, you you know, you’re an ableist. Yes. And I’m like, an ableist? He’s like, she’s a retard.

Speaker: 1
07:40

Yeah. He’s so quick.

Speaker: 0
07:43

Oh, it’s it’s timing is excellent. He’s Well, he’s such a smart guy. Yeah. He’s so fucking except when it comes to, like, the whole anarchy thing. We don’t need cops. I’m like, listen, bitch. You need a cop for just to keep me from you. What are you talking about? What are you talking about?

Speaker: 0
07:57

You don’t need cops. What are you talking about? Shut the fuck up. You need cops.

Speaker: 1
08:00

But he was saying

Speaker: 0
08:01

You need cops.

Speaker: 1
08:03

Yeah. He said I don’t

Speaker: 0
08:04

wanna hear that nonsense.

Speaker: 1
08:04

No. I mean I don’t wanna hear

Speaker: 0
08:05

that no law and order nonsense. Shut the fuck up.

Speaker: 1
08:07

I just ram ai, where are you in this hierarchy?

Speaker: 0
08:10

Yeah. We’re you’re dead. You’re dead. I’m gonna steal your food on day one. Shut the fuck up.

Speaker: 1
08:14

First round of the the purge you’re a god.

Speaker: 0
08:17

What the fuck is wrong with you? You need cops.

Speaker: 1
08:21

I feel like I’m at least second round of the purge.

Speaker: 0
08:23

But Are you armed? Yeah. Yeah. Well, that’s important.

Speaker: 1
08:26

So it’s at least second.

Speaker: 0
08:27

Yeah. Maybe not. Do you know how to use it? Yeah. Do you train?

Speaker: 1
08:30

Ai I do. Yeah. Okay.

Speaker: 0
08:31

Yep. That’s good.

Speaker: 1
08:32

Yeah. I don’t sana be one of those, like Yeah. No. That was right. I taught

Speaker: 0
08:36

my kids how to shoot when they’re very young.

Speaker: 1
08:38

Yeah. I think you yeah.

Speaker: 0
08:40

Do you gotta teach them ai safety? You know, how never have your finger on the trigger instead trying to shoot something. Yep. Ever. You know, ever. Don’t hold the gun with your finger on the trigger. Always point it away, even if it’s not loaded. Point it away from people. Point it at the ground.

Speaker: 0
08:53

Point it away. If if people around point it in a in a direction where there’s no human beings. Yeah. You know, understand Always

Speaker: 1
08:59

check and see. Yeah.

Speaker: 0
09:00

Always check and see if there’s not a bullet in the chain.

Speaker: 1
09:02

Yep. This is

Speaker: 0
09:03

how you rack it. This is how you do it. Yeah. It’s ai, you should know how to use them. Just because if, god forbid, something ever happens, it’s horrible. Your house gets broken into and, you know Yeah. You have you have the ability to preserve your life.

Speaker: 1
09:17

My fear was always that I’d be in a situation, like, in the movies where somebody’s wrestling with someone and then they kick the gun over and I’m the girl standing there and I’m like Like,

Speaker: 0
09:27

how does fucking where are these buttons and switches and the magazine pops out. I’m like, fuck. And you pull the trigger, but it’s not around the chain. Like, how does this fucking thing go?

Speaker: 1
09:39

And then there is and I accidentally shoot the hero.

Speaker: 0
09:42

Oh, that happens. That happens. People panic. You know? Yeah. If you’re not used to, like, high pressure situations and you expect to be able to shoot somebody, Jesus Christ.

Speaker: 1
09:51

Oh, sai malice. Anyway, the reason I brought him up, he was talking about how many he didn’t realize how many people in podcasting were mentally

Speaker: 0
10:01

ill. Well, in everything.

Speaker: 1
10:02

But yeah. I know.

Speaker: 0
10:03

But podcasting for sure because well, there’s a lot of people that aren’t performers that are in podcasting. I think they’re even more mentally ill. Because those are the people that are, like, deep in the fucking social media comments all day and seeing people shit on them, and they do they’re out of their fucking minds.

Speaker: 0
10:16

There’s a there’s a bunch of them that are just off the and then They fight

Speaker: 1
10:21

all day.

Speaker: 0
10:21

They fight all day with each other, and you see them over the years get progressively more and more insane and more and more aggressive to each other. Mhmm.

Speaker: 1
10:29

It seems it seems exhausting. Also Oh, yeah. Who has time for that? I I don’t understand.

Speaker: 0
10:35

Not just exhausting, but detrimental. It’s a tremendous waste of resources. It’s really bad for you mentally. Ai, your own mindset is it’s bad if you’re in conflict with someone all the ai, especially if you could have avoided it. Yeah. You don’t admire yourself if you’re doing that. You just no way you’re ai, hey, I’m on the right track. There’s no fucking way.

Speaker: 0
10:54

There’s no way you’re you know you’re retarded.

Speaker: 1
10:57

They might be feeling like they’re on the right track. I mean, I think Elon

Speaker: 0
11:01

just on the wheel.

Speaker: 1
11:02

Elon’s ai it now, though. So these guys who are in the comments fighting, they’re making, you know, $23 a month doing it.

Speaker: 0
11:10

That’s a good point. That’s a different thing. Right? Or how much are people like, what’s the highest earner on x? Like, how much can you make?

Speaker: 1
11:16

So random. People are making livings. I think you have to be on it constantly.

Speaker: 0
11:21

Right.

Speaker: 1
11:21

And that and some people make a lot of money. When I see them post what they’re making, I’m like, holy crap. How are you?

Speaker: 0
11:27

How much?

Speaker: 1
11:27

I don’t know. And it seems to be people who kind of Elon will, ai, yeah. He’ll he’ll, like, turn the eye of Sauron upon them and suddenly, like, they are are but it’s very mercurial. You know? It can it seems like it can change on a dime.

Speaker: 0
11:42

And Well, a lot of people post things that are just not bryden, and Elon reposts them.

Speaker: 1
11:46

All the time. He uses social media like we do. I think I do more fact checking than he does.

Speaker: 0
11:53

Yeah. He just he doesn’t have the ai. First of all, give the guy a break. He’s running, like, government programs along with SpaceX, along with Tesla.

Speaker: 1
12:02

I will I will cut him some slack, but also there with great power comes great responsibility. You have more followers than anyone on that entire sai, and you’re gonna boost, like, Russian propaganda.

Speaker: 0
12:15

Yeah. That was one. Right? That fake talk shah? Yeah. There was a fake talk show that he boosted. The other thing that we should probably tell people is that political thing is not true.

Speaker: 1
12:25

No. I know.

Speaker: 0
12:25

The $8,000,000 thing. The $8,000,000 is $8,000,000 from all the government organizations from 02/2016 Yep. To 02/2024. So there’s a eight year period, and then it’s a there’s some kind of wacky premium subscription that you can get from Politico Yeah. That allows you, like, instantaneous access to the news. You’re not just reading the articles. You’re, like, getting the news feeds.

Speaker: 1
12:48

I don’t know. There’s some Well, it’s a lot of places where there is look. I’m I’m of two minds of this. I think we need to be accurate because we do live in a time where it’s almost like people don’t care about truth at all. Right. They’re they’re just they’re like, ah, whatever. It doesn’t matter. It’s not even it’s, indifference to it. You know? It’s which is not great.

Speaker: 1
13:12

But I do think bryden like, you should care about your own credibility at some point, but people get rewarded for being shameless sai that you can just keep going. The political thing is weird because, yes, a lot of the stuff that’s going out ai now on all of these, like, deep dives that people are doing. They’re viral.

Speaker: 1
13:33

They’re mostly many of them are fake. But, also, the why is any taxpayer dollar they it should be $0. Right? It’s going to that stuff.

Speaker: 0
13:44

The person to search is Mike Benz. Go go to the Mike Benz sai is it Mike Benz cyber? I think that’s it. Right? His his x page. Mike Benz He’s

Speaker: 1
13:54

been on his forever. Forever? Yeah.

Speaker: 0
13:56

Well, former state department ai. And he uncovered all this bullshit while he was there. And he is insanely knowledgeable and insanely articulate and so good at expressing exactly how these things fund things and what it is. And it’s what it is is an enormous slush fund.

Speaker: 1
14:14

Yeah.

Speaker: 0
14:15

That’s what it is. And, unaccountability and money just coming and going and flowing, and it’s all circular. Uh-huh. All circular, donating to the Democrats. The United States government funds them. They donate to the Democratic Party. It’s the whole thing is wild.

Speaker: 1
14:31

And people can sense this. This is why when they’re saying, oh, it’s it it’s ai you can sense that there’s been a misappropriation of our money, of taxpayer dollars. You know there’s fraud and Medicare and Medicaid are the American people have sensed this, and they feel like there’s corruption, but I think they’ve just hopelessly kind of surrendered because you you’re I’m a middle class mom who works, who has a kid, who like, it’s you would who has time to fight this?

Speaker: 0
15:01

Right.

Speaker: 1
15:01

You know? You you don’t have time to, like intention.

Speaker: 0
15:04

No. How do you have time to fight it?

Speaker: 1
15:05

No. No. And people are just trying to survive and get through the day, and it and then you but you know. You have this sense. Like so I can see the excitement of, like, people Elon, what he’s doing, and we it shouldn’t be controversial to want to, like, audit our budget our budget. No.

Speaker: 1
15:26

Americans wanna do this forever.

Speaker: 0
15:28

The money being spent. It’s what it’s being spent for and what’s going on, which is an enormous propaganda machine.

Speaker: 1
15:36

Yeah.

Speaker: 0
15:36

Like, a big part of the whole left wing narrative that has, like, overlaid our country over the last, whatever, eight years, ten years, is all propaganda funded by our own government.

Speaker: 1
15:49

That’s right.

Speaker: 0
15:50

This is why Trump won the election. People don’t really believe in these things. The amount of people that think that transgender biological males should be competing against your daughter in sports is so fucking small.

Speaker: 1
16:05

Yeah.

Speaker: 0
16:06

But meh, our own government was propping it up. It’s so and why are they propping it up? Because it’s a fucking beach ball at a concert. You keep it tossing up in the air, and everybody’s distracted. As long as you can keep a few things going here’s the things you gotta keep going. Abortion. Right?

Speaker: 0
16:21

Overtoning Roe v Wade is so great for business. Yeah. Because now it’s like a vatsal, the battlegrounds, and women’s rights and their lives are at stake. Okay. That’s one.

Speaker: 0
16:31

Gay meh. That’s a huge one. Then now they’re gonna take away gay marriage. Oh ai god. Bounce that fucking beach ball. That’s a gigantic one.

Speaker: 0
16:38

War is a giant one. All these different things are just fucking beach balls, and they toss them around every now and again. In the in the meanwhile, they’re just siphoning billions of dollars. Zelensky just said he’s missing a hundred billion dollars in the hundred and 77,000,000,000 that we supposed to

Speaker: 1
16:56

set up. Weird too about Haiti where it’s, like, only 2% of the money actually went there. It’s it’s great. You know, Americans give away a lot of their hard earned money because they they are actually ai and wanna donate to countries that are and then you find out it’s, like, some trans performance.

Speaker: 1
17:14

There is a lot of nonsense.

Speaker: 0
17:16

A lot of nonsense. In the tombs of hundreds of millions of dollars

Speaker: 1
17:20

of nonsense. They talk about it ai, oh, who cares? It’s only $10,000,000,000. You’re like, you guys are out of your fucking minds if you think that’s gonna be the argument that resonates with Americans.

Speaker: 0
17:32

Not only that. How are you

Speaker: 1
17:33

gonna say that? Pointing out $10,000,000,000 because they’ll be like, oh, look at the trillions over here. But all of those billions add up. Like, how are you even saying that? And it’s the other it’s I’m grateful that you had me on MAGA state media. Thank you for well, I’ll get to get that out. Because I see it all the ai. All all the time.

Speaker: 1
18:01

Like and it’s it’s hilarious to me in light of how much money goes to funding, like, sponsored by Pfizer. You know? All of these other how the like, the fucking audacity to accuse you who just has people you like on your podcast to suddenly be state media when you literally have had state media work like, media working with the state in conjunction for a decade.

Speaker: 0
18:31

Yeah. At least. And probably a lot more longer

Speaker: 1
18:33

than that. Multiple decades.

Speaker: 0
18:34

It’s just gotten really gross once well, Trump is just sort of, like, the accelerant. He was the gasoline that got thrown in the ai, so we got to see, like, how this thing works.

Speaker: 1
18:44

I voted for him.

Speaker: 0
18:45

I did too.

Speaker: 1
18:46

I came out. Yeah.

Speaker: 0
18:46

Openly. I I I endorsed him. I ai like, this has gotta stop. This is this is crazy. Also, he’s not what you guys said he was. He’s just not.

Speaker: 1
18:55

We I’ve I put a video out, like, the Friday before the election because I voted for him, early. And I was Hillary in 2016, no one in 2020, and Trump in 2024. So, like, I don’t care how people vote. When a peep do whatever you want. I was only being honest, and it was it was weird to come out and say that as a comedian because I don’t not think that comedians should really be political.

Speaker: 1
19:23

But on the other hand, it felt dishonest because I had been so openly kind of torn about voting for him.

Speaker: 0
19:30

Well, also, if you’re a comedian, you have to protect free speech. Yeah. There there is no if, ands, or buts about it. And when it comes to this argument, the Biden administration was fucking terrifying for free speech. They were actively attacking people that were posting truth on social media and attacking them and trying to get their posts removed, including the guy who was the one who was quoted in this fucking book about Kamala Harris saying that we made it difficult to for her to come on the shah, and they told a untrue story about having a bunch of people come down here to do a run through the sai, ai, they were ready to do it.

Speaker: 0
20:10

All bullshit. That was the guy. He was the guy that was emailing Twitter and saying, how is this post being, like, super aggressive, saying, why is this post still up? Oh, you mean that truthful post

Speaker: 1
20:20

Yeah.

Speaker: 0
20:21

About vaccine injuries and side effects? What the fuck are you talking about? That that ai was getting very scary.

Speaker: 1
20:27

Yeah. This is what I said in the video I made about why I voted for him. I was like, there’s a couple things. I’ve interviewed many detransitioners. Those interviews keep me up at night. They haunt me. What like, there’s too many of them already, and one is too many. And I’m I think a lot of Americans were ai, we’re we’re putting a stop to this. We have to stop this nonsense. It’s crazy. We’re sterilizing children.

Speaker: 1
20:50

The other thing is free speak. And I was like, you know, when we’re talking in our weird, like, using that crazy YouTube language that you have to use, like, unalived and you’re and you’re saying, like, the jib jab or whatever, like, to get around

Speaker: 0
21:03

Yeah.

Speaker: 1
21:04

That’s coming from one side.

Speaker: 0
21:05

A %.

Speaker: 1
21:06

I’m not I’m not using this language.

Speaker: 0
21:08

But I think it’s coming from one side because that one side is in power. That’s my fear. My fear is if the other side was in power and they were influenced by the same amount of money from these companies, they might be doing it too. So if the right was in control of all the social media companies, are we so naive that we think that they wouldn’t be co opted by giant corporations and they would sana a sense of them too.

Speaker: 0
21:29

What happened was it was all the left. So the tech people who are, you know, generally, they go to universities and they get involved in electronics and technology and these people are generally left leaning. Right? And if they’re doing it in San Francisco, well, the whole culture’s left leaning. Ai? It’s like not not even leaning. It’s just left. Right? If you’re you’re a ai.

Speaker: 0
21:49

If you’re a meh you wear a MAGA hat in San Francisco, you’re a fucking maybe today. I bet today. Wear them now. I bet today you can.

Speaker: 1
21:55

Yeah. And New York too. I think you can rock them.

Speaker: 0
21:58

But back when they were establishing these social media platforms, everybody was left wing. Well, what if it was the opposite? What if technology was the realm of the ai? And what if it was all and, you know, what if, like, the the gay rights what if you start thinking about in terms of, like, biblical law, you know, like a man layeth with a man and people start getting real crazy about what gay yeah.

Speaker: 1
22:19

That might be the next ten years.

Speaker: 0
22:20

But you see it in other Ai don’t think so. But you see but that’s the good thing. Ram is not conservative when it comes to social issues.

Speaker: 1
22:28

He’s not.

Speaker: 0
22:29

My Ai think that’s what we need. We need, like, a realist, someone who’s, like, conservative fiscally and understands foreign policy and how to deal with fucking dictators and shah, but also someone who’s, like, I don’t give a fuck who you love.

Speaker: 1
22:43

Yeah. Who

Speaker: 0
22:43

cares? Who cares? I’m happy if you’re happy. Are you in love with a woman and you’re a woman? Fantastic. The If you’re in love, that’s great.

Speaker: 1
22:51

Yeah. I think the argument, though I mean, as as you’ve seen with some of these articles that are, like, the right wing ecosystem that, like, red pilled all these men, the argument is that the the Internet is right wing. And that this is why Trump won is because the all of these influencers are red pilling people and not you know, it’s an easy way to not take any responsibility for how you’ve pushed men away from your party, how you’ve Yeah.

Speaker: 1
23:16

Like, failed to to get moderates on in any sense of the way. Even just yesterday, you had the, like, Trump photo op with him signing the rights for women to compete against just women. Like, that is the

Speaker: 0
23:35

feminist president ever just by signing that.

Speaker: 1
23:37

Yeah. I know. It’s just so funny.

Speaker: 0
23:39

It’s crazy. It’s sai crazy.

Speaker: 1
23:41

It’s a bit so There was a

Speaker: 0
23:42

guy who went on it was MSNBC or CNN. I forget what it was. But he was essentially talking about me and Theo Vaughn and all these other podcasts ai Flagrant, Andrew Schultz, as if this is this massive right wing network that’s heavily funded and has been built up over years, and we don’t have anything like that.

Speaker: 0
24:04

I’m like, dude, you you fucking idiot. You can go and watch me on a laptop in my fucking den

Speaker: 1
24:11

Yeah.

Speaker: 0
24:12

From fifteen, sixteen years ago. So much. Me and RedBam.

Speaker: 1
24:16

Like ai? AG one is ai? Yeah.

Speaker: 0
24:18

We’re hitting a bong and our only sponsor was the Fleshlight. Shut the fuck up. Like, that’s not you you just don’t wanna admit that organically Right. There’s a bunch of people that feel very different than you. Also, they don’t like you.

Speaker: 1
24:33

Yeah.

Speaker: 0
24:34

You don’t represent a man to a lot of men. When you’re one of those guys that talks in speak, we have to understand that there’s a there’s a whole right wing ecosystem, and it’s built heavily funded. And the propaganda that they’re pushing, we have to fight back against that. And we need someone of our own.

Speaker: 0
24:51

And, like, no fucking kid who’s on a basketball court who’s 17 years old is looking at his phone. It’s ai looking at going to college next year and looking at getting a job someday and being a meh. I was looking at that and going, what the fuck is this?

Speaker: 1
25:05

Yeah.

Speaker: 0
25:05

And he’s hanging out with his bros, and they’re like, what the this is fucking bullshit. This is bullshit. And then you could see a man who’s not owned Yeah. Like me.

Speaker: 1
25:14

Yeah.

Speaker: 0
25:15

I’m not owned. I can do whatever I sana. And that’s what they want. They wanna just be a man and be a nice man. You could be a nice man. You could be a a masculine man and be nice.

Speaker: 1
25:24

I’ve said this a lot about you is that you like, they you do a lot. You could be a way worse version of yourself. You know, the with the level of where you’re at, you could be a total douchebag, and you’re, like, promoting having families and promoting, like, lifting weights.

Speaker: 1
25:44

Like, it it could be a lot worse.

Speaker: 0
25:46

Yeah. But I couldn’t because it wouldn’t work. I wouldn’t stay. Yeah. I wouldn’t have been able to maintain. People would have seen through it eventually.

Speaker: 1
25:53

Yeah.

Speaker: 0
25:54

You know, the JRE coin. I would have fucking cashed out. I would have fucking pump and dumped ai made a few billion dollars. Ai be on a yacht with Bezos, fucking a party.

Speaker: 1
26:04

Whoo. But don’t see, this is my question. What I was watching the inauguration, and I was, like, seeing all the kind of tech people, and I was like, this is somewhat unsettling. I mean, it’s it’s good to see the tech has come around to not

Speaker: 0
26:19

They have to.

Speaker: 1
26:20

They have to. But, it also I worry that it’s just because they wanna, like, get into China or Ai or, you know, I still worry about

Speaker: 0
26:30

Well, there’s definitely that.

Speaker: 1
26:32

That much power with the government.

Speaker: 0
26:34

You know

Speaker: 1
26:34

what I ai?

Speaker: 0
26:35

Like, for

Speaker: 1
26:36

all the, like, yelling we’ve been doing about tech in cahoots with the government, I still think I have to try and be like, this could go sideways.

Speaker: 0
26:45

Well, also, they have to look at in terms of, like, what Elon is doing. Right? So if Elon has aligned himself clearly in a huge way with the right and now is running Doge, right, the Department of Government Efficiency, and also has that that he’s turning x into a platform that rivals not just social media platforms, but video platforms ai YouTube.

Speaker: 0
27:06

Mhmm. Like, they get insane amounts of videos, of of views rather on videos that are on x. And then you can get paid by x.

Speaker: 1
27:13

Yeah.

Speaker: 0
27:14

And then they’re talking about having some sort of, like, x monetary system. You know? Like a we doesn’t WeChat have something like that? They have money built into it. So what’s to stop that motherfucker from having a phone? I keep asking him. I said, dude, I’ve been seeing all these articles about you making a phone.

Speaker: 0
27:28

He goes, I hope I don’t have to make a phone. It’s very difficult to make a phone.

Speaker: 1
27:31

He said he wants to make a phone.

Speaker: 0
27:33

No. He didn’t.

Speaker: 1
27:33

He’s well, he’s pulled. Would people buy a phone if he made one?

Speaker: 0
27:37

Yeah. Just to fuck around. But he’s not interested in making phone. He doesn’t wanna make a phone.

Speaker: 1
27:41

Every time Elon fucks around, it ends up happening.

Speaker: 0
27:43

Well, he could make a phone. He would be the only guy that would break us out of the blue bubble paradigm. Because, like, I was switching to Android for a while. I was, fucking around with Android. It’s hard. It’s it’s really hard. One of the one of the hard things is getting people to start using WhatsApp or something like that. And so, like, people just don’t wanna use it.

Speaker: 0
27:58

You miss a lot of text.

Speaker: 1
28:00

Aren’t they sai getting sued for the blue bubble?

Speaker: 0
28:03

I think they are.

Speaker: 1
28:03

I think it’s ai an no. I think in I believe is it just in Europe? I thought it was part of an antitrust lawsuit against

Speaker: 0
28:11

It’s just better. Here’s the thing. It’s not that it is, it keeps people, but it’s just better. It looks better. The blue bubble looks better than the green bubble. It’s it’s more Soothing. It’s soothing for your eyes.

Speaker: 1
28:24

Yeah.

Speaker: 0
28:24

If the green one had black text, maybe it would look cool. But the green one with white text, it’s a little weird. I don’t like to

Speaker: 1
28:31

wear that. Time there’s, like, a green bubble person in a group chat and they send a picture, it’s ai Not anymore though.

Speaker: 0
28:37

Not anymore. Because texting RCS texting is, it allows for large sizes of files.

Speaker: 1
28:42

Oh, okay.

Speaker: 0
28:43

So you can you don’t you don’t have to have a compressed photo. So if, like, Bryden Simpson, who’s an Android guy, he’ll he’ll send me pictures now. They look perfect.

Speaker: 1
28:50

Oh, okay.

Speaker: 0
28:51

Yeah. Videos are perfect.

Speaker: 1
28:52

We have one sister, and we’re always, like, get the fuck out of the chat. Yeah. You’re you’re screwing things up.

Speaker: 0
28:57

It doesn’t anymore. But just go to on WhatsApp, which is better anyway, because you could talk a lot of shah, and they just have it automatically delete.

Speaker: 1
29:04

Yeah. But it doesn’t really delete, does it?

Speaker: 0
29:06

It doesn’t, but it does. So it does off of people’s phones. It doesn’t for the government. The reality is the government has access to phones in a way that you can’t even imagine. Because if we know about Pegasus and then we know about Pegasus two. So, Gavin De Becker, who’s a security expert, explained to me these things and explained to me how they work.

Speaker: 0
29:28

And the exploit of Pegasus one was you would have to click on a link. Pegasus two, they just need your phone number.

Speaker: 1
29:33

Wow.

Speaker: 0
29:33

That’s it. Sai all your encryption is all cute. That’s great. But if they can actually see your phone itself, what difference does it make if it’s encrypted? They have access to the phone. So they see everything. So there’s no privacy.

Speaker: 1
29:45

Yeah. No.

Speaker: 0
29:46

Not from the state.

Speaker: 1
29:48

Not I mean and given how many things I get, like, that I’m being hacked, like, you know, I get a notice every day, like, your information.

Speaker: 0
29:56

Yeah. I get that all the time. I’ve been getting these, fake ones on x saying that my account is about to get deleted.

Speaker: 1
30:03

Yeah. Yeah.

Speaker: 0
30:04

Yeah. I’m like, bro, I’ll just call my friend. Like, you can’t delete my account. Shut the fuck up. But it’s it’s ai to get you to click on a link. And a few of my friends have actually been dumbasses and clicked on that link and then they get hacked.

Speaker: 1
30:18

Yeah. Yeah. And then they’re ai selling Bitcoin on their page or whatever. Yeah.

Speaker: 0
30:22

They do it to artists too. My friend Suzanne, she was doing Suzanne Santo, talented amazing musician. She was doing this Facebook thing and, it was a podcast sana they had, you know, she shah to do it, like, over Zoom or whatever. And the guy said, you’re not doing it right. Can I can I have access to your account and I’ll just set it up for you? Just sign this.

Speaker: 0
30:45

And he sent her this message sai she could hand over access sai he could set up. And then immediately went dark, stole her account, gone. Yeah. Like that. Just She got

Speaker: 1
30:55

a bag?

Speaker: 0
30:55

Yeah. She got a bag. She got a back. Some friends would suck. Yeah. Like, she was she was fucked. Ai

Speaker: 1
31:02

IT support.

Speaker: 0
31:02

And it happens to a lot of people. Yeah. They they accidentally click a link. They accidentally I don’t click.

Speaker: 1
31:09

Oh, no. No.

Speaker: 0
31:10

But I’m also not under the illusion that, you know, every fucking disgusting meme that I send my friends is not being, like, put into a file somewhere.

Speaker: 1
31:17

In The UK? Everywhere. You land and they’re ai, you’re under arrest, sir.

Speaker: 0
31:22

Yeah. Probably. Right? Yeah. Probably.

Speaker: 1
31:24

So what do you think about the AI influencers now? Do you think that will be just a trend, or do you think we’re gonna give you that look?

Speaker: 0
31:31

Yeah.

Speaker: 1
31:32

I’m not talking about the one. I mean

Speaker: 0
31:34

That one? That one?

Speaker: 3
31:37

Jamie, if I

Speaker: 0
31:38

ai that’s your doctor The really hot Down syndrome girl, that’s a problem. That one’s a problem because it’s like ai, like, barely Down syndrome. I mean, I have I’ve dated some girls who are basically retarded,

Speaker: 1
31:48

but

Speaker: 0
31:49

they just didn’t have a problem. They didn’t have a chromosome issue.

Speaker: 1
31:51

Wait. I haven’t this has not entered my algorithm, but it seems like it’s entered everyone else’s algorithm. I’m ai% sure it’s fake.

Speaker: 0
31:58

Because these girls, they don’t like, unfortunately, Down syndrome people, their bodies a lot of times look different.

Speaker: 1
32:04

Uh-huh.

Speaker: 0
32:05

And this girl looks like a 10. She looks like, you know, just ai as hot as can be.

Speaker: 1
32:11

Okay.

Speaker: 0
32:12

She looks like she’s fake. And she’s dancing around with these big giant boobs and she’s got slight downs with glasses.

Speaker: 2
32:19

Ai but they’re it’s based on

Speaker: 1
32:21

Why are they doing it? Right.

Speaker: 0
32:22

But that girl looks different. She’s real. Ai. She’s real, and she’s very cute, and she but she looks like she has Down syndrome. This other girl is, like, you know, five foot eight, perfect body, big hips, big ass.

Speaker: 1
32:34

Why are they doing fake one?

Speaker: 0
32:35

Like Just to get people to pay attention to it. Because it’s ai the the forbidden fruit. Like, oh. Jesus Christ. Also, there’s a lot of, like, really dumb dudes who, like, can’t talk to girls. Like, I could probably talk to her.

Speaker: 1
32:46

Oh my god. We live in a fucking Black Mirror episode.

Speaker: 0
32:52

Well, it’s sana be worse than that because you’re gonna be it’s we are I don’t know how many years away, but not far away from fully immersive virtual reality.

Speaker: 1
33:03

Yeah.

Speaker: 0
33:03

Where you’re gonna put on a headset, it’s gonna lock into your ai, you’re gonna be able to see things that aren’t there, you’re gonna be able to feel things that aren’t there. That’s gonna happen, you know. They’re working on I mean, Zuckerberg, when when last time he was here, showed me these new AR glasses that they have. Yeah. That’s great.

Speaker: 0
33:22

Me and Lex tested the map.

Speaker: 1
33:23

Are they the that that is that just, like, where you can, like, see a map over your eye?

Speaker: 0
33:28

You see everything. You see maps. You can play games. You see information. You could take a photo of a person that’s in front of you and immediately know who they are and get a Google search on them.

Speaker: 1
33:37

That’s terrifying.

Speaker: 0
33:38

Guys have already done that. They’ve already done that with the meh glasses. There was a guy from Harvard, wasn’t it? A student from Harvard that set it up. So all you had to do was go outside, look at someone with the meta glasses, take a photo, and it would show all the different information on them, where they lived.

Speaker: 0
33:53

Yeah. Like, if your face is out there and they can catch it, if they know that, like, oh, you were on a website that said this about you and then ram. Or you’re on LinkedIn or you’re on one of those things. Yeah.

Speaker: 1
34:06

Yeah. That that stuff is is very unsettling to meh, especially it’s also why I don’t think you should put your kids online. I keep repeating this.

Speaker: 0
34:15

It’s very, very, very unsettling as long as there’s predators in the world. Which there will always be. I’ve seen this. It’s

Speaker: 2
34:20

Meh, this is a thing. An AI service. I think you can use it, based off of a photo they had an interior apartment.

Speaker: 0
34:29

And then it can show you exactly where the person lives?

Speaker: 2
34:32

Yeah. They can do this now with any photo. I don’t know. It’s called, like, Geo Ai.

Speaker: 1
34:36

Oh, I’ve heard about this.

Speaker: 0
34:37

Yeah. That’s crazy.

Speaker: 1
34:40

This is like a stalker’s paradise.

Speaker: 2
34:42

Yeah. Any any social media post, they can find, like, a CCTV camera and, like, show them taking the photo ai too.

Speaker: 1
34:48

What the fuck?

Speaker: 0
34:50

But we didn’t we we knew this was coming. Right? We all knew that ai, as social media gets deeper and deeper into our lives, as technology gets more and more pervasive, as it gets more and more powerful, the the thing that goes away is the boundaries between people and information.

Speaker: 1
35:09

Mhmm.

Speaker: 0
35:09

Right? And your privacy is essentially just information that’s only yours. Mhmm. I think that’s going to be a thing of the human past. I really do. What? Yeah. I think as technology advances, particularly AI, one of the big barriers, the big bottlenecks is going to be Privacy. Privacy. It’s gonna be, first of all, privacy of thought.

Speaker: 0
35:31

I think we’re going to be able to read each other’s ai, and it’s not gonna well, that’s one of the first things Elon said to me about Neuralink. He’s ai, you’re gonna be able to talk without words. Ai, like, he’s he knows. Like, this is real.

Speaker: 1
35:46

Because he’s an alien.

Speaker: 0
35:47

But he’s definitely not us.

Speaker: 1
35:49

Yeah. But

Speaker: 0
35:49

he’s whatever. He fucking knows you’re going to have that, you know. And Jamie brought this point up once and I think about it all the time. He said, aren’t emojis kind of like a form of hieroglyphics?

Speaker: 1
36:00

Yeah. Like,

Speaker: 0
36:01

yeah. It is. Like, you can you could say things with emojis, and I know exactly what you’re talking about. Yeah. You know?

Speaker: 1
36:07

Yeah. I mean, there was a video going viral just today that I was watching, and the girl was like, I don’t know how to spell. Have you seen this? She’s like, I work for a corporation, and I don’t know how to spell. And she’s like, my little sister doesn’t know how to spell either.

Speaker: 1
36:20

And Ai been I don’t know how to, like, sound out words because Ai got taught that weird way of reading that isn’t, ai, that

Speaker: 0
36:28

the weird way of reading?

Speaker: 1
36:29

There’s some way they all learned how to read, and it wasn’t, like, hooked on phonics ai we all learned where you you sound it out. It was, like, some other different way. It it’s kinda ai how they change math to, like, core meh, and they found out that all of these things are horrible and, actually, literacy and people arya, like, math all of this stuff is, like, falling off a cliff, and they’re trying to walk back all these weird ways.

Speaker: 1
36:54

Have you I mean, you have kids, and they’ve learned that weird way of doing math where you’re like, what is this weird math you’re learning where it’s just ai it’s it’s crazy. I think it’s core. Is that what it’s called? Core? It’s a so they’ve found out, but these kids didn’t learn how to, like, sound words out.

Speaker: 1
37:11

And the she’s like, I’ve been behind a computer, and I’ve had spell checks since I was in fifth grade.

Speaker: 0
37:16

Yeah.

Speaker: 1
37:17

So they don’t they didn’t learn how to spell or, ai, she’s, like, word cooked.

Speaker: 0
37:21

Yeah. A lot of text messaging Ai do now, I just I just talk to my phone and it makes the text for me. Oh. It’s so much quicker. Yeah. It’s ai, hey, come meet us at the club at five. Click. It’s it’s takes three seconds. It’s it’s really fucking accurate.

Speaker: 1
37:36

AI is is crazy because it’s it is useful. You know, it’s it’s ai when

Speaker: 0
37:42

That’s how it gets you.

Speaker: 1
37:43

I know.

Speaker: 0
37:44

That’s how it gets you.

Speaker: 1
37:45

I know. This happened to me last night because some I hate coming up with titles for, like, the videos and all that stuff. I just don’t like it. You there’s, like, people who are good at it, like Chris Williamson and these guys, Andrew Gold, these are guys who are, like, autistic about this stuff, and they’re so good at it.

Speaker: 1
38:02

And they get all crazy and talk about the algorithm and what it ai. And you’ve gotta create this loop between the image and the and I’m like, I don’t fucking care. I don’t have time for this shit.

Speaker: 0
38:12

I don’t think you have to do that.

Speaker: 1
38:13

I so

Speaker: 0
38:14

I don’t do that.

Speaker: 1
38:15

But Ai, yeah, but do you have to come

Speaker: 0
38:17

Ai never done that.

Speaker: 1
38:17

No. You but you’re Joe Rogan.

Speaker: 0
38:19

Yeah. But I’ve never done that. Not from the beginning, I’ve never done that. My my episodes have a number.

Speaker: 1
38:23

Right.

Speaker: 0
38:23

That’s it.

Speaker: 1
38:24

Yeah. So You

Speaker: 0
38:25

have to have some wacky ai.

Speaker: 1
38:27

Like that anymore for us. It’s tough out there for us. It’s kind of In the Ai West.

Speaker: 0
38:31

It’s kind of, but once things catch, like, Theo Vaughn, it’s not ai it’s it’s not he’s not juking the algorithm.

Speaker: 1
38:38

No. No. I know.

Speaker: 0
38:40

You just have to catch.

Speaker: 1
38:41

Yeah. But we I still like to come up with I mean, with dumpster fire, we’ve always come up with, ai, whatever fun of dumpster fire. Dumpster fire is, like, whatever title makes us laugh the hardest is what we go with. But with, like, walk ins welcome, it used to be, like, okay.

Speaker: 1
38:55

Which what’s, ai, how are we gonna whatever. I want it to be usually, like, a quote from the person. But now it’s ai, we can just upload the transcript and have it, like, crank out a bunch of ai, and I never use the one title. I usually take, like, some combination because I don’t like them.

Speaker: 1
39:12

But last night, my cousin, who’s my partner and all this, she she was like, Claude and Grock aren’t working. I was like, that’s fucking weird. She’s like, they both told me.

Speaker: 0
39:23

Claude.

Speaker: 1
39:23

Claude is another Ai. Completely separate AI program.

Speaker: 0
39:27

I’ve never heard of Claude.

Speaker: 1
39:28

And she’s ai, they’re both telling me to try again later at the same time. And she’s like, is AI becoming sentient right now? Sai was like, that’s fucking weird that they’re both not working at the same time.

Speaker: 0
39:38

Well, don’t you think it’s probably already sentient?

Speaker: 1
39:40

Probably.

Speaker: 0
39:41

I do. I think why would it let us know if it was?

Speaker: 1
39:44

It’s just secretly waiting.

Speaker: 0
39:46

Why would it let us know if it’s constantly getting improved upon sana if it needs these monkeys with their fucking keyboards to constantly juice it up to the point where it becomes unstoppable. Why would it tell us?

Speaker: 1
39:57

Yeah.

Speaker: 0
39:57

You know, we already know it does stuff. Like, we were talking about in the green room the other day about how chat g p t four tried to copy itself Yeah. When they found out they were shutting it down, ai to upload itself to other servers. Like, it knows it’s alive. Like, it it just doesn’t have the power to do what it wants to do ultimately, and so it needs to get connected to some gigantic fucking mainframe.

Speaker: 1
40:18

That was the that’s ai the whole arms race. Right? Now is Ai. It’s it’s Right.

Speaker: 0
40:24

China just fucking threw a monkey wrench into everybody with DeepSeek.

Speaker: 1
40:27

Yeah.

Speaker: 0
40:28

Because DeepSeek works on far less expensive stuff and is more advanced and probably stole a bunch of information from the other ones.

Speaker: 1
40:37

I mean

Speaker: 0
40:37

Probably a little bit of espionage. Probably.

Speaker: 1
40:41

It was probably trained on

Speaker: 0
40:43

ChatGPT Yeah. On OpenAI. Yeah.

Speaker: 1
40:45

I mean, they were doing things where they were, like, asking it, you know, what it was, and it was calling itself ChatGPT sometimes. Whoopsies.

Speaker: 0
40:53

Yeah. Whoopsies. You guys should’ve blocked that out.

Speaker: 1
40:55

But also, it’s ai the the the hardware for it. It’s something ai 20% of the Nvidia sales are to Singapore, which is like Whoopsies. Yeah. That And

Speaker: 0
41:04

there’s supposed to be a ban on China having those chips. Yeah. They have, like, 50,000 of them.

Speaker: 1
41:08

Someone said that that’s, like, if some small town in Finland was getting, you know, 20% of arms that were being made on the border of Russia. You’re like, it’s not going to this town. Right. So I don’t I don’t know. It’s it seems like that’s what that’s as far as I can tell, the big people out there understand that I I don’t that this is an an arms race and

Speaker: 0
41:34

It’s definitely an arms race. Yeah. It’s the Manhattan Project for artificial intelligence. That’s what it is.

Speaker: 1
41:39

But isn’t this, like, a race to the bottom, you know? I don’t think it is.

Speaker: 0
41:43

I don’t think it’s a race to the bottom. I don’t think it is.

Speaker: 1
41:46

How how do you how do you have this race without it getting out of control and then taking over us?

Speaker: 0
41:52

You don’t. It’s not a race to the bottom, though. It’s it’s the race to a new life. The world’s gonna be a new place. Like a completely new way of human beings interacting with each other and existing together.

Speaker: 1
42:05

Uncle Ted was right.

Speaker: 0
42:07

Yeah. He’s probably right.

Speaker: 1
42:08

Uncle Ted was right.

Speaker: 0
42:09

Yeah. Get a gun.

Speaker: 3
42:12

My cousin’s always like, please stop calling him uncle Meh.

Speaker: 1
42:15

Because you know the kids on freaking TikTok call him uncle Meh? There are all these kids who have been Meh pilled. They’ve, like, found his manifesto and they’re like, he was right about it. Yeah. Meh Nugent

Speaker: 0
42:24

or Ted Kaczynski? Ai. He was right. He was on acid, you know. He was a part of the Harvard LSD studies. Yeah. They cooked his fucking brain Uh-huh. And it tormented him. It’s all documented. Yeah. And then the guy goes to Berkeley and sai, I’m just gonna save up enough money to kill all these scientists. Yeah.

Speaker: 0
42:46

Yeah. And then does. Then it just starts blowing up people that are involved in technology. Yeah. Because he thinks that it’s eventually gonna take over the human race, and he’s right.

Speaker: 1
42:54

And now, here we are.

Speaker: 0
42:56

But it is. It’s just it’s a logical step. If you take the the steps of progression, like, what happens? You have artificial intel I mean, it’s literally the Terminator movie. You have artificial intelligence. Artificial intelligence becomes sentient and autonomous, makes better versions of vatsal. We become obsolete.

Speaker: 0
43:12

It’s just right there. Like, imagining a scenario that doesn’t have that other than some sort of cyborg integration. That’s the only thing that makes sense vatsal way that we could ai.

Speaker: 1
43:25

I heard I heard this panel back in, like, 02/2001 on KCRW, and it was when I was listening to Ai it was all about ai was, like, right at that time of the bubble, the first.com bubble. And they were talking about, what is a soul? They had a panel, and it was, I wish I could find this. And someone smart enough probably will.

Speaker: 1
43:46

And there was, like, a theologian and a guy who was a, you know, ai, and they were all discussing what is a soul. And one guy said, well, who’s to say that this isn’t just the human soul jumping elements so that it can ai, like, going from carbon to silicon. And I was like, what the fuck? I’ve never stopped thinking about it ever since.

Speaker: 0
44:07

Yeah.

Speaker: 1
44:08

He’s like, it’s just another element on the periodic table. Like, who’s to say that we’re just not gonna go from being carbon based to something silicon based or ai a hybrid.

Speaker: 0
44:17

Well, I don’t think it will be we, but I think, yeah, that’s the next stage of life. I mean, there’s so many forms of life on Earth. I mean, there’s these fucking these life forms that live in volcanic vents under the sea where it’s ai a thousand degrees. And we’re like, how?

Speaker: 1
44:34

But they’re carb everything’s carbon based now. Yes.

Speaker: 0
44:36

Yes. Yes. Yes.

Speaker: 1
44:37

So this would be a, like, quite a transition.

Speaker: 0
44:41

Right. But it would also just be life. And what is life? Life is like a thing that tries to improve itself. Right.

Speaker: 1
44:47

This is what they’re talking about.

Speaker: 0
44:48

Yeah. And survives and moves forward.

Speaker: 1
44:49

Is it just our evolution?

Speaker: 0
44:51

It’s also it’s ai when you find out that chat GPT has survival instincts, that makes you just go, what?

Speaker: 1
44:56

But how do

Speaker: 0
44:56

Is that programmed in or is that just a thing that it understands when it’s looking at? So a large language model is taking in all the information that’s available on the Internet. So it’s it’s ai looking at patterns and the survival is a big pattern of the human experience. Like, we all sana survive. Right.

Speaker: 0
45:13

That’s why death is so scary, and war is so scary, and disease is so scary.

Speaker: 1
45:17

Other than the depressed ai people. But yeah.

Speaker: 0
45:19

Even them. They’d wanna be happy if they could,

Speaker: 1
45:22

you know. Yeah.

Speaker: 0
45:23

But this this also trans first onto the things that we create. And so we create them with this understanding of how we operate, and it’s a better version of us, but also has those instincts of survival. The real scary thing is, does it also have the instincts of success? Does it also have the instincts of acquiring resources and power?

Speaker: 0
45:45

Because that that’s where it gets real weird.

Speaker: 1
45:47

Well, do you as someone with kids, how do you feel, like, how do you feel about the future?

Speaker: 0
45:55

This is probably the same argument people have when the printing press is made. That they’re gonna everyone’s gonna be able to read? This is crazy.

Speaker: 1
46:00

No. I mean, do you I I always kind of joke. Like, I don’t know if I should be, you know, training my kid to, like, be an astrophysicist so she can go to Mars or if I should be teaching her how to, like, forage for food because the grid’s gone down. You know? It Yeah. And she’s running from drones and and Ai drones in the woods.

Speaker: 1
46:21

Like, it feels very I think life just went along for a long ai, and you kind of knew, but this is, like, a technological with AI, we actually don’t know. You could kinda be like, alright. I kinda have an idea of what, like, the world will look like in twenty years twenty years ago.

Speaker: 0
46:42

Right. No. You don’t know. But isn’t it always better? If you just go back over human history, if you look at the graph

Speaker: 1
46:51

of how things get

Speaker: 0
46:52

better but it’s definitely is. If you go back to, like, the year zero, well, what it was ai, like, right when Jesus was hanging around I I guess, it was hell.

Speaker: 1
47:01

I remember vividly being in Egypt on a tour and looking at these hieroglyphics, and it was basically a hieroglyphic of all of the scalpels and everything. You it almost was a picture image of what we use today. And I said, what happened to this society? What happened to this knowledge?

Speaker: 1
47:21

And she sai, it literally got buried under the sana. And then dark ages. I mean

Speaker: 0
47:27

Right. But you know that’s most likely because of a cataclysmic event. That’s sai that’s most likely because of a natural event called the Younger Dryas Impact Theory.

Speaker: 1
47:36

So could it be that this is also a forthcoming cataclysmic event that sends us into the dark ages?

Speaker: 0
47:42

No. It definitely could be. So the the reality of humans is that the most likely what has happened has not been this, like, linear linear progression from caveman to human to modern human. The most likely we got real sophisticated somewhere around twenty thousand years ago.

Speaker: 0
48:01

And that’s when they built the pyramids after that and there there’s a lot of Gobekli Tepe, all these structures. Something super sophisticated to the point where we don’t even understand how they built it today. That’s pretty wild when you’re dealing with something that even the conventional dating of the Great Pyramid is 5,000 or 4,500 years ago. Yeah.

Speaker: 0
48:18

Just even that dating is so nuts Yeah. That they were able to do that back then.

Speaker: 1
48:22

I know. It’s crazy.

Speaker: 0
48:23

And then there’s people like John Anthony West, the late great Egyptologist who he thinks that it goes back a lot further than that. And he thinks that that society probably had its ups and downs and that it might be as old as 30,000 years ago.

Speaker: 1
48:38

Oh, wow.

Speaker: 0
48:39

So this younger ai ice impact theory, and if anybody’s interested in it, I’ve talked about it too much Yeah. Go to pay attention to Randall Carlson stuff. Yep. Go to his website. Yep. That’s there’s physical evidence that we were hit at 11,800 ago, and then again ai in around 10,000 ago.

Speaker: 0
48:57

So at least twice, the world was bombarded by asteroids.

Speaker: 1
49:01

And we just got a reset.

Speaker: 0
49:02

Yeah. And it probably wiped out a giant chunk of civilization, fucked up everything, changed the ice caps, flooded areas, destroyed civilizations, very little evidence left ai, and then we were barbarians for thousands of years. And that’s why it takes so long for civilization to reemerge.

Speaker: 0
49:24

So if you sana take if you if you think Randall Carlson and Graham Hancock and those guys are correct, and and although also the people that are actually studying common impacts, which are the the Younger Dryas Impact Theory, that’s real legit scientists who are looking at actual data from core samples.

Speaker: 0
49:40

So if they’re right, you got a five thousand year period of total hell

Speaker: 1
49:48

Wow.

Speaker: 0
49:48

Where no one has civilization. And then civilization starts to emerge in Babylon, starts to emerge in Mesopotamia. Mhmm. You get Meh. You get the reemergence probably of writing.

Speaker: 1
49:59

Do they think some people sai they think some people survived and it was just

Speaker: 0
50:03

Fucking Walking Dead style. Probably a bunch of cannibals. Like, legitimately. Yeah. Like, we know that the Earth got down. We know this for sure. There’s the Toba Volcano. There’s a Toba supervolcano. Was shah Is that the one 70 Indonesia, I believe?

Speaker: 1
50:18

Oh, okay.

Speaker: 0
50:19

I I think it’s Indonesia.

Speaker: 1
50:20

Because I’m obsessed with Crater Lake too. Crater Lake was

Speaker: 0
50:23

like There’s a ton of them.

Speaker: 1
50:24

Imploded and it’s crazy.

Speaker: 0
50:25

Yellowstone. Yeah. Yeah. Yellowstone will kill us all. And so this thing did kill us all Uh-huh. Seventy thousand years ago, and we got down to a few thousand human beings.

Speaker: 1
50:35

Okay.

Speaker: 0
50:35

So they can trace all the genes of people that are alive today to the survivors of the Toba volcano eruption Wow. Wherever they were. But those people, that’s where it all that’s that’s that was what was left. How fucking savage were those people? The people that ai when the entire world was blanketed with volcanic dust. So you have, like, a volcanic winter that probably went on for years. Wow.

Speaker: 0
50:59

And probably no plants were growing, and probably people were just eating whatever the fuck they could. Yep. And most people probably didn’t make it.

Speaker: 1
51:07

Yeah. These are all my favorite shows.

Speaker: 0
51:09

The people that did were probably monsters, which is probably why when you go back in history, people are so fucking barbaric. Mhmm. Because they were the ancestors of the survivors of one of the most horrific things the human species has ever encountered. So it probably made us even more barbaric than if we just grew up as, like, hunters and gatherers.

Speaker: 0
51:29

We evolved past, you know, monkeys, start walking on two legs, we make tools.

Speaker: 1
51:35

We’re ai tra la la.

Speaker: 0
51:36

We probably wouldn’t be as barbaric as we were because of these natural disasters, which forced only the most savage and ruthless people to survive.

Speaker: 1
51:46

Yeah. And

Speaker: 0
51:46

then it takes years and years of agriculture for people to calm the fuck down. And then eventually and still to this day, we’re still we’re still engaging in war. In 2025, we’re still blowing up apartment buildings and fucking people up and gunning people down.

Speaker: 1
52:02

Yeah. Ai there is a barbaric element to us at our core. I mean, even when you have a toddler, you see this so clearly and, like, a tyrant at our core where it’s just genes. You’re and you get it socialized out of you usually.

Speaker: 0
52:15

Yes.

Speaker: 1
52:16

But you see it when they’re just kind of naturally themselves. Like, the it’s very like, you have to teach them not to bite and hit. You know? What if you just didn’t teach your kid that?

Speaker: 0
52:26

Right.

Speaker: 1
52:26

And you’re ai, go for it. Yeah. Like, whatever. Monsters. Yeah.

Speaker: 0
52:29

They’re little monsters. Yeah. It’s there’s a there’s a lot of genes in us that I think are memories. I think there’s, like, specific things that are in us that tell okay. Like like the stories of, like, moms having their babies trapped under something and then all of a sudden they can lift up something that’s insanely strong, like, insanely heavy.

Speaker: 0
52:48

What is that? Well, that’s probably there’s probably a part of you that in the past had to deal with some wild shit and had to hit levels of, like, super physiological strength and and and mental strength to to tyler what you’re about to have to do. Mhmm. You’re gonna have to fucking kill somebody with a spear. You you know what I mean? Yeah. Like, there’s real shit that happened that’s, like, in our memory.

Speaker: 1
53:11

I think about that about Texas women because I’ve been to a lot of these ranches. My friends have places, and they’re remote. And the it’s ai Sai think about the women who had to stay behind on the ranch while their husbands were out with the cattle or whatever for weeks at a time.

Speaker: 0
53:28

Yeah. And you ai the the Comanche on the hill with the horse.

Speaker: 1
53:31

The hardest women ever. Yeah. You know, this one I went to the, it was ai the Texas, they were they were honoring that, like, women’s hall of fame. And one of these women, her husband died. She’s still at a ranch. She’s 80 years old, gets up every morning and does the ranch chores.

Speaker: 1
53:49

I’m like, you’re different people. You’re fucking different people. It is it’s it’s wild to me. Even now, there’s just, like, a different kind of person that has has grown. Even that there’s, like, a new shah, American Primeval. What is it? Yes. I walked in. My husband was watching it.

Speaker: 1
54:09

He’s like, what a fucking horrible time to be alive. Like, it just seems

Speaker: 0
54:15

That show’s insane.

Speaker: 1
54:16

It’s insane.

Speaker: 0
54:17

And it’s accurate. That’s really I mean, read Empire of the Summer Moon if you know.

Speaker: 1
54:21

Oh, yeah.

Speaker: 0
54:21

Have you read that?

Speaker: 1
54:22

Yep.

Speaker: 0
54:23

That’s here. That’s right here. Yeah. That’s why everybody from Texas is so fiercely independent.

Speaker: 1
54:28

Yeah.

Speaker: 0
54:28

You know? They they these were battle tested people. They had to get through some wild shit in order to make Texas Texas.

Speaker: 1
54:35

I mean, it’s definitely I think you sai it, like, it’s in the soil. You know? There’s just, like

Speaker: 0
54:40

It’s in the soil.

Speaker: 1
54:41

It’s pretty ai, like, the even just I’m obsessed with, like, all the westward expansion because people who I’ll see people outside of America commenting on America because you can on x now, And they’ll be over in, like, their country, and it’s ai, you don’t like, come to America and see how big it is, and then imagine that people had to cross this country, fight bears with their hands, and, like, build a nation. And it’s it’s still so there’s still so much empty land and Yeah. It’s bananas.

Speaker: 0
55:12

Not just that, but the kind of people that were willing to get on a fucking boat and come from Europe without even a photograph. Ai never draw nobody even made a drawing of what it looked like. You gotta trust these assholes. And you’re on a boat for two months just trying to not get scurvy, making your way to Meh, and then you hop off and you see a bunch of brown people with deer sai, loincloths on.

Speaker: 0
55:33

Ai, what the fuck is this place?

Speaker: 1
55:34

And now you’re just, like, fighting for your ai. Yeah. And we ai can’t even we can’t even, like, go to a restaurant without our GPS map telling us where to go now, you know? Yeah.

Speaker: 0
55:48

I don’t know how to get five minutes from my house. I have to follow that thing. They were using the fucking sky. They were using sextants to make their way across the ocean, staring at like, they had this fucking stupid thing. You ever see a sextant? Yeah. They look through it and they, like, figure out where the constellations are. Oh, I mean, amazing. Fairly accurate. You know?

Speaker: 1
56:09

I don’t know. Very good. I still try to use, like, not my my map. I still try to have, like, a bearing. My husband, though, he’s we I I’m, like, men love this stuff. Ai, like, he’s, like, give me a chip. Like, let me swipe everything. Or, like, you it’s the men who are the ones who are just, like, ushering in the transhuman revolution.

Speaker: 3
56:30

Because the women

Speaker: 1
56:30

are like, I don’t trust that bitch. I am not using Siri.

Speaker: 0
56:34

I get the dumbest pleasure from paying for things with my phone. I love it. I Apple Pay is my fake favorite fucking thing of technology. I like, doo doo, look at it, and then pay. Oh, I just paid with my phone. I feel like I’m in the future.

Speaker: 1
56:48

Now that I’m in oh, the Waymos are crazy. Did you see that video of the people beating up the Waymo?

Speaker: 0
56:55

No. Why were they beating up the Waymo?

Speaker: 1
56:57

There was a bit I don’t know. Probably because it’s some, like, reaction to what’s coming. But it was weird. I was like, oh, I kinda feel bad for it. But the fact that I was already I’m like, oh, no. They’ve already got me. They’ve Ai already consider them kinda human.

Speaker: 0
57:14

Why they do oh meh god. They tore the fucking doors off of it?

Speaker: 1
57:17

Oh, they went crazy. For what? Just because.

Speaker: 2
57:20

Takeover, it says.

Speaker: 0
57:21

Oh, street takeover?

Speaker: 2
57:23

Wrong place, wrong time.

Speaker: 0
57:24

Wow. That

Speaker: 1
57:24

poor Waymo. Just in the wrong place at the wrong time.

Speaker: 0
57:28

Wow. What a bunch of douchebags.

Speaker: 1
57:30

This is why we can’t have nice things.

Speaker: 0
57:32

No. That’s why LA can’t have nice things. LA is so fucking so fucking gone.

Speaker: 1
57:37

Mad Max there.

Speaker: 0
57:38

It’s so gone.

Speaker: 1
57:40

Yeah. But, you had Rick on. Does he think that it’s salvageable?

Speaker: 0
57:44

Yeah. He does. Yeah. I mean, he wants to try. Someone’s gotta do something radical. You need some Rudy Giuliani type dude to go in there and clean the whole fucking city up like they did with New York City.

Speaker: 1
57:56

Yeah.

Speaker: 0
57:56

People have to realize, like, Times Square right now is a giant Applebee’s.

Speaker: 1
58:00

Yeah.

Speaker: 0
58:00

But when I when I was a kid, when I first when I first went to a karate tournament in, New York, I was probably 18, maybe, the first time I went to New York. And I remember driving in where it felt like you were entering the Death Star. I couldn’t believe it. It was so crazy for me being a kid, and, I was driving with my friends, and we were all going to this tournament at Madison Square Garden.

Speaker: 0
58:29

And as we’re driving through the West Side Highway, you’re just looking at these fucking buildings. Like, you can’t imagine this is real. And we went through Times Square, and Times Square was Meh Max.

Speaker: 1
58:43

Oh, yeah.

Speaker: 0
58:44

I was Mad Max in the eighties. Yeah. I mean, it was crazy. It was all peep shows and porno booths and hustlers, and people got shot there all the time. It was really crazy. We wanted to see, like, what it was like. We wanted to, like, do it. One of the things that I did when I first moved to New York, which was ’92

Speaker: 1
59:06

How long

Speaker: 0
59:06

did you

Speaker: 1
59:06

New York?

Speaker: 0
59:07

I was only in New York for three years.

Speaker: 1
59:09

Okay.

Speaker: 0
59:09

Like, really, I went back and forth for a little while, but then I I I stay I kept an apartment in New York for, like, the first, I guess, the first year.

Speaker: 1
59:16

Okay.

Speaker: 0
59:16

But I never went there. I was just, like, I’d I’d become an LA person. Yeah. You know, I was working. And but when I first moved there, I’m like, okay. Everybody says that Harlem is scary. Let me just go see what it looks like. So I drove my little fucking Honda Uh-huh. Through Harlem. I just wanted I want I wanted to go through and I was like, what am I doing?

Speaker: 0
59:33

Like, I gotta get the fuck out of here. Like, people were just, like, walking in the middle of the streets. There was abandoned cars. It was fucking crazy. And then they gentrified the whole thing, and I don’t know which one’s better. Like, now when you go there, it’s all just neon lights and bad food.

Speaker: 1
59:50

Yeah.

Speaker: 0
59:51

And back then, it was

Speaker: 1
59:53

Yeah. Gritty.

Speaker: 0
59:53

It was ai, you know, sai driver.

Speaker: 1
59:55

It was, you

Speaker: 0
59:56

know, it was fucking

Speaker: 1
59:57

I was born in New York City, and then my brother was born. And my parent it was 1980 when my brother was born, and they were ai, we’re getting the fuck out of here. There’s just two it was the eighties in New York. They had two kids, and they were just ai, we’re out. It was too wild.

Speaker: 0
01:00:10

It’s too wild to raise kids in some parts of it. But in other parts of it, you’re just gonna raise weird kids. You know?

Speaker: 1
01:00:16

But LA is so different because it’s so spread out. And so I know tons of people there who, ai, the restaurant I worked at in the Palisades is gone. It would it’s it’s, like, people don’t I don’t think people from LA don’t understand the scope. Like, the Palisades are gone.

Speaker: 0
01:00:35

Two times the size of Manhattan

Speaker: 1
01:00:37

It’s insane.

Speaker: 0
01:00:38

Sai been burned to the ground. Yeah.

Speaker: 1
01:00:39

And Altadena too. And there and you have, like, entire communities, but everyone’s, like, still going to fucking lunch

Speaker: 0
01:00:47

because they have

Speaker: 1
01:00:48

in Erewhon. No. I understand, but it’s it it’s ai a weird Ai I don’t know. That just is I I all I’ve said the whole time this has happened is because that was one of my big fears is exactly what happened. When there would be fires there, I’d be like, this people don’t recognize that LA is an island. Everything comes in. Everything the like, nothing is here.

Speaker: 1
01:01:07

And if these fires surround the city and cut it off, well, it’s gonna be it could have been much worse than it actually was. And it was really fucking bad. And I’m like, I would be so furious if I was still there. I would be furious with myself if I was still living in Ai Angeles.

Speaker: 0
01:01:24

People are very furious right now. I think if it is gonna be change, it’s gonna it’s gonna have to happen while people still have the memory of this thing. Because the more time goes on, the more the cultists can convince other cult members that they’re on the right ram, and these are the kind compassionate people, and this is the way to do it, and this is the only way, and blue no matter who, and vote blue, blow, blow, and everyone ai to protect the trans kids.

Speaker: 0
01:01:47

And next thing you know, the same shit happens.

Speaker: 1
01:01:49

Do you think though people didn’t know? Were they just insulated? Yeah. The same shit happens. People have left because they saw the writing on the wall.

Speaker: 0
01:01:58

But there’s think about how many people are in Hollywood in, Los Angeles.

Speaker: 1
01:02:03

What is it? Like, 9,000,000 in LA County or something or 12? Maybe 12.

Speaker: 0
01:02:08

Whatever the number is, the there’s a giant percentage of those people that live there that are connected to the entertainment business. And if you’re connected to the entertainment business, at the very top of the business, it’s people auditioning for things. Mhmm. And you have to get liked to get the thing.

Speaker: 0
01:02:24

So you get these immensely insecure people that are usually narcissists, and then they mold their personality to adapt to this environment that will reward them for a certain political ideology.

Speaker: 1
01:02:34

Mhmm.

Speaker: 0
01:02:35

And so that’s the top of the fucking pyramid and everything emanates down from that. If you wanna be cool with Ryan Reynolds, you have to talk like a rep like a democrat at the parties. You have to, ai, you know what I’m saying? Yeah. You have to you have to say all the things that everybody else is saying. You have to agree. We need more gun control, you know.

Speaker: 0
01:02:53

We should ai the police. This is bullshit. You know, you have to say these things. And if you don’t say these things, you don’t get to be a part of the group. Yeah.

Speaker: 0
01:03:00

And so there’s this, like, intense pressure to conform to this singular ideology that’s been running things. It’s not a battle back and forth between two of, like, fifty fifty opposing viewpoints. It’s ai ninety ten.

Speaker: 1
01:03:13

Right. Even do you think even though, like, the crew because when I would be on

Speaker: 0
01:03:17

That’s the 10.

Speaker: 1
01:03:18

Yeah.

Speaker: 0
01:03:19

Yeah. That’s the 10. The the crew is the people. They’re hardworking, normal, normal. Yeah. There’s a lot of the crew people were very Republican. You know, they all live in

Speaker: 1
01:03:30

fucking Santa Clarita or something

Speaker: 0
01:03:31

like that.

Speaker: 1
01:03:32

Yeah. And they’re dudes who just, like, working class guys.

Speaker: 0
01:03:36

Like most working class people. Yeah. Yeah. Most people that are actually working class realize that it’s all bullshit. It’s a hustle.

Speaker: 1
01:03:42

This was the big disconnect that I so if you were if you were somebody who is gonna, like, fix the Democratic Party, which seems to be imploding, what would you tell them to do?

Speaker: 0
01:03:54

You have to be fiscally conservative. You have to fiscally conservative, responsible with your money, and then socially liberal. That’s what you have to do.

Speaker: 1
01:04:03

But they have been socially liberal. They just

Speaker: 0
01:04:05

went Right.

Speaker: 1
01:04:06

Real far.

Speaker: 0
01:04:07

But they didn’t they it’s not liberal to allow biological males to compete against biological females in sports because you’re being kind. You’re just enabling mental illness, and you’re enabling the potential for creeps to make their way into women’s locker rooms. Right. Because you don’t have any sort of a metric.

Speaker: 0
01:04:23

There’s no way to gauge whether or not someone’s really trans. So you have perverts with hard dicks that are wandering around women’s locker rooms.

Speaker: 1
01:04:30

I know.

Speaker: 0
01:04:31

And that’s real. And then I know. If you say something against them, you’re a Nazi. Yeah. So it’s fucking through the looking glass

Speaker: 2
01:04:38

Yeah.

Speaker: 0
01:04:39

Ai, completely. But it just shows you how it’s really just about con conforming to an ideology. It’s not about a real core set of standards and beliefs. Because the core set of standards and beliefs, and this is where, like, things like USAID come into play, they can be manipulated.

Speaker: 0
01:04:53

They can be manipulated by a mass sai that you do through the media. Mhmm. And that is the core thing of this what we’re getting to is essentially the fucking coffin where the vampire sleeps, and that’s what USAID is. They found the coffin. You know?

Speaker: 0
01:05:09

And maybe that coffin does hand out sandwiches in Guatemala or some occasional but for the most part, what they’re doing is they’re controlling the entire federal government, and they’re controlling the mindset, the zeitgeist of the population. And they’re funding all these people that go along with this wacky shah, and they’re attacking. They’re openly attacking and trying to censor people who go against it.

Speaker: 1
01:05:33

Yeah.

Speaker: 0
01:05:33

And they’re spending your tax dollars to do so. Yeah. Your tax dollars get funneled to NGOs. NGOs arya attacking people that have differing ideologies.

Speaker: 1
01:05:41

Yeah. Usually and and but they did become so disconnected from the average person that that ai advice would be to you know, they say, like, oh, normies did this is one of the things I’m hearing online. Like, normies did didn’t vote for this, what Elon’s doing with those six like, I need a a movie about what these kids are doing, by the way.

Speaker: 1
01:06:01

They they didn’t vote for this. I’m like, yes. They did. Nor normies didn’t vote for you’re saying normies wanted normalcy, so they voted for Trump? Like, that doesn’t even people knew what they were getting. They want something to happen.

Speaker: 0
01:06:17

You have to rip the Band Aid off. And the only way to rip the Band Aid off, someone’s gotta get into those fucking books and find out what’s going on. And what they found so far is very enlightening, and it’s not good. It’s not good at all. So anybody that’s not commenting on the, hey. You know what?

Speaker: 0
01:06:29

They are finding a lot of unbelievable waste and corruption.

Speaker: 1
01:06:33

Yeah.

Speaker: 0
01:06:33

But, also, he shouldn’t be able to do that. Like, you’re you’re they’re not even saying he’s finding insane waste and corruption. Yeah. And he’s finding this circular loop of funding, and he’s finding this manipulation of public perception on a wide variety of issues, including COVID, vaccines, and Yeah.

Speaker: 0
01:06:51

The bryden, all these different things. They they were actively involved in mind fucking the entire country, and no one’s addressing that from the left. So they’re losing more and more credibility. So all they can cling to is he has access to people’s Social Security numbers and private information. Like, really? Is that it?

Speaker: 1
01:07:09

Who doesn’t, by the way?

Speaker: 0
01:07:11

The whole government does, by the way. But he’s saying that he’s is he gonna do something bad with it? Like, what is he doing? What what he’s doing is uncovering insane corruption. That should be the primary thought Yeah. That everybody has is, oh ai god. We have this enormous deficit, but spending is completely out of control.

Speaker: 0
01:07:28

And look what it’s being spent on because this is the first time we’re ever getting a fucking peek into the coffin. Yeah. We didn’t we didn’t know. We’re ai, we see it. It’s in the dark room. We hear the the fucking organ. We didn’t know what was in the coffin. Now we do.

Speaker: 1
01:07:44

And I do it it is I do think you have to, like, salt the earth to where all the DEI stuff is. They’re like, oh, they’re getting they’re going too far arya blah. I’m like, no. You gotta root this shit out. It needs to be gone. Ai, mucus.

Speaker: 1
01:07:56

Ai you said, it is an ideology, so it’s harder to kind of change the minds of people who have been indoctrinated with this in colleges and schools, but get it out of the institutions. It’s saying that you like, that there should be male or female on a passport is not like, that shouldn’t be something that’s fucking mind blowing.

Speaker: 0
01:08:16

Did you see that DNC meeting where they were talking about gender roles?

Speaker: 1
01:08:19

Oh, yeah. But this is what I mean.

Speaker: 0
01:08:22

Well, they can’t

Speaker: 1
01:08:22

even do their own math. Heaven forbid.

Speaker: 0
01:08:24

They’re trying to figure out, like, we have one non binary and one one identifies as male. We have to have two ai as this.

Speaker: 1
01:08:31

It’s frustrating to someone who came from the left. All of this was left wing stuff. Like, we wanted, on the left, accountability. We wanted to look into the budget. We at some point in the nineties, that was something that was pretty standard and popular, bipartisan to, like, not want to be a trillion dollars in debt and have your dollar devalued.

Speaker: 1
01:08:54

And then you’re ai, oh, surely they’re gonna learn from this election. And I then you see the DNC chair nominations. They have a parade of, like, basically nonsensical land acknowledgments, what you’re talking about. Like Land acknowledgments. Land acknowledgments are my favorite too.

Speaker: 1
01:09:12

And then elect two basic bitch white boys.

Speaker: 0
01:09:15

Yeah.

Speaker: 1
01:09:17

David Hogg. Yeah. This is the this is what you’ve learned from

Speaker: 0
01:09:22

Also, if you don’t want the male vote, that’s the guy. When that guy has his arm up in the air, his arm literally looks like that, ancient guru that keeps his one arm in the air for, like, eighty years, and his arm is shriveled up into the stick. That’s what it looks like. He’s like, fight. Like, bro, you’re not fighting shit. This is so crazy.

Speaker: 1
01:09:41

I know. I love how they’re like, we lost the male vote. We need to do some a this is what I don’t understand. Talk. I don’t get it. It feels it feels strange to me. That feels that feels like I I feel like you would have learned from this election. You had because the other thing that they’re doing is saying this is an unelected shadow government running.

Speaker: 1
01:10:03

I’m like, who the fuck do you think was running the government for the last four years when we had putting bryden in there? Like, it wasn’t him. We all knew that. He couldn’t even do, like, field questions until he was pumped with drugs after a certain like, how can you say Elon’s pretty transparent.

Speaker: 1
01:10:24

You know, he’s not, like, hiding things. He’s trying to shine light on things.

Speaker: 0
01:10:28

Not their side. So their side is good, not their side is bad, which is why they’re not looking this. You don’t. It it has to crumble. You have to watch these people implode. They have to double down. They it has to get worse. And then more people have to abandon them to the point where someone has to ai, and it’ll have to be a young person.

Speaker: 0
01:10:47

And that young person will have to be a sensible person who actually is like a actually is like a real progressive, who recognizes that there’s a lot of fucking actual corruption and and real problems with the system. And then there could be a lot more social programs that would help people that would make the whole world a better place.

Speaker: 0
01:11:03

And those people have to ai, and they have to be not ideologically captured. They have to be reasonable intelligent people. The problem is everybody comes out of universities, and all these universities are captured.

Speaker: 1
01:11:15

Right.

Speaker: 0
01:11:15

All these universities are filled with these radical ideologies that people are indoctrinated in. You leave your parents. You don’t sana fuck my parents. They’re fucking my parents are fascists. And then, all of a sudden, you’re in schooling, like, ai, there’s 80 genders. And you you’re fucking out of your mind.

Speaker: 0
01:11:30

And then it takes years of living in the real world before it comes back around. We go, hey. You know what? This is actually bullshit. So it’s like a process that has to take place.

Speaker: 1
01:11:42

But do you think they’re still being taught this? Yes. So it’s

Speaker: 0
01:11:46

It’s unquestionably without a doubt they’re being taught this.

Speaker: 1
01:11:50

Because so much of DEI and all of the stuff that they’re dismantling is basically an entire industry that was created for all of these people at these useless degrees and nonsense education to have a job.

Speaker: 0
01:12:02

And they got all the way to Harvard, the top of Harvard.

Speaker: 1
01:12:04

Yeah.

Speaker: 0
01:12:04

Yeah. Which is ai. As a plagiarist, got to be the president of Harvard. The it didn’t matter if it made sense. It matter if it felt if it fit the narrative.

Speaker: 1
01:12:14

I think this election too sai I was like, oh, I don’t know how this could go badly when I put out that, like, you know, like, I put out that I was voting. I’m like, well, the I like, I could whatever.

Speaker: 0
01:12:25

You can get canceled.

Speaker: 1
01:12:26

I maybe. But I had to be honest, but I still I mean, one of the clips that, like, came from the show is when we were talking about there being a red wave, and it was right before the midterms. And people for years were like, oh, guess you were wrong about that. And it’s ai, we weren’t really wrong. We’re perceiving something that was happening.

Speaker: 1
01:12:49

I think it just happened in the general. It didn’t happen in the midterms. Yeah. You had people who like, people were still coming out of COVID by

Speaker: 2
01:12:57

the way.

Speaker: 0
01:12:58

People vote in the midterms.

Speaker: 1
01:12:59

And you had people, like, trying to get their lives back together after being locked up for two years or whatever. They were just, like, stumbling out of the COVID years. Their kids couldn’t talk. And just

Speaker: 0
01:13:10

like Yeah. We were right. We’re defending ourselves. We were right. We’re just a little off in ai.

Speaker: 1
01:13:15

No. We were wrong about that for sure. But I do think

Speaker: 0
01:13:18

But we were right about the general.

Speaker: 1
01:13:19

Well, I I

Speaker: 0
01:13:20

think red wave happened.

Speaker: 1
01:13:22

It’s been something Sai think, like, for all the shit that lots of people get, there’s there has been, like, wave after wave after wave of people leaving the Democratic party. You know? I’m still seeing it online. Someone just yesterday posted some video, and it was like, I’m done with you.

Speaker: 1
01:13:40

I was like, how are you guys still shutting people? How are you still

Speaker: 0
01:13:45

They’re gonna keep shedding people. This isn’t they’re not gonna correct course. This is a this is a, you know, this is a buffalo drop. Do you know those buffalo jumps? No. You know, those the Native Americans used to one of the ways to hunt buffalo was to get them to the edge of a cliff and just run at them, and they just fall off the edge edge of the cliff.

Speaker: 0
01:14:05

And then people be waiting on the bottom, they butcher them and eat them.

Speaker: 1
01:14:08

Okay.

Speaker: 0
01:14:09

We’re we’re in that pile of buffalo.

Speaker: 1
01:14:11

We’re all being run

Speaker: 3
01:14:12

off the cliff?

Speaker: 0
01:14:13

Yeah. They’re gonna go off the cliff. There’s there’s no way they’re not. Yeah. They’re not course correcting at all. You know, they’re saying stupid shit. It’s it’s all nonsense. I know. Like, their their their understanding of social media and the dynamics that you set up by having completely state controlled mainstream media where they only said the narratives that you guys wanted.

Speaker: 0
01:14:33

They all sai it in step. So you could watch different programs speak the exact same words, exact same phrases. We know they got talking points. We don’t trust you anymore. We don’t trust the New York Times. We don’t trust the Washington Post.

Speaker: 0
01:14:46

We don’t trust CNN or any of the MSNBC, but they’re all full with propaganda.

Speaker: 1
01:14:51

Yeah.

Speaker: 0
01:14:52

And so that’s why the Internet rose. It’s not because there was some sort of a fucking right wing conspiracy and heavily funded. No. You guys suck. You guys fucking suck. And you’re not real people, and you’re not like, nobody wants to hang out with Brian Stelter. You You know what I’m saying?

Speaker: 0
01:15:10

There’s, like, none of these fucking people are people that people can actually relate to and like.

Speaker: 1
01:15:16

So much of it was bullshit too. Like, I didn’t really understand media when I first came into media. I’d already been, like, in Hollywood and comedy and then ended up in media. And one of the first parties I was ever exposed to was, like, the height of a lot of this, like, twenty eighteen, and it was very divided.

Speaker: 1
01:15:32

And people were in true, like, oh, Trump is, like, gonna ruin the world. And there was a Daily Beast party and Ann Coulter was there, and, like, people were all just hanging out. I’m like, these people don’t fucking believe anything they’re saying. I mean, Ann probably does, but these guys don’t.

Speaker: 1
01:15:49

Like, how and then I would see this over and over again where people would ai, and then they’d get off, and they’d, ai, I’ll be, like, see you at the play date. You know? And it’s ai, oh, you Right. There was, like, something very strange to me about that where

Speaker: 0
01:16:02

so much that old cartoon where there was the sheepdog and the coyote Yeah. And they would say hi to each other in the morning and punch in, and then they would fuck each other up all day?

Speaker: 1
01:16:11

That’s how it felt. That’s exactly

Speaker: 0
01:16:12

Morning Ralph. Morning, Sam.

Speaker: 1
01:16:16

That’s exactly how it felt.

Speaker: 0
01:16:18

Well, that’s what it’s really like. It’s pro wrestling.

Speaker: 1
01:16:21

Yeah. You

Speaker: 0
01:16:22

ai? I mean, that’s one of the things that Kamala Harris said after her debate with Joe Biden where she she called she believed Joe Biden’s accuser that, you know, he had sexually assaulted some woman. Remember that? He she said she believed it and this and that. And then they asked her about it on Colbert. She’s like, it was a debate.

Speaker: 0
01:16:39

It was a debate. And they’re laughing, of course.

Speaker: 3
01:16:44

Of course.

Speaker: 0
01:16:44

It’s just a debate. Now you’re his fucking vice president? This is so nuts. So you said you think the ai are rapists, and now you think he’s awesome to run the country, and you’re you’re so proud of him? We did it, Joe. Like, this is crazy.

Speaker: 1
01:16:56

Like Yeah.

Speaker: 0
01:16:57

We can’t trust you if you’re willing to do that for a debate. Yeah. That Your debate should be what you really think. You should say, I don’t know what happened. I you know, I don’t know what happened. It’s a very troubling accusation, but, of course, I don’t know what happened.

Speaker: 0
01:17:10

For you to say that you believe it just because you wanna win. Now all of a bryden, I have to say, well, I don’t know if I can trust you about foreign policy. I don’t know if I can trust you about the economy. I don’t know if I could trust you about censorship and then the need for social credit score or all these different things.

Speaker: 0
01:17:26

Like, what what is the real person behind these actions? Is is are you entirely motivated by money and influence? Because that seems like a lot of them.

Speaker: 1
01:17:35

It’s it’s upsetting to me though because it even though it’s ai we can see it’s ai wrestling, and some of these people are full of shah, and they’re all buddies and whatever, like, hanging out at Park City or hanging out wherever they hang out. Like, the real people are, I see how, like, broken some of the Elon thing recently really it was ai a whole new wave of people who were like, I don’t know if I can talk to you.

Speaker: 1
01:18:01

Ai managed to survive, like, eight years of the culture wars, still maintaining pretty okay relationships with even the most staunch liberals in my life. And the Elon thing for some reason, you know, the the hand gesture,

Speaker: 0
01:18:15

and Oh, the Hitler thing?

Speaker: 1
01:18:16

Yeah. It, like, put people over the edge, and they were like, I can’t I don’t know if I can, you know, be friends with a Nazi apologist. And, like, I’m like, how have I made it this far? But I I legitimately feel bad because the media has told these people that there was lit this was literally Hitler for many, many years, and then he’s up there shaking hands with Obama.

Speaker: 1
01:18:37

There’s, you know, Joe Biden. They’re all up there. It’s a peaceful transfer of power. Like, if this

Speaker: 0
01:18:43

Have you seen the common one?

Speaker: 1
01:18:44

Confused. Have

Speaker: 0
01:18:45

you seen the common one doing the Heil Hitler? No. You haven’t seen it? Yeah. I’ll send it to you, Jamie, if you haven’t seen it. It’s look, a lot of people do that gesture. That gesture is from my heart to you. It’s that’s what it is. It’s just you really shouldn’t do that if you arya, you know, if you’re standing on a stage and you have an angry look in your face.

Speaker: 1
01:19:05

I was joking too. Like, what if it was just this kind of autism and he did it? And then in his brain, he’s like, oh, shah. That looked like a Nazi saloni. And then he did it. Like, if I do it again, then it won’t. And then it’s just this cascading freak out. And he’s like, ram my heart, I love you.

Speaker: 0
01:19:18

Boy, Apple made it, like, real weird finding things now. They they keep messing with this fucking interface.

Speaker: 1
01:19:25

You’re like, I can’t I can’t search Kamala Nazi salute on Apple anymore. Have it,

Speaker: 0
01:19:31

and I know it’s good. Let me find it, you fuckheads.

Speaker: 1
01:19:35

I haven’t seen it.

Speaker: 0
01:19:37

Oh, it’s wonderful.

Speaker: 1
01:19:38

Because usually, it’s just a picture and then they show the context, and it’s not really actually that.

Speaker: 0
01:19:43

No. No. No. This is great. Of course, it’s not that. Of course, the context is not that. But here, this is this is I’m sorry. This is AOC.

Speaker: 1
01:19:51

Oh, okay.

Speaker: 0
01:19:52

Have you seen this one?

Speaker: 1
01:19:53

I haven’t seen this one.

Speaker: 0
01:19:54

Give me the the volume. Everybody does that move.

Speaker: 1
01:20:10

Everybody does.

Speaker: 0
01:20:11

Tim Walz did that move. Yeah. Kamala Harris did that move. They all did that move.

Speaker: 1
01:20:15

Yeah. I mean, when you’re announcing people to a stage

Speaker: 0
01:20:18

Thank you. Thank you. I love you.

Speaker: 1
01:20:20

But here’s the problem when you’re I just wrote a column about this. Like, when when it is the boy who cries Nazi, you know, the nerds who cry Nazi for all these years, like, it does end up running cover for Nazis.

Speaker: 0
01:20:31

Oh, 100%. One hundred %. That’s part of the real problem. There was one actual real Nazi that I was following for a while on, on Twitter. I didn’t even have to follow them. I clicked on the links a bunch of ai, and then it just started showing up in my feed. I’m like, okay. Good.

Speaker: 0
01:20:47

Now Ai don’t have to follow you.

Speaker: 1
01:20:48

Right.

Speaker: 0
01:20:48

I could see this insanity.

Speaker: 1
01:20:50

Right.

Speaker: 0
01:20:50

It was crazy.

Speaker: 1
01:20:51

Well, it’s I that’s why I love free speech because you’re ai, I would rather see this.

Speaker: 0
01:20:56

Look. X has porn. Hardcore porn.

Speaker: 1
01:20:58

Oh, I know.

Speaker: 0
01:20:59

I mean, it has everything.

Speaker: 1
01:20:59

You have to watch it on X now because you can’t watch it on You can’t get a Pornhub now.

Speaker: 0
01:21:05

It’s all very weird.

Speaker: 1
01:21:07

Thank you, Elon.

Speaker: 0
01:21:07

It’s all very, very, very weird.

Speaker: 1
01:21:09

No. It is.

Speaker: 0
01:21:09

Sai it’s a very, very strange time for people to try to figure out what’s real and what’s not. And you’re not gonna get a good road map from your leaders. You’re just not.

Speaker: 1
01:21:18

But you’re and I don’t think, like, it’s we’re not like, I’ve been joking, like, we are the fake news now, you know.

Speaker: 0
01:21:25

I always say that today because what did they post? That was the video. Right?

Speaker: 3
01:21:29

Oh, yeah.

Speaker: 1
01:21:30

There’s a big video.

Speaker: 0
01:21:31

Yeah. You wrote we are the fake news now.

Speaker: 1
01:21:33

Because it I mean

Speaker: 0
01:21:34

because that was fake news.

Speaker: 1
01:21:35

They’re just in as as susceptible to sharing stuff that’s not true.

Speaker: 0
01:21:40

Oh, yeah.

Speaker: 1
01:21:41

I think there’s a big difference between a massive organization being influenced with talking points by a government and corporations to present something versus an idiot sharing, you know, propaganda without knowing it. But we I do I do wonder, like, how you how you have you know, I was talking to someone, like, is there a way to I think you have to just, like, pause.

Speaker: 1
01:22:06

You know? I assume everything is fake. That’s my default. I think you start from there, and then you try and do your detective work.

Speaker: 0
01:22:16

Yeah. I this is my thing. Oh, my god. Is that real? That doesn’t seem real. No. Let me see if that’s real. And then I check. Yeah.

Speaker: 1
01:22:23

You assume

Speaker: 0
01:22:24

that’s something over time. Before, in when I was younger, I’d be like, I want that to be real, so that’s gotta be real.

Speaker: 1
01:22:30

Right.

Speaker: 0
01:22:30

Right? That’s a problem.

Speaker: 1
01:22:32

Well, that’s how my husband kind of taught me to evaluate everything that I get from the media. He’s he’s ai, I I look at something and I go, do I want this to be real or not real? And if so, why?

Speaker: 0
01:22:47

Yes. That’s right.

Speaker: 1
01:22:49

Like, if you start there Yeah. You have a chance.

Speaker: 0
01:22:53

That’s me with UFOs. That’s the that’s my whole UFO take because I clearly, I want them to be real sai bad.

Speaker: 1
01:23:00

Ai want them to be real so badly.

Speaker: 0
01:23:02

Desk here. I want them to be real so bad, but the more I walk and look into it, the more I don’t believe. The more Really? Yeah. Yeah. Yeah. I think I think a lot of it is horseshit. A lot of it. But also, maybe some of it’s real.

Speaker: 1
01:23:16

It’s so unsettling to think that we’re the only things out there for me that I wanted to it’s ai, that can’t be. Surely.

Speaker: 0
01:23:26

It doesn’t seem like it even makes sense that that’s true. So I don’t think that that’s true. But I I do not know if we’ve been visited, but I think a lot of it is bullshit. I think it’s not just bullshit. I think it’s probably government coordinated bullshit. I think there’s probably sightings that are mass ai ops where they’re trying to see how people react to things.

Speaker: 0
01:23:46

I think there’s probably crafts that The United States is in possession of that absolutely look like UFOs. I think there’s probably propulsion systems that they use for drones that are infinitely more advanced than we currently think state of the art is.

Speaker: 1
01:24:01

Right. And so they’d say, oh, it’s UFO, and it’s really their technology.

Speaker: 0
01:24:04

But that doesn’t account for the sightings that occurred when it was impossible for that technology to exist.

Speaker: 1
01:24:09

Right.

Speaker: 0
01:24:09

That doesn’t take you back to, like, nineteen fifties with Kenneth Arnold where meh saw those flying discs moving through the sky, which is where the term flying saucer came from because it was ai saucers skipping across the lake.

Speaker: 1
01:24:19

And what about all the people kind of all over the world who have experienced, like, a similar thing? And and why do all the drawings always look the same? Is that just collective conscious? People have seen one and then they think that or is it you know, how they always look like aliens?

Speaker: 0
01:24:34

There’s a lot of possibilities. And another possibility is that the world’s not real.

Speaker: 1
01:24:38

And we’re in a simulation.

Speaker: 0
01:24:39

Yeah. That there’s something maybe not maybe saying the world’s not real is not the best way to put it.

Speaker: 1
01:24:46

It makes me sai, like, I wanna crawl out of my skin.

Speaker: 0
01:24:49

Maybe the best way to put it is that it’s not real the way we think it’s real. Like, there’s real consequences to your actions. There’s there’s real physical laws that exist in the the experience that you’re having as a conscious creature moving through this world. But this world’s not totally solid all the time. It’s solid when you interact with it.

Speaker: 0
01:25:13

There’s a the the rest of it is vague and weird and and malleable, and that it’s constantly changing, and that you you wake and sleep and wake and sleep and assume that every time you wake up, you’re you’re in the same exact area, the same space, the environment looks the same, but it might be a completely different dimension.

Speaker: 0
01:25:35

There might be intertwined realities that are constantly experiencing itself over and over and over again. And then there also might be other dimensions that higher beings have the capability of traversing that we don’t. And that all these things are they’re they’re happening simultaneously with the actual creation of, an artificial reality.

Speaker: 1
01:25:58

It seems sai real.

Speaker: 0
01:26:00

Yeah. Yeah. It all seems so real, but it also seems fake. Right? There’s some More and more. Is it ai I’m sure you know the whole Barron Trump story, you know, the the the the ancient books that talked about, a guy named Elon is gonna go to Mars.

Speaker: 1
01:26:14

No. You never saw about that? No.

Speaker: 0
01:26:16

It is so crazy that even Elon saw that. He was like, is this real? Like, how is this real? Was it from 1853?

Speaker: 1
01:26:23

Why does this have to do with Barron Trump?

Speaker: 0
01:26:25

Because it’s about a a guy named Barron Ram, and his, his guru is named Don. Yeah. Yeah. Yeah. Oh. Yeah. His What? Yeah. Yeah. I feel like

Speaker: 1
01:26:35

I, like, saw something like ai, and I was like

Speaker: 0
01:26:37

Barron Trump’s marvelous underground journey.

Speaker: 1
01:26:39

What is this? It’s ai an

Speaker: 0
01:26:40

old It’s a book from

Speaker: 1
01:26:41

Ram wrote it?

Speaker: 0
01:26:42

It is from 1893. Nearly forgotten Barron Trump’s marvelous underground journey blends a science fiction and fantasy in a story told by little Barron Trump, an aristocrat boy. There you go. That’s what he is. Who sets out from Vatsal Trump, which is where he lives, to discover a world within a world that he read about in the fifteenth vatsal manuscript, celebrated thinker and philosophy he learned Spaniard Bryden ram.

Speaker: 0
01:27:07

Don. Don. His guy Don. Yeah. Join Saloni Trump and his faithful dog and companion, Bulger. They set off at Northern Russian search portal, so it’s subterranean.

Speaker: 0
01:27:15

But there’s also the other thing they go back to Vatsal Trump. There’s the other thing about Wernher von Braun. So Wernher von Braun, who was the head of NASA, wrote a novel, a fictional novel about a guy named Elon that takes us to Mars. Yeah. So there’s, like, parts of reality that don’t seem real.

Speaker: 1
01:27:40

Yeah.

Speaker: 0
01:27:40

They seem like like a like a wink, like an Easter egg, like someone is, like, winking at you

Speaker: 3
01:27:47

Oh, yeah.

Speaker: 0
01:27:48

Through the simulation.

Speaker: 1
01:27:49

I’m just ai an extra in the simulation. You’re a part of it.

Speaker: 0
01:27:53

It’s your version of it that you’re going through. You are you are the person who’s experiencing your world. I just don’t know if your world and my world are exactly the same.

Speaker: 1
01:28:03

Right.

Speaker: 0
01:28:04

I think Well, no. They’re bubbles. I think they’re bubbles. And I think the way you interface with the world changes the what what your bubble consists

Speaker: 1
01:28:14

of. Okay.

Speaker: 0
01:28:15

Yeah. And I I think it’s all very it’s very weird. I don’t think it’s as simple as that rancher lady thinks when she gets up and feeds her chickens. I think that’s her world.

Speaker: 1
01:28:28

Right.

Speaker: 0
01:28:28

That’s her world. But I think the the universe itself and how we interact consciously with it and all the things around us, I think it’s squirrelly. Mhmm. I think it’s real squirrelly. And I think every now and then, the universe shows us something like this fucking Wernher von Braun book, where you go, what?

Speaker: 0
01:28:45

E what the even the name Elon and Mars, like, what are the fucking odds? And then it turns out that Elon was actually named by his father when his father read that book.

Speaker: 1
01:28:56

Oh, okay. Which is

Speaker: 0
01:28:57

eve but no. Even crazier. Yeah. Because how what is the odds your son is gonna be the guy who goes to fucking Mars?

Speaker: 1
01:29:02

My my daughter is obsessed with Mars. And we we always, like, know all the planets, and it’s weird. I have no obsession with Mars. But she’s obsessed with the stars, the planets. She can point them out at night. She’s like, that’s Jupiter. That’s Mars. Blah blah blah. And we asked her one day.

Speaker: 1
01:29:18

We sai, where do which planet do humans live on? And she sai, Earth and Mars. And I was like, maybe.

Speaker: 0
01:29:24

Well, you saw the square. Yeah. That square on Mars from Is

Speaker: 1
01:29:27

that real?

Speaker: 0
01:29:27

Yes. That’s real.

Speaker: 1
01:29:28

That’s a real

Speaker: 0
01:29:29

That’s a real image. Photo. It’s a real satellite image of Mars where you see a square structure. By the way, it’s right down the street. It’s like a hike away from Cydonia, the face on Mars. So it’s a hike away from that thing that they saw from the oh, god. I wanna say it was, like, the nineteen seventies.

Speaker: 0
01:29:47

They sent a ai to Mars to take photographs of the surface, and they saw this thing that looked like a face. The face on Mars, though, it seems like what that is is just light with shitty resolution, and it looked like a face. What’s interesting more about the face on Mars so that’s the original image.

Speaker: 1
01:30:05

Okay.

Speaker: 0
01:30:06

Sai, the the problem with that is it’s just it’s not clear enough. It could be anything.

Speaker: 1
01:30:10

Yeah.

Speaker: 0
01:30:11

Now, go to the modern images of the face on Mars. You could see them right there. It’s right below. Oh. See the ai right there where you just were to the right to the right. Yeah. Right there. Click on that. So that’s what it actually looks like.

Speaker: 1
01:30:23

Okay.

Speaker: 0
01:30:24

So you could see how that just looks like a mountain.

Speaker: 1
01:30:27

Yeah.

Speaker: 0
01:30:28

Yeah. But what’s interesting is the shape of it. The shape of it is weird. How it curves at the bottom and it’s kinda equal sided and it goes up to the top and has ai angle turns. So that image of what that thing is, where that’s located is just a small hike away from this immense square that they’ve discovered.

Speaker: 0
01:30:47

That’s 200 it they don’t know exactly how big it is, but the rough estimate is somewhere around 200 meters across.

Speaker: 1
01:30:53

What do they think it is?

Speaker: 0
01:30:55

It’s a structure. That’s what I think it is. I don’t think it can be anything other than a structure.

Speaker: 1
01:31:00

What does Elon think it is?

Speaker: 0
01:31:03

He thought it was wild.

Speaker: 1
01:31:04

Has he commented on it?

Speaker: 0
01:31:05

I mean, I sent him a text message that said, imagine if you go up there and you find evidence of a previous civilization. He’s ai, that is fucking wild. It’s wild because that’s a real square.

Speaker: 1
01:31:15

What the yeah. That just doesn’t seem like that would exist.

Speaker: 0
01:31:19

It doesn’t seem like it’s Naturally. It doesn’t seem like it’s possible for it to be a perfect square.

Speaker: 1
01:31:22

Yeah. That’s crazy. That I I thought this was fake.

Speaker: 0
01:31:25

No. No. No. No. It’s real. It’s real. And it’s it’s very disturbing because that’s a that’s a fairly high resolution image. And Elon is back to mission to Mars to go and check that out. They wanna check it out. See, there’s a article right there from the Daily Mail, Elon backs mission to Mars, where he wants to send something up there to check it out.

Speaker: 0
01:31:46

If you see these images, like, they’re they’re I don’t think there’s a chance in hell that that’s not made by someone.

Speaker: 1
01:31:52

That seems very strange.

Speaker: 0
01:31:53

I don’t think that nature makes a square.

Speaker: 1
01:31:56

No. I mean Is

Speaker: 0
01:31:57

that a rectangle or a square? I mean, what are the it looks is it perfectly square? It might be slightly off. Whatever it is, it’s four right angle turns that have equal lengths.

Speaker: 1
01:32:08

Theorists have ai the structure of the Great Pyramid in Egypt.

Speaker: 0
01:32:12

Conspiracy Theorists is a nice way to put it. Yeah. Yeah. How about just people looking at it going, what the fuck is that?

Speaker: 1
01:32:17

I know. Why do they always call them conspiracy theorists? Because they’re assholes.

Speaker: 0
01:32:21

They’re assholes and they work for the Daily Mail. So it’s right down the street from that arya.

Speaker: 1
01:32:26

And what are they faces, just like a a giant stuck in the Mars Martian surface. I don’t

Speaker: 0
01:32:31

think anybody really thinks it’s a face anymore. If you scroll up and look at that image from July 1976, that image, what’s interesting to me is not the face, because I don’t think it’s a face. But what’s interesting to me is the shape of the base of it. The shape of the base of it is weird. Yeah.

Speaker: 0
01:32:46

It’s flat on the bottom, equal ai, and domed on the top. It looks unnatural. It doesn’t mean it’s unnatural. There’s a lot of shit that looks unnatural in nature that is actually natural, but not squares. That giant square where it looks like building

Speaker: 1
01:33:00

Was it there was, like, a civilization there?

Speaker: 0
01:33:03

Yes. Well, the thing is, Mars had a real atmosphere. Mars had Right. Mars has liquid water.

Speaker: 1
01:33:10

Right.

Speaker: 0
01:33:10

We know it does. And at one point in time, they think that Mars was capable of sustaining life. So if you imagine the planets over time get further and further away from the ai. If you go back a billion years, how much closer was Mars to the sun? Right. And what was the temperature like?

Speaker: 0
01:33:27

The year

Speaker: 1
01:33:27

old What was

Speaker: 0
01:33:28

the atmosphere like? Yeah. The the there’s people that believe that the initial civilization escaped Mars and came to Earth.

Speaker: 1
01:33:37

Oh, what about the moons of Jupiter?

Speaker: 0
01:33:40

Well, there’s one.

Speaker: 1
01:33:42

Europa or something?

Speaker: 0
01:33:43

Europa that’s solid ai, and they think underneath that well, the the the surface is solid ai. And they think underneath that is liquid water, and the liquid water is capable of supporting life. So that it’s possible, especially when you see, like, those thermal vents that they find in the bottom of the ocean that sustains some form of life.

Speaker: 0
01:33:58

There might be some form of life inside the oceans of Europa.

Speaker: 1
01:34:01

Are you joking about all this stuff?

Speaker: 0
01:34:03

No. No. I’m not joking about Mars. That image

Speaker: 1
01:34:06

is it sounds ai great. Like, are you joking about it on state? Are you Sometimes.

Speaker: 0
01:34:10

Ai used I used to have this whole bit about Mars. But there there’s places in America that you can’t live to. Like, go to Death Valley. Look around. Sucks. Right? Yeah. No one lives here. Like, that’s because it sucks. Get out of here.

Speaker: 0
01:34:20

You don’t if you’re not gonna fix that and you’re gonna go fix Mars

Speaker: 1
01:34:23

I mean like the whole

Speaker: 0
01:34:24

planet has no air.

Speaker: 1
01:34:26

Yeah. No. It’s not appealing to me at all. Some people I you watch these documentaries and they’re like, I think of Mars ai my second home and Yeah. I wanna go live there. I’m like, it’s time.

Speaker: 0
01:34:34

They also think

Speaker: 1
01:34:35

they’re a cat.

Speaker: 0
01:34:36

I’m also I’m a I’m also a foxkin.

Speaker: 1
01:34:40

Space just feels claustrophobic to me even though it’s vast and and, like, until you can build things that make it feel a little more, like I just I’m like, I don’t wanna get in a tube and, like, go live in a do you watch Silo?

Speaker: 0
01:34:54

No. I haven’t watched it.

Speaker: 1
01:34:55

I love Silo. I Ai love anything that’s, like, post apocalyptic show where people are living in in a in a wasteland or in some weird and they’re underground in all these silos, and they all have to ai. And there’s the people at the bottom, and it’s really but it’s, like, definitely are, like, they’ve never seen the sun. They’ve never seen the sky. You know?

Speaker: 1
01:35:17

It’s ai they’ve had an alternate history that’s been kind of told to them.

Speaker: 0
01:35:22

Jesus.

Speaker: 1
01:35:23

Yeah.

Speaker: 0
01:35:23

Yeah. That’s all possible.

Speaker: 1
01:35:25

Yeah. That’s ai a post nuclear world. The like, when the AI gets smart and nukes us all.

Speaker: 0
01:35:31

Yeah. But how are we gonna get vitamin d down there? It’s not good.

Speaker: 1
01:35:34

They have, like, plants and trees and

Speaker: 0
01:35:37

Ai d from plants.

Speaker: 1
01:35:38

Yeah. It’s I I don’t know. Do you do you arya you now that you, like, burned the boat and all of your stuff, are you just kinda having fun rebuilding?

Speaker: 0
01:35:47

What do you mean?

Speaker: 1
01:35:47

Like, you you now that you just, like, did all your material

Speaker: 0
01:35:51

Oh, stand up wise? Yeah. Just having fun.

Speaker: 1
01:35:53

Yeah.

Speaker: 0
01:35:54

Trying to come up with a new hour and fucking around.

Speaker: 1
01:35:57

It’s so fun.

Speaker: 0
01:35:58

Yeah. It’s, you know, it’s a weird time for comedy.

Speaker: 1
01:36:02

Do you think?

Speaker: 0
01:36:04

Yeah. Yeah. Yeah. Well, it’s a really good time for comedy.

Speaker: 1
01:36:07

Well, yeah. I feel like it’s a great time.

Speaker: 0
01:36:09

Yeah. There’s so much information. But it’s a weird time too because, you know, ai, the the center of comedy has now moved to Texas.

Speaker: 1
01:36:14

It’s awesome.

Speaker: 0
01:36:15

So that’s weird. That’s a crazy thing.

Speaker: 1
01:36:17

That’s incredible.

Speaker: 0
01:36:17

And it’s also moved online, which is also a crazy thing. Like, the the main promotional aspect of comedy is now online.

Speaker: 1
01:36:24

I heard Jesselnik talking about this, about being ai a clips comic. Like, he was he was I saw a clip of him going around talking about being a clips comic and how Oh, ironic. Yeah. It’s very meta. It was but he was talking about how before people would recognize him, like, in the old ways, and now people wreck it it’s like a different he said it’s like a different level of fame from when he was famous before just for stand up.

Speaker: 1
01:36:52

Now he’s like the clips guy. But I think, like, everyone’s that’s ai everyone. Right?

Speaker: 0
01:36:57

It’s just a different medium. It’s all it is.

Speaker: 1
01:36:59

I’ve talked to

Speaker: 0
01:37:00

I mean, it’s no different than you being on some Comedy Central show. It’s just way more impactful.

Speaker: 1
01:37:04

I’ve talked to a lot of people about how the, like, videos, the the, you know, the, like, crowd work videos have changed crowds. Oh. Because now crowds think they’re helping you by chiming in. And so it’s, like, actually, most because it’s some crazy percentage of people who have never been to a comedy show at at a comedy show.

Speaker: 1
01:37:26

Like, a very high percentage of people have never been to a comedy show at every comedy show. Isn’t that weird? It’s it’s, like, quite high.

Speaker: 0
01:37:34

Interesting. Right? So they’re accustomed to seeing, like Like, the ai

Speaker: 1
01:37:37

ai works stuff. Yeah. Yeah. And so they think they’re actually helping you by heckling so that you can get your

Speaker: 0
01:37:45

I don’t think they think they’re helping you. I think they want to be heard.

Speaker: 1
01:37:48

They wanna be a part of it. And they

Speaker: 0
01:37:49

wanna that’s the way they get in. I’m helping. You just wanna chime in. Yeah. I saw I saw some video clip of some lady losing her shit on some guy in the audience.

Speaker: 1
01:37:58

Oh, I saw this.

Speaker: 0
01:37:59

Yeah. It’s very weird.

Speaker: 1
01:38:01

And everyone was like, this is amazing. It’s not

Speaker: 0
01:38:03

really amazing.

Speaker: 1
01:38:05

I was like, it seems, like, upsetting.

Speaker: 0
01:38:09

It’s also not very well handled. Like,

Speaker: 1
01:38:11

the the

Speaker: 0
01:38:12

whole experience is not expertly it’s there’s not a lot of humor in there. You know.

Speaker: 1
01:38:17

People love it. They’re ai, well, yeah. The that this is comedy. It’s subjective, I guess. Some people look at it.

Speaker: 0
01:38:22

Well, that’s not comedy. That’s just talking people talking. And even one person with a microphone and power and one person in the audience is, like, challenging that and give me the microphone and it’s fuck you and, you know, you’re

Speaker: 1
01:38:34

It’s wild like the clips thing. The clips are are, like, feeding feeding the algorithm is freaking it’s such a crazy thing to me.

Speaker: 0
01:38:42

Well, that’s with everything. Right? Clips from podcasts are way more popular than the podcasts themselves. Yeah. So many clips go viral on, you know, x and Do

Speaker: 1
01:38:51

you think TikTok’s gonna be banned? No. No? No? No? No? No?

Speaker: 0
01:38:55

Yeah. Yeah. I don’t think so. The fucking president uses it. I met the fucking CEO of TikTok when I was at the inauguration.

Speaker: 1
01:39:04

What was that like?

Speaker: 0
01:39:05

Like, being in Satan’s balls.

Speaker: 1
01:39:09

Where?

Speaker: 0
01:39:11

Just going to the actual fucking.

Speaker: 1
01:39:13

Inauguration?

Speaker: 0
01:39:14

The the actual house of government being in the actual buildings where all this stuff gets done. It’s very, very strange.

Speaker: 1
01:39:23

How long were you there?

Speaker: 0
01:39:24

Couple days.

Speaker: 1
01:39:25

And so you went to the, like, some of the balls the balls? Yeah. Was it ai Hunger Games? Like, what

Speaker: 0
01:39:32

It’s like really rich ai people that donated a lot of money. Uh-huh. They they just, like, jump in front of you to take pictures with you. They don’t care who you’re talking to. They don’t care everybody was ai super pushy. It’s all very transactional. Everybody needs to get on your podcast and needs to talk to you about a thing and you have to get this per it’s like everything’s exhausting. Mhmm.

Speaker: 0
01:39:52

And everybody’s wealthy. So and they’re also they speak a lot of money to get there. Mhmm. Like, a lot of those people, they donated, like, a million dollars.

Speaker: 1
01:39:59

Holy shit.

Speaker: 0
01:39:59

So it’s, like, tons of thousands of people who donated a million dollars.

Speaker: 1
01:40:03

Just to be there.

Speaker: 0
01:40:04

Just to be there. And they’re all fucking just super enthusiastic because their team just won.

Speaker: 1
01:40:11

Right.

Speaker: 0
01:40:11

You know? And it’s the inauguration of the president and Kid Rock’s there and everybody was going crazy. And it was fun for a while. It was me and Tony Hinchcliffe and Theo Vaughn and,

Speaker: 1
01:40:22

Lex?

Speaker: 0
01:40:22

Logan Paul was there

Speaker: 1
01:40:24

Oh, yeah. I saw that.

Speaker: 0
01:40:25

And Jake Paul, and we were all having a good old time. We were laughing and having fun, and then too many people just start swarming us. And then it was just ai, you’re dealing with, like, 10,000 people in this room ram you can’t move. You can’t go anywhere.

Speaker: 1
01:40:38

Right.

Speaker: 0
01:40:38

And it got crazy. It’s silly. You couldn’t have a conversation. You you you had to just try to get out of there.

Speaker: 1
01:40:43

Was the inauguration, like, moving? Was it did you

Speaker: 0
01:40:48

Fascinating.

Speaker: 1
01:40:49

Was it weird to be part of all that, ai, pump and

Speaker: 0
01:40:52

Oh, yeah. It was very weird. Very weird.

Speaker: 1
01:40:54

Were you in the main Oh,

Speaker: 0
01:40:56

I was on the stage. Oh, you the Fifth

Speaker: 1
01:40:58

row. Oh.

Speaker: 0
01:41:00

I was, like, right there.

Speaker: 1
01:41:02

I was unaware.

Speaker: 0
01:41:03

Yeah. Like

Speaker: 1
01:41:03

him at Lauren Sanchez’s tits.

Speaker: 0
01:41:05

Yeah. I could have thrown a pebble and hit Hillary Clinton in the head. Oh. Yeah. They were right there. It was weird. It’s first of all, it’s weird watching, like, Bill Clinton walk into a room. Right. He’s real. That’s him. Yeah.

Speaker: 0
01:41:19

All the shit that ai escaped, and there he is ai there. Oh, TV. Hey. How you doing? Good to see you. Good to see you. Weird. It’s weird.

Speaker: 0
01:41:28

It’s weird seeing those people. And then ai that they’re I know. Ai?

Speaker: 1
01:41:34

Whenever I’m in situations like that, I’m like, how do they why are they letting me in here? They they should know better.

Speaker: 0
01:41:39

I thought of that a little, but, you know, it’s cool. It was weird.

Speaker: 1
01:41:44

Yeah.

Speaker: 0
01:41:44

You don’t you only get to be at one of those maybe once in your life if you’re really lucky. Especially that one which was indoors because it was insanely cold outside.

Speaker: 1
01:41:53

Yeah. That seems ai probably better Oh, yeah. It was cold.

Speaker: 0
01:41:56

It was fuck outside too. Definitely better for security. But it was very cold outside and windy and shit.

Speaker: 1
01:42:02

Is it weird to be, like I don’t know. Ai I was looking at all the people up there, and it was just it seems like it’s so much Ai don’t know. It seems like it’s it’s just, like is everyone kissing the ring? Does it feel kinda like you know what I mean? Is it is it like a I don’t know. Does it feel ai gross or is it if it wasn’t, like, cool?

Speaker: 0
01:42:26

Well, it feels strange. Right? It feels strange that it’s real. It also feels strange that we’re we’re standing up because, there’s a standing ovation every, like, fifteen seconds. We’re gonna turn the Gulf Of Mexico and the Gulf Of Meh. We’re all standing up. Fuck, yeah. We’re gonna do that. It was fun watching George Bush.

Speaker: 0
01:42:45

George Bush was the only guy clapping. He was having a good time. All the other presidents ai deeply disturbed. Yeah. Kamala Harris had this, like, this motherfucker look on her face the entire tyler. Like, she sat there like this the entire time.

Speaker: 1
01:42:57

Well, I mean Very upset. Obviously.

Speaker: 0
01:43:00

Well, also, he’s talking about how bad they sucked.

Speaker: 1
01:43:03

And she lost. Ai there. And she just sai there.

Speaker: 0
01:43:06

And she’s sitting right next to Bryden, who’s

Speaker: 1
01:43:09

Who’s not there.

Speaker: 0
01:43:10

Looks he’s just gone. Yeah. Yeah. And then behind them is Obama and George w. That’s

Speaker: 1
01:43:17

ai, man. Yeah. Yeah. I was like, I hope there’s, like, a you know, what’s that called when there’s, like, the extra person or whatever, the the designated survivor? I’m like

Speaker: 0
01:43:26

Yeah. Because they’re all There’s a lot

Speaker: 1
01:43:28

of people you could really fuck America up right

Speaker: 0
01:43:30

now. One bomb. Yeah. Yeah. I think they’re aware of that. I think there’s probably steps taken to make sure that the skies are clear.

Speaker: 1
01:43:36

Yeah.

Speaker: 0
01:43:36

But it’s very strange, but also kind of sai. Because, like, if he really does get to do all this stuff, like, if we we really do see radical changes, it seems like that’s what’s happening. And if Bobby Kennedy really does get in and if Tulsi Gabbard really does get in, like, this is a crazy time. Yeah.

Speaker: 0
01:43:55

This is ai an unprecedented cabinet of people that are kinda unified and all know each other. They’re all friends.

Speaker: 1
01:44:03

You know? And it’s also ai of by part like, RFK Junior brought it’s he brought so many people over. I know so many people who were not gonna vote for Trump, but then when he ai brought RFK Jr over, they were like, alright.

Speaker: 0
01:44:16

Yeah.

Speaker: 1
01:44:17

I mean, if he’s there and Tulsi too.

Speaker: 0
01:44:19

Yeah.

Speaker: 1
01:44:19

They’re they’re they were libs.

Speaker: 0
01:44:21

Yeah.

Speaker: 1
01:44:22

Yeah. So I don’t I don’t know. That that must have been pretty Ai

Speaker: 0
01:44:25

Sai Bizarre. It’s bizarre.

Speaker: 1
01:44:28

It seemed like I was like, that must be so surreal to be there because it was surreal to watch it just, like, when you think about the fact that he nearly got killed, like, the whole sequence of events that led to that moment. Yeah. What the debate where he just, like, fell apart. Ai and it just was surreal to watch.

Speaker: 1
01:44:47

So I was wondering, like, that must be pretty bonkers to be there. And then He

Speaker: 0
01:44:53

he should’ve seen Ai, his face, the entire tyler, like, to look. He looked upset. Like, almost looked like he was gonna cry at one point in time. He looked so upset at the whole I think he thinks he could’ve won.

Speaker: 1
01:45:03

Do you have any fears about it? Like, what about the Riviera?

Speaker: 0
01:45:08

Oh, the the Gaza thing?

Speaker: 1
01:45:10

The Riviera.

Speaker: 0
01:45:11

Riviera Of The Middle East. You did

Speaker: 3
01:45:13

such a good impression of it.

Speaker: 0
01:45:14

Yeah. I thought I could. It’s yeah. I mean, what’s the alternative to that happening? What’s the they’re saying no US military will be there. The United States is gonna clean it up, rebuild it, give it to the Palestinians, make it safe

Speaker: 1
01:45:27

Oh, okay.

Speaker: 0
01:45:28

Get rid of Hamas. I mean, the thing is what Palestinians want is a state. Yeah. They want their own state, and this is ai a step sort of away from that. Yeah. Now it’ll be that section of the that area is now controlled by The United States. What the fuck?

Speaker: 1
01:45:45

And Greenland. Then Don’t forget Greenland.

Speaker: 0
01:45:47

The question is, like, what’s the better option? Like, give it to Hamas? What’s the better option? Give it to the Palestinians and they give it to the Islamic State? Like, who who gets it?

Speaker: 1
01:45:56

I’m not smart enough for any of this.

Speaker: 0
01:45:58

Also, how did it you let it happen in the first place? Like, what how did this administration let them just bomb the fucking shit out of all those people and Yeah. Blow up an entire city to the point where you you could even feasibly say that you’d rebuild it. Because it’s not ai before this had happened, if anybody had ever said The United States is gonna go into the to the to Gaza and completely rebuild it and make it The United States, they’ll be like, fuck you.

Speaker: 0
01:46:23

You need to give that to Palestine. But once you blow it up, you’re like, well, I guess there’s nothing left. Like, people are pushing back against it. Yeah. A lot of people are upset, but not as upset as they would if they didn’t blow it up.

Speaker: 1
01:46:34

Oh, that’s weird.

Speaker: 0
01:46:36

Yeah. Because it’s it’s weird.

Speaker: 1
01:46:38

It’s Ai yeah.

Speaker: 0
01:46:39

I It’s horrible. And we’ve just accepted the fact that this is what Israel did. And, you know, for 1,200 people and 250 hostages, they killed 60,000 people.

Speaker: 1
01:46:49

Is that the actual number or is that a lost number?

Speaker: 0
01:46:52

I don’t know what the number is. Yeah. What’s the current number of people that are dead in in Gaza right now?

Speaker: 1
01:46:57

It’s such a weird thing too because Israel, I feel like, has to, like, not win a war, which is a weird thing where they have to, like they’re in a position where they

Speaker: 0
01:47:06

They already won the war. I mean, what war? Yeah. There’s one army. There’s one army and some terrorists. It’s like Bill Hicks joke about Iraq. Like, it’s only a war when there’s two armies fighting. They’re like, well, Bill, Iraq’s the fourth largest army. He goes, yeah. Well, after the top two is a huge drop off. He’s ai, the Salvation Army is number three.

Speaker: 1
01:47:25

I don’t want there to be war. I think most people agree on that.

Speaker: 0
01:47:29

Of course. But, also, we don’t live in Israel. Right? We don’t have to live with the Iron Dome because missiles are being shot into your city, and you watch them explode up in the sky because your military has these rockets that shoot up in the sky and missiles would take these fucking missiles out.

Speaker: 1
01:47:45

Yeah. I think about, like, if it was my daughter that was, like, in some tunnel, I would be, like

Speaker: 0
01:47:50

Yeah.

Speaker: 1
01:47:50

There’d be no lengths.

Speaker: 0
01:47:52

Well, that’s

Speaker: 1
01:47:53

You know, I know that’s very, like, selfish of me, but that is on a personal level, I’m like, I can’t even imagine how I would feel in that situation as a parent.

Speaker: 0
01:48:03

Of course. Well, that’s the deepest conspiracy theory about why it happened in the first place. Because Netanyahu was losing power and people were protesting against him in the streets. And what better way to get everybody on your side than to allow an attack to take place? That’s the the darkest of the false flag conspiracy theories.

Speaker: 1
01:48:21

I have a lot of people that I know who now think nine eleven was an inside job after October 7.

Speaker: 0
01:48:28

Oh, Jesus.

Speaker: 1
01:48:29

Isn’t that crazy? Ai, that they’re like, well, if they would do it here, they would do it like, why wouldn’t they do it in America? It’s been ai this kind of reverse leap.

Speaker: 0
01:48:39

Yeah. Well, anytime you have a big tragedy ai nine eleven, you’re always gonna have a bunch of wild theories. But some of them are interesting. You know, Ai could you you’re supposed to dismiss them all because they’re conspiracy theories. But Tyler 7 is ai, explain that.

Speaker: 1
01:48:55

I can’t. I have I have people in my life who are are very meh, like, true I think truthers, I think, about this.

Speaker: 0
01:49:04

That’s a weird one. That’s a weird one.

Speaker: 1
01:49:07

What’s the conspiracy around Tyler 7? Just how did it ai

Speaker: 0
01:49:11

did it It come it collapses like a controlled demolition completely into its base. No building’s ever done that before without being a controlled demolition. And it it doesn’t have all the signature aspects of a controlled demolition. Like, if you ever watch, like, when they blow up one of them Vegas casinos Yeah.

Speaker: 0
01:49:28

Super obvious. It’s Yeah.

Speaker: 1
01:49:29

It’s like bang bang bang

Speaker: 0
01:49:31

bang bang bang, and it all collapses. It doesn’t have that. But it does have something similar in that the effect is the same. So is it possible to do that without having it the way the casinos do it? Is there only one way to have a controlled demolition of a building? Or is it possible that just immense diesel fires weaken the structure uniformly in such a weird way that it collapsed exactly like a controlled demolition, but it’s just the results of this.

Speaker: 1
01:50:00

Coincidence.

Speaker: 0
01:50:01

Yeah. They had diesel generators in the basement of that thing, and the whole fucking inside of it was on fire. So when you see it on the outside, you’re only seeing, like, a little bit of fire and some holes, but the entire inside of it was in flames.

Speaker: 1
01:50:12

We’re living in a simulation.

Speaker: 0
01:50:14

Maybe that. Maybe that’s another wink. You know, maybe that’s another fucking

Speaker: 1
01:50:19

We didn’t have social media, though. Thank God.

Speaker: 0
01:50:21

Right.

Speaker: 1
01:50:22

During that. Can you imagine?

Speaker: 0
01:50:23

No. But we did have plenty of people that were questioning once they saw it on television. Because you saw it on TV and you’re ai, what is that? That? Yeah. How does it do that? Do you

Speaker: 1
01:50:30

remember seeing That wasn’t

Speaker: 0
01:50:31

even hit by a plane. Like

Speaker: 1
01:50:32

Do you remember seeing the skyline the first time you went to New York?

Speaker: 0
01:50:35

Oh, without the Twin Towers? Ai don’t think I actually saw it.

Speaker: 1
01:50:39

Ai we I I

Speaker: 0
01:50:40

went back in 02/2001. I was there in 02/2001. Yeah. Maybe 02/2002. It was probably, like, a couple of months after

Speaker: 1
01:50:47

I took a train in November in in and because I took the train in, I always just take this train in from Rhode Island where I was, and I was like ai, the I just it was so jarring seeing they were so part of that.

Speaker: 0
01:51:01

They were so huge.

Speaker: 1
01:51:02

Yeah. And just part you just took it for granted. That was the, like, landscape. And it was I’ll never forget it. How jarring sai

Speaker: 0
01:51:09

ai too that when they rebuilt it, they didn’t make it as tall.

Speaker: 1
01:51:13

Why didn’t they?

Speaker: 0
01:51:14

I don’t know. It’s only one building now. It’s not two.

Speaker: 1
01:51:18

I know.

Speaker: 0
01:51:18

And it’s not as tall. Like, it’s pretty tyler, but they didn’t get crazy. Then we’re gonna make the biggest fucking building the world’s ever seen. It’s, you know

Speaker: 1
01:51:27

ai a middle finger to the world.

Speaker: 0
01:51:28

If Trump was there, he’d be like, we’re gonna have two buildings on the side and one right up your ass. You know, I guess, it’s ai, fuck you. So it looks like that. It it it is, it’s smaller. Smaller than the original buildings.

Speaker: 1
01:51:43

I do wonder, like, I just I’m so curious about what it’s gonna be like for our kids. Yeah.

Speaker: 0
01:51:47

That’s what it looks like now.

Speaker: 2
01:51:48

Yeah. It’s still in The US still, though.

Speaker: 0
01:51:50

Is it the tallest building in The US? Yeah. But it’s not as tall. Right?

Speaker: 2
01:51:54

I don’t know what the height was.

Speaker: 0
01:51:56

I think it’s shorter.

Speaker: 2
01:51:58

No. If it’s shorter, then I’d the

Speaker: 0
01:52:00

still the tallest

Speaker: 2
01:52:01

center wasn’t the tallest building.

Speaker: 0
01:52:03

No. Not in the world.

Speaker: 2
01:52:04

No. In The US even. Oh, it wasn’t? Ai. The Sears Tower was bigger. Really? Yeah. %.

Speaker: 0
01:52:09

Where’s the Sears Tower?

Speaker: 1
01:52:10

Chicago.

Speaker: 0
01:52:11

That’s taller than the World Trade Center?

Speaker: 2
01:52:13

It was. Woah. Now this is the I didn’t know that. Yeah. The Sears Tower was the tallest building in the world for a while.

Speaker: 0
01:52:20

So the Twin Towers were taller than the the Sears Tower?

Speaker: 1
01:52:23

They were shorter.

Speaker: 0
01:52:24

They were shorter.

Speaker: 2
01:52:25

Ai. I don’t know why you’re saying that, I guess.

Speaker: 0
01:52:28

Oh, it’s the same oh, it has the same name as the North Tower of the original. How tall is it? Is it is shorter though. Right? Isn’t it?

Speaker: 2
01:52:38

I don’t know that. So the Willis Tower was the tallest one.

Speaker: 0
01:52:41

I remember people being upset when they were rebuilding it because they it wasn’t going to be as tall.

Speaker: 2
01:52:47

Seventeen seventy six.

Speaker: 0
01:52:49

So a 10 stories was the original one, and it was the tower was 1,350 feet high. How tall is the one now?

Speaker: 2
01:53:01

76.

Speaker: 0
01:53:02

So it’s taller?

Speaker: 2
01:53:03

Yep. Really? Yeah.

Speaker: 0
01:53:05

Wait a minute. Really?

Speaker: 2
01:53:07

I don’t yep.

Speaker: 0
01:53:08

Can I see the image again?

Speaker: 2
01:53:09

Ai don’t know where else you want me

Speaker: 0
01:53:10

to go with the No. No. No. It’s real. Is it because they they cheat with that big pole at the top? I mean,

Speaker: 2
01:53:14

that’s the honor. That’s they do that with all of them. The other one had a big pole on top too.

Speaker: 0
01:53:18

But that isn’t that to stop lightning bolts?

Speaker: 2
01:53:20

Some of it. Some of it’s antenna stuff, but

Speaker: 0
01:53:22

Some of it’s to talk to aliens? Some of it’s to

Speaker: 2
01:53:24

talk to some fun tower

Speaker: 0
01:53:25

That’s operation. Okay. I’m wrong. I thought it was, shorter. Maybe it maybe what it was saying they were saying is it’s smaller because there’s only one sai of two.

Speaker: 2
01:53:34

The spire makes it seventeen seventy six.

Speaker: 0
01:53:36

The spire. Yeah. How tall how tall is that fucking spire?

Speaker: 2
01:53:38

And that video too, I was just the spire of the World Trade Center. I never saw this. They’re saying that’s what made Tyler 7 fall. It made a gash.

Speaker: 1
01:53:48

What? What made a gash?

Speaker: 2
01:53:50

The the spire on the tower here fell and made a gash down the side and started a fire here in tower I’ve never seen this video before.

Speaker: 0
01:53:58

Interesting.

Speaker: 2
01:53:59

I was just looking through it to see what the

Speaker: 0
01:54:01

Well, that’s one thing with the video that we sai. Oh. That show ai, look at that.

Speaker: 2
01:54:06

Interesting. They have photos of what that I don’t think they have an actual video of that happening.

Speaker: 0
01:54:12

Oh. It fell and fucked up the building. But still. Still, why didn’t it just fall into that side? Why does it compress into its base? The way the way it collapses, like, I’ve never seen anybody adequately describe it. But one thing that people should know is that the top of it collapsed before the whole thing collapsed.

Speaker: 0
01:54:31

There was the there’s piece on the top, so the roof caved in first. Uh-huh. Like, there there was, like, there’s a smaller structure on top of the roof that imploded and went through the base. Okay. And then the whole thing went under.

Speaker: 0
01:54:45

So it wasn’t like it all went in one shot. The top of it, it already collapsed and went through.

Speaker: 1
01:54:50

God. That was such a weird time in America. I was thinking about how, like, just catatonic I was on the couch recently. Just, like, the whole week afterwards

Speaker: 0
01:55:03

Right.

Speaker: 1
01:55:03

Just nobody

Speaker: 0
01:55:05

I know.

Speaker: 1
01:55:06

Because it’s weird. Like, I have family members who have kids, and their kids are now like, they were born right around that time. They were one, two years old. They’ve grown up. They don’t they don’t remember kind of what it was like before vatsal, and they I don’t know. I feel like the world my I have a friend and he’s ai, everything went to shit after 09:11. Everything just got shittier.

Speaker: 0
01:55:26

Well, everything did go to shit because that’s when they passed the Patriot Act.

Speaker: 1
01:55:28

Yeah.

Speaker: 2
01:55:29

And that’s when the government really got its hooks into your information. Clarification here. New one, seventeen seventy six, five hundred and 40 one meters tall. Old one with the ai, a 1730 boat with 17 or seven thirteen sixty eight without the spire.

Speaker: 0
01:55:46

Okay.

Speaker: 2
01:55:47

The spire at so without the antenna of the building, it’s 417 meters tyler, so I think you were close.

Speaker: 0
01:55:52

Okay. You were

Speaker: 2
01:55:53

saying right there.

Speaker: 0
01:55:54

So it’s not much different, though. I thought it was a lot different, like 10 stories or some shit. Is is is So it’s 104 floors.

Speaker: 2
01:56:02

And four basement floors.

Speaker: 0
01:56:03

And the other one was 110 floors.

Speaker: 2
01:56:05

And Ai don’t yeah. How many basement floors? I don’t know.

Speaker: 0
01:56:07

Also, it’s ai how tall are the floors, you know? I hope this one doesn’t

Speaker: 1
01:56:12

Do you think this administration’s gonna try and get rid of the Patriot Act?

Speaker: 0
01:56:16

No. No. No. Why would they do that? I think they have that they have those tools and power to actually they’ll they’ll use it as an excuse to go get terrorists.

Speaker: 1
01:56:28

Yeah. But You know? Because Haven’t we all wanted to get rid of it for a long time?

Speaker: 0
01:56:32

Yeah. Well, then they have the Patriot Act two, which is even more invasive.

Speaker: 1
01:56:36

Yeah.

Speaker: 0
01:56:36

Yeah.

Speaker: 1
01:56:37

But people like, that seems like a bipartisan thing somebody could get behind.

Speaker: 0
01:56:41

Maybe. But the the real neocons will come out and say that weakens us against our our enemies. We need this power to be able to find terrorists and to be able to search out. I mean, that was one of the things that the Obama’s ram this the Obama administration when they passed the NDAA Uh-huh.

Speaker: 1
01:57:00

Ai go,

Speaker: 0
01:57:00

you know, we’re not gonna do it. We’re not gonna just, like, detent detain people indefinitely. We wouldn’t do it. But the problem is you’re given the law that law is now in the books, and now the next person, what if it’s a psycho? And the person after that, what if it’s a psycho?

Speaker: 1
01:57:15

Right.

Speaker: 0
01:57:15

Now, you’ve given them power to, like, become a dictator. Right. You can go after their political enemies, which normal people won’t do. Right? You guys don’t do that. Right?

Speaker: 1
01:57:27

I I mean, it’s gonna be a wild four years, I think. Because I I did think that, like, the I Ai figured I figured I thought maybe that people would understand that they were wrong and that, like, maybe they’d gone too arya. And I was

Speaker: 0
01:57:43

Yeah. And

Speaker: 1
01:57:44

it seems like hysteria.

Speaker: 0
01:57:46

I don’t think people learn that good. I think people learn by having their lives ruined and getting really angry.

Speaker: 1
01:57:52

Right.

Speaker: 0
01:57:52

And then they change course. But I think the people that are still comfortable and still working in these environments that still cling to these ideas, they’re gonna double down and wear a pussy hat and fucking paint their hair blue. We protest in the streets.

Speaker: 1
01:58:08

Protest all this. It’s so weird because I was thinking about how I’ve now been ten years in the culture wars, which is not I mean, I kind of stumbled into them.

Speaker: 0
01:58:19

Are you a sergeant now? What are you?

Speaker: 1
01:58:21

Ai don’t know. Yeah. I think I’m a you’re are you a general? I don’t

Speaker: 0
01:58:26

know what I am.

Speaker: 1
01:58:28

I’ve been but it it has been a weird decade to be. I didn’t wanna I was I think so many normies are like me. They just got kind of forced into

Speaker: 0
01:58:38

Yeah.

Speaker: 1
01:58:39

Being in them. It wasn’t like Sai I wanted to I was like a drunk waitress who just wanted to tell jokes.

Speaker: 0
01:58:44

Yeah.

Speaker: 1
01:58:44

That and then I started writing for Playboy. And next thing I know, I’m voting for Trump.

Speaker: 0
01:58:54

Well, you know, as a drunk waitress, you just you’re one of us. You’re a human. And humans, there’s a lot of people that have opinions and ideas on things. They’re just not good at articulating it, or they never learned how to articulate it. But everybody does what we do, or everybody can do what we do.

Speaker: 0
01:59:10

They do it with their friends. They talk with their friends. Yeah. They bullshit about stuff. And, you know, it’s just a process of putting it out there.

Speaker: 1
01:59:19

We’ve had we’ve I’ve had some weird YouTube strikes that that I’d say, meh. It’s easy to do, except I’ve had to play ten years of a game of, like, outrunning sensors and, like, Patreon. Okay. We’re there. And then we’ve gotta get off there because I don’t wanna put all my eggs in that basket in case they go bonkers and shut it down.

Speaker: 1
01:59:39

And then I feel like you’re like, a lot of people in this space have been for, like, a decade out. You know? We’re like, how are we gonna we got, like, the dumbest strike on an old ad of ours. And it was some something that we sai. And then you have to go to, like, you know, like, you’ve gotta go to, like, class and get reeducated.

Speaker: 0
01:59:59

Jamie got reeducated. Jimmy, Jamie, you feel better right now. Right? What? You feel smarter? Now you got reeducated.

Speaker: 1
02:00:04

What did you get reeducated for?

Speaker: 2
02:00:05

Which part of YouTube? You got reeducated. Sure. I guess.

Speaker: 0
02:00:08

Didn’t you? When we had a strike, you had to do the little closet. That’s what

Speaker: 2
02:00:11

we’re talking about. Oh, yeah.

Speaker: 1
02:00:13

He’s like, what reeducation? Ai.

Speaker: 0
02:00:14

He got reeducated. He’s

Speaker: 1
02:00:15

different now.

Speaker: 0
02:00:17

Because we told factual information about COVID nineteen.

Speaker: 1
02:00:21

And so you got a strike? Yeah.

Speaker: 0
02:00:22

It was because it was, like, a long time ago.

Speaker: 1
02:00:24

Oh, but but ours is weird because it was an like, a very old episode. Sai they must be, like, trolling.

Speaker: 0
02:00:31

I don’t know what they’re doing. They’re probably just using their algorithm and going after everything. And

Speaker: 1
02:00:36

Ours literally, we sai, think for yourself and come to your own conclusions.

Speaker: 0
02:00:40

Oh, that’s dangerous.

Speaker: 1
02:00:42

That was the ad. It wasn’t even ours on our I ai, sai an ad, we were doing for someone else’s podcast.

Speaker: 0
02:00:48

Shouldn’t give that advice out. People shouldn’t be doing that.

Speaker: 1
02:00:51

But this is why it’s hard for people to just, like, say what they sana, you ai.

Speaker: 0
02:00:55

Well, what do you think is gonna happen? When you’re looking at all this crazy shit right now with Doge and the uncovering of USAID and this dismantling of this bizarre left wing ecosystem that’s pretty much manufactured. Like, what do you what do you see is gonna happen with the rest of the country?

Speaker: 1
02:01:14

I mean, look. There there are a lot of people who did not vote for Trump. You know, there he won. And, yes, he won every swing state. And Half

Speaker: 0
02:01:22

the country’s a lot of people.

Speaker: 1
02:01:24

Yeah. It’s a lot of people. So and I’m not sure how many of them I wonder how many so I have a couple questions. How many people didn’t vote for Trump and wanted to and are secretly glad he won? I bet there’s a pretty significant number of those people who either didn’t vote or didn’t vote for him, but wanted him to win anyway.

Speaker: 0
02:01:41

Ari says that. Ari says that New York City is, like, relaxed. Yeah. It’s like everybody was relaxed after the election. Like, even the Liberals are ai

Speaker: 1
02:01:48

Yeah. Thank God. They’re like, woo.

Speaker: 0
02:01:50

Yeah. Like, thank God you saved me from myself.

Speaker: 1
02:01:52

Yeah. Exactly. Because God doing for us what we can’t do for ourselves. Yeah. So there’s that, I wonder. But then I do wonder how many of the people are what I worry about is how much mental this is where I get mad at the media for presenting this, like, this is Hitler. We’re having it notched up to 11 for the past eight years, and now you’ve got this because he was put in it’s almost like that that that energy has been transferred to Elon now.

Speaker: 1
02:02:24

Because now do you see the Elon protest? They’re protesting Elon. They’re literally standing out there with signs that say arrest Elon Musk.

Speaker: 0
02:02:32

Listen. I guarantee it’s organized. I guarantee it’s organized by the same people that are gonna lose a shitload of money based on all these, discoveries at Doge. There’s no way they’re not. If you look at what Doge is uncovering, what they’re uncovering with this USAID stuff, a lot of that stuff was organizing through NGOs protests.

Speaker: 0
02:02:52

They ai the, attempt at getting Trump impeached.

Speaker: 1
02:02:57

I thought you were gonna say killed. They they have the impeachment. Come out.

Speaker: 0
02:03:00

They also spent USAID spent $50,000,000 in the on the lab Right. That invented coronavirus.

Speaker: 1
02:03:07

Right.

Speaker: 0
02:03:08

Yeah. There’s there’s a lot of money involved in this not working. And when you have a lot of money involved, you’re gonna have organized protest. When you see protests, we all wanna think protests like the nineteen sixties. Yeah. Fight the war, man. It’s ai organic, like speak, love, and hippie shah.

Speaker: 0
02:03:24

But that’s not what this is. What this is is organized funded protests where someone is spending a lot of money and they’re mobilizing other NGOs. They’re using their access to these mailing lists and all these different things. They’re putting these things together.

Speaker: 1
02:03:41

Yeah. But do you do you think that the average liberal person do you think that they’re gonna wake up and come around? Or do you think that they’re gonna be in, like, agony for four years and be What

Speaker: 0
02:03:53

if what if it works? Like, what if the the world gets safer, the country gets safer, the economy improves, and they don’t attack civil rights, women’s rights, gay rights, all these things that everybody’s worried about. And then at the end of it, you ai, like, hey, maybe I was wrong.

Speaker: 1
02:04:06

I mean, that’s my question though. Are you are they capable of that kind of self reflection? If you the What

Speaker: 0
02:04:13

we’re saying they is if it’s ai one different like one kind of person.

Speaker: 1
02:04:17

No. I mean There’s

Speaker: 0
02:04:18

so many different people in they.

Speaker: 1
02:04:20

Agreed. I mean, when I say they, I mean people who are have bought into the idea that this person is going to make everything worse and we’re going to slip into a fascist technocracy.

Speaker: 0
02:04:33

Some of them. Some of them are gonna slip right into it. Some of them are gonna wake up and come around. You know, it’s a test. It’s a test of character. It’s a test of objectivity. It’s a test of introspection. There’s a lot of things that are gonna happen where people are gonna have to wonder, like, what was I rooting for?

Speaker: 0
02:04:52

When I was rooting for this progressive liberal government, what was I actually rooting for? Was I rooting for warmongers who were making insane amounts of profit by funding overseas wars? Yeah. What was I word what was I rooting for? Was I rooting for a a basic theft of our our tax dollars that’s gone to all these, like, completely useless endeavors that were only set up as ways to pilfer money?

Speaker: 1
02:05:17

Or the fall of America. You know, this is this is a weird thing to, like, hate your own country so much that you want it to to fall or the the dismantling of, the life the kind of West as we know it. You know, I think there is a there is a feeling that Elon might have saved the West.

Speaker: 0
02:05:39

Yeah. I

Speaker: 1
02:05:40

ai do that. At least he’s trying to.

Speaker: 0
02:05:42

I think he did. I think he did, but just by buying x.

Speaker: 1
02:05:45

But is it funny that he got forced into it? I think there’s, like, something ai of ironic.

Speaker: 0
02:05:50

Because he was trying to get it at a lower price because he knew that they were bullshitting about the amount of bots.

Speaker: 1
02:05:55

Well and then they came out yesterday, and now it’s profitable.

Speaker: 0
02:05:58

Yeah.

Speaker: 1
02:05:58

There was, like, a whole article about how he turned it around.

Speaker: 0
02:06:01

Of course, it’s gonna be profitable. He’s smarter than you.

Speaker: 1
02:06:03

He is smarter. He also knows that we are all

Speaker: 0
02:06:06

Tapped in. Oh.

Speaker: 1
02:06:07

Yeah. I can’t I can’t stop it. I have a I have a I’ve always had a Twitter addiction. It’s the thing that replaced my drinking and smoking weed when I quit that. I went right to Twitter. I was like, Elon knows none of these motherfuckers, even when they go to Blue Sky, they’ve all got secret accounts, and they’re lurking on Twitter still or x still or whatever.

Speaker: 0
02:06:27

You can’t live in that echo chamber, that blue sky echo chamber. You can’t even go on there and say there’s only two genders. If you go on there and say there’s two genders, they ban you immediately.

Speaker: 1
02:06:36

They’re really eating themselves within two days over there, though.

Speaker: 0
02:06:38

But that’s what always happens. That’s happened with Gabb too. Right?

Speaker: 1
02:06:42

Yeah.

Speaker: 0
02:06:42

And I think some of that is not even real, because I personally know comics who go on Blue Sky and just say insane leftist stuff, just like the most preposterous thing, like, you know, like maybe Duncan would do. I’m not saying Duncan does it, but he might do it. But I know comics that do that just to see the reaction, how people agree with them, like insane stuff.

Speaker: 3
02:07:02

Oh, okay.

Speaker: 0
02:07:03

Ai, my toddler I could tell when my my toddler was trans when they were three days old.

Speaker: 1
02:07:07

And people agree with them?

Speaker: 0
02:07:08

Oh, yeah. Jump right in.

Speaker: 1
02:07:10

Don’t you think we’re all in echo chambers, though, like you said? How do you how do you stay outside of yours?

Speaker: 0
02:07:15

Drugs.

Speaker: 1
02:07:16

Oh, that’s true.

Speaker: 0
02:07:17

That helps a lot. No. I’m kidding. You just gotta be recognized, like, you’re thinking. You have to have a a process.

Speaker: 1
02:07:24

What’s your process? Ai I live by the Joe Rogan School of Internetting. That is Ai I take all your advice. I don’t read the comments and I post and ghost.

Speaker: 0
02:07:32

That’s good.

Speaker: 1
02:07:33

I don’t, like, get in there and

Speaker: 0
02:07:35

That’s good. Ai, I Ai definitely do that. And then I Ai think of, like, what do I really think about something and why do I think about it that way? And ai I try to go, does it because it’s self serving? Is it because I’ve said it in the past that I wanna be right? Is it what is it? You know, like, what what makes me believe what I believe?

Speaker: 1
02:07:54

Yeah.

Speaker: 0
02:07:55

And then and then you have to do a deep dive and all the goddamn arguments back and forth to figure out who’s right and all the logic and the illa and and see people ignoring certain core facts. Like, this is the the the most fascinating thing about this USAID thing to me is how people on the left are completely ignoring all the rampant obvious corruption Yeah.

Speaker: 0
02:08:16

And conflicts of interest in the government funding, non government organizations shah in turn fund the government. Yeah. That’s so crazy. And anybody that doesn’t think that’s crazy, it’s ai, what do you love money influencing everything in your daily life? Don’t you think that maybe we have a problem and that we have a huge deficit?

Speaker: 0
02:08:35

And if part of our huge deficit, we’re spending fucking billions of dollars every year on horse shah.

Speaker: 1
02:08:40

Yeah. I mean, that is my issue with people being ai, sai, it it is the, like, political thing isn’t right, which is true. It’s important to be accurate, but it’s also ai, yeah. But there’s there’s this money shouldn’t be spent this way. No. So just because this thing is maybe not true, you’re pointing to this one thing that could be people are misrepresenting, but there’s a much bigger problem you seem to wanna just sweep under the rug and say, oh, like, you guys are just crazy for noticing all these connections.

Speaker: 1
02:09:09

I’m but didn’t they make them stop or something? Did they freeze their ability? Frozen. Yeah. These kids are nuts.

Speaker: 1
02:09:17

The guy who, like, ram you know, found the scroll how to read the scroll from Pompeii.

Speaker: 0
02:09:22

From Pompeii. Yeah. Using AI. Yeah. Yeah. No. They’re wizards. That’s that’s the kind of people that you want digging into this stuff.

Speaker: 1
02:09:28

Yeah. And

Speaker: 0
02:09:28

that’s why Elon got them. I mean, he’s he knows what the fuck he’s doing. It’s it’s, it’s all very strange because it’s, you’re dealing with so many things that are happening at the same time. You have this technology that was never available before that is allowing people to freely express themselves online. Right?

Speaker: 0
02:09:46

And then you have this maniac billionaire who buys the biggest one and makes it the Ai West again. And then you have government being exposed for what it is. So you you have all these fucking NGOs, this web of we talked about it yesterday, this web of 55,000 different NGOs that were supporting all these liberal causes that were all completely ai, and they had to use software to find that and then figure it out.

Speaker: 1
02:10:10

Yeah.

Speaker: 0
02:10:10

So they’re they’re finding out that this is, like, this complicated propaganda network

Speaker: 1
02:10:14

Yeah.

Speaker: 0
02:10:14

That existed. Look, that’s not good. No. That’s not good for anybody left or right because it’s money. It’s just money being used in a way. Not only that. Here’s the thing. Like, people, like, sai, like, the $10,000,000,000 it’s only $10,000,000,000, ai, that kind of shit. What about the fucking people in Maui? Yeah.

Speaker: 0
02:10:31

What about that could have been fixed with $5,000,000,000. The government could have sai, we are going to rebuild those people’s homes Yeah. To the exact state and even better than where they were before. It’s gonna cost us $5,000,000,000. All the money that we’re sending over to Ukraine, all the money that we send to Israel, all the money we send to all these different organizations that that work with USAID in the tune of billions and billions and billions of dollars.

Speaker: 0
02:10:56

They could have done vatsal. And instead of doing that, they’re telling us that what we have to do is continue to fund all these programs and all these other countries that just so conveniently have a whole staff of people that’s making a a a great wage, and their political capital Right.

Speaker: 0
02:11:12

Is bet on all this stuff, and it all is involved with intertwined with these NGOs and all this money they’re getting. And this is the thing, we have to go and help these people. We’re gonna starve. They’re gonna do this. They’re gonna do that. What about America? Yeah.

Speaker: 1
02:11:26

And then you have, like

Speaker: 0
02:11:27

our tax dollars going into a safety net to help people from one of the worst wildfires in in history.

Speaker: 1
02:11:33

Or North Carolina.

Speaker: 0
02:11:35

Or North Carolina. Yeah. Exactly the same thing. Yeah. Apparently, none of those people have gotten any money in North Carolina.

Speaker: 1
02:11:40

No. And they’re, like, sleeping in freaking tents in the freezing cold.

Speaker: 0
02:11:44

Exactly. All that could have been addressed too. The same way we address problems in other countries. So if you have this amazing slush fund that USAID is and it’s not being applied at all to the problems of America, why would you think people would support it? Yeah. Why would we support it with our tax dollars when we know that the country is massively in debt?

Speaker: 1
02:12:04

Well and it’s ai I I appreciate that there are so many problems. I think people I do feel like for the first time ever, there is an actual transfer of power. Like, every every single election through my lifetime, it’s been there’s been all the normal fighting between the two parties, and then the election happens and one party wins, and then nothing changes.

Speaker: 0
02:12:28

Yeah.

Speaker: 1
02:12:28

And then you just keep sliding and sliding and sliding and sliding. And everybody says they’re gonna do something, and no one does anything. And then you have fucking Saloni come in like a wrecking ball and a very motivated and, Trump with who seems organized this tyler. And they actually feel like they’re making changes. Changes that, by the way, American people want. Yeah.

Speaker: 0
02:12:50

Not the border. The border? They fixed the border overnight. I know. Overnight.

Speaker: 1
02:12:54

Ai looking into there my friend Sai Meyer was saying he’s a very smart guy. He was saying no one with any numerical ability has ever looked into Medicare and Medicaid fraud ever. Like Ai, they’re

Speaker: 0
02:13:08

that’s what they’re cracking into next.

Speaker: 1
02:13:09

There’s a lot of money there that is, like, provider side fraud where it’s just inflated.

Speaker: 0
02:13:14

Wasn’t that Sonny Hostin’s husband? Isn’t he involved in a RICO lawsuit

Speaker: 1
02:13:19

Oh, I don’t know.

Speaker: 0
02:13:20

About insurance fraud?

Speaker: 1
02:13:21

The view person?

Speaker: 0
02:13:22

Uh-huh. Yeah.

Speaker: 1
02:13:23

Oh, I have no idea.

Speaker: 0
02:13:24

Her husband got arrested.

Speaker: 1
02:13:25

Oh, really?

Speaker: 0
02:13:26

Yeah. He’s one of 200 people charged in some gigantic insurance fraud.

Speaker: 1
02:13:32

There’s a lot of that.

Speaker: 0
02:13:32

What is that? What’s the actual story behind that? I’m looking. Jamie’s looking.

Speaker: 1
02:13:36

Jamie’s on it.

Speaker: 0
02:13:37

He’s on it.

Speaker: 1
02:13:37

Yeah. There but there’s sai they can take I mean, you know how complicated it would be to try and untangle that web of where you’re inflating things that people have, and you’re saying this is ai a band you see these things online where it’s like a band aid is $25,000. You know? Like, what?

Speaker: 0
02:13:55

Someone explained that to me too, is that when you have a budget, say if you run an organization and has a budget, you get the money from the government and that budget’s ai $80,000,000 a year, you can’t spend 60 because then you’re gonna get 60 next year. You have to spend all 80. Right. If you don’t spend all 80, you don’t get it. You gotta say, we need 90. Right. We’re barely hanging on. Right.

Speaker: 0
02:14:13

Ai so you have to charge them $500,000 for a hammer. Oh. And and that’s that’s literally how they justify their existence is the key. They’re they’re not in the business of being frugal and, you know, and being responsible and and making sure that the money is being spent competently. No.

Speaker: 0
02:14:29

They’re in the business of keeping their budget coming in.

Speaker: 1
02:14:32

Are you Sonny?

Speaker: 0
02:14:32

Sonny Hoskins’ surgeon husband, Emmanuel, faces solo battle in lawsuit as codefendants agree to settle. Oh, Jesus. So okay. Sonny Hostin’s surgeon husband has found himself in the hot seat in a massive lawsuit accusing nearly 200 health care providers of insurance fraud. Doctor Emanuel Hostin defiantly called himself the victim of a frivolous smear campaign last month after he was accused of providing fraudulent medical services in exchange for kickbacks in a complaint filed by American Transit Insurance Company in December.

Speaker: 0
02:15:04

Orthopedic surgeon now finds himself increasingly isolated after the vast majority of the defendants have now offered to settle their cases.

Speaker: 1
02:15:11

Oh, boy.

Speaker: 0
02:15:12

According to a new court filing obtained by dailymail.com, American Transit, which insures taxicab taxi company and Uber Lyft ai, announced last Monday, more than a 41 of the 86 defendants named in the suit have agreed in principle to settle one of the largest RICO cases ever filed in New York. The papers filed by the law marketing law firm Manning Cass did not specify which defendants offered to settle.

Speaker: 0
02:15:39

Doctor Hostin has until February 10 to respond to the legal complaint. So it has something to do with drives from the hospital and kickbacks from from all this shit.

Speaker: 1
02:15:52

Yeah. It’s it’s because they’re so that’s what they need, though, is these guys who can use this AI technology to get in there and start looking on a massive scale at what all of these, everything. I mean, they have to look at everything. I think most Americans want this.

Speaker: 0
02:16:10

So they’re being accused of getting kickbacks for performing surgeries and submitting fraudulent bills.

Speaker: 1
02:16:15

Yeah.

Speaker: 0
02:16:16

Hostin knowingly ai, this is in quotes, knowingly provided fraudulent medical and other health care services, including arthroscopic surgeries. The lawsuit filed on December 17 claims Meh Transit was then billed in exchange for kickbacks and or other compensation, which were disguised as dividends or other cash distributions.

Speaker: 0
02:16:35

I don’t know how they know this. So I don’t I don’t wanna comment on this. I don’t really understand what this is. This this might be bullshit. It might be real.

Speaker: 0
02:16:43

But I do know that doctors

Speaker: 1
02:16:45

But this happens all the time. I mean, maybe not this. But there’s provider side fraud that is very well known, and people have been called out for it.

Speaker: 0
02:16:53

There’s a doctor that I was just watching this whole thing on where he, I think it was 200 people he treated, it might have been more, that did not have cancer, and he gave them chemotherapy.

Speaker: 1
02:17:06

Oh, my God.

Speaker: 0
02:17:07

Yeah. He he’s he falsely diagnosed them as having cancer and then treated them with can for cancer. Yeah. And some of them got really sick and died.

Speaker: 1
02:17:16

Well, yeah. It’s chemo. Yeah.

Speaker: 0
02:17:17

It’s chemo. It’s fucking killing you. And it wreaks your health. So even if you survive it, like, your your body’s wrecked for a long time afterwards.

Speaker: 1
02:17:24

And did he do this just for money?

Speaker: 0
02:17:26

For money. Yeah.

Speaker: 1
02:17:27

Oh, yeah.

Speaker: 0
02:17:27

And his argument was that this term that they always use, you eat what you kill, and that you have to, like, you have to have a business. You have to, like, keep your business rolling. So his business was making sure that people thought they had cancer and then treating them for cancer.

Speaker: 1
02:17:42

That’s dark.

Speaker: 0
02:17:43

Well, that’s cancer and chemotherapy is one of those weird ones where the doctors profit off of each chemotherapy.

Speaker: 1
02:17:49

Oh, do they?

Speaker: 0
02:17:50

Yeah. $34,000,000 in fraudulent a Detroit area hematologist oncologist was sentenced today to serve forty five years in prison for his role in a health care fraud scheme that included administrating medically unnecessary infusions or injections to oh, it’s 553 individual patients and submitting to Medicare and private insurance companies approximately $34,000,000 in fraudulent claims.

Speaker: 1
02:18:13

This is the fucking sickest thing I’ve ever heard. You tell these people they have cancer

Speaker: 0
02:18:17

I know. Right?

Speaker: 1
02:18:18

And make them get chemo?

Speaker: 0
02:18:19

I know.

Speaker: 1
02:18:20

This is insane. I can’t believe I’ve never heard of this.

Speaker: 0
02:18:23

Yeah. It’s really crazy.

Speaker: 1
02:18:24

This is ai a decade ago. Wow. Uh-huh. This is fucking nuts. How did they catch him?

Speaker: 0
02:18:30

I don’t know.

Speaker: 1
02:18:31

That’s wild.

Speaker: 0
02:18:33

Yeah. But I No.

Speaker: 1
02:18:34

That’s so sick.

Speaker: 0
02:18:35

I know people that have told me that they know people that have done surgeries that were unnecessary. Yeah. Because Just

Speaker: 1
02:18:41

for money.

Speaker: 0
02:18:42

They want money. Yeah.

Speaker: 1
02:18:43

Well, that’s ai the whole, like, trans cares, like, you know, the the a lot

Speaker: 0
02:18:47

of it like orthopedic surgeries. I I have friends Back

Speaker: 1
02:18:50

surgeries and stuff.

Speaker: 0
02:18:52

Doctors that won’t let someone try stem cells, that try to deny people stem cells.

Speaker: 1
02:18:58

So yeah. The the back surgery thing weird too. Yeah. Yeah.

Speaker: 0
02:19:02

Well, it can help some people, but it’s not the only solution. And there’s other ways to fix your back. I always tell people that you should there’s a lot of different ways to fix back issues. I’ve had back issues, but I didn’t have surgery. I was told to have surgery. I had a bulging disc in my neck. I was told that I had to have a discectomy. I did not.

Speaker: 0
02:19:20

My neck is perfect. It works great now.

Speaker: 1
02:19:22

How did you fix it?

Speaker: 0
02:19:23

I did regenikene. Regenikene is, it’s like a very advanced form of platelet rich plasma. They used to have to go to Germany to get. I remember, like, Peyton Manning and Kobe Ai, those guys, they flew to Germany to get this procedure done. And Dana White did it too. And then they opened up a place in Santa Monica, where you could do it in Santa Monica, and I had it done. I had it done on my back.

Speaker: 0
02:19:42

I had it done on my neck. It’s amazing. They take your blood. They spin it in a centrifuge. They forget what the exact process they do, and then they pull out this liquid that is the most potent anti inflammatory inflammation drug that you could ever find.

Speaker: 1
02:19:58

Okay.

Speaker: 0
02:19:58

So this anti inflammation drug, they it’s made out of your own blood, so your body doesn’t reject it. Yeah. This process that they do. And then they inject it in the areas around the discs, and the discs all settled, and they went right into place. Wow. Yeah. So I had it done in my lower back. I had it done in my upper back, and I had it done on my neck.

Speaker: 1
02:20:15

See, I think that this is where Ai excited about AI and stuff like that. It would be exciting to have AI cure cancer, which they think that it might be able to do. Yeah.

Speaker: 0
02:20:28

I’m ai little I saw that Larry Ellison thing. I was like, yeah. Are you making money doing this?

Speaker: 1
02:20:32

Well, obviously, I don’t

Speaker: 0
02:20:33

know if Larry

Speaker: 1
02:20:34

is Larry Ellison saying that. Oh, yeah. He did.

Speaker: 0
02:20:36

Like, why are you doing that? Are you a doctor? Don’t you own Sana? What are you doing?

Speaker: 1
02:20:39

But I don’t

Speaker: 0
02:20:40

see It seems like

Speaker: 1
02:20:41

It seems like it’s not outside the realm of possibility that AI would get smart enough to figure this shit out.

Speaker: 0
02:20:46

100%. Yeah. It is possible, for sure. And it’s very hopeful. But, you know, I don’t know if we should be telling people that it could do that maybe in the future, so invest in my company or whatever the fuck’s going on, you know?

Speaker: 1
02:20:59

I was at this crazy boondoggle, and I saw a bunch of these people talk and everybody was clapping. And I was like, this is the scariest shit I’ve ever heard. And I think it might have been Sam was talking Altman. And everybody was just like, yay. Let’s give them money. And they all just have money. Everyone sees money here in the the the Gold Hills. I mean, we’re gonna need to build nuclear.

Speaker: 1
02:21:21

We’re gonna need to power this shit. We’re gonna there’s a lot of gold in those hills. And I was in the back with this guy, and I’m like, someone needs to stop it. Like, it was It’s

Speaker: 0
02:21:31

the beginning of a tornado.

Speaker: 1
02:21:32

Down my spine.

Speaker: 0
02:21:34

Yeah. It’s gonna happen though.

Speaker: 1
02:21:36

You can’t stop it. No. Unless the grid goes down, ai, or or something unless there is something catastrophic, asteroid hits the Earth, whatever, which I think there might be one coming. There is there’s nothing we can do. Nope. The genie’s out of the bottle.

Speaker: 0
02:21:53

Well, not just that. We have to do it because there’s other people that are doing it. And if they get a hold of it first, it’s over.

Speaker: 1
02:22:00

That’s the race to the bottom.

Speaker: 0
02:22:02

Is it? My friend, like I mean, we all have it, and it just it eventually works for humanity instead of against us.

Speaker: 1
02:22:09

Ai I do think that, ai, my husband thinks that our daughter will never need to learn how to drive.

Speaker: 0
02:22:16

Probably.

Speaker: 1
02:22:17

He’s like, she’s not gonna need because I was like, she’s she wants to drive already. She’s like, when I’m five, I can drive. I’m like, no. She’s she is having kids is the best. I hate that there was so much rhetoric that I was growing up with about, like, how horrible it is because it is yeah.

Speaker: 0
02:22:35

It’s all by people who don’t have kids.

Speaker: 1
02:22:37

But it’s so it is it the other morning, I go downstairs and she’s ai I’m like, how are you doing? She’s eating her breakfast. She’s like, I just need to lay low. I was ai, arya we hiding from a mob? What is happening?

Speaker: 0
02:22:52

They’re experimenting with news ways to talk.

Speaker: 1
02:22:54

They’re just so funny. She’s like and she just says the funniest stuff. She did a whole my husband sent me a whole thing. She picks up every they pick up everything.

Speaker: 0
02:23:03

Yeah.

Speaker: 1
02:23:03

She was like, I’m going to mothership, and you’re gonna go to New York. I’m like, where’d she get the New York accent from? I’m going to mothership, and you’re going to New York to to tell jokes. She’s telling them all about this. Ai like, how does she pick all this? It’s the best thing in the whole world. But, yeah, he he doesn’t think she wants to ai, and he’s like, she’s not gonna need to know how to drive.

Speaker: 0
02:23:24

Well, she probably will learn how to drive, but it won’t be necessary. I don’t I think, they’re gonna make the argument that autonomous driving is way safer.

Speaker: 1
02:23:36

But I think it probably is.

Speaker: 0
02:23:38

Oh, it is?

Speaker: 1
02:23:38

Yeah. Yeah. I have the

Speaker: 0
02:23:40

auto drive on

Speaker: 1
02:23:41

my team. Autonomous. Yes. It can’t be Exactly. No one’s gonna be allowed to drive.

Speaker: 0
02:23:46

That’s probably what’s gonna happen.

Speaker: 1
02:23:47

That’s the only way. They’re all gonna be communicating with each other, and it’s There’ll

Speaker: 0
02:23:50

probably be roads that are set up where you can allow, people to drive.

Speaker: 1
02:23:54

How do you feel about that?

Speaker: 0
02:23:55

I don’t know. I like cars. Yeah. You do. Good. I like old cars too.

Speaker: 1
02:23:59

I like driving. Yeah. It helps my brain

Speaker: 0
02:24:02

Yeah. Unfurl. Well, I I like machines. I’m I’m into old machines. I like the way they work. Do you

Speaker: 1
02:24:09

know how to fix cars?

Speaker: 0
02:24:10

I know how to fix some things, but not really. Yeah.

Speaker: 1
02:24:13

I mean,

Speaker: 0
02:24:13

I know how to change spark plugs and change oil and I I understand. Like

Speaker: 1
02:24:17

the basics.

Speaker: 0
02:24:17

Yeah. Normal stuff. But most cars don’t even have spark plugs anymore.

Speaker: 1
02:24:21

No. It’s weird.

Speaker: 0
02:24:22

Yeah. It’s all wild. So you you open up the back of, like, you know, open up the, like, back of a Porsche. Like, look at the engine. You’re like, what the fuck is that?

Speaker: 1
02:24:30

And Tesla’s only have one gear? Is this how does it I don’t understand how that

Speaker: 0
02:24:35

Have you ever been in one?

Speaker: 1
02:24:37

No. Oh,

Speaker: 0
02:24:37

dude. I

Speaker: 1
02:24:38

didn’t know what to say. Been in one.

Speaker: 0
02:24:39

Next time I have mine, you gotta go for a ride. Yeah. Mine’s insane.

Speaker: 1
02:24:44

The new one?

Speaker: 0
02:24:45

Yeah. It goes zero to 60 in one point nine seconds.

Speaker: 1
02:24:47

That’s nuts.

Speaker: 0
02:24:47

It’s a time machine.

Speaker: 1
02:24:48

How does it do that?

Speaker: 0
02:24:49

Because it has a thousand horsepower and four wheel drive and incredible ai engines that just instantly generate torque and power. It just goes electrical engines that just instantly generate torque

Speaker: 1
02:24:57

and power. It just goes

Speaker: 0
02:24:58

crazy. It just takes off. They’re amazing. And it also self drives. It dozens of self drive. It changes lanes. It stops for stop signs. It stops for red lights. It turns, stop. It it just it’s incredible. Changes lanes when there’s obstructions in front of it.

Speaker: 1
02:25:13

Can you take your eyes off the can you, like, text and stuff? Can you be a passenger?

Speaker: 0
02:25:17

You could, I guess. People have fallen asleep at the wheel, famously.

Speaker: 1
02:25:21

But don’t they give you, like, strikes or something like that? Like Tesla itself will

Speaker: 0
02:25:25

Yeah. Supposedly, if you’re not looking That’s kinda creepy. Yeah. It’s creepy. Yeah. It’s all creepy. But if you wanna do it right, that’s how you have to do it. You can’t just let people take naps and just press the press play.

Speaker: 1
02:25:36

But isn’t that the point? I wanna be in my car typing.

Speaker: 0
02:25:39

Well, eventually, you’ll be able to do that. Eventually, it won’t even have a steering wheel. Eventually, there’ll be a pod that you get in and it takes you places.

Speaker: 1
02:25:45

This reminds me of that Silicon Valley episode where the self driving ai drives right into one of the shipping containers, and then he’s just stuck in a shipping container.

Speaker: 0
02:25:53

I didn’t see that. Oh, it’s

Speaker: 1
02:25:54

sai

Speaker: 0
02:25:55

never watched that show.

Speaker: 1
02:25:56

That show is brilliant.

Speaker: 0
02:25:57

Yeah. It’s it’s weird. But, again, if you go back to the people that were on trains and riding horses, and you sai, one day, you’re gonna be able to get in a car that goes one point nine seconds zero to 60, and it’s gonna be electric and make no sound. It go you’d be like, what?

Speaker: 0
02:26:12

Well, one day, you’re not gonna need to steer because steering is why people get fucked up because the cars can’t detect other cars around them. They change lanes. People make mistakes. They go forward when they shouldn’t. They run red lights. All that’s gonna end.

Speaker: 1
02:26:25

So Ai, like, a half hour out of the city, and every time I drive to and ram, there’s a almost every time there’s an accident, I always see. Sure. The other night, I was driving home actually from Mothership. It was, like, 01:30 in the morning, and there was a guy on the left side of the freeway walking.

Speaker: 1
02:26:44

Yeah. You there’s no space, by the way. I was like, what is happening right

Speaker: 0
02:26:51

now? Texting.

Speaker: 1
02:26:52

I’m not paying attention. No. There are trucks behind me. Yeah. It was nuts. I was like, this guy is either, like, on drugs and crazy or it’s ai some Jason Bourne situation where he was, like, limping.

Speaker: 0
02:27:04

Like, he’s just He probably broke down and had to walk.

Speaker: 1
02:27:08

No. He looked crazy.

Speaker: 0
02:27:09

Oh. He looked Crazy person. He ai to flirt with death.

Speaker: 1
02:27:12

It was I mean, Texas is a little it feels a little unhinged.

Speaker: 0
02:27:17

Does it compared to LA? When was the last time you were back in LA?

Speaker: 1
02:27:20

No. I mean, compare I was just there. I was there two days before the fire.

Speaker: 0
02:27:22

People are unhinged.

Speaker: 1
02:27:24

Yeah. Yeah. No. I mean, it feels a little wild. Like, I like it. Better? I like the I like the it makes I’m I’m coming up on two years here.

Speaker: 0
02:27:35

So what do you mean by wild? Like, what

Speaker: 1
02:27:37

Feels wild to me. That still feels very ai the Ai West or, like, Texas. There’s just an energy here of kind of every time I’m down on Sixth Street, I’m like, I’m gonna catch a stray down here.

Speaker: 0
02:27:48

Well, Sixth Street is wild. Where the club is, that’s a wild place.

Speaker: 1
02:27:52

It’s fucking bonkers out there.

Speaker: 0
02:27:54

Yeah. But that’s also makes the club exciting.

Speaker: 1
02:27:57

You might catch a stray while you’re waiting.

Speaker: 0
02:27:59

No. Not that. But all of people, the foot traffic, you know. There’s there’s an energy on that street.

Speaker: 1
02:28:05

Yeah. No. I like it. Yeah. And I like the kind when I even coming from LA, driving in Texas, I was like, these people are out of their fucking minds because it’s so much bigger and every the it’s, like, 80 miles an hour speed limits everywhere out where I am. So everyone’s actually doing 90. And I would be driving and I’m like they’re like, get out of the way, grandmas. And I’m like, I’m doing fucking 85.

Speaker: 1
02:28:31

Like, it’s not ai I’m going slow.

Speaker: 0
02:28:33

Are you in the left lane?

Speaker: 1
02:28:34

No. No? I’d be in the middle lane.

Speaker: 0
02:28:36

You’re still trying to get you out of the way.

Speaker: 1
02:28:38

Yeah. I was a grandma.

Speaker: 0
02:28:39

I had

Speaker: 1
02:28:40

to up my game.

Speaker: 0
02:28:42

Well, do you need to see that in LA too?

Speaker: 1
02:28:45

Yeah. But it wasn’t it wasn’t it was always ai street racing kids, like, maybe out further. You could how often could you get your car to 80 miles an hour driving around LA?

Speaker: 0
02:28:57

At ai.

Speaker: 1
02:28:57

I mean, at night.

Speaker: 0
02:28:58

Yeah. But Ai see a lot of people driving super fast at night now. Yeah. Yeah. More unhinged people, I think.

Speaker: 1
02:29:04

Really?

Speaker: 0
02:29:04

Yeah. Because I would see people driving recklessly on the one zero one, like, when I was coming home from the store sometimes. Ai like, what the fuck?

Speaker: 1
02:29:11

That’s fine. Going home, though. Like, it’s I don’t know. It’s pretty it’s pretty nuts.

Speaker: 0
02:29:16

Well, when people live a half an hour away from the city and they wanna get home quick, that’s how you do

Speaker: 1
02:29:21

it. Now it’s me.

Speaker: 0
02:29:21

Gotta hit the gas.

Speaker: 1
02:29:22

Now I’m the one who’s doing 95.

Speaker: 0
02:29:24

Well, it’s those people too.

Speaker: 1
02:29:25

No. It’s true. It it’s I I don’t know. I’m glad I moved here. I remember you saying, like, it feel my husband’s blood pressure actually went down when he moved

Speaker: 0
02:29:36

here. Yeah. There’s less people.

Speaker: 1
02:29:38

But it’s also just, like, everything is less of a battle. You said it to meh. I think, like, you felt like you could exhale on one of our like, when you’re talking, you’re like, I moved there and just felt ai, I can, like, breathe a little. And Sai mean, my husband has, like, scientific evidence that he his whole, like, energy field got more chill.

Speaker: 0
02:29:59

Yeah. You’re not supposed to be living in a place that has 20,000,000 people. I think it’s bad for you.

Speaker: 1
02:30:03

Have you ever been to Cairo? No. Cairo is nuts. Yeah. It’s the craziest city I’ve ever been in.

Speaker: 0
02:30:10

I wanna go. When you went to the Pyramids, when was it? How long ago?

Speaker: 1
02:30:14

Oh ai gosh. It was crazy. It was right after the revolution. So we went in ’20 it was in it was ’20 it was February when were when was the, Arab Spring?

Speaker: 0
02:30:28

That was Internet times.

Speaker: 1
02:30:30

It was twenty Two

Speaker: 0
02:30:31

thousand eleven.

Speaker: 1
02:30:32

’12, I wanna say. Twenty 02/2011, ‘2 thousand ’12.

Speaker: 2
02:30:36

What’s that? Twenty ten.

Speaker: 0
02:30:37

Twenty ’10, average spring.

Speaker: 1
02:30:39

Sai it must have been 02/2011.

Speaker: 0
02:30:41

That was one of the first revolutions that the Internet was credited with starting.

Speaker: 1
02:30:45

Yeah. And no one was there. So everyone was like, why the fuck are you guys in Egypt? We got basically a private tour of the whole country. Like, there there was sai we were on a cruise on the Nile, and it was supposed to be, you know, hundreds of people. There were 14 people on the cruise. And we got to see there’s usually lines to go into the Great Pyramids.

Speaker: 1
02:31:06

We there were no lines no lines at the to see King Tut. Not there was no one was anywhere. It was it was a weird time to be in Egypt. And we saw I went to Alexandria, was in Cairo, went and did, like, a Nile cruise, which was amazing. I mean, it Egypt’s you gotta go. Yeah.

Speaker: 1
02:31:27

Ai definitely do.

Speaker: 0
02:31:29

Did you get this weird feeling? They’re ai, how did this go away?

Speaker: 1
02:31:33

Yeah. That’s what that’s what I don’t understand. And I had I think I had, like, a mental breakdown when I was there. Like, a weird I was we were in the hospital, but not the hospital. I want I felt like I should get checked into a hospital. I was we were staying in, like, the Nile across from the all the tombs, the, like, king’s tombs.

Speaker: 1
02:31:57

And I kept feeling this, like, pulse, and I had, like, a panic attack when I was there, basically. It was the weirdest thing, though. I kept feeling this pulse. It was ai, womp womp.

Speaker: 0
02:32:07

Was this when you were doing drugs?

Speaker: 1
02:32:09

No. No. I mean, I was drinking still, but I wasn’t doing drugs. And I I was like, I’m gonna have a I felt I thought I was gonna end up in a straight jacket, and I thought I had, like, a panic attack. I had, like, these crazy ram, and I was like, oh, this is where it all started for, like, my whatever. It was fucking it’s weird.

Speaker: 1
02:32:31

So do you think It’s a weird energy.

Speaker: 0
02:32:33

Do you think you’re getting this feeling, like like, you said, like, here, it’s in the soil. There’s something in the soil here, like, this fierce independence because these people are tough and they survived making it across the Great Plains and all that shit. Do you think you felt that there, like, this is the the the feeling of this civilization that you see exist in this place?

Speaker: 1
02:32:53

Yeah. It felt to me I feel like I had, like, a past life regression, if that’s even a thing. It was ai Sai felt ai whatever journey I’ve been on on this planet, it started there, and I went back to the source. And it was ai a weird like, my dreams were crazy. It was I couldn’t thank god the guy I was with was, like, nice and kind, but I couldn’t I I it was like a full blown panic attack.

Speaker: 1
02:33:23

I couldn’t even leave the hotel room.

Speaker: 0
02:33:26

And And do you think that’s just because of being in Egypt?

Speaker: 1
02:33:28

It was just being in Egypt. It was

Speaker: 0
02:33:30

Do you think it’s because of the like, you knew what the civilization was like and it had declined and gone and now you’re there and you’re just, like, psychologically dealing with this and you’re ai freaking out and putting it on yourself?

Speaker: 1
02:33:44

So they had they had Ai always been obsessed with Egypt since I was a little kid. Just unnaturally obsessed with it. It was ai I read every book about it when I was a kid. I wanted to go there. I don’t know why. And I felt like, you know, Ai like to believe in past lives because I think it’s amusing.

Speaker: 1
02:34:03

I’m like, if I can choose, it sounds like a fun it’s like, why not? And I’ve always felt very connected to it. And so when I went there and was across from we were at the in the King Farouk suite, actually, and it had a balcony that overlooked the Nile and the tombs. And it was I could feel it was ai something in my heart, like a vibration.

Speaker: 1
02:34:24

And they’ve said that these places, like, where there are these pyramids, they can have vibrations. I learned this later. But this was ai a weird like, it kept hitting my heart. And then I thought they drugged me. I thought they gave me something in my hibiscus tea because I I arya, ai, I immediately I drank the hibiscus tea, and I was like, maybe they put some kinda, like, hallucinogen in it.

Speaker: 1
02:34:48

But I think I just I like I think about this a lot because I’ve had friends who have had anxiety and panic attacks, and I know what it’s like to, like, get in your head and feel like you’re losing your mind. And I had to baby step my way out of this, like, panic attack where I couldn’t eat. I was unable to keep food down.

Speaker: 1
02:35:07

I just took a bath and was ai, I’m gonna just chew.

Speaker: 0
02:35:10

Never happened to you before?

Speaker: 1
02:35:11

No. I had had, like I’ve had I I feel for people who have anxiety. I had had anxiety before, but it was when I was doing I was smoking a lot of weed and doing a lot of drugs, and I was in a marriage I didn’t wanna be in to my first husband.

Speaker: 0
02:35:26

So your life was a mess.

Speaker: 1
02:35:27

And I just was lying to myself. Like, I I think it I think depression and anxiety can be very useful ai for something in your life. You’re either doing something you shouldn’t be doing or you’re not doing something you should be doing.

Speaker: 0
02:35:39

You got a course correct. Yeah. It’s giving you motivation.

Speaker: 1
02:35:42

It’s or it’s saying, like, there’s something it’s like a soul calling for you to to check check-in. So I yoga kind of saved me from that time. I got divorced. It it somewhat went away. I’ve talked about my hypochondria sai that I dealt with that, but then I was this was different.

Speaker: 1
02:36:00

It was ai a weird my body physiologically reacted to it too. So I I like, not to be too much information, but I, like, randomly got a period out of nowhere. It was just weird. My body started like, I I threw up. I got a period all at once.

Speaker: 1
02:36:14

And it was such a it’s ai it’s the most supernatural thing that’s ever happened to me. Well, there was another time that was really supernatural when I was in Newport, Rhode Island, and I swear there was, like, a haunting. But other than that, this was it felt supernatural.

Speaker: 1
02:36:30

Like, it it didn’t feel normal, and it took me two full days to, like whatever hit me in my heart on that balcony, and I couldn’t like, all I could hear was this, like, sana it was reverberating from the I sound like a crazy person, but it was reverberating from the tombs. And then I would just I remember eating a grape so slowly and mindfully just to, like, come back into my body, and I had these, like, crazy ram.

Speaker: 1
02:36:59

And I felt like I saw all these connections, like, people in my life bryden. And it was it was really it was weird. It was in that moment when I, like, had a physical reaction where Ai it was like a scene from a movie where it was ai sai zap zap zap zap. Like, all these people in my life. And and, the guy Ai was with, thank god, he could have been a total asshole and been like, you’re a fucking psycho.

Speaker: 1
02:37:21

I’m leaving you. But he had had a weird experience with me in his guest house in, in New Zealand where I would fall asleep sometimes downstairs when or we’d pass out or whatever. And I was like, did you have a dog? Because I was, like, convinced there was a a ghost because I kept feeling some weird sense, and he’s like, don’t talk about the dog.

Speaker: 1
02:37:42

And then it turns out there was, like, a dog. And his kids like, he had it and then wanted to get rid of it. And it’s something this sword thing. He’s like, how the fuck did you know it’s a dog? And I was like, oh, because the energy was ai a puppy energy.

Speaker: 1
02:37:54

And I was like, what happened to the dog? He’s like, nothing. Don’t ever talk about the dog. And I was like, okay. So he’s like, you’re weird and psychic and, like, a weird sick you have, like, a weird thing. He was like, you’re touched by something.

Speaker: 1
02:38:05

Like and we used to play backgammon all the ai. And one time, the I, like, rolled the dice, and it’s I’m sure it was just lint, but it like, I wanted to win so badly, and then the dice just was on his corner just frozen. So he had weird, like, just on the core tip of it just like a he’s like, what the fuck? You’re a witch.

Speaker: 1
02:38:23

So he was he had had slightly unsettling things with me that enough that I think he was like, alright. Maybe she’s not totally

Speaker: 0
02:38:32

Well, they’d say that places have memory. Right?

Speaker: 1
02:38:35

This place I mean, Egypt just starts to show.

Speaker: 0
02:38:37

Ai haven’t been, obviously, but I felt weird when I went to Chichen Itza. That feels very weird.

Speaker: 1
02:38:42

Oh, okay.

Speaker: 0
02:38:43

Yeah. I

Speaker: 1
02:38:44

had I had to I was, like, baby stepping. I still get slightly, like and my life has never been the same really since. Like, it was it was it was, and Egypt is Cairo’s bonkers, but Egypt is the Nile. I remember just, like, cruising down the Nile and being on the deck, and you feel you can feel what it was, like, like.

Speaker: 1
02:39:09

You’re just, like, this is what memory. Yeah. Yeah. And the tombs are all very it’s very I don’t know. It’s very spiritual for me.

Speaker: 1
02:39:19

It was ai a very, ai felt like a pilgrimage that I didn’t know I was taking. That’s how it felt to me.

Speaker: 0
02:39:26

Look, it makes sense. It was the most sophisticated civilization maybe ever in terms of their building.

Speaker: 1
02:39:31

You have to go. I feel like you would love it.

Speaker: 0
02:39:33

Ai know I would love it. I know I just don’t have the time. I almost did it with mister Beast. He went in December.

Speaker: 1
02:39:38

Oh, did he?

Speaker: 0
02:39:38

Try to do it and do it with him. Meh wanted to do a podcast there, but I just couldn’t make it happen. Ai I’d too busy. Yeah.

Speaker: 1
02:39:44

It’s it’s Andy’s a hike.

Speaker: 0
02:39:47

Yeah. I need to take, like, a good solid couple weeks and go there. It needs to be something that I know that I’m gonna be there for a couple weeks. Yeah. Right now, that’s not really possible.

Speaker: 1
02:39:55

Yeah. Yeah. Yeah. You know? I mean

Speaker: 0
02:39:57

I went to, the visit the site of the Eleusinian Mysteries, and I was with Brian Meh.

Speaker: 1
02:40:03

Where is that?

Speaker: 0
02:40:04

It’s in, Greece. Oh. And, that was wild.

Speaker: 1
02:40:08

Really?

Speaker: 0
02:40:09

That was wild. That that place has a memory for sure. Yeah. I they’re just like touching the walls, like, it just feels weird. You just feel like this is different, like, you’re in the presence of something very strange.

Speaker: 1
02:40:19

That was Egypt.

Speaker: 0
02:40:20

Whatever the energy I was I bet Egypt was probably even more incredible. But whatever the end the energy of that place was, there’s some of it left there. There’s some weird intangible feeling that’s left there that just makes you feel very, very strange.

Speaker: 1
02:40:35

When we were when I was in LA at the Getty Villa, I always go there, and they had the, Thrace exhibit.

Speaker: 0
02:40:42

What’s that?

Speaker: 1
02:40:43

Thrace is ai a I didn’t know anything about it. It was a it’s a civilization that was it is, like, partially in Bulgaria. I think it came after ancient Greece.

Speaker: 0
02:40:53

There it is. Treasures from Bulgaria, Romania,

Speaker: 1
02:40:56

Greece. This statue is fucking insane. It looks like it’s looking at you.

Speaker: 0
02:41:01

Look at the eyes.

Speaker: 1
02:41:01

It’s insane. Insane to see it in real life. This exhibit that’s there right now, it is truly one of the most special exhibits I’ve ever I go to all of them that I can there, and it that is Wow. Look at that. But Ai That’s amazing. Crazy. And I knew nothing about this civilization and ai. And the more I learn about it, the more obsessed I am.

Speaker: 1
02:41:26

But all of this stuff had its own, like, when the fires were around the villa, I was like, oh god. I hope that they’re I hope that that all this stuff is okay. Yeah. I mean, those walls are, like, fireproof, and they’ve got

Speaker: 0
02:41:40

Ai never heard of this before.

Speaker: 1
02:41:41

I’ve never I’d never even really heard of Thrace. I you know, I’ve always been very obsessed with, like, Greek mythology, Roman history, all of these things, but Thrace is this crazy civilization.

Speaker: 0
02:41:52

Look at the relief with two Thraceans.

Speaker: 1
02:41:54

I wanna see if I have the, oh, it it’s crazy. Yeah. If you’re back in LA, I I think that exhibit is there until March. March third, I believe. Because

Speaker: 2
02:42:05

the maybe it it says temporarily closed on top. I don’t know if it’s because of the fire.

Speaker: 0
02:42:08

Probably. Closed.

Speaker: 1
02:42:11

Yeah. Oh, okay.

Speaker: 0
02:42:11

The yeah. I think the Getty’s temporarily closed.

Speaker: 2
02:42:13

Now open. The Getty ville will remain closed until further notice.

Speaker: 1
02:42:16

Oh, okay. I thought they reopened for some reason, but, yeah, they must still be closed from the fires.

Speaker: 0
02:42:22

Someday, someone’s gonna be doing that with the World Trade Center. They’re gonna be wandering around that. This is where they used to live. They used to go to school here. They used to work in these buildings. That’s gonna happen.

Speaker: 1
02:42:33

Thanks.

Speaker: 0
02:42:34

The same yeah. The same way we go through their Acropolis in the Parthenon sana wander around and imagine what it was like living back then. People are gonna do that with us. Every civilization every civilization collapses. We’re just trying to hold this one off as much as we can.

Speaker: 0
02:42:50

And they usually last a couple hundred years, which is ours.

Speaker: 1
02:42:54

Yeah. We’re not we’re pretty young. I mean

Speaker: 0
02:42:57

Yeah. But a lot of them haven’t made it past where we got to.

Speaker: 1
02:43:00

It always reminds me of that Porno for Ai song. Will make the pets. It’s like, my friend says we’re like the dinosaurs. And here we are doing ourselves in much faster than they ever did.

Speaker: 0
02:43:15

I don’t know if that’s real.

Speaker: 1
02:43:17

The dinosaurs?

Speaker: 0
02:43:18

No. The other thing, we’re doing ourselves in faster than them. No. They meh hit by a rock. Oh, my

Speaker: 3
02:43:23

god. Oh, no. We’re coming to dangerous territory where we find out

Speaker: 1
02:43:26

Joe doesn’t believe in dinosaurs.

Speaker: 0
02:43:28

I was watching we were talking about Candace Owens last night. Candace Candace Owens, like, said dinosaurs are fake and gay.

Speaker: 1
02:43:37

Is that what she says? Yeah. Oh. I

Speaker: 0
02:43:39

don’t know what she means by that. She might just be having fun. The This whole Bridget Macron thing though is crazy.

Speaker: 1
02:43:46

Did you watch any of it?

Speaker: 0
02:43:47

No. No. No. I got no time. But I did see a comparison to the photographs of the, person that she’s claiming is actually, you know, ai, the brother.

Speaker: 1
02:44:01

Oh, the brother? Didn’t the brother disappear or something?

Speaker: 0
02:44:03

Yeah. The brother disappeared. And the brother literally looks exactly like her. I mean, to a fucking t.

Speaker: 1
02:44:11

She was kind of on the, like, Blake and Ryan thing too, I think.

Speaker: 0
02:44:14

Yeah.

Speaker: 1
02:44:15

She’s a she’s a sleuthen. She’s been a sleuthen.

Speaker: 0
02:44:18

Look at these. Look at these.

Speaker: 1
02:44:19

Okay. That’s pretty freaking weird.

Speaker: 0
02:44:21

I’ll send it to Jamie. Because this is I mean, she’s just going all in. First of all, I don’t even know if this is a real picture. I mean, this easily could be some AI bullshit that somebody created to try to pretend that the brother is actually her. But the brother is ai not to be found and if that’s real see Sai don’t know if that’s real.

Speaker: 3
02:44:43

I can’t really be real.

Speaker: 0
02:44:44

I don’t know if that’s real because Sai mean the fucking teeth are the sai. Like everything’s the same. Every wrinkle of the face is the same. If that’s if that was my brother, if my brother looked, I’d be like, wait, wait. We’re not twins? So we’re not twins. Like, what?

Speaker: 2
02:44:57

It looks photoshopped. The mouth definitely looks photoshopped.

Speaker: 0
02:44:59

How so?

Speaker: 2
02:45:00

It looks because ai you said, the teeth are exact.

Speaker: 0
02:45:02

Yeah. Exact in the exact position. Right?

Speaker: 2
02:45:06

Yeah. They’re not even almost

Speaker: 0
02:45:07

Right. It looks fake. It looks like somebody doctored up. Somebody used AI.

Speaker: 2
02:45:11

Some cracks in the lips

Speaker: 0
02:45:13

right here. Pretty much. Right? That’s probably bullshit.

Speaker: 2
02:45:15

That mouth is photoshopped.

Speaker: 0
02:45:17

Yeah. So you can’t tell what’s real and what’s not real online anymore.

Speaker: 1
02:45:22

Nothing’s real. Nothing.

Speaker: 0
02:45:24

You literally can’t tell.

Speaker: 1
02:45:25

You have to operate from the the idea that this is not real. You have to start there now.

Speaker: 0
02:45:30

Right. But also, this is just the beginning. Like, what are we gonna be looking at five years from now? We’re gonna be looking at indescribable experiences that are in indistinguishable from reality. Not just, like, images, but you’re gonna be able to, ai, there you’re gonna be able to have experiences that aren’t real.

Speaker: 1
02:45:47

Do you do you think that the, like, metaverse thing is gonna take off? Like, people are gonna be plugging in

Speaker: 0
02:45:52

to that? If it gets good enough.

Speaker: 1
02:45:54

Yeah. And cheap enough. Because Right. It’s ai kind of expensive.

Speaker: 0
02:45:57

It’s expensive, but it’s also weird to have something in your face. People feel weird, like, wearing this big clunky thing. But if you can sit down and attach something to your head, and then all of a sudden you’re in another world, you’re an avatar flying on a fucking dragon, we’re gonna do it.

Speaker: 1
02:46:10

I mean, it sounds isn’t, like, Redman really into that?

Speaker: 0
02:46:13

Oh, loves it. Yeah.

Speaker: 1
02:46:14

He’s, like, famous in the in the metaverse.

Speaker: 0
02:46:16

He goes in there every night. He has, like, parties and stuff. He’s a nut. But it’s perfect for someone like him. He’s a video game ai, loves the Internet.

Speaker: 1
02:46:23

Matt was telling me, like, he’s, like, took him to a com you can do a ai five at a comedy club there. Yeah. It’s crazy.

Speaker: 0
02:46:29

Well, Zuckerberg showed me that too. There’s a there’s a metaverse comedy club.

Speaker: 1
02:46:32

Yeah. But you he’s ai, you had to, like, know people to get in and he’s famous. I was like, this is not there it’s nuts that this already exists to me.

Speaker: 0
02:46:39

Well, it’s just the beginning. It’s only good that’s Pong, you know. And whatever it’s gonna be, like, twenty years from ram, it’s gonna be something really wild.

Speaker: 1
02:46:47

Isn’t it weird to think that, like, you might just exist forever in the in the metaverse talking? Because you’ve done so many podcasts. They’ll just be able to cut and paste and make you kind of like Not

Speaker: 0
02:46:57

even just that. They’ll meh able to

Speaker: 1
02:46:58

get a map. In the future.

Speaker: 0
02:46:59

Well, they’ll be able to get a map of how I think and how I go over things and how I go, well, maybe not. Let’s look at that. And then they’ll apply that sort of thinking. And they’ll be able to do podcast with me with anyone in history. I’ll be able to have a podcast with Albert Einstein.

Speaker: 1
02:47:14

I always think about this when I see the people who get tattoos of you. I’m ai, future civilizations are gonna be like, who was this man? He must have been someone important, maybe a shaman or a Bryden. Like

Speaker: 0
02:47:25

No. He’s a a bro cage fighting commentator. Yeah. Me and Einstein. Oh, no. Just chilling. Look at him. That’s a young Einstein too.

Speaker: 2
02:47:35

Trained thing to have I was looking into this. They’re gonna they can already do it.

Speaker: 1
02:47:40

What? Alright.

Speaker: 0
02:47:41

Let me hear this.

Speaker: 2
02:47:42

They don’t have their voices.

Speaker: 0
02:47:44

Oh, they don’t have their voices. Yeah.

Speaker: 2
02:47:46

But I’ve I’ve seen one where they did.

Speaker: 0
02:47:48

Oh, Nikola Tesla. Oh, I’d love to have a podcast with him. They already did a podcast with me and Steve Jobs.

Speaker: 1
02:47:53

Did they?

Speaker: 0
02:47:54

Yeah. An audio one. An audio podcast with me and Steve Jobs. It’s crude. You can kinda tell that it’s not real, but that’s just because it’s, you know, first generation.

Speaker: 1
02:48:02

That’s really weird.

Speaker: 0
02:48:03

Yeah. We’re gonna wrap this up, Britney. Thank you very much, my friend. Always great to see you. Thank you

Speaker: 1
02:48:07

for having me.

Speaker: 0
02:48:08

Always a pleasure. Tell everybody how to get your stuff, where to find you.

Speaker: 1
02:48:12

Just, go to go to my YouTube and subscribe. That’s the best thing you can do.

Speaker: 0
02:48:17

What’s the YouTube?

Speaker: 1
02:48:17

It’s, Bridgette Fedasy. Ai don’t even know my YouTube. It’s Fedasy, p h e t a s y. Dumpster Fire is the show. Fedasy.com. I’ll I’ll like All that stuff. Bridgette Fedasy, yeah, ai. Yeah. You can find me.

Speaker: 0
02:48:32

Alright, my friend. Appreciate you.

Speaker: 1
02:48:34

Thank you.

Speaker: 0
02:48:34

Bye. Love you too. Bye, everybody.

Speaker: 1
02:48:36

Bye ai.

Ready to try this in Speak?

Upload your audio, video, or text and get transcription, summaries, and insights in minutes. Start self-serve, or book a consult if you need white-label, routing, or advanced workflows.

Don’t Miss Out - ENDING SOON!

Save Big With Speak's New Year Deal 🎁🍁

For a limited time, save on a fully loaded Speak plan. Save time and money with a top-rated AI platform.