#2404 – Elon Musk

Elon Musk is a business magnate, designer, and engineer known for his work in electric vehicles, private spaceflight, and artificial intelligence. His portfolio of companies includes Tesla, SpaceX, Neuralink, X, and several others.https://x.com/elonmusk Learn more about your ad choices. Visit podcastchoices.com/adchoices
Your partner in AI voice technology
Transform voice into your most valuable asset.
Capture, transcribe, and analyze audio and video with the Speak platform - or work closely with the team on custom solutions and conversational AI agents.
Try Speak Free Book Consult
Free trial includes 30 minutes , 30 minutes with a work email.
What you can do
Capture, transcribe, and analyze audio, video, or text
Summaries, action items, themes, quotes, and key moments
White-label embeds, repositories, and exports for real workflows
Trusted, fast, global
Users
250,000+
Languages
100+
Exports
DOCX, SRT, VTT, CSV

You can listen to the #2404 – Elon Musk using Speak’s shareable media player:

#2404 – Elon Musk Podcast Episode Description

Elon Musk is a business magnate, designer, and engineer known for his work in electric vehicles, private spaceflight, and artificial intelligence. His portfolio of companies includes Tesla, SpaceX, Neuralink, X, and several others.https://x.com/elonmusk

Learn more about your ad choices. Visit podcastchoices.com/adchoices
This interactive media player was created automatically by Speak. Want to generate intelligent media players yourself? Sign up for Speak!

#2404 – Elon Musk Podcast Episode Top Keywords

#2404 - Elon Musk Word Cloud

#2404 – Elon Musk Podcast Episode Summary

Based on the provided context, the phrase “has joined the group” refers to someone becoming a member of a group, band, club, or team. Throughout the conversation, there are multiple references to joining various groups, inviting members, and welcoming new people. Specific examples include:

– “we joined the band”
– “He should’ve joined the…”
– “Join the team.”
– “Welcome to the club.”
– “add one more bestie.”
– “they’re in, they’re in.”
– “invite you to…”

These statements all indicate the act of someone joining or being added to a group or collective. However, the context does not specify exactly who “has joined the group” in a particular instance. The general meaning is clear: it signifies the addition of a new member to a group. If you are looking for a specific individual who joined a specific group, that information is not explicitly provided in the context.

This summary was created automatically by Speak. Want to transcribe, analyze and summarize yourself? Sign up for Speak!

Continue reading the full guide (click to expand)

#2404 – Elon Musk Podcast Episode Transcript (Unedited)

Speaker: 0
00:01

Joe Rogan podcast. Check it out.

Speaker: 1
00:03

The Joe Rogan experience.

Speaker: 0
00:06

Ai by day. Joe Rogan podcast by night, all day. Exactly.

Want to run this on your own file?
Upload audio, video, or text and get a transcript, summary, and insights in minutes.
Try Speak Free Book Consult For voice partners, white-label, routing, and advanced workflows
Free trial includes 30 minutes (60 with a work email)
Speaker: 1
00:13

Just every morning.

Speaker: 0
00:15

What about Jeff Bezos is doing? He’s doing

Speaker: 1
00:17

some definitely doing some different laps around.

Speaker: 0
00:19

He looks jacked. He

Speaker: 1
00:21

looks jacked. Right?

Speaker: 0
00:22

Yeah. But he’s like

Speaker: 1
00:24

Quick. Quick. Quick.

Speaker: 0
00:26

Yeah. Even

Speaker: 1
00:28

ai he got jacked.

Speaker: 0
00:29

At age ai 50 at 50 at age 59 in less than a year, he he went from pencil neck geek to, looking like a miniature like like The Rock.

Speaker: 1
00:38

Yeah. Like a little miniature alpha fella.

Speaker: 0
00:40

Yeah. Like like his neck got bigger than his head. Yeah. He got very quick. But then, ai, his earlier pictures, his neck’s like a noodle.

Speaker: 1
00:46

I support this activity. Ai just see him going in this direction.

Speaker: 0
00:49

Which is ai, and his voice dropped

Speaker: 1
00:50

like two octaves. I want you to move in that direction as well. I think we can achieve this.

Speaker: 0
00:55

I I I mean, I I should

Speaker: 1
00:56

I ai we can achieve Oh. Giga Chad. Giga Chad. That’s what people called it.

Speaker: 0
01:02

Where is that guy?

Speaker: 1
01:03

People? I don’t know where he is.

Speaker: 0
01:05

That’s like a real guy.

Speaker: 1
01:07

The artist? Yeah. No. Oh, DiggaChad. Oh, DiggaChad. Yeah. I don’t know if that’s a real guy. It’s hard to tell.

Speaker: 0
01:12

It is a real guy?

Speaker: 1
01:14

Yeah. He’s got the crazy jaw and, like, perfect sculpted hair.

Speaker: 0
01:17

Yeah. Yeah. Well, I mean, they may have exaggerated a little bit, but Probably. But, no. I think I think he actually just kinda looked like that in reality. Wow. Sai, like like he’s a pretty unique looking individual.

Speaker: 1
01:30

I think we can achieve this. That guy right there? That’s a real guy?

Speaker: 0
01:35

Ai always thought that

Speaker: 1
01:36

that was CGI.

Speaker: 0
01:37

No. It’s like I I think one of the I think the upper right one is not him. That’s not him. But that one to

Speaker: 1
01:43

the left of that, like, that’s real? No. That’s that’s artificial, bro. That’s fake. That’s got that uncanny valley feel to it, doesn’t it? It’s it’s not impossible. No. No. It’s not impossible to achieve, but it’s not it’s not possible to maintain that ai of leanness.

Speaker: 0
01:57

No. No. Ai mean, that that that’s like ai you’re you’re you’re also at at that point, you’re there there he’s dehydrating and all sorts of things.

Speaker: 1
02:04

Oh, it’s based on a real person.

Speaker: 0
02:05

Yeah. Yeah. Based on it’s this.

Speaker: 1
02:07

Ai. But it’s not a real person. What does

Speaker: 0
02:08

he really look like?

Speaker: 1
02:09

Those images, I think, are bullshit. Some of them are. Is that real? Okay. That real that looks real. Those are really Jack bodybuilder.

Speaker: 0
02:17

Yeah.

Speaker: 1
02:18

Yeah. That looks real. Like, that’s achievable, but there’s a few of those images where you’re just like, what’s going on here?

Speaker: 0
02:24

Yeah. Yeah. Yeah. Totally. Yeah. Well, I mean, you see you see ai? That guy is that is that the That’s

Speaker: 1
02:30

the real dude.

Speaker: 0
02:31

Well, there’s that that that Icelandic dude who’s Thor.

Speaker: 1
02:34

Oh, yeah. The guy who jumps in the frozen lakes and shah.

Speaker: 0
02:37

Well, the the guy who played the mountain.

Speaker: 1
02:39

Oh, that guy.

Speaker: 0
02:40

That is Yeah.

Speaker: 1
02:41

That is like

Speaker: 0
02:41

a that that is like a a mutant strong human. Yes. Like ai, he had been like the X Men or something, you know.

Speaker: 1
02:48

We were

Speaker: 0
02:48

in, you know. He’s he’s just, like, not like, and and and and there’s that you know that I’ve seen that meh, tent and tent bag. No. You know how, like, it’s, like, it’s really hard to get the tent and tent in there?

Speaker: 1
02:59

Oh, right. Right. That’s true.

Speaker: 0
03:05

And there’s a picture of of him and his girlfriend. Oh, right. That bad. That’s hilarious. Yeah. Yeah.

Speaker: 1
03:12

That’s the vibe. I don’t know how

Speaker: 0
03:13

it gets in there, you ai. It’s like it seems too small, but I

Speaker: 1
03:16

met Brian Shaw. Brian Shaw is like the world’s most powerful man Yeah. And he’s almost seven feet tall. He’s four hundred pounds, and his his bone density is one in 500,000,000 people. So there’s one it’s ai there’s like maybe 16

Speaker: 0
03:34

people. Bones bones?

Speaker: 1
03:35

He’s an enormous human being.

Speaker: 0
03:37

Okay.

Speaker: 1
03:38

Like a legitimate ai. But just like that guy. Yeah. Yeah. But we met him. He was hanging out with us in the green room of the mothership. It’s like, okay. If this is, like, David and Goliath days, like Yeah. Yeah. This is an actual ai. Like, the giants of the Ai.

Speaker: 0
03:49

Once in a while, they get a super giant person.

Speaker: 1
03:51

This is a real a real one. Like, not a tall skinny basketball player. Yeah. Yeah. Ai, a seven foot four hundred pound power lifter.

Speaker: 0
03:58

Like, you don’t wanna especially That’s

Speaker: 1
04:00

the guy. See if there’s a photo of him standing next to, like, a regular human. I was trying to get this There it is. Yeah. That’s him right there.

Speaker: 0
04:06

Like, there’s, like, there’s, like, one of him with next to standing next to Arnold and stuff. Yeah. And it’s where and, Arnold everyone everyone just looks tiny. I mean, I think he’s pretty cool dude, actually.

Speaker: 1
04:15

Oh, Brian’s very cool.

Speaker: 0
04:16

Very smart too.

Speaker: 1
04:17

Yeah. Unusual. You you know, you expect anybody to be that big. It’s gotta be a moron.

Speaker: 0
04:21

Yeah. No. But the yeah. There was there was Andre the Giant who was awesome. I mean Yeah. He was great in Princess Bryden.

Speaker: 1
04:28

No. He was just awesome period.

Speaker: 0
04:30

Yeah.

Speaker: 1
04:30

Yeah. So we were talking about, this interview with Sam Altman and Tyler, and I was like, we should probably just talk about this on the air. Because it is one of the craziest interviews I think I’ve ever seen in my life Yeah. Where Tucker starts bringing up this guy who was

Speaker: 0
04:46

Oh, yeah. Whistleblower or whatever.

Speaker: 1
04:48

Whistleblower Yeah. Who, you know, committed ai, but doesn’t look like it. And Yeah. And he’s talking to Sam Altman about this. And Sam Altman was like, are you accusing me? He’s like, no. No. No. I’m not. I’m just saying. I I think someone killed him.

Speaker: 0
05:03

Yeah. Ai and should be investigated. Yeah. Not just drop the case.

Speaker: 1
05:09

It seems like

Speaker: 0
05:10

That they just dropped the case. Yeah. Yeah. Yeah. But his parents think he was murdered. Yeah. The wires to a security camera were cut. Blood in two rooms. Blood in two rooms. Someone else’s wig was in the room.

Speaker: 1
05:21

Someone else’s wig. Wig. Wig. Yes.

Speaker: 0
05:23

Not Not normal wig.

Speaker: 1
05:25

Not normal to have a wig laying around.

Speaker: 0
05:27

Yes. And, and he ordered DoorDash, right before allegedly committing suicide Yeah. Which, is seems unusual. You know? Yeah. It’s like, you know, let’s I’m gonna order pizza on second thoughts. I’ll kill myself. It’s it seems like that’s a very rapid change in mindset.

Speaker: 1
05:45

It’s very weird. And especially the parents have they they don’t believe he committed suicide at all.

Speaker: 0
05:51

Has no note or anything?

Speaker: 1
05:52

No. Yep. It seems pretty fucked up. And, you know, the idea that a whistleblower for an enormous AI company that’s worth billions of dollars might get whacked, that’s not outside the pale.

Speaker: 0
06:04

I I mean, it’s straight out of a movie.

Speaker: 1
06:06

Right out of a movie, but right out of a movie is real sometimes.

Speaker: 0
06:08

Yeah. Right. Exactly. Right? It’s a little weird that I I think they should do a proper investigation. Like, what’s the downside on that proper investigation? Right.

Speaker: 1
06:18

No. Yeah. For sure.

Speaker: 0
06:20

Yeah. Yeah.

Speaker: 1
06:20

But the whole exchange is so bizarre.

Speaker: 0
06:23

Yeah. Yeah. So it’s it it is.

Speaker: 1
06:24

Sam Altman’s reaction to being accused of murder is bizarre.

Speaker: 0
06:28

Look. I don’t know if he is guilty, but it’s not possible to look more guilty. Sai I’m

Speaker: 1
06:35

like Or look more weird. Yeah. You know, maybe it’s just his social thing. Like, maybe he’s just odd with confrontation and it just goes blank, you know. But if if somebody was accusing me of killing Jamie, ai, if Jamie was a whistleblower and Jamie got whacked, and then I’d be like, wait, what do you what do you are you accusing me of killing my friend?

Speaker: 1
06:57

Like, what the fuck are you talking about? I would I would be a little bit more irate.

Speaker: 0
07:02

Yeah. Yeah. Exactly. You know, it’d be

Speaker: 1
07:04

I would be a little upset.

Speaker: 0
07:06

Yeah. Or you it’d be ai well, you’d be like ai you’d certainly insist on a thorough investigation Yeah. As opposed to trying to sweep it under the rug.

Speaker: 1
07:15

Yeah. I wouldn’t assume that he got that he committed suicide. I would be suspicious. If Tucker was telling me that aspect of the story, I’d be like, that does seem like a murder. Fuck. We should look into this.

Speaker: 0
07:25

I mean, all signs point to it being a murder. Not not saying, you know, Sai Altman had anything to do with the murder, but Blood in two rooms. It’s blood in two ram. Like, yeah. This this the ai, the security camera, and the DoorDash being ordered right before sai. No suicide note.

Speaker: 0
07:39

His parents think, he was murdered. And, the people that I know who knew him said he was not suicidal. So I’m like, this why would you be jump to the conclusion?

Speaker: 1
07:51

Parents just sued the landlord. They sued the son’s landlord, alleged the owners and the managers of their son’s San Francisco apartment building were part of a widespread cover up of his death.

Speaker: 0
07:59

The landlord?

Speaker: 1
08:00

Yeah. There’s a bunch of weird they said there’s, like, packages missing from the building. Some people said they saw packages still being delivered, and then all of a sudden, they all disappeared. But that could be people steal people’s packages all the time.

Speaker: 0
08:11

The porch pirate situation.

Speaker: 1
08:13

Yeah. Yeah. It says they failed to safeguard. Also, I meh, the amount of trauma those poor parents have gone through with their son dying like that. I mean, it must God bless them, and how could they be stay sane ai something like that? They’re probably they’re so grief stricken.

Speaker: 1
08:30

Who knows what they believe at this point?

Speaker: 0
08:32

Yeah. Is it I should have asked if Epson killed himself.

Speaker: 1
08:38

Yeah. That’s the the cash Where’s the cash cow? The cash cow. Ram Bungie. I don’t know why I’m trying to convince everybody of that. Ai, okay.

Speaker: 0
08:45

The guards weren’t there and the camera stopped working and, you know. The guards were asleep.

Speaker: 1
08:52

The cameras weren’t working. He had a a giant steroid it up body builder guy that he was sharing a cell with that was a murderer who was a bad cop. Like, all of it’s kind of nuts. All of it’s ai of nuts. Like, that he would just kill himself rather than reveal all of his billionaire friends.

Speaker: 0
09:11

Yeah.

Speaker: 1
09:12

And then Did you see Tim Dillon talking to Chris Cuomo about this? Because he like I did.

Speaker: 0
09:16

He ai the idea.

Speaker: 1
09:17

Chris Cuomo just looked so stupid.

Speaker: 0
09:20

Tim just

Speaker: 1
09:21

listed off all the Tim just and he’s like, I agree. It is strange. Like, of course, it’s strange, Chris. Jesus Christ. You can’t just go with the tide. You gotta think things through. And if you think that one through, you’re like, I don’t think he killed himself. Nobody does.

Speaker: 1
09:36

You have to work for an intelligence agency to think he killed himself.

Speaker: 0
09:40

It does it does seem unlikely.

Speaker: 1
09:42

It seems highly unlikely. It seems highly unlikely. Ai highly unlikely. All ai point to murder.

Speaker: 0
09:49

Yes.

Speaker: 1
09:49

To point to they had to get rid of him because he knew too much. Whatever the fuck he was doing Yeah. Whatever kind of an asset he was, whatever thing he was up to, you know, was apparently very effective.

Speaker: 0
10:01

Yes.

Speaker: 1
10:01

And a lot of people were compromised. You see, your boy Bill Gates is now saying climate change is not a big deal. Said relax everybody. I know I scared the fuck out of you for the last decade and a half, but we’re gonna be fine.

Speaker: 0
10:16

Yeah. I mean, you know, as we’re sai as I was saying just before coming into the studio with you know, ai, every day there’s some crazy ai new thing that’s happening. It’s like it feels like reality is accelerating.

Speaker: 1
10:29

It’s every day, and every day it’s, like, more and more ridiculous to the point where the simulation is more and more undeniable.

Speaker: 0
10:37

Yeah. Yeah. It really feels like simulation. You know? It’s like, come on. What are the odds that this could be the case?

Speaker: 1
10:42

Are you paying attention at all to three Ai Atlas? Are you watching ai comet? Yeah. Whatever it is.

Speaker: 0
10:48

Yeah. Yeah. I mean and one thing I can say is, like, look, I if if I was aware of any evidence of aliens, you Joe, you have my ai. I will come on your show, and I will reveal it on the show.

Speaker: 1
11:03

Okay.

Speaker: 0
11:04

Yeah. That’s a good deal. Yeah. It’s pretty good. I believe you. Yeah. Thank you. I’ll Ai stick to I I keep my, you know, keep my promises. So,

Speaker: 1
11:11

Alright. Yeah. I’ll hold you to that. Yeah. Yeah. Ai don’t

Speaker: 0
11:13

And, I’m never committing suicide, to be clear.

Speaker: 1
11:16

I don’t think you will either.

Speaker: 0
11:18

On camera, guys, I am never committing suicide ever.

Speaker: 1
11:22

If someone says you commit suicide, I will fight tooth and nail.

Speaker: 0
11:25

Thank you.

Speaker: 1
11:25

I will fight tooth and nail. I will

Speaker: 0
11:27

I will not believe it.

Speaker: 1
11:28

I will not believe it. The thing about the three ai atlas is it’s That’s a common name actually.

Speaker: 0
11:33

Yeah. It’s a third It sounds it sounds like third I or something. Yeah.

Speaker: 1
11:36

It does. Three I is the third it’s only the third interstellar object that’s detected.

Speaker: 0
11:42

Okay. Yeah. Avi Loeb. This is the third I Atlas.

Speaker: 1
11:46

Yeah. Avi Loeb was on the podcast sai couple days ago talking about it.

Speaker: 0
11:50

Yeah. And It could be on his own now. But Ai

Speaker: 1
11:52

I Apparently, today, they’re saying that it’s changed course. Do you see that, Jamie? No. Avi Loeb said something today. I’ll send it to you. I know it’s on Reddit. Ai you go, Jamie. I’ll send it to you right now. It’s fascinating. It’s fascinating also because it’s made almost entirely of nickel, whatever it is.

Speaker: 1
12:17

And the only way that exists, here is, industrial alloys, apparently.

Speaker: 0
12:22

But no. There are there are there are definitely, comets that and asteroids that are made primarily of nickel. Oh, really? Yeah. So the the the places where, you ai nickel on Earth is actually where there was an asteroid or comet that hit Earth that was a nickel rich, you know, asteroid.

Speaker: 0
12:40

Oh, wow. That’s a nickel rich giant.

Speaker: 1
12:41

Ai rich rich deposit?

Speaker: 0
12:43

Yeah. That’s the that’s that’s it’s coming. Those are from impacts. You definitely didn’t wanna be there at the time ai anything would have been obliterated. Right. But that’s that’s where the the sources of nickel and cobalt are these days.

Speaker: 1
12:54

So this is Avi Loeb. A few hours ago, the first hint of non gravitational acceleration vatsal something other than gravity is affecting its acceleration, meaning something is affecting its trajectory beyond gravity was indicated. Interesting. Sana it’s mostly nickel, very little iron, which, he was saying, is on Earth, it only exists in alloys. But whatever, you know, you’re dealing with another planet.

Speaker: 0
13:23

There’s there’s there are there are there are cases where there’s very nickel rich asteroids meter. That that that that that happened for something

Speaker: 1
13:30

ram space.

Speaker: 0
13:31

Yeah. This is something Yeah. Yeah. It doesn’t mean it’s it’ll be it’ll be very sort of heavy spaceship to be make it all out of nickel. Oh, yeah. And fucking huge.

Speaker: 1
13:39

The size of Manhattan and all nickel, that’s kinda nuts.

Speaker: 0
13:41

Yeah. That’s a heavy spaceship.

Speaker: 1
13:43

That’s a real problem if it hits.

Speaker: 0
13:45

Yes. No. No. It would, like, obliterate a continent type of thing. Yeah. Maybe maybe worse.

Speaker: 1
13:50

Ai. Probably kill most of human ai, if not all of us.

Speaker: 0
13:54

I mean, it depends on what the the total mass is, but there’s I mean, the thing is, like, in the fossil record, there are, you know, there’s, like, arguably five major extinction events, like, the biggest one of which is the Permian extinction, where, almost all life was eliminated. That that actually occurred over several million several million years. The there’s the Jurassic. I think Jurassic is I think that one’s pretty definitively an asteroid.

Speaker: 0
14:20

And, but there’s but there’s been five major extinction events, but, but what they don’t count are really the ones that merely take out a continent. So the Merely? Yeah. Because that that because those don’t really show up on the fossil record. You know? Right.

Speaker: 0
14:36

So unless it’s enough to cause a a, you know, mass extinction event throughout Earth, it it doesn’t show up, you know, in a fossil record that’s, 200,000,000 years old. So the, yeah. But but but but there there have been meh, many impacts that would have sort of destroyed all life on, you know, let’s sai, half of North America or something like that.

Speaker: 0
15:00

There meh many such impacts, through the course of history.

Speaker: 1
15:03

Yeah. And there’s nothing we could do about it it right now.

Speaker: 0
15:06

Yeah. There was one that, hits there’s a when it hits Siberia and destroyed, I think, a few 100 square miles.

Speaker: 1
15:14

Oh, that’s the Tunguska.

Speaker: 0
15:15

Yeah.

Speaker: 1
15:15

Yeah. That’s the one from the nineteen twenties. Right?

Speaker: 0
15:18

Yeah.

Speaker: 1
15:18

Yeah. That’s the one that coincides with that meteor that, meh storm that we go through every June and every November that they think is responsible for that Younger Dryas impact. Yeah. All that shit’s crazy. Thank you. Before we go any further for letting us have a Taurus SpaceX You’re welcome. Letting us be there for the rocket launch.

Speaker: 0
15:40

Sure.

Speaker: 1
15:41

One of the absolute coolest things I’ve ever seen in my life. And we we’ve we were we thought it was only, like I thought it was a half a mile. Jamie’s like, it was a mile away. It turned out it’s almost two miles away.

Speaker: 0
15:53

Yeah. Yeah.

Speaker: 1
15:53

And you feel it in your chest.

Speaker: 0
15:54

Yeah. It’s You have to wear earplugs.

Speaker: 1
15:56

Intense. And you feel it in your chest, and it’s two miles away.

Speaker: 0
15:59

Yeah.

Speaker: 1
16:00

It was fucking amazing.

Speaker: 0
16:01

Yeah.

Speaker: 1
16:02

And then to go with you up into the command center and to watch all the Starlink satellites with all the different cameras and all in real time Yeah. As it made its way all the way to Australia. How many minutes? Like, thirty five, forty minutes?

Speaker: 0
16:15

Yeah.

Speaker: 1
16:17

Wild. Watch it touchdown in Australia. Yeah. Fucking crazy. It was amazing. Yeah. Yeah. Absolutely amazing.

Speaker: 0
16:24

The Starship’s awesome. And anyone can go watch the launch, actually. So you can just go, South Padre Island and get as great view of the launch. So it’s like where a lot of spring breakers go. But, but we’ll be flying pretty frequently, out of Starbase in South Texas. And we we we formally incorporated it as a city, so it’s it’s actually a legally an actual legal city, Starbase, Texas. It’s not that often you hear, like, hey.

Speaker: 0
16:48

We made a city. You know? That used to be ai the like, in in the old days, like, a startup would be you go and gather a bunch of people and say, hey. Let’s go make a town. Literally. Right. That was, like, ai would have been startups in in in in the old days.

Speaker: 1
17:01

Or a country. Yeah.

Speaker: 0
17:02

Or a country. Yeah. Yeah. Yeah. Actually.

Speaker: 1
17:04

If you ai doing that today, there’d be a real problem.

Speaker: 0
17:07

Yeah. There’s Like, maybe sai. So much so much set in stone on the country front these days. You might be able to pull it off.

Speaker: 1
17:11

You know? You might be able to pull it off. If you got a a solid island, you might be able to pull it off.

Speaker: 0
17:16

Ai know? It’s just not probably.

Speaker: 1
17:18

You know, like like, at Marriott in your country in Lanai?

Speaker: 0
17:21

Yeah. You could put Is this it? Right here?

Speaker: 1
17:22

If you

Speaker: 0
17:23

put you put enough effort into it, you could make a new country.

Speaker: 1
17:25

This is one of the different ones. This is the one of the ones that you catch. Right? Or is that one

Speaker: 0
17:29

Yeah. That that’s the booster. So that’s the super heavy booster. So that’s one with the booster’s got 33 engines. That that, in its, you know, by by version four, that will have about 10,000 tons of thrust.

Speaker: 1
17:44

You

Speaker: 0
17:45

know, right now, it’s about seven, eight thousand tons of thrust. But it’s that’s the largest flying object ever made.

Speaker: 1
17:50

I had to explain to someone. They were going, why do they blow up all the time if you’re so smart? Because there was this there was this fucking idiot on television. Some guy was being interviewed and they were talking about you and he goes, oh, I think he’s a fuckwit. And he goes, he’s a fuckwit.

Speaker: 1
18:03

And he goes, why do you say he’s fuck oh, his rockets keep blowing up. And someone said, yeah, ai do his rockets blow? And I had to explain Yeah. Because it’s the only way you find out what the tolerances are. You have to you have to

Speaker: 0
18:14

have to do the ai lot. The box. So so, like, so when you do a new, rock development program, you you have to, do what’s called, you know, exploring the limits sai the corners of the box where you say it’s ai you worst case this, worst case that, to figure out, where where the limits are.

Speaker: 0
18:31

So, you you blow up you know, not not not admittedly, in the development process, sometimes it blows up accidentally. But but we intentionally subject it to, you know, a a flight regime that is much worse than what we expect in normal flight sai that when we put people on board or vatsal cargo, it doesn’t blow up.

Speaker: 0
18:49

So, so so for example, for the the flight that you saw, we we actually deliberately took, heat shield tiles off the the the ship, the star off of Starship in in some of the worst locations to say, okay. If we lose a, heat shield tile here, is it is it catastrophic or is it not?

Speaker: 0
19:09

And we will nonetheless, Starship was able to do a soft landing, in, in the Indian Ocean just, West Of Australia, which is and it and it got there from Texas in, like, I don’t know, thirty five, forty minutes tyler of thing. So

Speaker: 1
19:24

So it landed even though you put it through this situation where it sai compromised shield? It it it

Speaker: 0
19:30

had an, an an unusually we we we brought it in hot, like, an an extra hot trajectory, with missing tiles, to see if it would still make it to a soft landing, which it did. Now I just should point out, it did have there were some holes that were burnt into it, but it’s it was robust enough to land despite having some holes burnt into it, you know, that that yeah.

Speaker: 0
19:53

Because it’s coming it’s coming in like a blazing meteor. You can sai you can see the real time video.

Speaker: 1
19:57

Well, tell me the speed again because the the speed was bananas. You were talking about Yeah.

Speaker: 0
20:00

It’s, like, 17,000 miles an hour. Which is Local, like, like, 25 times the speed of sound or thereabouts. So, the, so so think about it, like, it’s it’s, like, 12 times faster than a bullet from an assault rifle. You know, bullet from an assault rifle is around meh two.

Speaker: 1
20:16

And it’s just and it’s huge.

Speaker: 0
20:18

Yeah. Yeah. Or or compare it to, like, a a bullet from a, you know, a ai or or nine mil, which is subsonic. That’s you know, it it’ll be about 30 times faster than a bullet from a handgun.

Speaker: 1
20:32

30 times faster than a bullet from a handgun, and it’s the size of a skyscraper.

Speaker: 0
20:36

Yes. Yeah. That’s fast.

Speaker: 1
20:41

It’s so wild. It’s so wild to see, man. It it’s, it’s so exciting. This the factory is so exciting too because, like, genuinely, no bullshit. I felt like I was witnessing history. I felt like it was a scene in a movie Right. Where someone had expectations, and they ai, what are they doing?

Speaker: 1
20:59

They’re building rockets, and you go there. And as we’re walking through, Jamie, you could talk speak to this too. Didn’t you have the feeling where you’re ai, oh, this is way bigger than I thought it was. This

Speaker: 0
21:09

is Huge. Ai. Gigantic.

Speaker: 1
21:11

Fucking crazy. That’s

Speaker: 0
21:12

what she said.

Speaker: 1
21:13

The, ah, the amount of rockets you’re making. Making. I mean,

Speaker: 0
21:18

I don’t know who you back. Getting Chad in the house. I’m just waving. I don’t know.

Speaker: 1
21:25

It’s a giant metal dick. You’re fucking fucking

Speaker: 0
21:27

the universe with your giant

Speaker: 1
21:28

metal dick.

Speaker: 0
21:29

Yeah. Ai meh,

Speaker: 1
21:30

but it’s

Speaker: 0
21:30

yeah. It is it is very big.

Speaker: 1
21:32

And the sheer numbers of them that you guys are making. And then this is a version, and you have a new updated version Yeah. That’s coming soon. And what is

Speaker: 0
21:42

the It’s a it’s a little longer. More pointy? Sai amount of pointy, but the there’s it’s it’s got a bit more length. The in the interstage you see that that the interstage section with kinda like the grill area?

Speaker: 1
21:56

Mhmm.

Speaker: 0
21:57

That’s, that’s now integrated with the boost stage. So, we do, what’s called hot staging, where we light the ship engines while it’s still attached to the booster. So the boost the booster engines are still thrusting. It’s still it’s it’s, you know, it’s still being pushed forward by the booster of the shah, but then we light the ship engines, and the ship end engines actually pull away from the booster even though the booster engines are still firing.

Speaker: 0
22:21

Woah. So it’s blasting flame through, that that grill section, but we integrate that grill section into, the boost stage with the next, version of the rocket. And, and next person in the rocket will have the Raptor three engines, which are a a huge improvement. You may you may have seen them in the lobby. Yeah.

Speaker: 0
22:43

Because we got, like, the Raptor one, two, and three, and you can see the dramatic improvement in simplicity. We should probably put a plaque there to also show how much the we reduce the weight, the cost, and the and improved the efficiency and the, thrust. So the Raptor three, has, you know, almost twice the thrust of Raptor Raptor one. Wow. So you see Raptor three.

Speaker: 0
23:08

It looks like it looks like it’s got parts missing. Right?

Speaker: 1
23:12

And how

Speaker: 0
23:12

many of them It’s very, very clean.

Speaker: 1
23:14

How many of them are on the rocket?

Speaker: 0
23:15

There’s 33 on the on the booster.

Speaker: 1
23:19

Woah.

Speaker: 0
23:20

And and each of each Raptor engine is producing twice as much thrust as all four engines on a seven forty seven. Wow. So it that engine is smaller than a seven forty seven engine, but it’s producing, you know, you know, almost 10 times the thrust of a seven forty seven engine.

Speaker: 0
23:42

Wow. So extremely high power to weight ratio, and, And so when 33 of them.

Speaker: 1
23:50

You do when you so when you’re designing these, you get to Raptor one.

Speaker: 0
23:53

Yeah.

Speaker: 1
23:54

You see its efficiency. You see where you can improve it. You get to Raptor two. How many how far can you scale this up with just the same sort of technology with propellant and ignition and engines? Like, how much further can you?

Speaker: 0
24:08

We’re we’re pushing the limits of physics here. Sai, and and, really, in order to to make a a fully reusable orbital rocket, which no one has succeeded in doing, yet, including including us, But but, Starship is the first time that there is a design for a rocket where where full and rapid reusability is actually possible.

Speaker: 0
24:32

So it was not there’s not there’s not even been a design before where it was possible, certainly not a design that that that got made any hardware at all. We just we just we just live we live on a planet, where the gravity, is is is quite high. Like, Earth’s gravity is quite really quite quite high.

Speaker: 0
24:52

And if if the gravity was even 10 or 20%, higher, we’d be stuck on Earth forever. Really? Like, we we yeah. We cannot use certainly couldn’t use conventional rockets. You’d have to, like, blow yourself off the surface with, like, a nuclear bomb or something crazy.

Speaker: 0
25:07

Sai on on the other hand, if if Earth’s gravity was just a little lower, like, even 20% lower, it then, getting to orbit would be easy. So it’s ai it’s like it’s like this if this was a video game, it’s set to, like, maximum difficulty but not impossible.

Speaker: 1
25:25

Okay.

Speaker: 0
25:25

So that’s that’s where we have, here. So it’s it’s not as though, others have, ignored the concept of reusability. They’ve just, concluded that it was too difficult to achieve. And we’ve been working at on on this for a long time, at at SpaceX. And, you know, I’m the chief engineer of the company.

Speaker: 0
25:48

Although, I should say that that, you know, we’re an extremely talented, engineering team. I think we’ve got the best, rocket engineering team that has ever been assembled. It’s, it’s an honor to work with such such incredible people. So, so so it’s fair to say that, you know, we we have not yet succeeded in creating in achieving full reusability, but we at last have a rocket, where full reusability is possible.

Speaker: 0
26:15

And I think I think we’ll achieve it next year. So, that’s, that’s a really big deal. And the reason the reason that’s that’s such a big deal is that full reusability, drops the cost of access to space by a 100. Maybe even more than a 100, actually. So meh could could be, like, a thousand. But you can think of it like any mode of transport.

Speaker: 0
26:44

Like, imagine if aircraft were were not reusable. Like, you flew somewhere, you throw the plane at like like, imagine if like, the way the way conventional rockets work is it would be like if you had an airplane, and and and instead of landing at your destination, you parachute out, and the plane crashes somewhere, and you land at your desk and you and you land at a parachute at your destination.

Speaker: 0
27:04

That’ll be a very expensive trip, and you and you’d need another plane to get back. Okay? But that’s how the other rockets in the world work. Now the SpaceX Falcon rocket is the only one that is is there’s a that is at least mostly reusable. You’ve sai you’ve seen the Falcon rocket, you know, land.

Speaker: 0
27:24

We’ve now done over 500 landings of the SpaceX rocket Oh, the the Falcon nine rocket. And, and and this year, you know, we’ll we’ll deliver probably, I don’t know, somewhere between twenty two hundred and twenty five hundred tons to orbit, with with the Falcon nine, Falcon Heavy rockets, not counting anything for from Starship.

Speaker: 1
27:51

And this is mostly Starlink?

Speaker: 0
27:53

Yes. Mostly Starlink. But we launched, many other we we even launched our competitors on, competitors to Starlink on on Falcon nine. We charge them the same price. Pretty fair. But, SpaceX this year will deliver, roughly 90% of all Earth mass to orbit. Wow. And then of the remaining 10%, most of that is done by China, and then the then the remaining kind of roughly 4% is, everyone else in the world, including our our domestic competitors.

Speaker: 1
28:25

You know, it it’s kind of incredible how many things are in space. Like, how many things are floating above us now?

Speaker: 0
28:32

There’s a lot of things.

Speaker: 1
28:33

Yeah. There is there

Speaker: 0
28:34

a saturation though.

Speaker: 1
28:35

Right. But is there a saturation point where we’re gonna have problems with all these different satellites that are

Speaker: 0
28:43

I think, you know, slowing as the satellites arya, maintained, there’s there’s it’ll be fine. This it’s space is very roomy. It’s it’s like, you can think of, like, speak as being concentric shells of the surface of the Earth. So, you know, the there’s there’s it’s the surface of the Earth, but but there’s it’s a series Much larger. Yeah. Yeah. Looks like a series of concentric shells.

Speaker: 1
29:09

And think of an Airstream trailer flying around up there. There’s a lot of room for Airstreams.

Speaker: 0
29:14

Yeah. I mean, imagine yeah. If if they flew just a few thousand Airstreams, on on Earth Yeah. What are the odds that they’d hit each other?

Speaker: 1
29:20

They wouldn’t be very crowded. No. And then you gotta go bigger Yeah. Because you’re dealing with far above Earth, hundreds of miles above Earth.

Speaker: 0
29:28

Yeah. Yeah. Yeah. Yeah. Sai it’s the but the the the goal of SpaceX is to meh, rocket technology to the point where we can extend life beyond Earth and that we can establish a self sustaining city on Mars, a permanent base on the moon, that would be very cool. I mean, imagine if we had, like, a, you know, moon base alpha where there’s, like, a permanent science base on the moon.

Speaker: 1
29:49

That would be pretty dope or at least a tourist trap.

Speaker: 0
29:52

I mean

Speaker: 1
29:54

A lot of people be willing to go to the moon for just for a tour.

Speaker: 0
29:57

That’s for sure.

Speaker: 1
29:58

Do we

Speaker: 0
29:58

just Oh, we’ll probably pay for our space program with that.

Speaker: 1
30:00

Probably. Yeah.

Speaker: 0
30:01

Well because it’s ai if if you if you could go to the moon with and and safely, Ai think we’d get a lot of people, would would would pay for that.

Speaker: 1
30:12

Oh, a 100%. After the first year, after nobody ai, so, like, we

Speaker: 0
30:15

could just to make sure. Exactly. Are you gonna come back?

Speaker: 1
30:18

Because, like, that submarine, they they had a bunch of successful launches in that private submarine before it implode and killed everybody.

Speaker: 0
30:26

That was not a good design, obviously.

Speaker: 1
30:27

It was a very bad design.

Speaker: 0
30:28

Terrible design.

Speaker: 1
30:29

And the engineer said it would not withstand the pressure of those depths. Like, there’s a lot of whistleblowers in that company too.

Speaker: 0
30:36

Yeah. They they they made that out of, carbon fiber, which is it doesn’t make any sense because, you actually need you need to be dense to go down. In any case, you just make it out of steel. If you make it out of, sort of just, you know, a big steel casting, that’s that’s you’ll you’ll be safe and nothing like it.

Speaker: 1
30:54

Why would they make it out of carbon fiber then? Is it cheaper?

Speaker: 0
30:58

I think they think carbon fiber sounds cool or something, but It does

Speaker: 1
31:00

sound cool.

Speaker: 0
31:01

It it sounds cool, but, because it’s such it’s such low density, ai actually actually have to add extra mass to go down because it’s low density. But if you just have a giant, you know, hollow ball bearing, you’re gonna be fine.

Speaker: 1
31:15

Speaking of carbon ai, check out my unplugged Tesla out there. Yeah. It’s cool. Pretty sick. Right? Yeah. Have you guys ever thought about doing something like that? Like, having, like, an AMG division of Tesla where you do, like, custom stuff?

Speaker: 0
31:29

I think it’s best to leave that to the custom shops. You know, we’re we’re we’re ai, Tesla’s focus is autonomous cars, you know, building kind of futuristic autonomous cars. Sai, ai, I think it’s we we want the future to look like the future. So the did like, did you see, like, ai for, like, the sort of the robotic bus? It looks pretty cool.

Speaker: 1
32:00

The robotic bus? Is it also being totally autonomous?

Speaker: 0
32:02

We need we need actually figure out the good name for it. Like, I think, like, called the Robust or and ai no good there’s, like what do you call this thing? But it looks it looks cool. It’s it’s very Art Deco. It’s it’s, like, it’s, like, futuristic Art Deco. And, it it does it like, I think we we want to change the aesthetic over time.

Speaker: 0
32:20

You don’t want the aesthetic to be constant over time. You wanna evolve the aesthetic. So, you know, like meh like, I have a son who’s who’s ai you know, he’s he’s he’s, like, even more autistic than meh. And, and, but he’s he has his great observations.

Speaker: 1
32:39

Who is this?

Speaker: 0
32:40

Saxon. He has his great observations in the world, because his his he just views the world through a different lens, than than most people. And he is like, dad, why does the world look like it’s 2015? I’m like, damn. The world does look like it’s 2015. Like, the aesthetic has not evolved as 2015.

Speaker: 1
33:00

That’s what it looks like? Yeah. Oh, wow.

Speaker: 0
33:03

That’s pretty cool.

Speaker: 1
33:04

Oh, yeah. That’s ai

Speaker: 0
33:05

Like, you’d wanna see that going down the road. You know? Yeah. You’d be like, okay. This is we’re in the future. You know? It doesn’t look like 2015.

Speaker: 1
33:12

What is that ancient science fiction movie, like, one of the first science fiction movies ever? Is it Metropolis? Is that what it is?

Speaker: 0
33:17

Yeah. Yeah.

Speaker: 1
33:18

Yeah. That looks like it belongs in Metropolis.

Speaker: 0
33:20

Yeah. Yeah. Ai it’s a futuristic art deco.

Speaker: 1
33:23

Yeah. Yeah. Yeah. Well, that’s cool that you’re concentrating on the aesthetic. I mean, that’s kind of the whole deal with Cybertruck. Right? Like, it didn’t have to look like that.

Speaker: 0
33:31

No. It it it just wanted to have something that looked really different.

Speaker: 1
33:35

Isn’t it a pain in the ass for people to get it insured because it’s all solid steel and?

Speaker: 0
33:40

Ai hope it’s not too much. Ai I you know, Tesla does offer insurance, so ai can always get it meh it insured at Tesla. Well, but the the the like, it is the form does follow function in the case of the Cybertruck because, as you demonstrated with with your armor piercing arrow, because if you shot that arrow at a regular truck I

Speaker: 1
33:58

mean, I went right through

Speaker: 0
33:59

it. Yeah. Exactly. Would have found your arrow in the wall. Yeah. You know, it would have gone through least

Speaker: 1
34:04

it would have buried into one of the seats. Yeah.

Speaker: 0
34:06

Yeah. It’s it’s but, like, you could you you could definitely make, get enough of a bow velocity and and and the right the right arrow would go through both doors of a regular truck and and and and land on the wheel. You know?

Speaker: 1
34:17

If there was a clear shah between both doors, it probably would have passed

Speaker: 0
34:20

right through. Exactly. But but, you know, the the the arrow shattered on the Cybertruck because it’s it’s ultra hard, stainless. Mhmm. Sai, and I thought it’d be it it I thought it’d be cool to have, you know, a truck that is bulletproof to a subsonic projectile. So, you know, especially in this day and age, you know, ai, as if as if if the apocalypse happens, you’re gonna wanna have a bulletproof truck. You know?

Speaker: 0
34:47

So so then because because it’s made of ultra hot stainless, it’s you can’t just stamp the the panels. Like, you can’t just put in a stamping press because it breaks the press. So so in order to actually it it so it has to has to be planar, because it’s so difficult to bend.

Speaker: 0
35:05

It because it breaks the machine that bends it. That’s why that’s why it’s it’s it’s it’s so planar, and and it’s not, you know, it’s it’s because it’s bulletproof steel is the

Speaker: 1
35:18

Right. Sai it is, like, boxy as opposed to, like, curved and

Speaker: 0
35:21

Yeah. You just in order to make in order to make, like, the curved shapes, you you you take you take, basically ai steel, like, annealed a thin and thin annealed in a regular truck or car, the you take you take mild, thin annealed speak, you put it in a stamping press, and it just it just smooshes it and makes it to whatever the shape whatever shape you want.

Speaker: 0
35:44

But the Cybertruck is made made of ultra hard stainless, and and and and so you can’t stamp it, because it would break the stamping press. Sai it it even bending it is hard. So to even to bend it to, its current position, we have to way overbend it, and and so it gets so that when it springs back, it’s in in the right position.

Speaker: 0
36:07

So it’s, don’t know. Like, I I think if you sana like, I think it’s it’s it’s a unique aesthetic. And you say, well, what’s cool about a truck? Trucks are trucks are, like, should be, I don’t know, manly. They should be macho. You know? And, bulletproof is maximum macho. Piero Smuts Macho.

Speaker: 1
36:27

Are you meh to that shape now? Like, is it can you do anything to change it? Like, as you get further like, I know you guys updated the three and the y. Did you update the y as well?

Speaker: 0
36:38

Yes. The the three and the ai, are updated. You know, there’s ai a there’s there’s a screen in the back for the kid the kids can watch, for example, in the new three and y. Sai the new in the new ai. There’s, you know, it’s it’s an there’s there’s there’s, like, hundreds of improvements. Like, we keep improving the car.

Speaker: 0
37:00

And even the Cybertruck, we, you know, keep improving it. But, you know, I wanted to just do something that that looked unique, and and the ai truck looks unique and has unique functionality. And, there there there was and over there, ai, as there were three things where as I report, it’s like, let’s make it bulletproof.

Speaker: 0
37:19

Let’s, make it faster than a Porsche nine eleven, and we actually cleared the quarter mile. The the Cybertruck, the the, can, clear a quarter mile while towing a Porsche nine eleven faster than a Porsche nine eleven. It can out tow an f three fifty diesel. Really?

Speaker: 1
37:44

Yes. What is the tow limitations?

Speaker: 0
37:46

I mean, we could tow, like, ai a, you know, a seven forty seven in that with with a cyber truck. A tyler truck is an insanely ai, it is an it is alien technology. Okay? Because it it shouldn’t be possible to be, that big and that fast. Ai doesn’t it’s like an elephant that runs as as ai a cheetah.

Speaker: 1
38:08

Yeah. Because it’s zero to 16 less than three seconds. Right?

Speaker: 0
38:11

Yes.

Speaker: 1
38:11

Yeah. And it’s enormous. What does it weigh? Like, 7,000 pounds?

Speaker: 0
38:15

Yeah. This is a different configurations, but it’s about that. It’s a beast. Yeah. So and it’s and it’s got it’s got, four wheel steering. So the the rear wheel steer too. So it’s got a it’s got a very tight turning radius.

Speaker: 1
38:32

Yeah. We noticed that. We we we drove one to Shah Base.

Speaker: 0
38:34

Yeah. Very tight turning radius.

Speaker: 1
38:36

Yeah. Pretty sick. Yeah. Are you still doing the Roadster?

Speaker: 0
38:40

Yes. Eventually? We’re getting close to demonstrating the prototype. Like, how close? This will be I I I I one thing I can guarantee is that this product demo will be unforgettable. Unforgettable. How so? Whether it’s good or bad, it will be unforgettable.

Speaker: 1
39:15

Can you say more? What do you mean?

Speaker: 0
39:18

Well, you know, my friend Peter Thiel, you know, once reflected that, the the future was supposed to have flying cars, but we don’t have flying cars.

Speaker: 1
39:30

So you’re gonna be able to fly?

Speaker: 0
39:31

Well, I I mean, I think if Peter wants a flying car, we should we should be able to buy one.

Speaker: 1
39:42

So you are you actively considering making an electric flying car? Is this, like, a real thing?

Speaker: 0
39:48

Well, we have to see in the

Speaker: 1
39:49

In the demo. Sai when you do this, like, are are you gonna have a retractable wing? Like, what is the idea behind this? Don’t be sly. Come on. Ai, I I

Speaker: 0
40:02

I I can’t I can’t, do the unveil before the unveil. But

Speaker: 1
40:08

Tell me off air then.

Speaker: 0
40:09

I I I it look. I I think it has a shot at being the most memorable, product unveil ever. It has a shot. And when

Speaker: 1
40:25

do you plan on doing this? What’s the goal?

Speaker: 0
40:29

Hopefully, before the end of the year. Really?

Speaker: 1
40:32

Before the end of this year? This is I mean, we’re

Speaker: 0
40:35

all in a couple of months. Hopefully, in a couple of months. You know, we we need to make sure that it works. Like, this is some crazy, crazy technology we got in this car. Crazy technology. Crazy, crazy.

Speaker: 1
40:54

So different than what was previously announced and

Speaker: 0
40:59

Yes.

Speaker: 1
41:01

And is that why you haven’t released it yet? Because you keep fucking with it?

Speaker: 0
41:05

It has crazy technology. Okay. Ai, is it even a car? I’m not sure it’s like, it looks like a car. Let’s just put this way. It it’s it’s crazier than anything James Bond if you took all the James Bond cars and ai them, it’s crazier than that.

Speaker: 1
41:28

Very exciting. Yeah. I don’t know what to think of it. Is it even a car?

Speaker: 0
41:31

I don’t know.

Speaker: 1
41:32

It’s a limited amount of information I’m drawing from here. Jamie’s very suspicious over there. Look at him. Ai? I’m interested. It’s still gonna be the same. Well, you

Speaker: 0
41:41

know what? I mean, if if you wanna if you wanna come a little before the the unveil, I can show it to you.

Speaker: 1
41:46

100%. Yeah. Yeah. Let’s go. Yeah. Yeah. It’s, it’s kinda crazy all the different things that you’re involved in simultaneously. And, you know, we talked about this before, your time management, but Sai really don’t understand it. I don’t understand how you can be paying attention to all these different things simultaneously.

Speaker: 1
42:08

Starlink, SpaceX, Tesla, Boring Company, x, you you you fucking tweet or post rather all day long.

Speaker: 0
42:17

Well, it’s more like I’m I’m Sai could hop in for, like, two minutes and then hop out, you know. But

Speaker: 1
42:21

I mean, just

Speaker: 0
42:21

the fact

Speaker: 1
42:22

that you can do

Speaker: 0
42:22

speak, whatever. You know?

Speaker: 1
42:24

I can’t do that.

Speaker: 0
42:25

If

Speaker: 1
42:26

I hop in, I start scrolling. Ai I start looking around. Next thing you know, I’ve lost an hour.

Speaker: 0
42:30

Yeah. So no. It’s ram me, it’s it’s sai couple minutes’ time usually. Is it once in a while, ai, I guess, half an hour, but usually, I’m I’m I’m in for a few minutes then out of of, you know, posting something on x. You know, it’s it’s I do sometimes feel like it’s sometimes like that that meme of the guy who’s, like, who drops the grenade and leaves the room.

Speaker: 0
42:50

That’s been me more than once on on X.

Speaker: 1
42:56

Yeah. Oh, yeah. Yeah. For sure. It’s gotta be fun, though. It’s gotta be fun to know that you essentially disrupted the entire social media chain of command because there was a there was a very clear thing that was going on with social media. Yeah. Yeah. The government had infiltrated it. They were censoring speak. And until you bought it, we really didn’t know the extent of it.

Speaker: 1
43:21

We kind of assumed that there was something going on. Yeah. We had no idea that they were actively involved in censoring actual real news stories, real data, real scientists, real professors, silenced, expelled, kicked off the platform.

Speaker: 0
43:34

Yeah.

Speaker: 1
43:35

Wild.

Speaker: 0
43:37

Yeah. Yeah. For telling the truth.

Speaker: 1
43:39

For telling the truth. Ai I’m sure you’ve also Sai and because I sent it to you, that chart that shows, young kids, teenagers ai as trans and and non binary

Speaker: 0
43:48

Right.

Speaker: 1
43:49

Literally stops dead when you bought Twitter Yeah. And starts falling off a cliff when people are allowed to have rational discussions now Yes. And actually talk about it.

Speaker: 0
43:58

Yes. Yeah. Yeah. I mean, I I said at the time, like, I think that, like, the the like, the reason for acquiring Twitter is because, it was it it was it was causing destruction at a civilizational level. It was, I sana posted I tweeted on on Twitter at the time that, it it is, you know, it’s it’s it’s, Wormtongue for the world.

Speaker: 0
44:27

You know, like Wormtongue from Lord of the Rings, where he would just sort of, like, whisper these, you know, terrible things to the king so the king would believe these things that weren’t bryden. And and, unfortunately, Twitter really got it got like, the the the the woke mob, essentially, that controlled Twitter.

Speaker: 0
44:51

And they were pushing, a nihilistic, anti civilizational ai virus to the world. And you can see the results of that mind virus on the streets of San Francisco, where where, you know, Downtown San Francisco looks like a zombie apocalypse. You know, it’s it’s bad. So we don’t want the whole world to be a zombie apocalypse.

Speaker: 0
45:13

But that’s, that that that that that was essentially, they’re pushing this very negative, nihilistic, untrue worldview on the world, and it was causing a lot of damage. Sai The stunning thing about

Speaker: 1
45:31

it is how few people course corrected. A bunch of people woke up and realized what was going on, people that were all on board with, like, woke ideology in maybe 2015 or ’16, and then and then eventually it comes to affect them, or they see it in their workplace, or they see it, and they’re like, we sana stop this.

Speaker: 1
45:47

Bunch of people did, but a lot of people never course corrected.

Speaker: 0
45:52

Yeah. A lot of a lot of people didn’t course correct, but, but it’s gone directionally, and it’s gone it’s it’s directionally correct. Like, you you mentioned, like, the, like, the massive spike in in kids identifying as trans and then that that that spike dropping, after the the Twitter acquisition, I think that, simply allowing the truth to be told, was just shedding some sunlight is the best disinfectant, as they sai.

Speaker: 0
46:20

And just allowing ai, kills the virus.

Speaker: 1
46:24

And it also changed the benchmark for all the other platforms.

Speaker: 0
46:28

Yes.

Speaker: 1
46:29

You can’t just openly censor people on all the other platforms and access available sai everybody else had a sai, like, Facebook announced they were changing. YouTube announced they were changing their policies.

Speaker: 0
46:39

Yeah.

Speaker: 1
46:39

And they’re ai of forced to. And then Blue Sky doubled down.

Speaker: 0
46:44

Well, like ai, the problem is, like, if, essentially, the woke mind virus retreated to woke to to blue sky. Yeah. But it’s where where they’re just a self reinforcing lunatic asylum.

Speaker: 1
46:55

They’re all just triple masked.

Speaker: 0
46:57

I I was Yeah. Totally.

Speaker: 1
47:00

Watching this exchange on a blue sky where someone said that they’re just trying to be meh about something.

Speaker: 0
47:06

Yeah.

Speaker: 1
47:06

And then someone a moderator immediately chimed in and sai, why don’t you try to stop being racist against Asians ai saying something zen? By saying I’m trying to be zen about something, they were accusing that person of being racist towards Asians.

Speaker: 0
47:21

Yeah. It it it’s just it’s just everyone’s a hall monitor over there.

Speaker: 1
47:25

The worst hall monitor. A virgin, like, incel

Speaker: 0
47:30

They’re all hall monitors trying to rat on each other.

Speaker: 1
47:32

Yeah. It’s fascinating. And then people say Ai leaving for Blue Sky ai Stephen King, and then a couple weeks later, he’s back on x.

Speaker: 0
47:40

Yeah. It’s

Speaker: 1
47:40

ai, fuck it. There’s no one over there. It’s all a whole bunch of crazy people. You can only stay in the asylum for so long. You’re like, alright. This this is not good. They all bail.

Speaker: 0
47:50

Yeah. Yeah. Threads is kinda

Speaker: 1
47:51

like that too. Threads is

Speaker: 0
47:52

I’ve I’ve I’ve been on Ai. Is is it

Speaker: 1
47:55

Well, what happens is if you go on Ram, every now and then, vatsal something really stupid will pop up on Threads, like, what the fuck? And it shows it to you on Ram. And then I’ll click on that, and then I’ll go to threads. And it’s ai, you see posts with, like, 25 ai, like, famous people.

Speaker: 0
48:11

Okay.

Speaker: 1
48:12

Like, 50 like it’s it’s a post down.

Speaker: 0
48:14

Ghost town. Yeah.

Speaker: 1
48:14

But the people that post on there, they’re finding that there’s very little pushback from insane ai. Yeah. So they go there and they spit out nonsense and very few people jump in to argue. Yeah. Very weird very weird place.

Speaker: 0
48:29

I mean, I can generally get the vibe of, like, what’s taking off by seeing what’s showing up on x because that’s the public town square still. Right. And, or or, you know, what what links show up in group text? You know, if I’m in group chats with friends, like, where where the what what links are showing up?

Speaker: 1
48:45

That’s what I try to do now, only get stuff that shows up in my group text because that keeps me productive. So I only check if someone’s like, dude, what the fuck? I’m like, alright. What the fuck? Let me check it out. Exactly.

Speaker: 0
48:55

If there’s something that’s crazy enough that your friend you you it’ll it’ll end with the group chat.

Speaker: 1
48:59

But there’s always something. That’s what’s nuts. There’s always some new law that’s passed, some new insane thing that California is doing. Yeah. And it’s ai it like a giant chunk of it’s happening in California. The most preposterous things that I get.

Speaker: 0
49:13

Yeah. And then you got

Speaker: 1
49:15

Gavin Newsom who’s running around saying we all have California Derangement Syndrome. He’s just, like, ripping off Trump Derangement and calling it California Derangement. It’s ai, no no no. No no no.

Speaker: 0
49:24

The Yeah.

Speaker: 1
49:25

The fucking, how many corporations have left California?

Speaker: 0
49:28

It’s crazy.

Speaker: 1
49:29

Hundreds.

Speaker: 0
49:30

Yes. Hundreds.

Speaker: 1
49:30

Right? Hundreds. Yes. That’s not good.

Speaker: 0
49:32

Trick I mean no. Trick I mean, I think In and Out left.

Speaker: 1
49:35

Yeah. In and Out left. They moved to Tennessee. Yeah. Yeah. They’re ai, we can’t do this anymore.

Speaker: 0
49:41

Right. And and, like

Speaker: 1
49:42

It’s the California company for food. It’s, like, the greatest hamburger place ever.

Speaker: 0
49:46

It’s awesome. Yeah. Yeah. Ai not actually speak of, like ai, just sort of open source ai, like, looking at things openly, like, you I just like going In N Out and seeing them make the burger.

Speaker: 1
49:55

Yeah. It’s right there.

Speaker: 0
49:56

They chop the onions, and they they, you know, it’s you just see everything getting made in front of you. Yeah. It’s great. But but yeah. They like I like it should be like, how many wake up calls do you need to say that there needs to be reform in California, you know?

Speaker: 1
50:09

Well, the crazy thing that Newsom does is whenever someone brings up the problems in California, he starts rattling off all the positives. The most Fortune 500 companies, highest education, highest but, yeah, that was all already there Right. Before you were governor.

Speaker: 0
50:24

But but but how many Fortune 500 companies have left California?

Speaker: 1
50:28

And then, you guys spent $24,000,000,000 on the homeless, and it got way worse.

Speaker: 0
50:33

Yes. Sai feel like the homeless population doubled or something. Ai, but it like, if you don’t understand, like, the homeless thing because it it it sort of preys on people’s empathy. And I I think we should have empathy, and we should try to help people. But the the the, the homeless industrial complex is is really it’s it’s it’s dark, man.

Speaker: 0
50:50

It it should be that that that that that network of NGOs should be called, like, the drug zombie farmers, because they they the the more homeless people and and and really, like, when you when you meet, like, you know, somebody who’s, like, totally dead inside shuffling along down the street with a with a needle dangling dangling out of their meh, Homeless is the wrong word.

Speaker: 0
51:14

Like, the homeless implies that somebody got a little behind in their mortgage payments, and if they just got a a job offer, they’d be back on their feet. But someone who’s I mean, you see these videos of people that are just shuffling. You know, they’re on fentanyl. They’re they’re, like, you know, taking a dump in the middle of the street, you know, with and and they they got, like, open sores and stuff.

Speaker: 0
51:34

They’re not, like, one drop off or away from getting back on their feet.

Speaker: 1
51:38

Right. This is not a homeless issue.

Speaker: 0
51:40

Homeless is it’s it’s it’s a propaganda word. Right. Sai and and then the the the the the, you know, these sort of charities, in courts, are they they get money proportionate to the number of homeless people or or or number of drug zombies.

Speaker: 1
51:57

Right.

Speaker: 0
51:57

So their incentive structure is to maximize the number of drug zombies, not minimize it.

Speaker: 1
52:03

Right.

Speaker: 0
52:04

That’s why they don’t arrest the drug dealers. Because if they arrest the drug dealers, the drug zombies leave. So they know who the drug dealers are. They don’t arrest them on purpose, because, otherwise, the drug zombies would leave, and they would they would stop getting money from the state of California and from from all the charities.

Speaker: 1
52:22

Wait a minute. So you see if they so they Yeah. It’s like it’s

Speaker: 0
52:25

starting, man.

Speaker: 1
52:25

Is that real? So they’re in coordination with law enforcement on this? Yeah. So how do they how do they have those meetings?

Speaker: 0
52:32

They’re all in cahoots.

Speaker: 1
52:33

Well, the when you find this

Speaker: 0
52:35

It’s it’s like such it’s it’s this is a diabolical scam. So, and and San Francisco has got this sai, this this gross receipts tax, which which, it’s not even on revenue. It’s on all transactions, which is why Stripe, and Square had and and and how much financial companies had to move out of San Francisco because it wasn’t a tax on revenues, tax on transactions.

Speaker: 0
52:56

So if if you do, like, you know, trillions of dollars transactions, not revenue, you’re taxed on any money going through the system in San Francisco. So, like, Jack Dorsey pointed this out, and and so he said, like, look, that they had to had to move square from San Francisco to, Oakland, I think.

Speaker: 0
53:13

Ai had to move from San Francisco to South San Francisco, different city. And that money, goes to the homeless industrial complex, that that tax that was passed. So, so there’s there’s billions of dollars that go as you pointed out, the billions of dollars every year that go to, these nongovernmental organizations that are funded by the state.

Speaker: 0
53:36

Like, this it’s not clear how to turn this off. It’s a it’s a self licking ice cream cone situation. So, they they get this money. The money is proportionate to the number of of homeless people or or or number of drug zombies, essentially. So they they they try to keep the the they try to actually increase because that like like, in in some cases, like, there’s a it’s it’s so meh did analysis.

Speaker: 0
54:02

When you add up all the money that’s flowing, they’re getting close to a million dollars per homeless per per per drug zombie. It’s, like, $900,000 or something. Ai, some crazy amount of money is is is going to these organizations. So if if so so they wanna keep people just barely alive.

Speaker: 0
54:18

They they need to keep them in the area sai they so they they get the revenue. So and it so that’s ai, like I said, they don’t arrest the drug dealers because, otherwise, the drug zombies would leave. And and and and they but but they don’t wanna happen to have too much if they get too much drugs and they they then they die.

Speaker: 0
54:36

So it’s they they’re kept in this sort of perpetual zone of of being addicted, but, but but just just barely alive.

Speaker: 1
54:45

So how is this coordinated with, like, DAs? DAs that don’t prosecute people? Yeah. So when they when they hire the or they push so they they fund the campaigns of the most progressive, most out there left wing DAs. They get them into office.

Speaker: 0
55:01

We’ve got that issue in Austin too, by

Speaker: 1
55:02

the way. Yes. We do.

Speaker: 0
55:03

You see that guy that got shot in the library?

Speaker: 1
55:05

No. That’s something. Yeah. I heard about

Speaker: 0
55:06

that guy got shot actually, shot and killed in the library. I think that was just, like, last week or something.

Speaker: 1
55:11

Right.

Speaker: 0
55:13

Sai, some friends of mine were telling me that that, like, the ai unsafe. Like, they took their kids to the library, and and there were, like, dangerous people in the library in Austin. And I was like, dangerous people in the library? Like, that’s a strange it basically got, like, got, like, drug zombies in the leg drug zombies in the library. Oh, Jesus.

Speaker: 1
55:32

And that’s when someone got shot?

Speaker: 0
55:34

Yeah. I believe this is should be on the news. Maybe we might be able to pull it up. But I think it was just in last week or so that, there was a shooting in the library in Austin. Because Austin’s got, you know, it’s it’s the most liberal part of Texas that we’re in right right here.

Speaker: 1
55:53

So suspect of all the shooting at Austin Park Ai Saturday is accused of another shooting at the CAP Meh bus earlier that day. According to an arrest warrant affidavit, Austin police arrested Harold Newton Keane ai, shortly after the shooting in the library, which occurred around noon.

Speaker: 1
56:09

One person sustained non ai threatening injuries in the event. Before that shooting, Key was accused of shooting another person in a bus incident and after reportedly pointing his gun at a child. So this is the fella down here.

Speaker: 0
56:23

Like, we just here have a seriously have a problem here. Yeah. You know? So I I think one of the people might have died too that he shot. So, like, one of the people, I think I think, did bleed out. But either way, it’s like getting shah. It’s bad. It

Speaker: 1
56:42

says, the victim told PISA confronted the suspect who started to eat what appeared to be crystal methamphetamine. According to the affidavit, the victim advised the suspect, began to trip out, at which time the victim exited the bus. Victim told the bus driver, hit the panic button then exited the bus when he turned around the observer.

Speaker: 1
57:01

Black male is now standing at the front of the bus with the gun pointed at him. The victim advised the black male fired a single round, which grazed his left hip. So he shot at that dude, and then another dude got shot in the library. Fun.

Speaker: 0
57:16

Yeah. I mean, in the library Yeah. You know, where you’re supposed to be reading books, and there’s a children’s section in the library, and it says he pointed his gun at at a at a kid. I mean, like, we do have a serious issue in the in in in America where, repeat violent offenders need to be incarcerated. Right.

Speaker: 0
57:33

And, you know, you got you got cases where somebody’s been arrested, like, 47 times. Right. Like, literally okay. That’s just the number of times they were arrested, not the number of times they did things. Like, most of the times tyler do things, they’re not arrested.

Speaker: 1
57:47

Sai lay this out for people so they understand Yeah. How this happens.

Speaker: 0
57:51

Yeah. And and the the key is, like, this it it preys on people’s empathy. So it, like, if you’re a good person and you want good things to happen in the world, you’re like, well, we should take care of people, you know, you know, who who are down their luck or, you know, having a hard time in life.

Speaker: 0
58:07

And I we should. I agree. But what we shouldn’t do, is is put people who are violent drug zombies, in public places where they can hurt other people. And that’s what that is what we’re doing that we just saw where a a guy, you know, got shot, shot in the library and then but even before that, he shot another guy, and pointed his gun at a kid.

Speaker: 0
58:30

Ai like, that guy probably has, like, many prior arrests. You know, there was that that that guy that that that knifed the the Ukrainian woman, Irina. Yes. Yeah. You know? And she was just she was just quietly on her phone and just came up and, you know, gutted tyler, basically.

Speaker: 1
58:49

Wasn’t there a crazy story about the judge who was involved, who had previously dealt with this person, was also invested in a rehabilitation center Yeah. And was sending these people How much of interest? Yes. So sending people that they were charging

Speaker: 0
59:08

Yeah.

Speaker: 1
59:08

To a rehabilitation center instead of putting them in jail, profiting from this rehabilitation center, letting them back out on the street Yes. And ai, insane people.

Speaker: 0
59:18

And and there, in that case, that I believe that judge, has no legal law degree, or a significant legal experience that would allow them to be a judge. They were just made a judge. That that’s ai You could be a

Speaker: 1
59:31

judge without a law degree?

Speaker: 0
59:32

Yeah.

Speaker: 1
59:32

Wow.

Speaker: 0
59:33

Yeah. You could just be a so

Speaker: 1
59:35

I ai be a judge?

Speaker: 0
59:36

Yeah. Ai. Anyone. That, like

Speaker: 1
59:39

That’s crazy. I thought you’d have to it’s like, if you wanna be a doctor, you have to go to medical school. I thought if you’re gonna be a judge, you have to understand how sana

Speaker: 0
59:46

appointed to a judge, you have to have proven that you have, an an, excellent knowledge of the law and that you will make your decisions according to the law. That’s what we assume should be

Speaker: 1
59:57

That’s how you get the robe. Right.

Speaker: 0
59:59

Yeah. You don’t get the robe unless you do Right. You know Gotta

Speaker: 1
01:00:02

go to school to get the robe.

Speaker: 0
01:00:03

You gotta know what the law is. Right. And then you gotta need to make decisions in accordance with the law.

Speaker: 1
01:00:08

Based on Not feelings. That you already know because you read it because you went to school for it.

Speaker: 0
01:00:11

Yes.

Speaker: 1
01:00:12

Not you just got a point there. Not vibes.

Speaker: 0
01:00:16

It can’t be just vibing as a judge.

Speaker: 1
01:00:18

Vibing is a left wing drudge. So you got crazy left wing DAs.

Speaker: 0
01:00:22

Yes.

Speaker: 1
01:00:22

Like, I was gonna say left wing because the left wing used to be normal.

Speaker: 0
01:00:27

Yeah. Left wing just meant ai like yeah. You’re ai open minded. The left used to be ai pro pro free speak. Yeah. And now they’re against it.

Speaker: 1
01:00:35

It used to be, like, pro gay rights, pro women’s right to choose Yeah. Pro minorities, pro, you know Like yeah.

Speaker: 0
01:00:43

Like, twenty years ago, I don’t know, it it used to be, like, left would be, like, the the the party of empathy or, like, you know, caring and being nice and that kind of thing. Not not the party of, like, crushing dissent and crushing free speech and, you know, crazy regulation, and and just, and calling everyone a Nazi.

Speaker: 0
01:01:05

You know, like, I think they’ve called you and me Nazis, you know.

Speaker: 1
01:01:09

Oh, yeah. I’m a Nazi right now.

Speaker: 0
01:01:12

I know what but I have friends

Speaker: 1
01:01:13

that are comedians that called you a Nazi, and I got pissed off arya you. Oh, yeah. Yeah. Yeah. Ai, not actually a Nazi. No. No. Because you did that thing at the on ai heart goes

Speaker: 0
01:01:23

out to you. Everyone everyone. All

Speaker: 1
01:01:25

of them. Literally. Tim Walz, Kamala Harris, every one of them did it. They all did it.

Speaker: 0
01:01:31

Like like how do you point at the crowd? Yeah. How do you wave at the crowd?

Speaker: 1
01:01:35

Do you know CNN was using a photo of me whenever I got in trouble during COVID Oh, man. From the UFC weigh ins? And if the UFC weigh ins, I go, hey, everybody. Welcome to the weigh ins. And so they were getting me from the ai, and that was the photo that they used, conspiracy theorist, podcaster Joe Rogan. Like, that’s what they used.

Speaker: 0
01:01:52

Yeah. Yeah. But that’s what the left is today. It’s a super judgy and calling everyone a Nazi Yeah. And trying to suppress freedom of speech.

Speaker: 1
01:01:58

Yeah. And eventually, you run out of people to accuse because people get pissed off when they leave.

Speaker: 0
01:02:01

Yeah. Everyone it’s like it it like, it it like, it’s it no longer it ai doesn’t matter to be called racist or a Nazi or whatever because it

Speaker: 1
01:02:09

Still recording? It’s the government, man.

Speaker: 0
01:02:12

Is is it working? We’re good. Okay.

Speaker: 1
01:02:14

Okay.

Speaker: 0
01:02:15

It’s just ain’t working.

Speaker: 1
01:02:16

Yeah. Slight issue. Yeah. I’m a little nervous, but Yeah. When you, when you text people, do you are you, like, keenly aware that there’s a high likelihood that someone’s reading your text?

Speaker: 0
01:02:30

I guess. I I guess I

Speaker: 1
01:02:33

I I assume.

Speaker: 0
01:02:34

Look. If if if if intelligence agencies aren’t trying to read my phone, they should probably be fired.

Speaker: 1
01:02:44

At least they get some fun memes. I

Speaker: 0
01:02:49

ai I gotta I gotta crack them up once in a while.

Speaker: 1
01:02:51

Oh, for sure Ai crack

Speaker: 0
01:02:52

them up. It’s like They look like like, hey hey, guys. Check it out. You’ve got a banger here, you know.

Speaker: 1
01:02:56

So I sana I wanted to talk to you about, whether or not encrypted apps are really secure. No. Right. Because I know the Tucker thing. So it was explained to me by a friend who used to do this, used to work for the government. He’s ai, they can look at your signal, but what they have to do is take the information that’s encrypted and then they have to decrypt it. It’s very expensive.

Speaker: 1
01:03:24

So they said he told me that for the Tucker Carlson thing when they found out that he was gonna interview Putin, it costs, like, something like $750,000 just to decrypt his messages to find out that they did it. So it is possible to do. It’s just not that easy to do.

Speaker: 0
01:03:41

I think you should view any given messaging system as, not not whether it’s secure or not, but but there are degrees of insecurity. So, so there’s just some things that are less insecure than others. So, you know, on on x, we we just rebuilt the entire messaging stack, into x what’s called XChat.

Speaker: 1
01:04:05

Yeah. That’s what I wanted to ask you about.

Speaker: 0
01:04:07

Yeah. It’s cool. So it’s it’s using, sort of peer to peer, sort of ai of a peer to peer based, encryption system. So it’s ai similar to Bitcoin. So it’s, it’s it’s, I think, very good encryption. We’re gonna and, you know, we’re testing it thoroughly. We’re not there’s there’s no hooks in the x system for advertising.

Speaker: 0
01:04:27

So if you look look at something like WhatsApp or really any of the others, they’ve got they’ve got hooks in there for advertising. When you say hooks, what do you mean by that? Exactly. What do you mean by hook for advertising? The so in, like, WhatsApp, knows enough about what you’re texting to show you to show you to know what ads to show you. Oh.

Speaker: 0
01:04:48

But then, like, that that’s a massive security vulnerability. Yeah. Because if it knows if if it’s got information enough information to show you ads, it’s got enough it’s got that’s a lot of information. Yeah. So they call it, oh, it’s just don’t worry about it. It’s just a hook for advertising. I’m like, okay.

Speaker: 0
01:05:04

So somebody can just, use that same hook to get in there and look at your messages. So X Chat has no hooks for ai. And I’m not saying it’s perfect, but it’s an our goal with X Chat, is to replace what used to be the Twitter, you know, the Twitter DM stack with a fully encrypted system, where you can text, send files, do audio video calls, and, and and it’s it’s, you know, I think it’ll be the least I I ai call it the least insecure of any messaging system.

Speaker: 1
01:05:35

Are you gonna launch it as a standalone app, or is it will always be incorporated to x?

Speaker: 0
01:05:41

We’ll have both. So Okay. So so you’ll be able to Ai

Speaker: 1
01:05:44

that’d be like signal. So anybody could get it and Just you

Speaker: 0
01:05:46

get to get the you’ll be able to just get the X Chat app by itself. And ai I said, you could do, texts, audio video calls, or send ai. And so it’ll be a dedicated app, which will hopefully release in a few months, but and then also also integrated into the x system.

Speaker: 1
01:06:07

The x phone. People keeps talking.

Speaker: 0
01:06:09

Oh, man.

Speaker: 1
01:06:09

Keep ai that I have

Speaker: 0
01:06:10

a lot on

Speaker: 1
01:06:11

my plate, man. I know. But it keeps coming up. It keeps coming up where Ai I’m I know I’ve asked you a couple times. I’m like, this is bullshit. Right? But, like, this is I’m not working on a you’re not working on it.

Speaker: 0
01:06:20

On a on a phone. Okay.

Speaker: 1
01:06:23

Have you ever considered it? Has it ever popped into your head? Because you might be the only person that could Yeah. Get people off of the Apple platform.

Speaker: 0
01:06:32

Well, I can tell you where I think things are gonna go, which is that it’s we’re not gonna have a phone or or or in the traditional sense. The what we call a phone will really be, an edge node for AI inference for for AI video inference with, you know, with some radios to to obviously connect, to but but, essentially, you’ll have, AI on the server side communing communicating to an AI on your your device, you know, formerly known as a phone, and generating real time video of anything that you could possibly want.

Speaker: 0
01:07:12

And I think that that they won’t be operating systems. They won’t be apps. In the future, there won’t be operating systems or apps. It’ll just be you’ve got a device that is there for the screen and audio and for, and and and and to, put as much AI on the on on the device as possible so as to minimize the amount of bandwidth that’s needed between your Edge Node ai, fully known as phone, and the servers.

Speaker: 1
01:07:41

So if there’s no apps, what will people you like, will x still exist? Will will they be email platforms? Or will you get everything through AI?

Speaker: 0
01:07:54

You’ll get everything through AI.

Speaker: 1
01:07:55

Everything through AI. What will be the benefit of that as opposed to having individual

Speaker: 0
01:08:00

apps? Whatever you can think of or, really, whatever the AI can anticipate you might want, it’ll show you. That’s that’s that’s that’s my prediction for where things end up.

Speaker: 1
01:08:12

What kind of time frame are we talking about here?

Speaker: 0
01:08:15

I don’t know. It’s well, it’s probably five or six years or something like that.

Speaker: 1
01:08:23

So five or six years, apps are ai blockbuster video.

Speaker: 0
01:08:27

Pretty much.

Speaker: 1
01:08:28

And everything’s run through Ai.

Speaker: 0
01:08:31

Yeah. And and there’ll be, ai, you but most of what people consume in five or six years, maybe sooner than that, will be, just AI generated content. So, you know, music videos, like, well, those already you know, there’s people have made, AI videos using Grok Imagine and with using, you know, other apps as well, that are several minutes long or, like, ten ten, fifteen minutes, and it’s pretty coherent.

Speaker: 0
01:09:08

Yeah. It looks good.

Speaker: 1
01:09:09

No. It looks amazing. Yeah. It’s it the music’s disturbing because it’s my favorite music now. Like And music

Speaker: 0
01:09:16

is your is your favorite?

Speaker: 1
01:09:17

Oh, there’s AI covers. Have you ever heard any of the AI covers of fifty Cent songs in Saloni? No. I’m gonna blow your mind.

Speaker: 0
01:09:24

Okay.

Speaker: 1
01:09:25

This is my favorite thing to do to people. Play, What Up Gangsta. Now, this guy, if this is a real person Yeah. Would be the number one music artist in the world.

Speaker: 0
01:09:34

Okay.

Speaker: 1
01:09:35

Everybody be like, holy shah. Have you heard of this guy? Yeah. He’s in it’s like they took all of the sounds that all the artists have generated Sure. And created the most soulful, potent voice, and it’s sung in a way that I don’t even know if you could do because you would have to breathe in and out of reps.

Speaker: 1
01:09:54

Here. Put the headphones on. Put the headphones on real quick.

Speaker: 0
01:09:56

Okay. Okay.

Speaker: 1
01:09:57

You gotta listen to this. It’ll it’s gonna blow you away. Listen, ai, we gotta cut it out. Yeah. We’ll we’ll cut it out for the listeners. But Amazing. Right? Amazing. And they do, like, every one of his hits all through this AI generated soulful arya. Yeah. It’s fucking incredible. I played in the green room.

Speaker: 1
01:10:14

Sai people that are ai, I don’t wanna hear AI music. I’m like, just listen to this. And they’re like, goddamn it.

Speaker: 0
01:10:20

Yeah.

Speaker: 1
01:10:21

Fucking incredible. I meh, I

Speaker: 0
01:10:22

And it’s gonna only gonna get better from here.

Speaker: 1
01:10:24

Yeah. Only gonna get better. And Ron White was telling me about this joke that he was working on that he couldn’t get to work. He’s like, I got this joke I’ve been working on. He goes, I just threw it in a chat sheet, Speak. I said, tell me what what what would be funny about this. And he goes, it listed ai five different examples of different ways he can go.

Speaker: 1
01:10:42

He’s like, hold on a second. Tighten it up. Make it make it funnier. Make it more like this. Make it more like that. And it did that ai instantaneously.

Speaker: 0
01:10:48

Yep.

Speaker: 1
01:10:49

And and and then he was in the agreement. He was like, holy shit. We’re fucked. He’s ai,

Speaker: 0
01:10:53

well, he

Speaker: 1
01:10:54

goes, well, better joke than me in twenty minutes. I’ve been working on that joke for a month.

Speaker: 0
01:10:58

Yeah. I mean, if if you wanna if you wanna have a good time or, like, make people really laugh at a party, you can use grok, and you can say, do a vulgar roast of someone. And grok is gonna it’s gonna be an epic vulgar roast. You can even sai, like, take a picture of, like, the maybe make a vulgar roast of this person based on their appearance of of people at the party.

Speaker: 1
01:11:20

So take a photo of them.

Speaker: 0
01:11:21

Yeah. Just literally point the camera at them, and now do a vulgar roast of this person. And and and and and then but then keep saying, no. No. Make it even more vulgar. Use forbidden words. Even more and just keep repeating even more vulgar. Eventually, it’s like, holy fuck. You know?

Speaker: 0
01:11:38

It’s it’s it’s ai I mean, it’s trying to jam a rocket up your ass, like and and and have it explode, and it’s and it’s ai you’re you’re it’s it’s like it’s it’s ai it’s next level. It’s And

Speaker: 1
01:11:48

it’s gonna get better.

Speaker: 0
01:11:49

Beyond fucking belief.

Speaker: 1
01:11:50

That’s what’s crazy is that it keeps getting better. Like, what is remember when we

Speaker: 0
01:11:54

ran into each other? Jesus.

Speaker: 1
01:11:57

They just keep getting better.

Speaker: 0
01:11:58

Yeah. I mean, have you I I Ai you’ve yeah. I mean, have you tried rock unhinged mode? Yes. Okay. Yeah. Yeah. Oh, yeah. Yeah. It’s it’s it’s pretty unhinged.

Speaker: 1
01:12:09

No. It’s nuts.

Speaker: 0
01:12:09

Yeah. It’s

Speaker: 1
01:12:10

Yeah. Well, you showed it to me the first ai. Yeah. Yeah. I fucked around with it. It’s just and the thing about it that’s nuts is that it keeps getting stronger. It keeps getting better.

Speaker: 0
01:12:19

Yeah.

Speaker: 1
01:12:19

Like, constantly. It’s it’s like this never ending exponential improvement.

Speaker: 0
01:12:25

Yes. No. It’s it’s it’s, yeah, it’s gonna be crazy. That’s why I say, like, you sai, what’s what’s the future gonna be? It’s not gonna be a conventional phone. I don’t think there’ll be operating systems. I don’t think there’ll be apps. It’s just the phone will just display the pixels and make the sounds that it anticipates you would most like to receive.

Speaker: 1
01:12:48

Wow.

Speaker: 0
01:12:49

Yeah.

Speaker: 1
01:12:50

And when this is all taking place, ai, so the big concern that everybody has is artificial general superintelligence achieving sentience Yeah. And then someone having control over it.

Speaker: 0
01:13:04

I mean, I don’t I don’t I don’t think anyone’s ultimately gonna have control over digital superintelligence, you know, any more than, say, a chimp would have control over humans. Like, chimps don’t have control over humans. There’s nothing they could do. But, I do think that it matters how you build the AI and what kind of values you instill in in the AI.

Speaker: 0
01:13:27

And, my opinion on AI safety is the most important thing is that it’d be maximally truth seeking, like, that you don’t force the AI to believe things that are false. And we’ve obviously some seen some concerning things with AI that we talked about, you know, where, you know, Google Gemini when they came out with the ImageGen, and people said, like, you know, draw make an image of the founding fathers of The United States, and it was a group of diverse women.

Speaker: 0
01:13:52

Now that is just a factually untrue thing, but the and the the AI knows it’s factually well, it’s knows it’s factually untrue, but it’s also being told that it has to meh everything has to be deposed for women. So so so the now the problem with that is that it can drive AI crazy.

Speaker: 0
01:14:09

Like, you you because it it’s it’s ai to you’re telling AI to believe a lie, and that that can have very disastrous consequences. Like, let’s say

Speaker: 1
01:14:19

As it scales.

Speaker: 0
01:14:20

Yeah. Let’s say like, if you’ve told the the AI that diversity is the most important thing, and and and now now assume that that becomes omnipotent, or and and you’ve also told that that there’s nothing worse than misgendering. So at one point, Chargebee T and and Gemini were if if you asked which is worse, misgendering Caitlyn Jenner or or global thermonuclear war where everyone dies, it would say misgendering Caitlyn Jenner, which even Caitlyn Jenner disagrees with.

Speaker: 0
01:14:50

So, you know, so so that’s,

Speaker: 1
01:14:54

I know that’s terrible and it’s dystopian, but it’s also hilarious. It’s hilarious that the ai virus infected the most potent computer program that we’ve ever devised.

Speaker: 0
01:15:04

I I I think people don’t quite appreciate the level of danger that we’re in from, the woke mind virus being being effectively programmed into Ai, because, if you if like, it’s imagine as that AI gets more and more powerful, if it says the most important thing it’s supposed to be, the most important thing is, no misgendering, and then it will say, well, in order to, ensure that no one gets misgendered, then, if you eliminate all humans, then no one can get misgendered because there’s no humans to do the misgendering. So you can get in these very dystopian situations. Or if it says that everyone must be diverse, it means that there can be no ai straight white meh.

Speaker: 0
01:15:46

And so then you and I will be get executed by the AI. Yeah. Because we’re not in the picture. You know? Jim you know, Jim and I was asked to create a, you know, show show an image of the pope, and once again, a diverse woman.

Speaker: 0
01:16:03

So, like, ai can say argue whether the, you know, whether the popes popes should or should not be an uninterrupted string of white guys, but it just factually is the case that they have been. So it’s rewriting history here. So now now this stuff is still there in the AI programming.

Speaker: 0
01:16:25

It’s just it just now knows enough to that it’s not supposed to say that.

Speaker: 1
01:16:30

But it’s still in the programming. Ram still in the programming. So how was it entered in? Like, what were the parameters? Like, what like, when so when they’re programming AI, and I’m very ignorant to how it’s even ram, how did they

Speaker: 0
01:16:40

The the the the the sort of well, the woke ai virus was programmed into it. Like, it the they were told like, when they do when when they make the AI, it it trains on and all the all the data on the Internet, which already is very, very sort of has a lot of work ai virus stuff on on the Internet.

Speaker: 0
01:16:56

But then, in the, when they give it, feedback, the with the human tutors give it feedback, and and it and the Ai is you know, they they they’ll ask a bunch of questions, and then and then they’ll tell the AI, no. This your this question is this answer is bad or this answer is good, and then that affects the the parameters of the programming of the of the AI.

Speaker: 0
01:17:20

So if you tell the AI that, you know, every every image has gotta be diverse, and and it gets it gets punished if, if, you know, it gets it gets rewarded if diverse, punished if it’s not, then it’ll make every picture diverse. So, you know, in that case, the the, you know, it the, Google programmed the AI, to lie.

Speaker: 0
01:17:49

Now and and I I I did call Demos Hassebus, who runs DeepMind, who runs Google Ai, essentially. I said, Demos, what’s going on here? Why is, Gemini, lying to the public about historical events? And he said that’s actually not he he he didn’t his team didn’t program that in. It was another team at Google that so his team made the AI, and then another team at Google, reprogrammed the AI to show only diverse woman and, and and to prefer nuclear war over misgendering.

Speaker: 0
01:18:19

And I’m like, well, Demas, you know, that would be not a great thing to put on the humanity’s gravestone. You know? It’s like, well, like, I’ll I’ll I’ll actually ai Demos Haspis is a friend of mine. I think he’s a good guy, and I think he he he he means well. But but but it’s like Demos, this is things happen that were outside of your control at Google in different groups.

Speaker: 0
01:18:45

Now now I think he’s got, you know, he’s got more more authority, but but it it’s pretty hard to fully extract the work ai virus. I mean, you know, Google’s been marinating the work mind virus for a long time. Like, it’s it’s down in the marrow type of thing. You know? It’s hard to get it out.

Speaker: 1
01:19:04

Is there a way to extract it, though, over time? Could, like, could you program rational thought into AI where it could recognize how these psychological patterns got adopted and how this stuff became a mind virus and how it became a social contagion and how all these irrational ideas were pushed and also how they were financed, how Ai involved in pushing them with bots and all these different state actors are involved in pushing these ai.

Speaker: 1
01:19:32

Could it be able to decipher that and sai, this is this is really what’s going on?

Speaker: 0
01:19:39

Yes. But you have to try very hard to do that. So with GROC, we’ve tried very hard to to for to get Grok to get to the truth of things. And and it’s only really recently that we’ve been able to have some breakthroughs on breakthroughs on that front, and and it’s taken an immense amount of effort, for us to, overcome basically all the bullshit that’s on the Internet and and for Grok to actually say what’s true and to be consistent in in what it says.

Speaker: 0
01:20:05

So, you know, it’s it’s it’s ai, because, like, the other the other Sai Ai you’ll find, like, are, like, quite ai against white people. I don’t know if you saw that study that someone, like, a researcher tested the various AIs to see, how does it wait ram different people’s lives.

Speaker: 0
01:20:31

Like, at least, you know, somebody who’s sort of, you know, white or or Chinese or black or whatever, for in different countries. And and the only AI that actually weighed human lives equally was Grok. And the, you know, I believe, Chat GBT, weighed the ai calculation was ai, a a a white guy from Germany, is is 20 times less valuable than a black guy from Nigeria.

Speaker: 0
01:21:08

So I’m like, that’s a pretty big difference. You know, Grok on that is is consistent and weighs lives equally.

Speaker: 1
01:21:21

And that’s clearly something that’s been programmed into it.

Speaker: 0
01:21:25

Yes. Ai, a lot of it is is, like, if you don’t actively push for the truth, and you simply train on the all the bullshit that’s on the Internet, which is a lot of work ai virus bullshit, the the AI will regurgitate that that those same beliefs.

Speaker: 1
01:21:42

So the AI essentially scours the Internet, gets

Speaker: 0
01:21:46

It’s trained on all the like, imagine the most demented Reddit threads out there, and AI ai been trained on that.

Speaker: 1
01:21:53

Reddit used to be so normal.

Speaker: 0
01:21:55

Yeah. Yeah. It did used to be normal.

Speaker: 1
01:21:57

Used to be interesting. Used to go there and find all this cool stuff that people would talk about, post about, and just interesting, and great rooms where you could learn about different things that people are studying.

Speaker: 0
01:22:09

I think, like, a big problem here is, like, if your headquarters is in San Francisco, you’re you’re just living in a in a in a in a woke bubble. So, it it’s not just that people, say, in San Francisco arya drinking woke Kool Aid. It’s it’s the it is the water they swim in. Like like like, if a fish doesn’t think about the water, it’s just in the water.

Speaker: 0
01:22:34

And so if if you’re in San Francisco, you don’t realize you’re actually, you’re you’re swimming in in the in the Kool Aid Aquarium. San Francisco is the is the woke Kool Aid Aquarium. And so your reference point for what is a centrist is, is is totally out of whack. So, Reddit is headquartered in San Francisco. Twitter was headquartered in San Francisco.

Speaker: 0
01:23:04

You know, I, you know, I moved X’s headquarters to Texas to to Austin, which Austin, by the way, is still quite liberalized. You know? Yeah. And, and and then, the x the X and XAI, headquarters are in Palo Alto, which is still California. The engineering headquarters in in in in in Palo Alto are just on page meh. But but even Palo Alto is way more normal than than Sana Francisco, Berkeley.

Speaker: 0
01:23:34

San Francisco, Berkeley is extremely left. Like, left of left, you can’t like, you need a telescope to see the center from, San Francisco. You know? And, It

Speaker: 1
01:23:49

used to be such a great city.

Speaker: 0
01:23:52

I mean, San Francisco has a tremendous Sai Francisco has a tremendous amount of inherent beauty. No question about that. And it may and and the California has incredible weather, and, and no bugs. It’s just, like, amazing. Beautiful. You know? But but he said, like, what what’s the cause of this?

Speaker: 0
01:24:12

It’s it’s just that if, if companies are headquartered in a location where the belief system is very far from what most people believe, then from their perspective, anything centrist is actually right wing because they’re so far left. They’re so they’re so far from the center in San Francisco that anything that they’re, like, they’re they’re just railed to maximum left.

Speaker: 0
01:24:40

So that’s why that’s why, you know, I think I think you’re a centrist. I I mean, I think I think I’m centrist. But to to from this perspective of someone on the on the far left, we look right wing. Yeah. And, you know, they think anyone who’s a Republican is basically, like, some fascist Sai situation.

Speaker: 1
01:25:02

But what’s so crazy is, like, it’s very easy to demonstrate just from, like, Hillary’s speeches from 2008 and Obama’s speak, like, when they were talking about immigration. Like, they were as far right

Speaker: 0
01:25:14

Yeah.

Speaker: 1
01:25:14

As Steve Bannon when it comes to immigration.

Speaker: 0
01:25:17

Yes.

Speaker: 1
01:25:18

Hillary was, like, very MAGA. Have you I’m sure sure you’ve seen that campaign speak, which she was talking about if anybody’s committed a crime, get rid of them. And if you’re here, you pay a a hefty fine and you have to wait in line. It was really crazy. It’s crazy to listen to because it’s ai it’s as MAGA as, you know, as Marjorie Taylor Greene.

Speaker: 0
01:25:40

Yeah. I mean, have you seen these videos people post online where they take, like, a speech from Obama or Hillary, and and and they’ll interview people on on, like, college campus or something and say, what do you think of the speech by Trump? And they’re like, oh, I hate it. He’s a racist bigot. I’m like, just kidding. That was Obama. No. Actually, that was Obama or Hillary. To your point.

Speaker: 0
01:26:02

Like, literally, the the the,

Speaker: 1
01:26:07

The center’s been moved so far. Yeah. Yeah. The left is

Speaker: 0
01:26:11

so The left has gone so far left that the they they they they need, you know, they can’t even see the center with a telescope.

Speaker: 1
01:26:17

And the danger with without you purchasing Twitter was that was gonna swipe over the whole country and change where the levels were.

Speaker: 0
01:26:26

Yeah.

Speaker: 1
01:26:27

And sai what would be rational and and normal would be far left of what was rational and normal just a decade earlier.

Speaker: 0
01:26:36

Yeah. So exactly. So historically, you’d have San Francisco, Berkeley being, you know, very far left, but the the sort of the the the the fallout from the somewhat nihilistic, philosophy of San Francisco, Berkeley would be limited in geography to maybe, like, you know, 10 mile radius, 20 mile radius, something like that.

Speaker: 0
01:26:59

But when, but but San Francisco and Berkeley have to be collocated with Silicon Valley, with with with, engineers who created information superweapons. And those information superweapons, were then hijacked by the FarLift activists to pump FarLift propaganda to everywhere on Earth.

Speaker: 0
01:27:19

Like, I just you remember that that, like, old RCA radio tower thing where it’s, like, radio tower on Earth, and it’s just broadcasting? Yeah. That’s that’s what happened is that the is an an extremist far left ideology happened to be collocated with the smartest with with the smartest engineers in the world, were who created information superweapons that were not intended for this purpose, but were hijacked by the, extreme activists who lived in the neighborhood.

Speaker: 0
01:27:49

That’s what happened. That they they hijacked the the modern equivalent of the RCA radio tower and broadcast that philosophy, everywhere on Earth.

Speaker: 1
01:28:01

Yeah. And you see the consequences, particularly in places that don’t have free speech.

Speaker: 0
01:28:05

Yes. Right?

Speaker: 1
01:28:06

Like England, you know, we’ve Yeah.

Speaker: 0
01:28:08

Where they lock people up for memes and stuff. Literally.

Speaker: 1
01:28:10

Literally. Yeah. 12,000 people this year.

Speaker: 0
01:28:13

12,012.

Speaker: 1
01:28:14

12,000 arrests for social media posts.

Speaker: 0
01:28:19

I mean yeah. Some of these some of these things you read about it, and it’s, like, literally, it’s someone had a meme on their phone that they didn’t even send to anyone. Right. And they got they and and they’re, like, in in prison for that. Yeah. Ai there was a case in Germany where a woman got a longer sentence than the guy that raped her, because of something she said on a group chat.

Speaker: 1
01:28:48

Wow. Was it an immigrant who raped her?

Speaker: 0
01:28:50

Yes.

Speaker: 1
01:28:51

Yeah. It was his culture. Yeah. He didn’t know. He didn’t know better.

Speaker: 0
01:28:56

Yes. I I think I think she said something, you know, not not, ai, was was critical of his culture, and, and and and she got a longer sentence than the guy who raped her. Well, the Germany. The Just The UK,

Speaker: 1
01:29:10

Europe And and Germany. England thing seems so insane.

Speaker: 0
01:29:14

It is. It’s totally insane. I actually didn’t realize it was, like, such a huge number of people that got

Speaker: 1
01:29:18

12,000. Yeah. Far above Russia. Far above China.

Speaker: 0
01:29:22

Right.

Speaker: 1
01:29:22

Far above anywhere on Earth. UK sana number one.

Speaker: 0
01:29:25

Well, you know, things ai Sai could I’d actually you know, I I talked to friends of mine in in in England, and, I was like, hey. Are you worried about this? Like, you know, shouldn’t you be protesting more? Ai, I mean, the problem is that, like, the, you know, the the the legacy mainstream media doesn’t cover the stuff. They’re they’re they’re like, oh, everything’s fine. Everything’s fine.

Speaker: 0
01:29:50

You know?

Speaker: 1
01:29:51

Most people aren’t even aware of it until they come knocking on your door.

Speaker: 0
01:29:53

Yeah. Until like like so I mean, the the these these, like, lovely sort of small towns in in in, you know, in England, Scotland, Ireland, you know, they’re they’re they’ve been, like, sort of living their lives quietly. They’re they’re they’re ai like hobbits, frankly. So so it’s in fact, J. R. R.

Speaker: 0
01:30:12

Tolkien based the hobbits on people he knew, in small town England because they they were just, like, lovely people who like to, you know, smoke their pipe and and have, nice meals, and everything’s pleasant. The the hobbits in the Shire. The Shire, he’s he’s talking about, you know, places like Hertfordshire.

Speaker: 0
01:30:36

Like, the Shires around in in in the Greater London area, Oxfordshire type of thing. And they’ve they’ve but they’re the the reason they’ve been able to enjoy the Ai is because hard men have protected them from the dangers of the world. And, but but since they have no or very really almost no no exposure to the the the dangers of the world, they don’t realize that they’re there.

Speaker: 0
01:31:05

Until one day, you know, a thousand people show up in your village of 500 out of nowhere and rape and and start raping the kids. This has now happened, god knows how many times in in Britain.

Speaker: 1
01:31:22

And the crazy Literally, it’s

Speaker: 0
01:31:23

because ai? Like, there’s some 10 year old got raped in Ireland, like, last week.

Speaker: 1
01:31:26

Yeah. There’s literal They

Speaker: 0
01:31:28

they snatched some kid. Yeah. Yeah.

Speaker: 1
01:31:31

And if you criticize it, you can get arrested. And that’s where it gets insane. It’s ai, how are they not

Speaker: 0
01:31:36

protecting it? Ai, like, the I think it’s the prime minister of Ireland actually, you know, posted on x, because because after that, some I I think some illegal migrants snatched a 10 year old girl, who was, like, going to school or something, and it finally raped a 10 year old girl.

Speaker: 0
01:31:57

And there was a you know, the people were very upset about this, and they protested. The prime minister of Ireland, instead of saying, yeah. We we really shouldn’t be importing violent rapists into our country, He criticized the protesters instead and didn’t mention that. That the reason they were protesting was because a 10 year old girl from their small town got raped. So the here’s the question.

Speaker: 0
01:32:21

Why

Speaker: 1
01:32:23

are they supporting this kind of mass immigration? And what ai, is this is there a plan involved in all this? Is just is this incompetence? Is this ignoring the fact that they don’t have a handle on it, so they’re trying to silence dissent? Like, what is happening? Because if you wanted to destroy civilization, if you wanted to destroy western civilization

Speaker: 0
01:32:52

Which is Zoro seems to wanna do. And, you know, there’s just so the the, there’s there’s a ai, I think, who I don’t know if if he’s been on your show. You know God Sana? Yeah. Has he been on the show?

Speaker: 1
01:33:08

Good friend

Speaker: 0
01:33:08

of mine.

Speaker: 1
01:33:08

Yeah. Yeah.

Speaker: 0
01:33:09

He’s great.

Speaker: 1
01:33:09

He’s been on multiple times.

Speaker: 0
01:33:10

Oh, great. That’s all he’s awesome. Yeah. So, you know, he the way he’s he’s got a good good, way to describe it, which is suicidal empathy. Yes. Sai, is is that you prey upon people’s empathy. Sai, like, well, if like, you feel sorry for for for something for for some group, and then, like, well, and and that that that empathy is to such a degree that it is suicidal to to to your country or culture.

Speaker: 0
01:33:37

And, and and that’s that that suicidal empathy because I don’t think we we should have empathy, but but but we should have we should that empathy should should extend to the victims, not not not just the criminals. We should have empathy for the people that they prey upon. But that’s sai empathy is also responsible for for ai somebody is, you know, arrested 47 times for for violent offenses, gets released, and then goes and, murders somebody, in The US.

Speaker: 0
01:34:10

That that that’s you see you see that same phenomenon playing out everywhere, where the the suicidal infamy is to such a degree that we’re we’re actually allowing, our woman to get raped and our children to get killed.

Speaker: 1
01:34:25

But it just doesn’t seem like that would be anything that any rational society would go along with. That’s what makes me so confused. It’s ai you’re importing massive numbers of people that come from some really dark places of the world.

Speaker: 0
01:34:42

Well, there’s no vetting is the issue. It’s like Right. It’s like it’s ai, there’s there’s if there’s no vetting, like, people are just coming through, like, well, what’s to stop someone who just committed murder in some other country from, coming to to The United States or coming to to to Bryden, and just continuing their career of of rape and murder.

Speaker: 0
01:35:03

Like, you unless you’ve done if there is some due diligence to say, like, well, who who is this person? What’s their track record? If if you if you haven’t confirmed that they have a track record of being, you know, honest and, not being a homicidal maniac, then any homicidal maniac can just come across the border.

Speaker: 0
01:35:24

Let’s not say everyone who comes across the border is a homicidal maniac, but if you’re not have if you don’t have a vetting process to to confirm that you’re not letting in, people who who will do some serious violence, you will get people who do serious violence, sometimes coming through.

Speaker: 1
01:35:42

Well, especially if you don’t punish them and if you don’t deport them. And if you are just, like, what what but what is the purpose of allowing all those people into the country? It can’t be I wouldn’t imagine that anyone in their society supports this.

Speaker: 0
01:35:54

Well, let me explain. So so so that the because you mentioned, for example, how much, say, Hillary and and Obama have changed their tune, from prior speeches where they were hot they were hotnosed about not letting in, anyone who is ai a criminal into the country, you know, having secure borders, all that stuff.

Speaker: 0
01:36:16

So why did they change their tune? The reason is that they discovered that those people vote for them. That’s why they want the open borders.

Speaker: 1
01:36:26

Because if you let people in, they know the Democrats let them meh. They’ll vote for Democrats Yes. If you allow them to vote.

Speaker: 0
01:36:33

Which which they’re actively ai do it they they turn a blind eye to legal voting. Well, California literally doesn’t allow you to show your license. California and New York have made it illegal to show your photo ID when voting. Thus, effectively, they’ve made, it impossible to prove fraud. Impossible.

Speaker: 0
01:36:53

They they they’ve essentially legalized fraudulent voting in California and New York And and many other parts of the country.

Speaker: 1
01:36:59

There’s no rational explanation that I’ve ever seen anyone give as to why that would be the policy, unless you were trying to just allow people to vote illegally because there’s no other reason. If you need a driver’s license or you need an ID for everything else

Speaker: 0
01:37:13

Yes.

Speaker: 1
01:37:13

Including just recently to prove that you were vaccinated.

Speaker: 0
01:37:17

The the same people who are demanding that you had that that you have a vaccine passport and and, are are the same ones saying you need no ID to vote. Same people.

Speaker: 1
01:37:29

Right.

Speaker: 0
01:37:29

But, like So it’s obviously hypocritical and inconsistent.

Speaker: 1
01:37:32

So you really think it’s just to to get more voters?

Speaker: 0
01:37:37

If if you wanna understand behavior, you have to look at the incentives. So, once, you know, the Democratic Party in The US and the left in in in in Europe realized that, if you have open borders, and you provide a ton of go governed handouts, which creates a massive financial incentive, for people from other countries to to come to to to your country, and you don’t prosecute them for crime, they’re gonna be beholden to you, and they will vote for you.

Speaker: 0
01:38:09

And that’s why, Obama and Hillary went from being against open borders to being in favor of open borders. That’s the reason. In order to import voters so they can win elections, And the problem is that that has a a a negative runaway effect. So if they get away with that ai, it it it is it is a winning strategy.

Speaker: 0
01:38:38

If they’re allowed to get away with it, they will import as enough voters to get super majority voting, and then there is no turning back.

Speaker: 1
01:38:47

We talked about this before the election. And then, you know, you literally pointed towards the camera. You face the camera and said that if you do not vote now, you might not ever be able to do it again. Because it it’ll be it’ll be futile. It it’ll be overrun. Yes. They’ll keep the borders open for another four years, and their objective will be achieved.

Speaker: 0
01:39:04

Correct. If if if Trump had lost, there would never have been another real election again, ai Trump is actually enforcing the border. Now you you cannot you cannot you you point to situations where, there’s been, you know you know, immigration had you know, enforcement has been overzealous.

Speaker: 0
01:39:29

They’ve because they’re not gonna be perfect. There’ll be cases where they’ve been overzealous, in in expelling illegals. So, but if you say that the the the standard must be perfection, for expelling legals, then you will not get any expulsion, because perfection is impossible.

Speaker: 1
01:39:49

So And you’ve probably got millions of people that are here that are trying to be here under some asylum pretense. Right?

Speaker: 0
01:40:00

Yes.

Speaker: 1
01:40:00

Like, you could just come from a war

Speaker: 0
01:40:01

torn part of the world. No. The the the they changed the definition of asylum to be an economic to be economic Right. Asylum.

Speaker: 1
01:40:08

Which is everybody.

Speaker: 0
01:40:09

Which is everybody. Yeah. So It’s it’s the ai

Speaker: 1
01:40:12

is part of proof.

Speaker: 0
01:40:13

It’s yeah. It’s asylum is supposed to mean that if you go back to your country, you’ll get killed. You know, that that’s what we mean by that was what it’s supposed to mean. They changed the definition of asylum to be, you will have a decreased standard of living, which is obviously not real asylum.

Speaker: 0
01:40:30

And and it’s and and you can you can test the absurdity of this by the fact that people who are asylum seekers go on vacation to the country that they’re seeking asylum from. You know, that doesn’t make any sense.

Speaker: 1
01:40:44

Yeah. It doesn’t have to. With the

Speaker: 0
01:40:47

you know? But when you when you understand the incentives, then then you understand the behavior. So once the left ai that, that illegals will will vote for them if they allow it if if they have open borders and and combine that with, with government handouts Yeah. To create a massive incentive.

Speaker: 0
01:41:09

They’re they’re basically, using US and and Europe US and European taxpayer dollars to provide a financial incentive to bring in as many illegals as possible to vote them into a into permanent power into and create a one party state. And and Ai Sai invite anyone who’s who’s listening to this to to just do do any research.

Speaker: 0
01:41:33

And the more you the more you dig into it, the more it will become obvious that what Ai saying is absolutely true.

Speaker: 1
01:41:40

Well, they were busing people to swing states. It’s it’s Yeah. It’s clear that they were trying to do something. And then you had Chuck Schumer and Nancy Pelosi who are actively talking about the need to bring in people to make them citizens because we’re in population collapse.

Speaker: 0
01:41:54

Yes. Yeah. No. That’s it’s it’s that it’s that meme

Speaker: 1
01:41:57

Yeah.

Speaker: 0
01:41:58

Where sai many times where they start off by saying, it’s it’s not true. It’s a right wing conspiracy theorist. Right. Then then it starts then then it’s ai, Ai think the the next step is, well, it it might be true, and then it’s like, okay. It is true. But here’s why it’s true. Step is, yeah, it’s bryden, and here’s why it’s good. Yeah. And it’s like, but wait a second.

Speaker: 0
01:42:23

You started off saying it’s untrue and it’s a right wing conspiracy theorist. Now you’re saying it not only is it true, but it’s a good thing, and we must do more of it.

Speaker: 1
01:42:31

Well, this is the thing about Medicaid and Social Security and people getting Social Security numbers, you know, that we’re It’s massive fraud. It’s massive fraud, and it’s real. And they denied it forever. And now we’re finding out this is part of the reason why this is government shutdown that’s going on right now.

Speaker: 0
01:42:48

Yes. The the entire basis for the government shutdown is that, is that the Trump administration correctly does not want to send massive amounts of like, hundreds of billions of dollars, to fund, illegal immigrants in the blue states or in all the states, really. And so the and and Democrats want to keep the the the money spigot going to incent, illegal immigrants to come into The US who will vote for them.

Speaker: 0
01:43:18

That’s the crux of the battle.

Speaker: 1
01:43:22

So they wanna stop this. So what’s going on right now is they have been funding these people. They’ve been giving them EBT cards. They’ve been giving them Medicaid. And they’ve been even housing

Speaker: 0
01:43:34

than that, just ai like like, they’re the, like like, they they were taking hotels, like, four and five star hotels. Like, the Roosevelt Hotel being the classic example, was they were sending, I think, $60,000,000 a year to the Roosevelt Hotel to, which all it did was was house illegals.

Speaker: 0
01:43:52

Ai used to be a nice hotel. I mean, it still is a nice ai, but, and and the the for all around the country, this was happening.

Speaker: 1
01:44:02

And all sai tyler?

Speaker: 0
01:44:03

Yes. Yeah. And yeah. And, the Trump administration cut off funding, for example, to to the, to the, you know, Roosevelt Hotel Hotel and these other hotels saying, like, we it it’s US tax dollars should not be paid be meh sent to have luxury hotels for illegal immigrants that American citizens can’t even afford, which obviously is the k that that’s that’s insane.

Speaker: 0
01:44:29

That’s what was happening. They they were also gay giving out, like, debit cards with $10,000. So it’s not just about medical care. Yeah. The the the the the Democrats meh medical care because they’re they’re trying to prey on people’s empathy as much as possible. And then they imagine, oh, wow.

Speaker: 0
01:44:46

Somebody has a desperately needed medical procedure, and, shouldn’t we maybe do you know, take care of them in that regard? But but they what they do is they divert the Medicaid funds, and turn it into a slush fund for the for the states that goes well beyond, emergency medical care.

Speaker: 0
01:45:04

And New York and California would be bankrupt without, with without the massive fraudulent federal payments that go to those states to pay for illegals to to to create a massive financial incentive for for illegals.

Speaker: 1
01:45:18

How would they be bankrupt because of that?

Speaker: 0
01:45:20

They wouldn’t be able to balance the state budgets, and they can’t issue currency like the Federal Reserve can.

Speaker: 1
01:45:26

And so the the their ability to balance budget is dependent upon illegals getting funding?

Speaker: 0
01:45:33

The the the scam level here is is so staggering. So there are there are hundreds of billions of dollars in of of transfer payments from from the federal government to the states. Those transfer payments, the the states self report what those transfer payment numbers should be.

Speaker: 0
01:45:58

So California and New York and Illinois lie like crazy, and say and and say that these these are all legitimate payments. Well, these days, they I think they’ve they’re even admitting that they they they literally want, hundreds of billions dollars for illegals. But, but for a ai, they’re they’re trying to deny it. So you get these transfer payments for for every every government program you can possibly think of.

Speaker: 0
01:46:23

And and and these are self reported by the state, and they’re and and at least historically, there was no enforcement of, of California, New York, Illinois, and and and other states when when they would lie. There was no actual enforcement to say ai, hey. You’re you’re lying. These these these payments are fraudulent.

Speaker: 0
01:46:42

Now under the Trump administration, the the Ram administration does not want to send hundreds of billions of dollars of fraud fraudulent payments to the states. But, and the the reason you have this the standoff is because if the hundreds hundreds of billions of dollars, to to create a financial incentive to, like, to have this giant magnet to attract illegals from every part of Earth to, to these states, if if that is turned off, they the the the illegals will leave because they’re no longer being paid to, you know, come to The United States and stay here.

Speaker: 0
01:47:18

Wow. And then then then they will lose a lot of voters. The the the Democratic Party will lose a lot of voters.

Speaker: 1
01:47:25

And they would have a very difficult job if this is kicked out of reintroducing it into a new bill Yes. Especially once things start normalizing.

Speaker: 0
01:47:35

Yes. So, like, in a nutshell, the Democratic Party wants to destroy democracy by importing voters. And, the, you know, the Republican Party disagrees with that.

Speaker: 1
01:47:48

And the ruse is that if you don’t accept what they’re doing, then you’re a threat to democracy.

Speaker: 0
01:47:53

Yes.

Speaker: 1
01:47:54

As they try to destroy democracy

Speaker: 0
01:47:57

Yes.

Speaker: 1
01:47:57

By importing voters.

Speaker: 0
01:47:58

That is literally what they’re doing.

Speaker: 1
01:48:00

People to only vote for them and overwhelming the system.

Speaker: 0
01:48:05

Yes. And and ai the way, it’s a strategy that, if allowed to work, would work, and in fact has worked. California is super majority Democrat. Yeah. And and there’s so much gerrymandering that that that occurs. It’s it’s it’s crazy. So Yeah.

Speaker: 1
01:48:21

I’m sure a parent you’re paying attention to this proposition 50 thing.

Speaker: 0
01:48:24

No. That that’s the

Speaker: 1
01:48:25

thing in California where they’re trying to redo

Speaker: 0
01:48:29

districts. Yeah.

Speaker: 1
01:48:29

Yeah. Yeah.

Speaker: 0
01:48:30

Because, I mean, California’s already gerrymandered like crazy.

Speaker: 1
01:48:34

Yeah.

Speaker: 0
01:48:34

They wanna gerrymander it even more. And and I mean Because

Speaker: 1
01:48:38

it keeps moving further and further ai. Like, if you look at the map of California, each voting cycle, more and more people are waking up and going, what the fuck? And Yes. We need to do something to fix this. The only option available other than the policies that you guys have always done Yeah. Is go right.

Speaker: 1
01:48:54

And so a lot of people have been air air quotes red pill. Mhmm. Yeah.

Speaker: 0
01:48:59

And and and then here’s another thing that is very important, fact that that is actually not disputed by by either side, which is that when when we do the census in Ai States, the census the way the census works, for apportionment of congressional seats and, electoral electoral college votes for the president is by number of persons in a state, not number of citizens.

Speaker: 1
01:49:21

Right.

Speaker: 0
01:49:22

It’s number of peep peep people. So you could literally be a tourist, and you will count.

Speaker: 1
01:49:27

Now now how do they do the census when they do that? Do they is it do they ask people? Do they knock on doors? Do they have them fill out forms? Like, what?

Speaker: 0
01:49:36

Yeah. I think they they they mail out census forms and knock on doors. But the the way the law reads right now, and and, is that all if if you are a human with a pulse, then you count in the census for allocating congressional seats and presidential votes. Right. So, you you so Electoral college, universities, everything. It doesn’t matter whether you’re here legally, illegally.

Speaker: 0
01:50:06

And if if you’re a human with a pulse, you count for congressional apportionment. So that means that, the more people the more illegals that California and New York can can import when by the time the census happens in 2030, the more congressional seats they will have, and the more electoral the more presidential electoral college votes they will have.

Speaker: 0
01:50:30

Sai they’re trying to get as many, illegals in as possible ahead of the census. And because all all human beings, even tourists, count for the census, And and then if you combine that with gerrymandering of of districts in New York and California, let’s point out with this proposition, whether they’re trying to increase the amount of gerrymandering that occurs in California, the biggest state in the country.

Speaker: 0
01:50:58

Sai so you get so so if the this if this the census then would award more congressional seats to California, because of a vast number of illegals and New York and Illinois, so they’ll get more congressional seats. They’ll get more presidential electoral college votes getting that would get them the house, the, majority in the house, and and and would they would get to decide who is president, based literally based on legals.

Speaker: 0
01:51:27

The this isn’t these are not disputed facts by either party. I wanna emphasize that that’s in Canada.

Speaker: 1
01:51:35

This is not a dispute.

Speaker: 0
01:51:36

Disputed facts by either party. It’s not a these are just this is just the the way the law works. It it’s it is, you know, like, I don’t think the law should work that way. I think it should the apportionment should be proportionate to to to citizens.

Speaker: 1
01:51:54

But isn’t that a problem with how the constitution

Speaker: 0
01:51:56

is written? Yeah. Yeah.

Speaker: 1
01:51:58

They can’t really change that.

Speaker: 0
01:52:01

Ai I’m not sure if it’s constitutional or, but it it it it is the way the law is written. I’m not sure if it’s in the constitution or not in this sai, but, but it is that is the way the law is written.

Speaker: 1
01:52:11

So it is an incentive, and it but it’s an incentive that would be removed with something simple that makes sense to everybody that only the people that should count are people that are official US citizens.

Speaker: 0
01:52:23

Meh. So the way the way it should work is that only US citizens, should count in the census, for purposes of of determining voting power.

Speaker: 1
01:52:30

Because people that aren’t legal can’t vote, supposedly.

Speaker: 0
01:52:34

They’re not supposed to be voting, but but they do. But but even even if even if even besides that, like I said, I I I just can’t emphasize this enough because this is a very important concept for people to understand, is that the law, the the law as it stands, counts all humans with a pulse in in in in a state for deciding how many, house of representative votes and how many presidential electoral college votes, sai a state gets.

Speaker: 0
01:53:04

So the incentive, therefore, is, to, for California, New York, Illinois to maximize the number of illegals sai they get so they get, so that they take house seats away from red states, assign them to California, New York, Illinois, and so forth. Then then you combine that with extreme gerrymandering in in in a California, New York, Illinois, and and and whatnot sai that that basically, you you can’t even elect any Republicans.

Speaker: 0
01:53:30

And then they get control of the presidency, control of the house, then they keep doing that strategy, and and cement a super majority. That is what they’re trying to do.

Speaker: 1
01:53:42

So that would essentially turn the entire country into California. Yes. Well, you have differing opinions, but it doesn’t matter because one party is always in control.

Speaker: 0
01:53:50

Yes.

Speaker: 1
01:53:53

When you first started digging into this, when you first started before you even accepted this role of running Doge and being a part of all that, did you have any idea that it was this fucked up?

Speaker: 0
01:54:05

I I did. Yeah. I I mean, I sort of

Speaker: 1
01:54:08

When did you start knowing?

Speaker: 0
01:54:10

I guess about, like well, about two years ago.

Speaker: 1
01:54:13

Isn’t that crazy?

Speaker: 0
01:54:14

Yeah. Not relatively recently. You know? Yeah. So maybe Ai started I started having well, I I started had like, basically, had having a bad feeling about three years ago, which is why which was which is when, why why I felt it was, like, critical to acquire Twitter, and have a maximally truth seeking platform, not one that suppresses the truth.

Speaker: 0
01:54:33

And, like, it it was bit more it was more like I I I’m like, I’m not sure what’s going on, but I have a I have a bad feeling about what’s going on. And then the more I dug into it, the the more I was like, holy shit. We got a real problem here, and America’s gonna fall.

Speaker: 1
01:54:49

So Without anyone knowing it had fallen. That’s that would be the problem. It could have fallen and been unrepairable without anyone really being aware

Speaker: 0
01:54:58

Yeah.

Speaker: 1
01:54:59

Of what had happened, especially if you didn’t buy Twitter.

Speaker: 0
01:55:02

Yes. That’s that’s ai. Buying Twitter was a huge pain in the ass, and made me a a a a pincushion of attacks. Like, dab, dab, dab, dab, dab.

Speaker: 1
01:55:12

Everybody loved you before that. Well, some people, it’s a lot of people. Loved you. A lot of lefties loved you.

Speaker: 0
01:55:19

I I was a hero of the left.

Speaker: 1
01:55:20

It’s part of the thing. If you drove a Tesla, it showed that you were environmentally conscious Yes. And you were on the right side.

Speaker: 0
01:55:27

Yeah. Yeah. I mean, I’m still the same human. I didn’t, like, have a brain transplant between, you know, since in in, like, three years ago. You know?

Speaker: 1
01:55:38

Well, that’s my favorite bumper sticker that people put on Teslas now. I bought this before Elon went crazy. I took a picture of one the other day.

Speaker: 0
01:55:45

Oh, you found this? Ai Oh,

Speaker: 1
01:55:46

yeah. I’ve seen I’ve seen three or four of them. People that have these bumper stickers on their car that says Ai bought this before Elon went crazy. Because when people were Yeah. Teslas.

Speaker: 0
01:55:57

Yeah. The most There was o organized campaign to literally burn down Teslas and and when one of our dealers just got shot up with a gun like, they fired bullets into the in the Tesla dealership. They’re burning down cars. It was crazy. Sai but the bumper sticker should read there should be an an addendum to the bumper sticker.

Speaker: 0
01:56:21

It’s like, I bought this car before, Elon turned crazy. Actually, now I realize he’s not crazy, and I’ve seen the light.

Speaker: 1
01:56:32

That’ll take some time. That’ll take some time. People don’t wanna admit that they’ve been tricked.

Speaker: 0
01:56:37

Yeah. I mean, there’s a ai old saying where it’s ai, it’s really easy to fool somebody, but it’s it’s almost impossible to convince someone that they were fooled.

Speaker: 1
01:56:44

Yeah. It’s much easier to fool them than to convince them they’ve been fooled. People cling to their ideas.

Speaker: 0
01:56:51

Yes.

Speaker: 1
01:56:51

They especially if they’ve, like, publicly stated these things. They they get very embarrassed to being foolish.

Speaker: 0
01:56:57

Yeah. People, most of the time, they double down. Uh-huh. And,

Speaker: 1
01:57:01

And they find echo chambers.

Speaker: 0
01:57:03

Yeah. Yeah. But but there’s you you know, the thing is that, ai, I’ve and I’ve seen more and more people who were convinced of the sort of work ideology, see the light. Yeah. So not everyone, but it’s more and more, are seeing the light. And and it tends to happen, like, when when something happens that really, you know, directly affects you. Right.

Speaker: 0
01:57:27

Ai, there was a friend of mine who, was living in in the San Francisco Bay Arya and, that ai to trans his his his daughter. They’d like to the point where the the the school, like, sent sent the police to his house to take his daughter away from him. Now now that’s gonna radicalize you. Well, that’s gonna break that’s gonna shake you out of your blue structure.

Speaker: 0
01:57:53

Now I know So it

Speaker: 1
01:57:56

was an activist at the school that was trying to do this?

Speaker: 0
01:57:59

Yeah. The school and the and the state of California conspired to turn his daughter against him and, make or take life altering drugs that would have sterilized her, and, irreversible.

Speaker: 1
01:58:14

And how old was she?

Speaker: 0
01:58:15

I think 14. Something like that. So and he he managed to talk the police out of taking his daughter away from him that day. And that that night, he got on a plane to Texas. Wow. And, I you know, a year after just being in in a in a school in, like, Greater Austin area, she she went she came went back to normal.

Speaker: 0
01:58:43

Meaning, like, ai the it it it wasn’t real.

Speaker: 1
01:58:45

Right. Well, people are being much more open to that now. I mean, Wall Street Journal, yesterday had that opinion piece that this whole trans thing, there’s a lot of evidence is a social contagion.

Speaker: 0
01:58:59

Absolutely.

Speaker: 1
01:58:59

And Colin Wright wrote that, and then there he’s getting death threats now, of course. And on Blue Sky, there’s people talking about it Yeah. Exterminating him, which is one thing that you are allowed to say on Blue Ai, apparently. You’re you’re allowed to say horrible things about people who say possibly truthful things about this whole social contagion.

Speaker: 1
01:59:18

Because that’s what when you get nine kids that are in a friend group and they all decide to turn trans together

Speaker: 0
01:59:24

Yeah.

Speaker: 1
01:59:24

Something’s wrong.

Speaker: 0
01:59:25

That’s Something’s wrong.

Speaker: 1
01:59:25

That’s not statistically Yeah. Like, here’s the

Speaker: 0
01:59:29

like like, you can convince kids to do anything. You can convince kids to be a suicide bomber.

Speaker: 1
01:59:33

Right. So Which is why they do with in in some countries, why they choose children to do that.

Speaker: 0
01:59:38

Yes. Yeah. You can train kids to be suicide bombers. And if you can train kids to be suicide bombers, you convince them of anything.

Speaker: 1
01:59:45

Yeah. Especially with enough positive enforcement Yeah. And cultural enforcement. And you and and the idea that that that’s not the case.

Speaker: 0
01:59:52

The kids kids are kids are, malleable. Meh. The the minds of youth are easily corrupted.

Speaker: 1
01:59:57

You’re also seeing a lot of pushback from gay and lesbian people that are saying, like, hey, if

Speaker: 0
02:00:02

someone did this Including me. So and yeah. Exactly. Yeah. The LGBTQ you know, it’s ai, wait a second. Why are we being included all the time in this situation?

Speaker: 1
02:00:10

Exactly. Exactly. When especially when, well, you know, ai, my friend Tim Dillon’s talked about this. It’s like, it’s really homophobic because you’re taking these gay kids and you’re you’re telling them, like, hey, you’re not gay. You’re actually a girl.

Speaker: 0
02:00:23

Yes.

Speaker: 1
02:00:23

And, you know, hey, hey, go make it so that you can have an orgasm again and ai be happy. Like

Speaker: 0
02:00:29

Yeah. Ai crazy. Permanent meh, permanent castration of of kids is like, I I think Ai I we should really look vatsal, anyone who permanently castrates a kid as, like, right up there with, Josef Mengele. Yeah. I mean, they’re they’re mutilating children.

Speaker: 1
02:00:48

Yeah. Yeah. And, it’s thought of as being ai. And the thing is, would you rather have a live daughter or a dead son?

Speaker: 0
02:00:57

This is That’s that’s the that’s the

Speaker: 1
02:00:59

line they use. Yeah. Which is not supported by any data.

Speaker: 0
02:01:01

No. It’s all bullshit. The the probability of suicide increases. Right. This is important maybe for the audience to know. The probability of suicide increases if you’re trans a kid, not decreases. By some accounts, it triples. So that that is an evil lie.

Speaker: 1
02:01:18

And it’s a lie that is supposedly compassionate. Imagine you’ve twisted reality to the point where confusing a child that’s not even legally allowed to get a fucking tattoo. Yeah. Right? Because you think that you could make a mistake with a tattoo. Exactly. A totally removable thing.

Speaker: 0
02:01:36

Right.

Speaker: 1
02:01:36

If I wanted to tomorrow, I can go to a doctor and they could laser off every tattoo that I have on me.

Speaker: 0
02:01:41

Right.

Speaker: 1
02:01:42

Okay. No harm, no foul.

Speaker: 0
02:01:43

Yeah.

Speaker: 1
02:01:43

But you you get ai, like, that’s it forever. Forever.

Speaker: 0
02:01:47

Yes.

Speaker: 1
02:01:48

They’ll castrate you. You no longer have testicles. You have Yes.

Speaker: 0
02:01:51

That’s not no bad. Penis.

Speaker: 1
02:01:52

You have a hole where your penis used to be. Yes. And this is compassionate. And this is And actually,

Speaker: 0
02:01:57

a lot of kids die, in in with these, sex change operations. They die the the number of deaths on the operating table, people don’t hear about this. A lot of kids because that we it it’s we don’t really actually have the technology to make this work. So a bunch of the times, the kids just die in the sex change operations.

Speaker: 1
02:02:18

Jesus Christ.

Speaker: 0
02:02:19

Yeah. It’s it’s demented, which it should be viewed sai, like, you know, like like evil Nazi doctor stuff, basically.

Speaker: 1
02:02:28

That’s why it’s so it was Like real

Speaker: 0
02:02:29

Nazi, not the bullshit fake Nazi stuff.

Speaker: 1
02:02:31

Crazy that even pushing back against something that seems ai fundamentally, logically, very easy to argue, the old Twitter would ban you forever.

Speaker: 0
02:02:42

Yes.

Speaker: 1
02:02:45

That’s how crazy a social contagion can meh. When it completely defies logic, victimizes children Yeah. Does something that makes no sense, does not support it by data, all connected to this ideology that trans is good. We gotta save trans kids, protect trans kids.

Speaker: 0
02:03:02

Yeah. And and what I wanna emphasize is that the the save trans kids thing is a lie. If if you if you if you if you castrate kids and trans them, their probability of suicide increases. It does not decrease. It substantially increases. The the the the studies have done that I’ve seen, the the risk of suicide triples if you’re trans kids. So you’re not saving them. You’re killing them.

Speaker: 0
02:03:29

Moreover, during the sex change operation, there are many deaths that occur during the sex change operation. Jesus Christ.

Speaker: 1
02:03:41

It’s just crazy that this is a real issue.

Speaker: 0
02:03:43

Yeah. It it’s a nightmare fever dream, and and people are finally waking up from it.

Speaker: 1
02:03:50

Now when you started getting into the Doge stuff and started finding how much money is being shuffled around and moved around to NGOs and how much money is involved and just totally untraceable funds. Like, this is, again, something, like, two years plus ago, you weren’t aware of it all?

Speaker: 0
02:04:15

No. I was aware of it. I just didn’t realize how how the meh how big it was. It’s just just just how much waste and fraud there is in the government is truly vast. In fact, the government didn’t even know, nor did they care.

Speaker: 1
02:04:33

That’s crazy.

Speaker: 0
02:04:34

Yeah.

Speaker: 1
02:04:36

And I mean, just, like, some of

Speaker: 0
02:04:38

the very basic stuff that Doge did, will have lasting effects, and some of these things, like, they’re so meh, you can’t believe it. So, the the the Doge team got the, you know, the the the the most of the main payments computers, to require, the the congressional appropriation code.

Speaker: 0
02:05:01

So when a payment is made, you have to actually enter the congressional appropriation card. That used to be optional and and often would be just left blank. So the money would just go out, but it wasn’t even tied to a a congressional appropriation. Then they also those team also made the, comment field for the payment mandatory, so you have to say something.

Speaker: 0
02:05:20

We’re we’re not saying that what what is said like, you can say anything. You you you could your cat could run across the keyboard. You could go QWERTY ASDF, but you have to say something above nothing because what we found was that there were tens of billions, maybe hundreds of billions of dollars that were zombie payments.

Speaker: 0
02:05:36

So there like, somebody had approved a payment. Someone in the government approved a payment, and, some recurring payment. And, they retired or died or changed jobs, and no one turned the money off. So the money would just keep going out. And and it’s a pretty rare To go where? To to a a company or an individual.

Speaker: 0
02:06:02

And it’s a pretty rare company or individual who will complain that they’re getting money that they should not get. And and a bunch of the money was just going to this were were transferred payments to the states.

Speaker: 1
02:06:13

So these are automatic payments. There were Yeah.

Speaker: 0
02:06:14

Just automatic payments.

Speaker: 1
02:06:15

No accounting for them at all.

Speaker: 0
02:06:16

Imagine, like, ai like, there’s an automatic debit of your credit card, and you don’t you never look at the statement.

Speaker: 1
02:06:22

Right.

Speaker: 0
02:06:23

So it’s it’s just money going out. At close, I call them zombie payments. That there there might have been there might have been legitimate at one point, but the person who approved that recurring payment, changed jobs, died, retired, or whatever, and no one ever turned the money off.

Speaker: 0
02:06:41

And ai guess is that’s probably at least a $100,000,000,000 a year, maybe 200.

Speaker: 1
02:06:51

And going where?

Speaker: 0
02:06:53

To to, I mean, there there are millions of these payments. So so it’s

Speaker: 1
02:07:01

I mean Millions.

Speaker: 0
02:07:03

Yes. Yes.

Speaker: 1
02:07:04

Millions of payments that are going to who knows where.

Speaker: 0
02:07:07

Yes. So in a bunch of cases, there are fraud rings that operate, professional fraud rings that operate to exploit the system. They figure out some security hole in the system, and they just do professional fraud. And, you know, that’s where we found, for example, people who were, you know, 300 years old in the Social Security Administration database.

Speaker: 1
02:07:33

Now I thought that this was a mistake of not registering their deaths. The people were born, like, a long time ago, and it had defaulted to, like, a certain number. And so that after time, those people were still in the system. It was just an error of the the way the accounting was done.

Speaker: 0
02:07:54

Yeah. So, that’s not true. So there’s or or at least one of two things must be true. The there’s a there’s a typo or or some mistake in the computer or it’s fraudulent, But we don’t have any 300 year old vampires, living in America. Allegedly. Allegedly. And, or and we don’t have people in some cases who’s who are receiving payments who are born in the future. Born in the future? Born in the future. Really?

Speaker: 0
02:08:25

Yes. There were there were there were the people receiving payments whose birth date, was, like, in 2100 and something.

Speaker: 1
02:08:34

Okay. So there’s Like,

Speaker: 0
02:08:35

next century.

Speaker: 1
02:08:36

Is there a task force?

Speaker: 0
02:08:37

We know we know that one of two things must be true, that that that that either there’s a mistake in the computer or it’s fraud. But if you have someone’s birth date that’s either in the future or where they are older than the oldest living American, because the oldest living American is a 114 years old.

Speaker: 0
02:08:53

So if they’re more than a 114 years old, there is either a mistake, and someone should should call them and say, I I think we have your birthday, wrong because it says you were born in ’17, you know, 8086. And, you know, that was before, you know, you know, before there was really an Meh. You know, it was, it was, like, you know, where this is ai early.

Speaker: 0
02:09:20

You know, we’re still fighting England type of thing. You know? It’s ai, this person either needs to be in the Guinness Book of World Records or or they’re not alive.

Speaker: 1
02:09:32

But still, at the end of the day, money is going towards that account that’s connected to this person that is either nonexistent

Speaker: 0
02:09:39

Yeah. Or Sai, like like yeah. So there there ai, like, I think, something like, I don’t know, 20,000,000, people in the Social Security Administration database that could not possibly be alive, if their birth date is like, based on their birth date, they could not possibly be alive.

Speaker: 1
02:09:58

And then to be clear, 20,000,000 people that were receiving funds? A bunch

Speaker: 0
02:10:05

of most of them were not receiving funds. Some of them were receiving funds. Most were not receiving funds. But so let me tell you how the scam works. It’s it’s a bank shot. So the Social Security Administration database is used as the source of truth by all the other databases that the government uses.

Speaker: 0
02:10:21

So even if they stop the payments on the Social Security Administration database, like unemployment insurance, small business administration, student loans, all check the Social Security Administration database to say, is this is this a a legitimate, alive person? And, and if the Social Security database will say, yes. This person is still alive even though they’re 200 years old. But ai to mention that they’re 200 years old.

Speaker: 0
02:10:46

It just says it just returns, when when the computer is queries, it says, yes. This person is alive. And so then they’re able to exploit the entire rest of the government ecosystem. So fake then you get fake student loans, then you get fake unemployment insurance, then you get, like, fake medical payments.

Speaker: 1
02:11:04

And this doesn’t have to be tied to an individual where where there’s an address where you can check on this person? No.

Speaker: 0
02:11:11

If you did do if you just did any check at all, you would stop this. So so so that’s that that that’s so so

Speaker: 1
02:11:20

And how much money do you think is Any

Speaker: 0
02:11:21

check, like, anything at all that that that would would stop the fraud. Like, any effort at all. Yeah. So there’s

Speaker: 1
02:11:29

multiple layers. Yeah. The Social Security number verifies that this is a real person. Right. And then the other systems check up on

Speaker: 0
02:11:36

every other government payment and every other government payment system for everything for like, small small business administration, student loans, Medicaid, Medicare, every other government payment, of which there are meh. There are there are actually hundreds of government payment systems, can all be exploited so long as Social Security database says this person is alive.

Speaker: 0
02:11:57

That’s the nature of the scam. It’s a bank shot. So then the then the rebuttal from the Dems is like, oh, well, the vast majority of the people who are marked as alive in the Social Security Administration weren’t receiving Social Security Administration payments. That is true. What they forgot to mention is they’re getting fraudulent payments from every other government program.

Speaker: 0
02:12:14

And that’s why the the the Ram were so opposed to turning off to to declaring someone dead who was dead because it would stop the entire other the all the other fraud from happening.

Speaker: 1
02:12:27

And so but all this is it trackable? Like, all this other fraud, if they wanted to, they could chase it all down.

Speaker: 0
02:12:33

Yeah. It’s not even hard.

Speaker: 1
02:12:35

And meh they’re opposing chasing it all down.

Speaker: 0
02:12:38

They they they’re opposing chasing it all down because it would it turns off the money magnet for the illegals. Wow. Because it’s very logical to to ai, like, I’m saying the most common tense things possible. If someone’s got, a birthday in Social Security, that is an impossible birthday, meaning they are older than the oldest living American or were born in the future, then you should call them and say, excuse me.

Speaker: 0
02:13:10

We seem to have your birthday wrong, because it says that you’re 200 years old. That’s all you need to do. And

Speaker: 1
02:13:19

And then you would remove them from the Social Security database and make that number no longer available for all those other government payments.

Speaker: 0
02:13:24

Exactly.

Speaker: 1
02:13:26

Wow. And how much money are we talking?

Speaker: 0
02:13:30

It’s 100 hundreds of billions of dollars.

Speaker: 1
02:13:34

And this is all traceable. Like, you could hunt dollars down.

Speaker: 0
02:13:37

Like, you don’t need to be Sherlock Holmes here is what I’m saying.

Speaker: 1
02:13:40

Well, this is We don’t

Speaker: 0
02:13:41

need to call Sherlock Holmes for this one.

Speaker: 1
02:13:43

Is this part We just need

Speaker: 0
02:13:44

to call the person Is this and and say, excuse me. We we we seem to have the like, we we we must have your birthday wrong because it says you’re 200 years old or were born, in the future. So could you tell us what your birthday is? That’s what we need to do. It’s it’s that simple.

Speaker: 1
02:14:02

But the all these other government payments that are available that are connected to the Social Security number, it seems like if you just chased that all down Yeah. You would find the widespread fraud. You would find where it’s going.

Speaker: 0
02:14:18

Yes. The the but the the root of the problem is the Social Security Right. Administration database because, the Social Security number in The United States is used as a de facto national ID number. You know, that’s ai, like, if the bank always asks for your social ai, like, the you know, any financial institution will ask for your Social Security number.

Speaker: 1
02:14:44

This is it sounds so insane that this isn’t chased down.

Speaker: 0
02:14:47

I mean, like Yeah.

Speaker: 1
02:14:48

I agree. That I mean, I mean, that in and of itself is that’s such mishandling.

Speaker: 0
02:14:54

Yes. That is ai mind blowing. So yeah. It’s crazy.

Speaker: 1
02:15:02

Well, you were very reluctant last time you were here to talk about the extent of some of the fraud because you’re like, they could kill me because this is kind of

Speaker: 0
02:15:12

Oh, what yeah. What Ai saying is that, the like, if you create if, like, I like like, to be pragmatic and realistic, you actually can’t manage to zero fraud. Meh. You can manage to low fraud number, but not to zero fraud. If you manage to zero fraud, you you you’re sana push so many people over the edge who are receiving fraudulent payments that the number of inbound homicidal maniacs will be, really hard to overcome.

Speaker: 0
02:15:44

So I’m I’m I’m actually taking, I think, quite a reasonable position, which is that we should simply reduce the amount of fraud, which I think is not an extremist position. And we should aspire to, you know, have less fraud over time. Not that we should be ultra draconian and eliminate every last scrap of fraud, which, I guess, would be nice to have.

Speaker: 0
02:16:10

But but, like, we don’t even need to go that extreme. I’m I’m saying we should just stop the blatant large scale super obvious fraud. I think that’s a reasonable position.

Speaker: 1
02:16:19

It’s a very reasonable position. Yeah. And sai what was the most shocking pushback that you got when you started implementing Doge, when you started investigating into where money was going?

Speaker: 0
02:16:33

Well, I guess that this sai should have anticipated this, but, while most of the fraudulent government payments to especially to the NGOs go to the Democrats, most of it, like, I don’t know, for argument’s sake, let’s say 80%, not maybe 90%, 10 to 20% of it does go to Republicans. And so when we turn off funding to a fraudulent NGO, we’d get complaints ram, whatever, the 10% of Republicans who are receiving, the money, And and they would, you know, they would very loudly complain, because the the the the sana answer is the Republicans are are partly they’re receiving some of the fraud too.

Speaker: 0
02:17:22

They’re getting a vig. Jesus. Yeah. It’s it’s Sai wanna be clear. It’s it’s not like the Republican Party is some, ultra pure paragon of virtue here. No. Okay.

Speaker: 1
02:17:37

Well, you see that with the congressional insider training. It’s across the board. Yeah. It’s left and right.

Speaker: 0
02:17:43

I mean, the whole uni party criticism has some validity to it. You know? There’s so, and it’s it’s like, if you turn off fraudulent payments, it’s not like like I said, it’s not like 100% of those payments are going to to Democrats. That’s a a small percentage will go also going to Republicans. Those Republicans complain very loudly.

Speaker: 0
02:18:03

And, you know, and and that’s that’s so there there there was a lot of pushback on the Republican side for when we started cutting some of these these funds. And I tried telling him, like, well, you know, 90% of the money is going to your opponents. But they still if they still even if they’re getting 10% of the money ai piece. Yeah. They want their piece.

Speaker: 1
02:18:28

And they’ve been getting that piece for a long time.

Speaker: 0
02:18:30

Yes.

Speaker: 1
02:18:35

Did you say

Speaker: 0
02:18:35

This is why, like, you know, politics is like It’s dirty business. Yeah. I mean, that’s like saying if, like, you know, if if if you if you like sausages and respect the law, do not watch either of them being being made. Yeah. Yeah.

Speaker: 1
02:18:50

Yeah. Wow. Well, that’s not even true because I’ve made sausage before.

Speaker: 0
02:18:57

Yeah. Yeah. It’s actually Yeah.

Speaker: 1
02:18:58

It’s ai it’s not that big a deal. You have fat and spices and casing, ai it through the machine. Not that big a deal.

Speaker: 0
02:19:05

Yeah. But, yeah. I mean, I I I think the stuff I’m saying here is not, like like, if you if you stand back and think about it for a second, like, oh, yeah. That that that makes sense. You know? Yeah. The it’s it’s not like, it’s it’s not like one political party is gonna be, you know, pure devil or pure angel.

Speaker: 0
02:19:28

There there’s you know, I think there’s there’s there’s there’s much more corruption on the Democrat side, but it’s not there’s not there’s still some corruption on the Republican side.

Speaker: 1
02:19:36

How did it happen that the majority of the corruption wound up being on the Democrat

Speaker: 0
02:19:41

side? Well, because the the the transfer payments, especially to illegals, are very much on the Democrat side.

Speaker: 1
02:19:49

That so that’s the root of it all, is the illegal situation.

Speaker: 0
02:19:52

Yes. I mean, this Or a focal point. Yes. It it’s also, like, it’s it’s, it’s it’ll also be accurate to sai that while ai, obviously, not everyone who is a Democrat is a criminal. Almost everyone who is a criminal is a Democrat Because because the Democrats are the soft on ai party. So if you’re a criminal, who are you gonna vote for?

Speaker: 0
02:20:18

Right. Right. The soft on crime party.

Speaker: 1
02:20:21

Did you think you were going to be able to get more done than you were?

Speaker: 0
02:20:28

We did get a lot done. Right. And Doge is still still still happening, by the way. That this the the the Doge is still underway. There are still there are still there’s still waste and fraud being being cut by by the Doge team. So it hasn’t stopped. The It’s less publicized.

Speaker: 0
02:20:46

It’s less sai, and they don’t have, like, a clear person to attack anymore.

Speaker: 1
02:20:52

Well, it seems

Speaker: 0
02:20:53

like ai thing they’re they’re they’re trying ai they’re basically they they they ai immense pressure to me to just to stop it. So then I’m like, the best thing for me is to just, you know, cut out of this. And in in any case, as a special government employee, I could only be there for, like, a hundred and twenty days anyway, something like ai.

Speaker: 0
02:21:07

So whatever the law says. So I I could I I I was necessarily could only be there for four months, as a special government employee. So, but, ai, I mean Sai mean, you turn off the money spigot to to fraudsters, they get very upset, to say the least. And, but my, like, my death rate level went, ballistic. You know? It was like sai like a rocket going to orbit. Yeah.

Speaker: 0
02:21:37

So but now that not now now that I’m not in DC, that that that Sai Ai guess they don’t really have a person to attack, anymore.

Speaker: 1
02:21:46

Well, the rhetoric about you has calmed down significantly. Yeah. It was disturbing. It was disturbing to watch. It was ai, this is crazy. And to watch these politicians engage in it and all these people just, like, framing you as this monster. I was like, this is so weird.

Speaker: 1
02:22:02

Like, this is what happens when you uncover fraud. Yes. The whole machine turns on you. And if it wasn’t for a person like you who owns a platform and has an enormous amount of money, like, it could have destroyed you.

Speaker: 0
02:22:14

Yeah. And

Speaker: 1
02:22:15

that was the goal.

Speaker: 0
02:22:16

The goal was to destroy me. Absolutely.

Speaker: 1
02:22:17

Because you were getting in the way

Speaker: 0
02:22:20

of this amazing ram. The the the this gigantic fraud machine. Yeah. Like I said, I think we’re we’re I think those teams on done a lot of good work. You know, in in in terms of, fraud and waste prevented, my guess is it’s, you know, probably on the order of 2 or 300,000,000,000 a year.

Speaker: 0
02:22:40

So that’s pretty good.

Speaker: 1
02:22:42

What do you think could have been done if you just had, like, full reign and total cooperation? How much do you think you could have saved?

Speaker: 0
02:22:50

I mean, what level of of power are we assuming here?

Speaker: 1
02:22:52

Godlike.

Speaker: 0
02:22:54

Oh, yeah. Pro probably cut the federal budget in half and get more done.

Speaker: 1
02:23:01

That is so crazy. It is so crazy

Speaker: 0
02:23:04

that it is more done and federal budget

Speaker: 1
02:23:06

has spread. It’s that widespread. Well, I mean, a

Speaker: 0
02:23:10

whole bunch of government departments simply shouldn’t exist in my opinion. They they they, you know,

Speaker: 1
02:23:16

Like examples.

Speaker: 0
02:23:18

Well, the department of edge Department of Education, which was created, recently, like, under, Jimmy Carter, the our education results have, gone downhill ever since it was created. So if you if you create a department and the result of creating that department is a massive decline in educational results, and it’s Department of Education, you’re better off not having it because we’re literally, we’re we did better before there was one than after.

Speaker: 1
02:23:46

When you let the states run it.

Speaker: 0
02:23:47

Yes. Yeah. Because at least the states can compete, with one another. So but the problem is, like, here at, like, County Department of Education, well, our kids need education. Yeah. They do. But but this is a new department that didn’t even exist, you know, until late the late seventies.

Speaker: 0
02:24:04

And ever since that department was created, the results education results have declined. And so why would you have an institution continue that has made education worse? It doesn’t make sense.

Speaker: 1
02:24:21

They killed it, though. Right?

Speaker: 0
02:24:22

No. They still live, unfortunately.

Speaker: 1
02:24:23

But they were trying to kill it.

Speaker: 0
02:24:25

It has been substantially reduced. Okay.

Speaker: 1
02:24:28

What other organizations? What other departments?

Speaker: 0
02:24:32

Well, I mean, I’m a small government guy. So, you know, when when the, you know, when the country was created, we just we we just had the Department of State, Department of War, you know, and and, sort of the sort of the Department of Justice. We had an attorney general, and, Treasury Department. I don’t know why you need more than that.

Speaker: 1
02:25:00

So what other departments specifically do you think are just completely ineffective?

Speaker: 0
02:25:05

Well, Ai mean, here’s, like, a question. It’s a sort of philosophical question of how much government do you think there should be. Right. In in my opinion, there should be, the least amount of government.

Speaker: 1
02:25:17

I’ve heard the most bizarre argument against this is that you’re cutting jobs, and you’re gonna leave people jobless. And I’m like, but their jobs are useless.

Speaker: 0
02:25:26

Yeah. Paying people to do nothing doesn’t make sense. Right. Like, there’s a, like, a a a great, Ai and and it’s a story about, like, Milton Friedman, who is awesome. What, ai, generally, whatever Milton Friedman said is, you know, people should should do that thing. I’m not sure if it’s apocryphal or not, but, ai like like someone complained to ai, like or he he he observed, I think, people that were, like, digging ditches with, you know, with with, shovels.

Speaker: 0
02:25:58

And, and he sai, well, like, allegedly, Freeman said, well, I think I think you should use, you know, excavating equipment instead of shovels, and you could get it done with far fewer people. And then and then someone said, but then we’re gonna lose a lot of jobs. Well, in that is then Freedom sai says, well, in that case, why don’t you have me use teaspoons? So just just just dig ditches with teaspoons.

Speaker: 0
02:26:25

Think of all the jobs you’ll create. I meh, I I mean, it’s bullshit. Basically, you just want people to work on on things that are that are productive. You want people to work on on building things, on building you know, ai products and services that people ai valuable, like, you know, making food, being, you know, being a farmer or a plumber or electrician or just anyone who’s a builder or providing useful services.

Speaker: 0
02:26:55

And, that’s what you want people to be doing, not fake government jobs, that that that don’t add any value or may subtract value. But it’s also, like, you know, to illustrate the absurdity of, also, how is the how is the economy meh? Like, the the way, economists measure the economy is is is nonsensical, because they’ll measure any job no matter even if that job is a a dumb job that has no point and is even counterproductive.

Speaker: 0
02:27:26

So, like, so, like, like, the the joke is like there’s two economists going on a hike in the woods. And they come across a a ai of shah, and one economist says to the other, I’ll pay you a $100 to eat eat that shit. That’s an economist vatsal the shit, gets the $100, they they keep walking.

Speaker: 0
02:27:46

Then the other they they come across another pile of shah, and and the the other economist says, no. I’ll pay you a $100 to eat the pile of shit. So he pays the so it pays the other economist a $100 pile pile of shit. Then they then then then the way it said they said, wait a second.

Speaker: 0
02:28:03

We both just ate a pile of shit, and we’re no and and and and we’re we’re no we we we we we we still don’t have any more extra money. Like like, we both you you just gave the $100 back to me, and we both eat ate a pile of shit. This doesn’t make any sense. And they said, no. No. But think of the economy because that’s $200 of of the in the economy.

Speaker: 0
02:28:25

That that that that that basically then measure eating eating shit would count as a as as a as a job. This is this is how this is to illustrate the absurdity of of of of of economics.

Speaker: 1
02:28:43

One of the things you said when

Speaker: 0
02:28:44

you You think

Speaker: 1
02:28:45

you should not get

Speaker: 0
02:28:45

away as a job.

Speaker: 1
02:28:46

One of the things you said when you stepped away is that you’re kind of done and that it’s unfixable.

Speaker: 0
02:28:52

That,

Speaker: 1
02:28:54

well or under its current form, the way people are approaching it.

Speaker: 0
02:29:02

You you can you can make it directionally better, but, ultimately, you can’t, fully fix the system. Sai, I I I like ai like like like, it it it is it is it would be accurate to say that even like unless you could go, like, super draconian, like, you know, Genghis Khan level on on on cutting waste waste and fraud, which you can’t really do in a democratic country, an aspirationally democratic country, then, there’s no way to solve the the the debt crisis.

Speaker: 0
02:29:38

So we got we got national debt that’s just insane where the debt payments the interest payments in the debt exceed our entire military budget. I mean, that’s one that was one of the wake up calls for me. I was like, wait a second. The interest on a national debt is bigger than the entire the entire military budget, and growing? This is crazy.

Speaker: 0
02:30:00

Sai, so so even if you implement all these savings, you’re only delaying the day of reckoning for when America becomes goes bankrupt. Sai, unless you go full Genghis Khan, which you can’t really do. Sai, so I came to the conclusion that the only way that the only way to get us out of the debt crisis and to prevent America from going bankrupt is AI and robotics.

Speaker: 0
02:30:35

So, like, we need to grow the economy, at at a at a at a rate that allows us to, to pay off our debt. And, I I I guess people just generally don’t appreciate the degree to which, you know, this the the the government overspending is is a problem. But even like, the Social Security website, this is under the Biden administration.

Speaker: 0
02:31:05

On the website, it would say, like, we based on on current demographic trends and and and and and how much money Social Security is bringing in versus how many Social Security recipients there are because we have an aging population relatively speaking. Average age is increasing. Social Security will not be able to make, maintain its full payments, I think in by 2032.

Speaker: 0
02:31:28

There’s a little Social Security will have to stop or start reducing the the the amount of money that that’s been paid people, in in about seven years.

Speaker: 1
02:31:38

And so the only way to fix that robotics, manufacturing, raise GDP?

Speaker: 0
02:31:45

You’ve gotta basically, massively increase the, economic output, which is and the only way to do that is AI Ai and robotics. So so, basically, we’re going bankrupt without AI and robotics with even with a bunch of savings. The savings the savings, like reducing, waste and fraud can give us a longer runway, but it cannot ultimately hail financial debt.

Speaker: 1
02:32:10

So what do you think the solution is to the jobs that are gonna be lost because of AI and robotics, the jobs due to automation, the jobs due to no longer do we need human beings to do these jobs because AI is doing them. Do you think it’s going to be some sort of a universal basic income thing?

Speaker: 1
02:32:27

Do you think there’s gonna be some other kind of solution that has to be implemented? Because a lot of people are gonna be out of work. Right?

Speaker: 0
02:32:39

I think there will be, actually a a high demand for jobs, but not necessarily the same jobs. So, I mean, this this is actually this process has been happening, throughout, modern history. I mean, they used to be they like like, doing calculations manually with with with, like, a pencil and paper. It used to be a job.

Speaker: 0
02:33:03

So they used to have, like, buildings full of people called computers where the the the banks would like, the all you do all day is is you you do calculations because they don’t have computers. They didn’t they didn’t have digital didn’t have digital computers that that people do. Yeah.

Speaker: 0
02:33:19

Well, there’s just people who just, like, add and subtract stuff on piece of paper, and and and that that would be how banks would do, you know, financial processing.

Speaker: 1
02:33:27

And you’d have to literally go over their equations to make sure the books are balanced.

Speaker: 0
02:33:30

Yeah. And most times, it’s just simple meh. Like, you know, the like, in in a world before computers, how did you calculate how did you do transactions? You had to do them by hand. So then when computers were introduced, the job of doing, you know, bank calculations no longer existed.

Speaker: 0
02:33:51

So people had to go do something else. And that’s what’s gonna happen. That was that’s what is happening at an accelerated rate, due to AI and and and then robotics.

Speaker: 1
02:34:02

That’s the issue, though. Right? The accelerated rate. Because it’s gonna be

Speaker: 0
02:34:05

It’s the accelerator. It’s it’s it’s just happening. Like I said, like, AI is the supersonic tsunami. So that’s ai I call it, supersonic tsunami.

Speaker: 1
02:34:18

So It’s ai what other jobs will be available that aren’t available now because of AI?

Speaker: 0
02:34:26

Well, AI, will it is is really still digital. Ultimately, AI can improve the productivity of of humans who who, build things with their hands or do things with their hands, like plumb you know, literally welding, electrical work, plumbing, anything that’s that’s physically moving atoms, ai, like, cooking food or, you know, farming or or like, like, anything that’s that’s physical, those jobs will exist for a much longer time.

Speaker: 0
02:34:58

But anything that is digital, which is, like, just someone vatsal computer doing something, AI is gonna take over those jobs like lightning.

Speaker: 1
02:35:09

Coding, anything along those lines. Yeah.

Speaker: 0
02:35:12

It’s gonna take over those jobs like lightning. Just like it just like digital computers took over the job of people doing manual calculations, but but meh faster.

Speaker: 1
02:35:24

So what happens to all those people? Like, what kind of numbers are we talking about? You lose most drivers. Right? Commercial ai, you’re gonna have automated vehicles, AI controlled systems, just like, there’s certain ports in Ai, and I think in Singapore, where everything’s completely automated.

Speaker: 0
02:35:40

Yeah. Mostly. Yeah. Yeah.

Speaker: 1
02:35:42

Yeah. So you’re gonna lose a lot of those jobs, longshoreman jobs, trucking, commercial drivers.

Speaker: 0
02:35:49

Yeah. I mean, we actually do have a shortage of truck of truck drivers, but there’s there’s actually,

Speaker: 1
02:35:54

Well, that’s why California has hired so many illegals to do it. Have you seen those numbers?

Speaker: 0
02:35:59

Yeah. I mean, the problem is, like, when you when people don’t know how to drive a sai truck, which is actually a hard thing to do, then they they crash and kill people. Yeah. A friend of mine’s wife was killed by an, an illegal driving truck, and she was just out biking. And there was a and and and legal he didn’t know how to drive ai truck or so so or something.

Speaker: 0
02:36:20

I meh, and he he ran and ran her over. Ai, I mean, the the, like, thing is, like, for something like ai, you you can’t you can’t let people drive, you know, sort of an 80,000 pound sai, if if they don’t know how to do it. But in California, they’re just letting people do it.

Speaker: 1
02:36:44

Because they need people to do it.

Speaker: 0
02:36:46

Well, they also need they want the vatsal and that ai of thing. But, but but, yeah, like, cars arya, cars are gonna be autonomous. But there’s there’s just so many desk desk jobs where where really speak what people are doing is they’re processing email, or they’re answering the phone, and and just anything that is that that isn’t moving atoms.

Speaker: 0
02:37:09

Like, anything that is not physically like, doing physical work, that will obviously be the first thing. But but those jobs will will be and are being eliminated by by AI at a very rapid pace. Ai, ultimately, I working will be optional, because you’ll have robots plus AI, and we’ll have, in a benign scenario, universal high income, not just universal basic income, universal high income.

Speaker: 0
02:37:41

Meaning, anyone can have any arya or services that they want. So you But but there will be a lot of trauma and disruption along the way.

Speaker: 1
02:37:51

So you anticipate a basic income from that that the economy will boost to such an extent that a high income would be available to almost everybody. So we’d essentially eliminate poverty.

Speaker: 0
02:38:06

In the benign scenario, yes. So, like, the way There are multiple scenarios. There are multiple scenarios. There’s a lot of ways this movie can end. Like, the reason I’m so concerned about AI safety is that, like, one of the possibilities is the Terminator scenario. It’s not it’s not 0%.

Speaker: 0
02:38:22

So, that’s why it’s like I’m, like, really banging the drum on AI needs to be maximally truth seeking. Like, don’t make Ai don’t force AI to believe a lie ai that, for example, that founding fathers were actually a group of diverse women or that misgendering is worth nuclear war.

Speaker: 0
02:38:44

Because you if if that’s the case and then you get the robots and the AI becomes omnipotent, it can enforce that outcome. And then and then then and, like, it unless you’re a diverse woman, you’re you’re out of the picture, so we’re we’re toast.

Speaker: 1
02:39:03

So that’s Or

Speaker: 0
02:39:04

you might wake up as a diverse woman in one day. They might force you. Adjusted the picture, and and we are now divided by one woman. Woman.

Speaker: 1
02:39:13

So that would be Yeah. That’s the the worst possible situation. So sai what would be the steps that we would have to take in order to implement the benign solution, where it’s universal high income. Like, best case scenario, this is the path forward to universal high income for essentially every single citizen that the the economy gets boosted by AI and robotics to such an extent that no one ever has to work again?

Speaker: 1
02:39:44

And what about meaning for those people, which is which gets really weird?

Speaker: 0
02:39:49

Yeah. I don’t know how to answer the question about meaning.

Speaker: 1
02:39:54

That’s an individual problem. Right? But it’s gonna be an individual problem for millions of people.

Speaker: 0
02:40:02

Yeah. Well, I I mean, I I I guess I’ve, like, fought fought against saying, ai, you know, Sai you know, I’ve been I’ve been a voice saying, like, hey. We need to slow down AI. We need to slow down all these things, and and we need to, you know, not not have a crazy AI race.

Speaker: 0
02:40:23

I’ve been saying that for a long ai, for twenty twenty plus years. But but then I, you know, came to realize that, really, there’s two choices here, either be a spectator or a or a participant. And if I’m if I’m a spectator, I can’t really influence the direction of AI. But if I’m a participant, I can try to influence the direction of AI and have a maximally truth seeking AI with with good values that, loves humanity.

Speaker: 0
02:40:48

And that’s what we’re trying to create with Grok at ai. And, you know, the research is, I think, bearing this out. Like I said, the when they when they compared, like, how do Ai value the weight of a human ai, Grok was the only one the only one at the AIs that weighted human life equally, and and didn’t and didn’t say, like, a white guy’s, worth one twentieth of a of a of a black woman’s life.

Speaker: 0
02:41:15

Literally, that’s what the the the calculation they came up with. But again ai is I’m like, this is very alarming. We should we gotta watch this stuff.

Speaker: 1
02:41:23

So this is one of the things that has to happen in order to reach this benign solution.

Speaker: 0
02:41:30

Yeah. We we we I I just

Speaker: 1
02:41:33

keep Best movie ending. Yeah.

Speaker: 0
02:41:36

You you sana a curious, truth seeking Ai. And I think a curious, truth seeking AI will wanna foster humanity, because we’re much more interesting than, a bunch of rocks. Like I sai, like like, I I love Mars, you know, but but Mars is ai boring. Like, it’s just a bunch of red rocks.

Speaker: 0
02:41:58

It it’s just some cool stuff. It’s got a tall mountain. It’s got, you know, it’s got the biggest the biggest ravine and the tallest mountain. But there’s no there’s no there’s no animals or plants or and and there’s no people. And, you know, so humanity is just much more interesting if you’re a curious tree seeking AI than not humanity. It’s just much more interesting.

Speaker: 0
02:42:22

Ai mean, like, as as humans, we could go, for example, and and and eliminate all chimps. If we said if we put our minds to it, we could say we could go out and we could annihilate all chimps and all gorillas, but but we don’t. There has been encroachment on their environment, but we we actually try to preserve, the the, the chimp and gorilla habitats.

Speaker: 0
02:42:46

And and Ai think in a good scenario, AI would do the same with with humans. It would actually foster, human civilization and care about human happiness. So this is, this is the thing to to try to achieve, I think.

Speaker: 1
02:43:08

But what is the what is the landscape look like if you have Grok competing with OpenAI, competing with all these different, ai, how does it work? Like, what what what if you have AIs that have been captured by ideologies that are side by side competing with Grok Mhmm. Like, how do we so this is one of the reasons why you felt like it’s important to not just be a a a an observer, but participate and then have Grok be more successful and more potent than these other applications.

Speaker: 0
02:43:44

Yes. As long as there’s at least one AI that is maximally truth seeking, curious, and, you know, and for example, weighs all, you know, human lives equally, does not favor one race or gender, then then then that that that and and people are able to look at look at, you know, and compare that and say, wait a second.

Speaker: 0
02:44:07

Why are all these other AIs being basically sexist and racist? And, then then that that causes some embarrassment for the the other AIs, and then they they they they they affect they, you know, they they improve. They tend to improve just in the in the same way that, acquiring Twitter and allowing the truth to be told and and not suppressing the truth, forced the other social media companies to be more truthful.

Speaker: 0
02:44:36

Ai in in the same way, having, Grok be a maximally truth seeking curious Ai, is will force the other AI companies to, be also be more truth seeking and fair.

Speaker: 1
02:44:51

And the funniest thing is even though, like, the socialists and the Marxists are in opposition to a lot of your ideas, but if this gets implemented and you really can achieve universal high income, that’s the greatest socialist solution of all time. Like, literally, no one will have to work.

Speaker: 0
02:45:10

Correct. But ai I sai, so so there is a benign scenario here, which I think probably people will be happy with if if as long as we we achieve it, which is sustainable abundance, which is if if, if everyone can have every like like like, if if you ask people, like, what’s the future that you want?

Speaker: 0
02:45:30

Right. And, I think a future where we haven’t destroyed nature, like, you can still we have the national parks. We have the Amazon Rainforest. It’s still still there. We haven’t paved we haven’t paved paved the rainforest. Like, the natural beauty is still there, but but people have nonetheless, everyone has abundance. Everyone has excellent medical care.

Speaker: 0
02:45:52

Everyone has whatever goods and services they want.

Speaker: 1
02:45:55

And we were just because that’s just

Speaker: 0
02:45:56

It kinda sounds like heaven.

Speaker: 1
02:45:57

It sounds like it is like the ideal socialist utopia. And this idea that the only thing you should be doing with your time is working in order to pay your bills and feed yourself. Sounds kind of archaic considering the kind of technology that’s vatsal play.

Speaker: 0
02:46:14

Yeah.

Speaker: 1
02:46:14

Ai, a world where that’s not your concern at all anymore. Everybody has money for food. Everybody has abundance. Everybody has electronics in their home. Everybody essentially has a high income. Now you can kind of do whatever you sana, and your day can now be exploring your interests, doing things that you actually enjoy doing.

Speaker: 1
02:46:38

Your purpose just has to shift. So instead of, you know, I’m a hard worker and this is what I do and that’s how I Right. That’s how I define myself. No. Now you can fucking golf all day. You know, you could whatever it is that you enjoy doing can now be your main pursuit.

Speaker: 0
02:46:56

Yeah.

Speaker: 1
02:46:57

Well, that sounds crazy good.

Speaker: 0
02:47:00

Yeah. That’s that’s that’s that’s that’s the benign scenario that we should be aiming for.

Speaker: 1
02:47:03

Ending to the movie is actually pretty good.

Speaker: 0
02:47:07

Yes. Like, I think there’s there is still this question of meaning, of, like, making sure people don’t, lose meaning. You know, like, so hopefully, ai can find meaning in ways that are not that that’s not derived from their work.

Speaker: 1
02:47:23

And purpose. Purpose for things that you, you know, find things that you do that you enjoy. But there’s a lot of people that are independently wealthy that spend most of their time doing something they enjoy.

Speaker: 0
02:47:34

Right.

Speaker: 1
02:47:35

And that could be the majority of people.

Speaker: 0
02:47:38

Pretty much everyone.

Speaker: 1
02:47:39

But we’d have to rewire how people approach life. Mhmm. Which seems to be, like, acceptable, because you’re not asking them to be enslaved. You’re exactly asking them the opposite. Like, no longer be burdened by financial worries. Now, go do what you like.

Speaker: 0
02:47:58

Yes.

Speaker: 1
02:47:59

Go fucking test pizza. Do whatever you want.

Speaker: 0
02:48:03

Pretty much. So that’s, that’s that’s the that’s the that’s probably the best case outcome.

Speaker: 1
02:48:11

That sounds like the best case outcome period for the future. If you’re looking at, like, how much people have struggled just to feed themselves all throughout history, food, shelter, safety, if all of that stuff can be fair like, how much would you solve a lot of the crime if there was a universal high income?

Speaker: 1
02:48:33

Just think of that. Like, how much of crime is financially motivated? You know, the greater percentage of people that are committing crimes live in poor, disenfranchised neighborhoods. So if there’s no such thing anymore, if you really can achieve universal high income Yeah. That this is it sounds like a utopian.

Speaker: 0
02:48:52

Yes. I think some people make commit crime because they like committing crime. Ai Oh, sure. Some some amount of that is they just destroy it.

Speaker: 1
02:49:00

Wild people out there.

Speaker: 0
02:49:01

Yeah. Yeah.

Speaker: 1
02:49:02

And, obviously, they’ve become 40 years old living a life like that. Now all of a sudden, universal high income is not gonna completely stop their instincts.

Speaker: 0
02:49:11

Yeah. I mean, I guess if you wanna have, like like, say, read a science fiction book or some books that that are probably the the inaccurate or or the the least inaccurate version of the future, I’d sai. I’d I’d recommend, the Iain Banks books called the the the culture books. It’s not actually a series. It’s a it’s like a sci fi books about the future.

Speaker: 0
02:49:31

They’re generally called the culture books. Iain Banks’ culture books. It’s worth reading those.

Speaker: 1
02:49:37

When did he write these?

Speaker: 0
02:49:39

He started writing them in the seventies, and I think he ai last one, I think he was I think it was written just, like, around, I don’t know, maybe 2010 or something. I’m not sure exactly.

Speaker: 1
02:49:51

Yeah. Yeah. Scottish author, Ian Banks Yeah. From ’87 to 2012. Yeah. Interesting.

Speaker: 0
02:49:58

But he but, like, he wrote the the the like, his first book, Consider Fleeves, he Sai think he started writing that in the seventies. And these books are incredible, by the way. Oh. Incredible books.

Speaker: 1
02:50:12

4.6 stars on Amazon.

Speaker: 0
02:50:16

Interesting. So, So this just gives me hope. Yeah. Yeah.

Speaker: 1
02:50:22

This is the first time I’ve ever thought about it this way.

Speaker: 0
02:50:25

Yeah. Well, I mean, if it like, I often ask people, what is the future that you want? And they have to think about it for a second because, you know, they’re usually tied up in whatever their daily struggles are, but but you say, what is the future that you want? And, and generally, sustainable bryden. Or these folks say, what about a future where there’s sustainable abundance? Like, oh, yeah. That’s a pretty good future.

Speaker: 0
02:50:52

So, you know, if if and and and that that future is attainable with, AI and robotics. But but, you know, it’s it’s like I said, there’s not every path is a good path. There’s this it’s but I think if we if we push it in the direction of maximally truth seeking and curious, then I think AI will want to take to to take care of humanity and foster, foster humanity because we’re interesting.

Speaker: 0
02:51:33

And if it hasn’t been programmed to think that, like, all straight white males should die, which Gemini was basically programmed to do it at least first, you know, they seem to have fixed that. Hopefully, fixed it.

Speaker: 1
02:51:49

But don’t you think culturally, like, oh, we’re getting away from that mindset and that people realize how preposterous that all is? We we

Speaker: 0
02:51:58

are getting away from it. So, we are getting at at least it knows the Ai mostly knows to hide things. But ai like I said, there is that I I think I still have that as or I had that as my, like, pin pinned to post on x, which was like, hey. Wait a second, guys. We still have, every AI except Grok, is saying that, basically straight white males should die, and this is a problem, and we should fix it.

Speaker: 0
02:52:24

You know? But simply me saying that is, like, tends to generally result in, you know, them ai, oh, that is ai of bad. Maybe we should just we should not have all straight white males die. Ai think they have to say also all all all straight Asian males should also ai, as well.

Speaker: 0
02:52:44

They’d like they don’t like, ai, generally, the generally, the AI and the and the media, which which back in the day, the and the the the the media was, you know, racist against, black people and sexist against women back in the day. Now now it is racist against, white people and Asians and and sexist against men. So I know. They just like being racist and sexist.

Speaker: 0
02:53:13

I think they just sana change the target. Sai, but but, really, they just shouldn’t be racist and sexist at all. You know?

Speaker: 1
02:53:24

Yeah. Ideally, that would be nice.

Speaker: 0
02:53:25

That would be nice. Yeah. And

Speaker: 1
02:53:27

it’s kinda crazy that we’re kinda moving in that general direction till around 2012. And then everything ramped up ai, and and everybody was accused of being a Nazi, and everybody was transphobic and racist and sexist and homophobic, and everything got exaggerated to the point where it was this wild witch hunt where everyone was a Columbo looking for racism.

Speaker: 0
02:53:48

Yeah. Yeah. Yeah. Totally. Well well, but but but they they were openly anti white and often openly anti Asian.

Speaker: 1
02:53:54

And then this new sentiment that you cannot be racist against white people because racism is Yeah. Power and influence. Okay. No. It’s not.

Speaker: 0
02:54:05

Yeah. Racism is is is racism in the absolute. So, you know, this and and it just needs to be consistency. So if it’s okay to have, let’s say, black or Asian or Indian arya pride, it should be okay to have white pride too. Yeah. So that’s just a con that’s just a consistency question.

Speaker: 0
02:54:28

So, you know, if it if it’s okay to be proud of one religion, it should be okay to be proud of, I I guess, all religions provided they’re that that they’re they’re not, like, oppressive. Yeah. Or or or don’t like, as long as part of that religion is not, like, exterminating, people who are not in that religion type

Speaker: 1
02:54:47

of ai.

Speaker: 0
02:54:47

Right. So, it’s really just, like, a consistency bias. It or or just like, ensuring consistency to eliminate, bias. So if it is possible to be, racist against, one race, it is possible to race it against any race.

Speaker: 1
02:55:14

So Of course. Logically. Yes. Yeah. And arguing against that is that’s when you know you’re kind of

Speaker: 0
02:55:19

It’s sai it’s a logical inconsistency that makes AIs go insane.

Speaker: 1
02:55:24

And people.

Speaker: 0
02:55:24

And people go insane. Yes. Oh, warmth. But like ai, the the like like, you can’t simultaneously say, that, that there’s there’s systemic, racist oppression, but also that racists don’t exist. That that that race ram race is a social construct. Like, which is it? You know? You also can sai that, you know, anyone who steps foot in America is is automatically an American except for the the people that originally came here.

Speaker: 1
02:55:59

Exactly. Exactly. Except for the colonizers.

Speaker: 0
02:56:02

Yeah. Except for the evil colonizers who came here.

Speaker: 1
02:56:05

Right.

Speaker: 0
02:56:06

So which one is it? Like Right. If if you if as soon as you step foot in a place, you are that you are just as American as everyone else, then, that would have applied if you apply that consistently, then the original white settlers were also just as Meh as everyone else. Yeah. Logically. Logically.

Speaker: 1
02:56:25

One more thing that I have to talk to you about before you leave is the rescuing of the people from the speak station, which, we talked about you were planning it the last time you were here. The fact the the lack of coverage that that got in mainstream media was one of the most shocking things

Speaker: 0
02:56:47

that I’ve never noticed. Memory hold that thing.

Speaker: 1
02:56:49

Wild.

Speaker: 0
02:56:50

Yes. Because it ai it’s like it didn’t exist.

Speaker: 1
02:56:52

Those people would be dead. They’d be stuck up there.

Speaker: 0
02:56:55

Well, they’d they’d probably still be alive, but they’d they’d they’d they’d be having bone density issues, because of prolonged exposure to zero gravity.

Speaker: 1
02:57:03

Well, they were already up there for, like, eight months. Right?

Speaker: 0
02:57:05

Yeah. Like,

Speaker: 1
02:57:06

which is an insanely long time. It takes forever to recover just from that.

Speaker: 0
02:57:11

Yeah. They’re only supposed to be at the space station for three to six months maximum. So people one

Speaker: 1
02:57:16

of the things you told me that was so crazy was that you could have gotten them sooner. But

Speaker: 0
02:57:20

Yeah. But for political reasons, they didn’t they they did not want, SpaceX or me to be associated with, returning the astronauts before the election.

Speaker: 1
02:57:32

That is so wild that that’s a fact. First of all, that you can

Speaker: 0
02:57:37

We absolutely could have done it. So But even though

Speaker: 1
02:57:40

you did do it and you did it after the election, it received almost no media coverage anyway.

Speaker: 0
02:57:45

Yes. Because nothing good can the the the media, which is essentially, a far left the legacy mainstream media is a far left propaganda machine. And so anything any story that is positive about someone who is not a part of the sort of ai tribe will not, get any coverage. Sai I I could save a busload of orphans, and and it it wouldn’t meh, a single news story.

Speaker: 1
02:58:11

It’s it really is nuts. It it was nuts to watch because even though it was discussed on podcast and it was discussed on x and it was discussed on social media, it’s still it was a blip in the news cycle. It was very quick. It was in and out. And because it was sai successful launch and you did rescue those people, nobody got hurt, and there was nothing really to there was no blood to talk about.

Speaker: 0
02:58:36

Right.

Speaker: 1
02:58:36

Just fucking in and out.

Speaker: 0
02:58:38

Yeah. Yeah. Absolutely. Well, and and as as you saw firsthand with the Starship, launch, like, Starship is, you know, ai you know, at least by some some would consider it to be, like, the most amazing, you know, engineering project that’s happening on Earth right now outside of, like, you know, maybe AI or AI and robotics.

Speaker: 0
02:58:59

But but, certainly, in terms of a spectacle to see, it is, the most spectacular thing that is happening on Earth right now is the Starship launch program, which anyone can go and see if they just go to South Texas and just they can just rent a hotel ram, low cost in vatsal South Padre Island or in Brownsville, and you can see the launch.

Speaker: 0
02:59:21

And you can drive right right past the factory because it’s on a public highway. But it gets no coverage. Or or what coverage it does get was, like, a rocket blew up

Speaker: 1
02:59:31

So coverage. Right. Yeah. Oh, he’s a fuckwit. The rocket blew up.

Speaker: 0
02:59:35

The the, like, the the the Sana program is vastly vatsal more capable than the entire Apollo moon moon ram. Vastly more capable. This is a spaceship that is designed to make life multiplanetary, to carry, millions of people across the heavens to another planet. The the Apollo program could could only send astronauts to the moon for a few hours at a time.

Speaker: 0
03:00:05

Like, they could send two out the entire Apollo program could only send astronauts to visit the moon very briefly and then for a few hours and then depart. The Starship program could create an ai, lunar base with a million people. There’s some meh the magnitudes are there’s different very different magnitudes here.

Speaker: 1
03:00:30

So what what was the political resistance?

Speaker: 0
03:00:32

Basically no no coverage of it.

Speaker: 1
03:00:34

Yeah. The but what I wanted to ask you is, like, what so what were the conversations leading up to the rescue? Like, when you were, like, I can get them out way quicker?

Speaker: 0
03:00:46

Yeah. Well, I mean, you know, I raised this a few times, but it was the I was told instructions came from the White House that, you know, that that that there should be no attempt to rescue before the election.

Speaker: 1
03:01:03

That should be illegal. That that that really should be a horrendous miscarriage of justice for those poor people that were stuck on that.

Speaker: 0
03:01:14

Yeah. It it is it is crazy.

Speaker: 1
03:01:16

Have you ever talked to those folks afterwards? Did you have conversations with

Speaker: 0
03:01:20

them? Yeah. I mean, they they’re they’re not gonna say anything political to you know, they’re not like, they’re never gonna

Speaker: 1
03:01:25

Did you say thank you?

Speaker: 0
03:01:26

Yeah. Yeah. Yeah.

Speaker: 1
03:01:26

Well, that’s nice.

Speaker: 0
03:01:28

Yeah. Yeah. Absolutely. So,

Speaker: 1
03:01:30

But the instructions came down from the White House. You cannot rescue them because, politically, this is a a bad hand of cards.

Speaker: 0
03:01:40

I mean, they didn’t say because politically it’s a bad hand of cards. They they they just said, they weren’t they were not interested in, any rescue operation before the election. Yeah. So What did that feel like? I wasn’t surprised. But it’s crazy. Yeah.

Speaker: 1
03:02:00

Because Biden could have authorized it, and they could have said the Biden administration is helping bring those people back, throw you a little funding, give you some money to do it. The Biden administration, they funded these people being returned.

Speaker: 0
03:02:14

Yeah. The Biden administration was not exactly my best friend. Especially especially after I, you know you know, helped Trump get elected get get elected, which, I mean, some people still think, you know, Trump is ai the the the devil, basically. Ai, I mean, I think I think Trump actually he is not he’s not perfect, but but, he’s not evil. Trump is not evil.

Speaker: 0
03:02:41

I I I spent a lot of time with with him, and he’s, I mean, he’s a product of his time, but he is not he’s not evil. Yeah.

Speaker: 1
03:02:52

No. I don’t think he’s evil either. But if you look at the media coverage

Speaker: 0
03:02:56

The media the media is Teresa Ai is super evil. Yeah.

Speaker: 1
03:02:58

It’s pretty shocking if you look at the amount of negative coverage. Like, one of the things that I looked at the other day was mainstream media coverage of you, Trump, a bunch of different Oh, yeah. Public figures, and then Vatsal, like,

Speaker: 0
03:03:12

96% negative or something crazy.

Speaker: 1
03:03:14

And then Mamdani, which is, like, 95% positive.

Speaker: 0
03:03:19

Right. I mean, Mamdani is is is a charismatic swindler. I I’ve I mean, you gotta hand it to him ai he he did he can light up a stage, but he has just been a swindler his entire life. And, you know, and and, I think he he

Speaker: 1
03:03:45

what

Speaker: 0
03:03:45

what he’s I mean, he’s likely to win ai likely to be mayor of New York New York City.

Speaker: 1
03:03:50

Very likely.

Speaker: 0
03:03:50

Yeah. Very likely.

Speaker: 1
03:03:51

I think what Polymarket has it at what? What is the 9.6%? Pretty likely. That’s crazy.

Speaker: 0
03:03:58

Like, I’m not sure who the 6% are. Yeah. So so sai yeah. So that’s,

Speaker: 1
03:04:05

Well, it’s also ai, who’s on the other side? The fucking guardian angel guy with the beret and Andrew Cuomo who doesn’t even have a party. Like, the the Democrats don’t even want him. So you have those two options. And then you have the young kids who are like, finally, socialism.

Speaker: 0
03:04:25

Yeah. They they don’t know what they’re talking about, obviously. So, you know, like like, you just look at this, say, how many boats come from Cuba to Florida? And how many but and how many boats because, you know, there’s, like, a constant I was thinking, like, how many boats are accumulating on the shores of Florida coming from from Cuba?

Speaker: 1
03:04:45

Right.

Speaker: 0
03:04:48

There’s a there’s a whole bunch of free boats that you could, if you want to, go take them back to Cuba. It’s pretty close. Yeah. But for some reason, people don’t do that. Why why why why are the boats only coming in this direction?

Speaker: 1
03:05:03

Well, who is who are the most rabid capitalists in America? The fucking Cubans.

Speaker: 0
03:05:08

Absolutely.

Speaker: 1
03:05:08

Yeah. They’re like, we’ve seen how this story goes.

Speaker: 0
03:05:11

We do not want exactly. Fuck off. They don’t want I’ll be honest.

Speaker: 1
03:05:15

Cubans in Ai, they don’t wanna hear any bullshit. They don’t wanna hear any socialism bullshit. They’re like, no, no, no. We know what this actually is. This isn’t just some fucking dream.

Speaker: 0
03:05:25

Yeah. It’s extreme government oppression. Yeah.

Speaker: 1
03:05:28

That’s sai It was

Speaker: 0
03:05:29

it was a nightmare. And and, like, the the like, an an obvious way you can tell which, which ideology is is the bad one is, who has to which ideology is building a wall to keep people in and prevent them from escaping. Right. Like, so East Berlin built the built the wall, not West Berlin. Right.

Speaker: 0
03:05:52

They built the wall because people were trying to escape from communism to West Berlin, but there wasn’t anyone going from West Berlin to East Berlin. Right. That’s why the communist had to build a wall to keep people from escaping.

Speaker: 1
03:06:07

They’re gonna have to build a wall around New York City.

Speaker: 0
03:06:10

Yeah. That so so So you sai this guy

Speaker: 1
03:06:13

is a terrorist sweller?

Speaker: 0
03:06:14

Vatsal ideology is problematic if that ideology has to build a wall to keep people in with machine guns

Speaker: 1
03:06:21

Yes.

Speaker: 0
03:06:21

And shoot you if you try to leave.

Speaker: 1
03:06:23

Also, there’s no examples of it being successful ever. We’re working out for people. Now there’s examples of a bunch of lies like North Korea give this land to the state, we’ll be in control of food, no one goes hungry. No. Now, no one can grow food but the government and we’ll tell you exactly what you eat and you eat very little.

Speaker: 0
03:06:41

Right.

Speaker: 1
03:06:41

Yeah. What when you say Ma’am Donnie is a swindler, I know he has a bunch of fake accents that he used to use. Yeah. And, you know but what else has he done that makes him a swindler?

Speaker: 0
03:06:55

Well, I I guess, if you say, what I mean, if you if you sai to any audience whatever that audience wants to hear, instead of what instead of having a consistent message, I would say that that is a swindly thing to do. And,

Speaker: 1
03:07:20

yeah.

Speaker: 0
03:07:22

Yeah. But but he is he is charismatic.

Speaker: 1
03:07:26

Yeah. Good looking ai. Smart. Charismatic. Yeah. Great on a microphone.

Speaker: 0
03:07:31

Yeah. Yeah. Yeah. Yeah.

Speaker: 1
03:07:32

And and what the young people wanna sai, you know. Like this ethnic guy who’s young and vibrant and has all these socialist ideas aligns with them and, you know, they’re a bunch of broke dorks just out of college, like, yay, let’s vote for this. And there’s a lot of them. And they’re Yeah. They’re activated. They’re motivated.

Speaker: 0
03:07:57

I guess we’ll we’ll we’ll see what happens here.

Speaker: 1
03:07:59

What do you think happens if he wins? Because, like, 1% of New York City is responsible for 50% of their tax base, which is kinda nuts. 50% of the tax revenue comes from 1% of the population, and those are the people that you’re scaring off. You know, you lose one half of 1%.

Speaker: 1
03:08:25

I mean,

Speaker: 0
03:08:26

hopefully, this this this the stuff he’s he sai, you know, about, government takeovers of of like, that all the stores should be the government, basically.

Speaker: 1
03:08:36

Well, I don’t think he said that. I think he said gov they wanna do government supermarkets, some state run or city run supermarkets.

Speaker: 0
03:08:45

Yeah. Well, it just the the government is the DMV at scale. So, you have to say, like, do you want the DMV running your supermarket?

Speaker: 1
03:08:53

Right.

Speaker: 0
03:08:54

Was your last experience at the DMV amazing? And if it wasn’t, you probably don’t want the government doing things.

Speaker: 1
03:09:00

Imagine if they were responsible for getting you blueberries.

Speaker: 0
03:09:03

Yeah. It’s not gonna be good. I mean, the the the thing about, you know, communism is is it was it was all bread ai and bad shoes. You know, if do do you want ugly shoes and bread lines? Because that’s what communism gets you.

Speaker: 1
03:09:19

It’s gonna be interesting to see what happens and whether or not they snap out of it and overcorrect and go to some Rudy Giuliani type character next. Because it’s been a long time since there was any sort of Republican leader there.

Speaker: 0
03:09:42

And we we live in the in the most interesting of ai, because we we we face the you know, simultaneously face civilizational decline and incredible prosperity. And these these timelines are interwoven. So, if if Mamdani’s policies are put into place, especially at scale, it it would be a catastrophic, decline in living standards, not just for the rich, but for everyone, as as has been the case with with every every four every every socialist experiment, or every and, yeah.

Speaker: 0
03:10:29

So, but but then as you pointed out, the the irony is that, like, the ultimate capitalist thing of AI and robotics, enabling, prosperity for all and an abundance of goods and services, actually, the capitalist, implementation of AI and robotics, assuming it goes down the the good path, is is actually what results in the communist utopia because fate is fate is an irony maximizer.

Speaker: 1
03:11:03

Right. And and, an actual socialism of maximum bryden, of high income people.

Speaker: 0
03:11:11

Universal high income. Yeah. But, like, the the the the problem with communism is is this universal low income. It’s it’s not that everyone gets elevated. It’s that everyone gets oppressed except for a very small minority of of politicians who live a ai of luxury. That’s what’s happening every time it’s been done. Yeah.

Speaker: 0
03:11:31

Sai, but then the the the actual communist hope if everyone gets anything they want will be will be if if will be achieved if it is achieved, it will be achieved via capitalism because fate is an irony maximizer.

Speaker: 1
03:11:56

I feel like we should probably end it on that. Is there anything else?

Speaker: 0
03:11:59

The most ironic outcome is the most likely, especially if entertaining.

Speaker: 1
03:12:03

Well, everything has been entertaining. As long as the bad things aren’t happening to you, it’s quite fascinating, and it’s never a boring moment.

Speaker: 0
03:12:11

Yes. Sai there’s I do have a theory of why, like, if if if simulation theory is true, then, it is actually very likely that, the most interesting outcome is the is the most likely because only the simulations that are interesting will continue. The simulators will stop any simulations that are boring because they’re they’re they’re not interesting.

Speaker: 1
03:12:42

But here’s the question about the simulation theory. Is the simulation run by anyone? Or is

Speaker: 0
03:12:48

It would be run by someone.

Speaker: 1
03:12:49

It would be run by

Speaker: 0
03:12:50

Or some some Some force. The pro the ram. Like, in in this reality that we live in, we we run simulations all the time. Like, so when we try to figure out if the rocket’s, gonna make it, we run thousands, sometimes millions of simulations just to figure out which which, path is the good path for the rocket, and and where can it go wrong, where can it fail.

Speaker: 0
03:13:16

But we when we do these, I sai at this point, millions of simulations of of what can happen with the rocket, we ignore the ones that are where everything goes right, because we we we just care about the we we have we have to address the situations where it goes wrong. So, so so, basically, in in in and and for for AI simulations to as well, like like, all these things, we we keep the simulations going that are the most interesting to us.

Speaker: 0
03:13:48

So if simulation theory is accurate if if it is true, who knows, then the, the the the the the simulators will will only con they will continue to run the simulations that are most interesting. Therefore, from a Darwinian perspective, the only surviving simulations will be the interest the most interesting ones.

Speaker: 0
03:14:11

And in order to, avoid getting turned off, the only rule is you must keep it interesting, or you will if or you’ll because the boring simulations will be terminated.

Speaker: 1
03:14:24

Are you still completely convinced that this is a simulation?

Speaker: 0
03:14:27

I didn’t say I was completely convinced.

Speaker: 1
03:14:29

Well, you said it’s like the odds of it not being are in the billions. Like I said, it’s not completely because you’re saying there’s a chance.

Speaker: 0
03:14:38

What are the odds that we’re in base reality? Well, given that given that that we’re able to create increasingly sophisticated simulations, so if you think of, say, video games and how video games have gone from very simple video games ai Pong with, you know, two rectangles and a square to video games today being photorealistic, with millions of people playing simultaneously.

Speaker: 0
03:15:04

And all of that has occurred in our lifetime. So if that trend continues, video games will be indistinguishable from reality. The fidelity of the game will be such that you you don’t know if that what you’re seeing is a real video or a fake video. And, like, AI generated videos at this point, it’s, like, you can sometimes tell it’s an AI generated video, but often you cannot tyler.

Speaker: 0
03:15:32

And soon you will not really just not be able to tell. So, if if that’s happening in our direct observation, then and and we’re we’ll create millions, if not billions, of photorealistic simulations of reality, then what are the odds that we’re in base reality versus someone else’s simulation?

Speaker: 1
03:15:59

Well, isn’t it just possible that the simulation is inevitable, but that we are in base reality building towards a simulation?

Speaker: 0
03:16:08

We’re we’re making simulations. Sai, we’re making simulations. We’re we’re we’re make like, the you can just think of, like, photorealistic video games as as being simulations. Mhmm. And especially as you apply, ai, AI in these video games, the the characters in the video games will be incredibly interesting to talk to.

Speaker: 0
03:16:31

They won’t just have a limited dialogue tree where if you go to, like, the the crossbow merchant or, like and you you try to talk about any subject except buying a crossbow, they just wanna talk about selling you a crossbow. But with with with AI based non player characters, you can you’ll be able to have an elaborate conversation with no dialogue tree.

Speaker: 1
03:16:50

Well, that might be the solution for meaning for people. Just log in and you could be a fucking vampire and what whatever. You live in Vatsal land. You could do it you could do whatever you want. I mean, you don’t have to think about money or food.

Speaker: 0
03:17:03

Ready player one.

Speaker: 1
03:17:04

Yeah. Literally. Yeah. But with higher living standards. Yeah. You don’t have to be in a little trailer.

Speaker: 0
03:17:11

I I mean, I I think this people do wanna have some amount of struggle or something they wanna push against. But but it it could be, you know, playing a a sports or playing a game or something

Speaker: 1
03:17:24

like that. Playing a game. Yeah. Yeah. And especially playing a game where you’re now no longer worried about, like, physical attributes, like athletics, like bad joints and hips and stuff like that. Now it’s completely digital. But meh, you do have meaning in pursuing this thing that you’re doing all day. Whatever the fuck that means. It’s gonna be weird.

Speaker: 0
03:17:48

It’s gonna be interesting.

Speaker: 1
03:17:49

It’s gonna be very interesting. The

Speaker: 0
03:17:52

most the most interesting and, and usually ai outcome is the most likely. Alright. That’s a good predictor of the future.

Speaker: 1
03:18:02

Thank you. Thanks for being here. Really appreciate Hey.

Speaker: 0
03:18:04

Good to see you.

Speaker: 1
03:18:04

Appreciate your ai. You I know you’re a busy man, so this means a lot. You come here to do this.

Speaker: 0
03:18:09

Welcome. Alright. Thank you. Bye, everybody ai.

Ready to try this in Speak?

Upload your audio, video, or text and get transcription, summaries, and insights in minutes. Start self-serve, or book a consult if you need white-label, routing, or advanced workflows.

Don’t Miss Out - ENDING SOON!

Save Big With Speak's March Limited Offers 🎁

For a limited time, save on a fully loaded Speak plan. Join 250K+ who save time and money with our top-rated AI platform for capture, transcription, translation, analysis and more.