#2247 – Duncan Trussell

Duncan Trussell is a stand-up comic, host of the "Duncan Trussell Family Hour" podcast, and voice of "Hippocampus" on the television series "Krapopolis." www.duncantrussell.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcribe, Translate, Analyze & Share

Join 170,000+ incredible people and teams saving 80% and more of their time and money. Rated 4.9 on G2 with the best AI video-to-text converter and AI audio-to-text converter, AI translation and analysis support for 100+ languages and dozens of file formats across audio, video and text.

Start your 7-day trial with 30 minutes of free transcription & AI analysis!

More Affordable
1 %+
Transcription Accuracy
1 %+
Time & Cost Savings
1 %+
Supported Languages
1 +

You can listen to the #2247 – Duncan Trussell using Speak’s shareable media player:

#2247 – Duncan Trussell Podcast Episode Description

Duncan Trussell is a stand-up comic, host of the “Duncan Trussell Family Hour” podcast, and voice of “Hippocampus” on the television series “Krapopolis.”

www.duncantrussell.com

Learn more about your ad choices. Visit podcastchoices.com/adchoices
This interactive media player was created automatically by Speak. Want to generate intelligent media players yourself? Sign up for Speak!

#2247 – Duncan Trussell Podcast Episode Top Keywords

#2247 - Duncan Trussell Word Cloud

#2247 – Duncan Trussell Podcast Episode Summary

In this episode of the Joe Rogan Experience, several key topics and themes are explored. The discussion touches on the nature of reality and perception, with a focus on how media influences public attention and perception. The conversation suggests that media often curates a narrative that dictates what people should focus on, which can lead to a loss of personal agency and critical thinking.

A significant portion of the episode is dedicated to discussing scientific research involving nonverbal autistic children and their ability to communicate accurately with their parents, which challenges conventional understanding and is often dismissed as pseudoscience. This topic highlights the importance of keeping an open mind to unconventional ideas.

The episode also features a discussion about the existence of aliens and the potential implications of such revelations on society. This conversation raises questions about government transparency and the public’s readiness to accept such information.

Eric Weinstein and Terrence Howard are mentioned in the context of a previous podcast episode, where Weinstein used his expertise to challenge Howard’s ideas, demonstrating the importance of informed debate and respectful discourse.

Actionable insights from the episode include the encouragement to critically evaluate media sources and to focus on personal interests rather than being swayed by curated narratives. The episode also emphasizes the value of open-mindedness and respectful dialogue when discussing complex or controversial topics.

Overall, the recurring theme is the importance of questioning mainstream narratives and maintaining personal agency in understanding the world. The episode encourages listeners to seek knowledge, engage in respectful discussions, and remain open to new ideas.

This summary was created automatically by Speak. Want to transcribe, analyze and summarize yourself? Sign up for Speak!

#2247 – Duncan Trussell Podcast Episode Transcript (Unedited)

Speaker: 0
00:01

Joe Rogan podcast. Check it out.

Speaker: 1
00:03

The Joe Rogan experience.

Speaker: 0
00:06

Ai my day. Joe Rogan podcast by night, all day. Oh, shit. I didn’t know we had bells.

Speaker: 2
00:14

Yeah, bro. We got bells. It’s probably super annoying to people listening.

Speaker: 0
00:18

Bro, that it’s fucking Christmas. The war on Christmas must end. It did.

Speaker: 2
00:24

How dare we say Meh Christmas?

Speaker: 0
00:25

How dare you say that? It offends me.

Speaker: 2
00:29

Did you ever see Kamala Harris do that when she had this speech?

Speaker: 0
00:32

Sai, how dare you say fucking Christmas?

Speaker: 2
00:34

You never seen it?

Speaker: 0
00:35

No. Oh ai god.

Speaker: 2
00:36

Okay. Let’s start with this because it’s so crazy. I don’t understand the context. So, like, I wish I could be charitable and say, well, there’s probably a context where this makes sense.

Speaker: 0
00:47

Yeah. Satan is the lord of the earth is the context.

Speaker: 2
00:51

But you know, you see something that it’s only a 15 second clip, and you’re like, okay. Let me just be the nicest person possible.

Speaker: 0
00:58

Yes.

Speaker: 2
00:58

Like, what what could be the reason why you would say, how dare we say Meh Christmas? Yeah. Unless you’re playing a character.

Speaker: 0
01:04

Right. Well she’s

Speaker: 2
01:05

on stage doing a play. She’s like, I wanna read ram my college play where I was the Grinch.

Speaker: 0
01:11

Can you imagine saying that? Like, it seems like a nightmare that you would wake up from.

Speaker: 2
01:16

It says Harris fumed at Meh saying Merry Christmas before illegal migrants were protected in resurface clips.

Speaker: 0
01:24

Oh, you’re not allowed to sai until there’s absolute peace and harmony on the planet, then we could start saying

Speaker: 2
01:28

it again. This is so scolding and weird.

Speaker: 1
01:30

Only when they cleared that vet did we give them DACA status, and now we’re talking about taking it away. It is morally wrong.

Speaker: 0
01:41

No. Thank you. I did not And

Speaker: 1
01:43

when we all sing happy tunes and sing Meh Christmas and wish each other Merry Christmas, these children are not gonna have a Merry Christmas. How dare we speak merry Christmas? How dare we? They will not have a merry Christmas.

Speaker: 0
02:01

Who are you to say that?

Speaker: 1
02:02

Don’t know if they will be here in a matter of days, weeks, and months. Since September 5th, over 12,000 have lost their status.

Speaker: 2
02:14

This is the the here’s why you can’t, be charitable because it’s just a bad perspective. It’s just a bad perspective.

Speaker: 0
02:23

Charitable. What do you mean?

Speaker: 2
02:24

Because if you wanted to ai look what does does anything she sai make sense? Oh. You they’re not mutually exclusive. Alright? You can’t. It’s like celebrating joy and happiness and some people suffering. It’s like you can’t you can’t say no one is going to suffer anywhere before I celebrate. No.

Speaker: 2
02:47

Because that’s crazy. Now you’re taking in the entire Earth’s consciousness and all of its decisions Yeah. As to whether or not you will or we not will not be happy. Like, you and I didn’t force anybody to work in the cobalt mines. Not yet.

Speaker: 2
03:02

We buy these fucking phones. We buy these fucking phones, and we know we know that electronics that have cobalt in them were probably pulled out of the ground by slaves. Should we never celebrate anything again until those people are free?

Speaker: 0
03:18

No. Never. Never. We should just be shitting in our hands, rubbing in our faces, whipping our backs until the whole world experiences a simultaneous ai, then Meh Christmas to you.

Speaker: 2
03:30

But if you were a drone, so let’s just say they really are intergalactic beings, and you’re watching all of our hypocrisy and our scolding of each other and these, like, untested perspectives just jizzed out into the world Yeah. And you’re looking vatsal this craziness, like, the the manufacturing of almost everything that we have that comes from overseas is probably ram horrible conditions.

Speaker: 0
03:59

Yeah.

Speaker: 2
03:59

And we’ve just accepted that. Yeah. Like, if aliens were watching this to be, like, who are they bullshitting? Who are they bullshitting each other? They’re bullshitting themselves?

Speaker: 0
04:08

Right. They’re

Speaker: 2
04:08

trying to figure out how many genders there are. They’re trying to they’re ai to decide, like, who’s the most protective status who you can’t discuss about anything? Yeah. Currently, that’s illegal immigrants?

Speaker: 0
04:20

Yeah. Well, you know, I was just dude, for I don’t know why I started doing this. Highly recommend it. I started listening because I forgot a lot of the new age ai. So I started listening to new age channeled audibles, aliens channeled through new age people.

Speaker: 2
04:39

Oh, cheers, my brother.

Speaker: 0
04:41

Cheers. Merry Christmas.

Speaker: 2
04:42

To a great start. Isn’t that Seth Speaks? Isn’t that It’s a whole genre. But is that Seth Speaks person? Is that that’s the whole deal behind that. Right?

Speaker: 0
04:53

It’s okay. So it’s again, I’m like, my mom got into it briefly because she dated this new age dude, and I fucking hated it. He wore Birkenstocks. He’d force us to go on hikes. He wouldn’t let me take my fake gun. Sai well, you know, you’re a kid. You wanna take your fake gun on the hike? He’s like, we don’t do that on hikes. Oh. You know?

Speaker: 0
05:11

The fucking, fascist hike where you’re forced to recognize the beauty of nature, and it’s like, dude, don’t put that on me. I I’ll I’ll find it on my own. But he got my mom in a new age stuff, and this was prime new age time. This is like this is when they all killed themselves.

Speaker: 0
05:30

It’s like they were part of it too. You remember the the they were wearing the speak. What were they called? Heaven’s gate. Oh, yeah.

Speaker: 0
05:37

That was a new age cult. So, I remember, like, watching these old grainy VHS tapes with my mom and this dude and, thinking they were cool or even, like, there was some sound that that was playing in 1, and my mom looks at me, like, hopefully. Like, do you recognize that sound?

Speaker: 0
05:55

Because I guess

Speaker: 2
06:00

here’s the thing about all this. I think some telepathy is real. It

Speaker: 0
06:07

is real.

Speaker: 2
06:07

I think it is real. Have you listened to the telepathy tapes? No. You haven’t?

Speaker: 0
06:12

I haven’t listened to it.

Speaker: 2
06:13

It’s a new podcast that’s out, and it’s all about this scientific research that was done with nonverbal autistic kids and their parents. And they were able to go into another ram, and they would, bring up things to 1, whether it was I think it I think they’d bring things up to the mom or the mom would say things.

Speaker: 2
06:35

But the kid was accurate 95% of the ai. Wow. With numbers, with colors Yeah. Ai, not not ai three numbers in a row. Yeah.

Speaker: 2
06:46

But you know how crazy that is just to guess three numbers in a row ai% of the time? Yeah. Like, whatever it is, they think it’s real. I’m only on episode 2. So, but it’s really fascinating, man, because it’s a dismissed thing. It’s a woo woo thing.

Speaker: 0
07:01

Sure.

Speaker: 2
07:02

But if it’s real, shouldn’t ai study it like it’s real? And it seems like through scientific study, it’s real.

Speaker: 0
07:09

Yeah. I think I would it’s definitely you’ve probably experienced it. I’ve experienced it.

Speaker: 2
07:13

I think it’s an emerging part of human consciousness that we don’t we don’t agree to or we don’t admit to. Like, we know there’s something there, but we’re not, like, that’s too silly. It’s just there’s so many people that fake it. That’s the problem. Right. Because everybody, ai, wants to have some special thing that they have. You always have a special thing. You have a special thing, Duncan. You have a special talent.

Speaker: 0
07:39

I mean, think of all Yeah. The Ai, Carrie Yeah. Stranger Things. Yeah. This is the fantasy. When I was a kid, dude, I would sit when my dad was working in his apartment and try to make shit on the table move with my mind because I’ve been reading books on telekinesis. One day, you know, like, when when, like, you’ve got a cold drink and it gets a little wet on the bottom? Uh-huh.

Speaker: 0
08:05

One day ai I’m doing that, because of that, it slid forward and, like, I I was, like, totally freaked out because I thought I had used telekinesis to slide. It was just luck. It was just luck.

Speaker: 2
08:18

It was just a badly balanced floor.

Speaker: 0
08:20

It was a it was just a shitty fucking apartment in College Station. But, you know, that’s the that once you recognize the flaw in the operating system in humans is so like, as a kid, like, for a few days, I was like, shit. I might be telekinetic. But, like, once you know people want that or wanna believe in it and how easy it is to manufacture those moments and then claim responsibility.

Speaker: 0
08:43

Holy shit, dude. You can really pull some strings on people because there’s an assumption. Let’s say some I do know. I I really believe in telepathy. I’m positive it exists. But the assumption there then would be, like, you get around a telepathic person. Well, they must be good because they’re telepathic. Right?

Speaker: 0
09:02

They’re they’re they’re a magic, so we should trust them. This is where people get real fucked up. These are called in India, they call them siddhis, which is if you meditate a lot

Speaker: 2
09:12

Right.

Speaker: 0
09:13

You begin to, like well, I would say comedy is a city. Oh, you know, it’s not special. I was talking to Luis Gomez about sales. You know? That’s sai that’s the really good people in getting your head and get you to buy shit. He was saying it’s, like, basically magic. And it’s ai so Hypnosis. Hypnosis.

Speaker: 0
09:31

Yeah. But, meh,

Speaker: 2
09:33

I’ve been been hypnotized?

Speaker: 0
09:34

Yes. I have. It’s interesting. Right? Dude, my mom hypnotized me when I had a wart because she had heard you could hypnotize people and the wart goes away. Hypnotize meh, said something about the wart going away. Within a couple of weeks, I swear to you, that wart fucking dried up and just fell off my hand.

Speaker: 2
09:54

Woah. Yeah.

Speaker: 0
09:56

Woah. What the fuck?

Speaker: 2
09:57

Woah. What the fuck? So that’s the placebo effect. The placebo effect is real. You know, I had a guy tell me this once. He was ai a a kind of a wacky healing chiropractic type guy. Yeah. And he was telling me that if you believe what I’m saying is true because I was asking him, like, how does this work? Like, how how is this working?

Speaker: 2
10:18

Like, how are you healing people by by working on ai passing on things? If you believe it works. Oh, sai it’s a ai, but if I believe the lie so what are you selling?

Speaker: 0
10:30

I’m like,

Speaker: 2
10:30

you’re just you’re just ai fucking manipulating people and saying mumbo jumbo muscular structural words.

Speaker: 0
10:37

Yeah.

Speaker: 2
10:38

And you’re doing you’re doing hypnosis, kind of. Because you’re you’re sort of admitting that by healing, like, a person who’s gonna heal you with words and talking and touching you, you’re they’re tricking you into doing it yourself.

Speaker: 0
10:51

Well, I mean, the placebo effect

Speaker: 2
10:54

It’s real.

Speaker: 0
10:54

Is real. Ai I’ve heard it’s one of the most powerful effects in medicine.

Speaker: 2
10:58

Is it really?

Speaker: 0
10:59

Yeah. Well, I ai, yeah. You think of, like, the new cancer drugs. They tell your immune system what to attack. Right? So if somehow you you could do that without the drug, if there and and that’s where it gets interesting. Right? Because we these are our bodies Right. Perfectly metabolizing, transforming so many things instantaneously. The heart effortlessly beating all the fucking time. So theoretically, purely theoretically, you you what if you could control more of it?

Speaker: 0
11:34

Like, how much of this thing can we actually control? And ai the way, that’s a really fun thing to think about because, like, not much. And so do you ever think about that? Like, you sort of think ai, okay. Like, I’m how much of my body can I really do anything about?

Speaker: 0
11:52

I can eat good food. I can ai. But all the quantum processes that are happening within, all of the things, you kinda realize you’re just the tip of the iceberg. You’re just the little yappy tip of the iceberg. Yeah.

Speaker: 0
12:04

And underneath it is all this stuff that is you, but really isn’t you if having control of yourself is, like, a way to identify this is me. So what are you in that swirl of particulates? Like, what are you in there?

Speaker: 2
12:20

Yeah. What are you? Yeah. It’s That’s most people, and that’s one of the reasons why ideologies are so interesting because it’s the same thing. It’s the same person. It just they’ve agreed to one thing or they agreed to the other thing. And it could be how you were raised or it could be you rebelling or it could be but people find a way to fucking slip into a groove Yeah.

Speaker: 2
12:45

And it’s so much easier Yeah. Than trying to look at, like, what is this? Yeah. What is this thing we’re doing where I’m making noises with my mouth? You’re reading my mind. Yeah. And we’re, like, broadcasting it to the world. By making noises with your mouth, we’re we’re speaking through each other’s minds.

Speaker: 0
13:03

Yeah. And also though, you know, when you get into the telepathy idea, which is sort of like the question is like, you know, right now we identify our minds as some kind of neurological process. Right? So the idea is, like, we have this, like, ai computer and somewhere in there is our mind.

Speaker: 0
13:25

Everything out here, not our ai, even though everything out here ram a neurological perspective is our mind. Everything you’re seeing is an instantaneous, interpretation of a variety of phenomena that gets compressed into reality. And then you say, oh, out there is that that’s not meh, but it is you.

Speaker: 0
13:49

It’s like it’s you in the way if you put on VR goggles, you know, except in this case, the VR goggles, it’s your neocortex. It’s all the processes that are making color, light, sound, etcetera. So if we’re sort of sharing a dual reality, which is all the phenomena that’s being interpreted into our minds, somewhere in there is a possibility that we’re we’re we ai share a mind.

Speaker: 0
14:10

So from that perspective, all these other things become possible. Telepathy, all of this stuff. Like, you know, you get around funny people, you get funnier. When I when I was doing the midnight gospel, I was around all these arya. I got better at drawing. Like, you share a mind.

Speaker: 0
14:26

It’s the gestalt or, you know, the where 3 or 2 or more of you gathered, there will be vatsal something else comes in the room. And

Speaker: 2
14:34

I think we’re collaborating with something that we’re we’re we we don’t truly understand because we’re still trapped in primate bodies.

Speaker: 0
14:42

Yes.

Speaker: 2
14:43

So I think I think we have these moments of recognition of these connections, you know, in in great moments in life and these beautiful things that can happen. And it’s all being twisted up by this ape, this wild ape that had to survive for 1000 and 1000 of years by killing its neighbors

Speaker: 0
15:06

Yeah.

Speaker: 2
15:07

And and eating monkeys and and fucking running around and clubbing things to death and eating raw meat Yeah. Until it figured out how to harness ai. And then it had to deal with neighboring tribes coming in with hordes of people with swords and speak. Yeah. You had to run for the hills.

Speaker: 2
15:24

They killed your kids in front of you. They fucked your wife in front of you. Yep. They cut your dick off and stuffed it in your mouth, and this was thousands and thousands of years. Yeah. Yeah. This thing we’re doing right now is so recent.

Speaker: 2
15:39

Yeah. This thing where you can meet strangers, you don’t have to worry about killing them

Speaker: 0
15:43

Yeah.

Speaker: 2
15:44

Is super recent.

Speaker: 0
15:46

Dogs aren’t there yet. That’s why dogs freak out when someone comes to your door. They’re not there yet. They still remember the old days. And

Speaker: 2
15:52

the the That’s a great point.

Speaker: 0
15:54

That that they’re still ai, dude, usually, if someone’s coming up Yeah. We have to kill it. Like and they’re reminding you of that. You know? That and and it’s true. I mean but and if you look at that collective epigenetic trauma as an egregore, as a ghost, a ghost haunting the planet, the ghost of, like, not that long

Speaker: 2
16:16

ago. To primate past.

Speaker: 0
16:18

The gross ghost of primate past haunts us. Yeah. And it that’s why it’s so easy to slide into, aggressive patterns and defensive patterns that are completely unnecessary.

Speaker: 2
16:30

And that’s what they are too. This is what you have to ai. It’s not you. It’s patterns that you’ve selected and you’ve you’ve selected them over and over and over again, and they’ve become you. It’s ai you went down a groove. You don’t have to stay on that groove. No. You don’t have to.

Speaker: 2
16:46

But I think you have to find something in life, that’s physical that you enjoy because I think that’s one of the best ways to manage this fucking weirdness. The absolute best way is through getting physically exhausted. Sure. Just get get on purpose, get physically exhausted, and then you can manage the craziness Right. Of existing. Yeah.

Speaker: 2
17:12

Because everybody wants to pretend that it’s normal. Everyone wants to pretend that existence is, like, oh, you know, you get up on the morning, you fucking have your eggs and your bacon and your doodle your

Speaker: 0
17:21

shit. Ai Chaparron sai?

Speaker: 2
17:23

Yeah. Exactly. Exactly. It’s ai every day. It’s like, oh my god. This is happening. What do you think the drones are? What do you think the drones are? You know? How much did Nancy Pelosi make this week in the stock market?

Speaker: 0
17:35

Yeah. Oh,

Speaker: 2
17:36

yeah. Ai, like, what what is this?

Speaker: 0
17:39

Right.

Speaker: 2
17:40

What are we doing now?

Speaker: 0
17:42

Did you see the new shit that they found out about consciousness in the human brain? This popped up on my feed. This dude, Penrose, this guy used to be an anesthesiologist. He already knew about these neurological structures that are these quantum tubules that apparently anesthesia impacts, and he began to think maybe consciousness is not associated as much as we thought with the, with with neurons, but is a microstructure within the bryden, these quantum tubules that get shut down when there’s anesthesia.

Speaker: 0
18:15

And so there’s this new controversial sort of emergent theory of consciousness, which is that, when you are awake, you go from being a wave to a particle. You in other words, the whatever you wanna call it, the I am, the, all one situation that we actually are experiencing gets compressed Right.

Speaker: 0
18:38

Into a particle, which is your experience of reality. But when you fall asleep and taking a facet, you go into a superposition. When that’s that feeling of being connected to everything, part of everything, not even being there anymore. So were those things simultaneously? And and and Ai guess as far as the default reality that that you’re talking about, that’s a situation where it’s a bunch of particles that have focused in on a on, like, a buffet of, moments that the news curates.

Speaker: 0
19:11

So the news is like, okay. Be mad at this person. This person’s wrong. This person’s right. Here’s what you should be afraid of. Here’s a celebrity that sucks.

Speaker: 0
19:20

You know? That’s the whole business model. The business model.

Speaker: 2
19:27

And that’s the way we get the news.

Speaker: 0
19:29

That’s it.

Speaker: 2
19:30

Isn’t that crazy? And sponsored by pharmaceutical drug companies.

Speaker: 0
19:34

There you go.

Speaker: 2
19:34

And everybody else I was watching a regular movie the other night. I was in a hotel, and so it was the only thing they had with in the hotel was regular movies on TV TV. Yeah. Yeah. So I was watching John Wick on TV, and it’s every 5 minutes

Speaker: 0
19:51

Oh, yeah.

Speaker: 2
19:52

You’re bombarded with nonsense. They stop the show and give you 5 minutes of nonsense.

Speaker: 0
20:00

That’s right.

Speaker: 2
20:01

Just nonsense about the and side effects.

Speaker: 0
20:04

It’s unnerving. And, also, when you realize we think the show is John Wick, that ain’t the show. No. The show is the nonsense that’s happening in between John Wick. Because when you see go out when you’re watching a good movie, you relax, you calm down, you open up. It’s the perfect, perfect state of consciousness to manipulate people.

Speaker: 2
20:27

I also thought it was incredible that they bleeped out all the bad words when the commercials were far more offensive. Yeah. They bleeped out fuck. They bleeped out this I was like, I’m like, how are they gonna handle this scene? Because there’s a scene where the the Russian mobster, his, son, comes home from this job in Atlantic City and after he did this thing with John Wick.

Speaker: 2
20:52

And, the ai like, who the fuck nobody goes, that fucking nobody is John Wick. And it’s ai this the whole setup of John Wick. And it’s that nobody.

Speaker: 0
21:04

Like, no. You’re gross. Can’t say fuck.

Speaker: 2
21:07

You took out the fuck, but meanwhile, you’re telling me about a bloody diarrhea that might kill you if you take this drug. Yeah. You’re you’re telling me about its side effects that are ai suicide, ai, all kinds of, like, wild shit, depression, anxiety, fear, violent tendencies Dude. Gambling addictions.

Speaker: 0
21:28

This is so when you think about the idea that if, you know, what a group of witches is called a coven. A a group of Christians is called a church. The ai is, sort of simultaneous prayer causes change. Now there’s different words where some people call the prayer spells. Some people call the prayer a pep rally. We’re gonna go go go. You look at the football game.

Speaker: 0
21:52

You’re seeing covens of witches cheering to direct energy at the team they want to try to, like, move the needle a little bit. But when you consider the power of directing little bubble universes, which is every single human, focusing that beam of attention onto certain ai.

Speaker: 0
22:12

Dude, you could you’re you’re you not only are you going to create whatever it is you wanna create in the case of an advertiser, make some money, but, theoretically, you could guide history that way. And the and and the last thing you want them to figure out is if they all stop focusing on what you’re telling them to focus on and trust themselves enough to focus on what they wanna focus on, which is usually not bad, then all of a sudden you would lose that kind of magical control.

Speaker: 0
22:41

You lose the actual steering wheel of the, of the weird vehicle we’re in. You know? And and we you know, they’re like, it’s democracy. The steering wheel is your vote and the president and the elected officials who guide the country. But the real steering wheel is here’s what what we’re gonna get you to pay attention to. You need to pay attention to this right now.

Speaker: 0
23:01

And if we all pay attention to that, it like, where attention goes, energy flows.

Speaker: 2
23:07

You know you know what I think it is? I think it’s like if we’re in a factory if we’re in a factory and there’s certain gears that turn certain machines, and they think they’re the only thing that exists. Yeah. But it’s a chain of things that have to take place in order to manufacture something, like a Tesla. Yeah.

Speaker: 2
23:30

Like, imagine if you are if if we just don’t realize it, but if everything has a consciousness, at least in some sort of a limited capacity. Yeah. Literally everything, even tables. Yeah. Everything has something. Yeah. We’re just we’re we’re super egotistic, and and we believe that only we possess this.

Speaker: 2
23:48

But we know dogs have it too, which gets where it gets weird. Yeah. Animals have it. We know that, you know. But this whole thing that we’re doing is trying to understand how we we interface. Ai, how are we doing this?

Speaker: 2
24:06

Like, if if we’re in a world where it’s 2024 and there’s drones flying over New Jersey, and they’re gaslighting us saying they’re all airplanes, They’re saying we have it under control. Yeah. And then it appears there was a satellite that was shot out of the sky. Yeah. Have you seen that? No. No. You haven’t seen that?

Speaker: 0
24:25

I missed it. So this

Speaker: 2
24:26

is the big conspiracy. And, again, I have done no research, so do not believe me, ladies and gentlemen.

Speaker: 0
24:32

Okay, elf.

Speaker: 2
24:33

The big conspiracy is that these are Chinese drones, and they’re being piloted by a satellite that they shot out of orbit. Oh. That’s the and this is a conspiracy? Un founded conspiracy? Unfounded. But I’m just for funsies for funsies.

Speaker: 0
24:48

Well, I mean, do you remember when those weird green fucking lights showed up in Hawaii?

Speaker: 2
24:54

Well, I remember where there was a ship. Right? And there was, like, these triangle looking things that were flying over a ship.

Speaker: 0
25:00

Laser lights that shot out of the sky. Oh, yeah. Remember that? That’s right. These

Speaker: 2
25:05

are what was that?

Speaker: 0
25:07

What are the drones? I mean, that’s what’s what I love about the drones is I mean, aside from the obvious, like, you know, getting to imagine ai, it could be they’re chasing orbs and the orbs or whatever. What I love about the drones is that it’s another step in shaking people awake. You know what I mean?

Speaker: 0
25:26

Because it’s like part of living in default reality, I think, is you sort of lean into the idea that the government is you could trust. You can trust the government. Of course. Like, you have

Speaker: 2
25:39

Trust the people that make the weapons.

Speaker: 0
25:41

You could trust them.

Speaker: 2
25:42

They’re really ai good guys.

Speaker: 0
25:43

Sure. Yes. Some of them, you know Are hyper violent. A little or whatever. Yeah. But, ultimately, we can trust these people. And so then you have over fucking New Jersey meh vehicles that people are filming. Welcome to Earth, bitch. Do you see that one? And it’s so funny. People in New Jersey are reacting to them.

Speaker: 2
26:06

And you see the ai shooting into the sky at the drone? Yeah.

Speaker: 0
26:09

Of cool. I mean, I’m surprised more people haven’t.

Speaker: 2
26:11

Well, the problem with that, you fucking idiots, is that bullets fall. Okay? And they fall with almost the same kind of velocity. I mean, I’m sure that you they lose a they lose a lot of speak, but it’s enough to kill people. People definitely died from bullets falling.

Speaker: 0
26:24

You know what else falls? Drones. Experimental fucking drones. The government’s flying over fucking New Jersey hoping those fuckers don’t crash. They’re apparently half the size of a car. You know? Some of them are bigger, bro.

Speaker: 2
26:36

Some of them are suburban size.

Speaker: 0
26:37

Yeah. And these are flying over oversized. Flying over houses.

Speaker: 2
26:40

Oh, yeah.

Speaker: 0
26:40

So it’s just ai, no. Listen. Number 1, most of what you guys are seeing, it’s stars or you’re seeing, commercial vehicles Yeah. Mostly. And and the other stuff, we don’t really know. And so then at that point, you’re like, wait a minute. I’m paying almost half of my income in fucking taxes, so you know what the car ai mystery things flying over the cities are.

Speaker: 0
27:05

And you don’t know what that fucking is? What am I paying you for? You know what I mean? You’re making a lot of money, man. You should know what the drones are. And so but then when you see what’s his name? Bolt is it Bryden?

Speaker: 0
27:17

The the the

Speaker: 2
27:19

The guy with the mustache?

Speaker: 0
27:20

Not Bolton. When you see, I don’t know, the DOD dudes up there and the way that they’re just lying their fucking asses ai. Did you see the press secretary talking about it, and she’s wearing a necklace that looks like a UFO?

Speaker: 2
27:33

Is it Corinne Jean Pierre, that lady?

Speaker: 0
27:34

Not Jean Pierre. It was another one. It was some A new one?

Speaker: 2
27:37

It I’m it was a How do they just shuffle dupe people in without announcing?

Speaker: 0
27:40

Ai don’t think they got rid of Pierre. Okay.

Speaker: 2
27:43

I hope not. Because most of them don’t last as long as Pierre. She’s she’s, like, the marathoner. Oh, dude. Most of them, they quit that job. They’re like, fuck this job. I just sana lie all the

Speaker: 0
27:53

time. Horrible.

Speaker: 2
27:55

Horrible. Imagine, like, Duncan, this is what you’re gonna sell, war with Sudan. Okay. Okay? Here’s the reason why. Alright. The rebels, children, problems Children problems. Economy Okay. Pollution.

Speaker: 0
28:07

Okay. Okay.

Speaker: 2
28:08

We gotta vaccinate them.

Speaker: 0
28:09

Got got Tyler this. There’s a lot of data right now, though. But I got then then what about the data showing vaccinations are bad for you?

Speaker: 2
28:15

No. No. Fuck that data. These these people arya in trouble, and we need to help them. We need to help them. So war, Sudan.

Speaker: 0
28:22

War, Sudan. War, Sudan. Got it. Okay. No problem.

Speaker: 2
28:24

Sell it.

Speaker: 0
28:25

Tell it. Ai got it. I’ll sell it. I’m going out here.

Speaker: 2
28:27

You just put on your Rachel Maddow glasses.

Speaker: 0
28:31

The Rachel Maddow

Speaker: 2
28:32

glasses. Everybody’s wearing them to look super serious.

Speaker: 0
28:35

Well, this is it’s part of the costume, isn’t it? Like It is. And, you know, you know, we also buy ai

Speaker: 2
28:40

wearing costumes while we’re saying this.

Speaker: 0
28:44

We are in elf costumes. But it you know, but, yeah, that that I don’t understand the Rachel Maddow glasses phenomena, but I have done research into it because I wanted to create a vision board of all the people wearing those glasses, and it’s a thing. It’s like a a thing on the left.

Speaker: 0
29:01

They wear those fucking glasses that identifies that you are you have a certain set of beliefs if you’re wearing the Maddow glasses. I mean, it’s a real thing.

Speaker: 2
29:09

It is a real thing. Yeah. If you if you have those glasses on and you’re a Republican, you’re an assassin. Like, you’re a guy who kills people for a living. You’re you’re a very strange person.

Speaker: 0
29:21

Dude, isn’t that but that’s to me the invasion of the body snatchers experience. I love have you seen that movie in a while? It’s been the remake?

Speaker: 2
29:29

I I I saw the remake, but it was a long time ago. Right? Wasn’t it, like, 5 years ago? How long ago

Speaker: 0
29:34

was it? This I think it’s from the one I like is from the seventies.

Speaker: 2
29:37

Oh, the double southern one. Meh. Yes. That one’s amazing. Amazing.

Speaker: 0
29:42

Yeah. Right. Ai

Speaker: 2
29:43

they point at you and make that noise?

Speaker: 0
29:44

Yeah. Oh. And meh. That’s it. Oh my god. Sutherland, you’re killing it. Ai how creepy his eyes are. Yeah. That’s it, bro.

Speaker: 2
29:55

You imagine can you imagine you imagine that reality? And ai the way, not that hard. You know what’s way harder than that? Building a planet.

Speaker: 0
30:05

Ai? Yeah.

Speaker: 2
30:06

Right. There is plenty of planets. Yeah. That is not that hard. That is essentially what happens all the time Yeah. In, like, the insect kingdom where they get infested by another parasite that controls their brain.

Speaker: 0
30:16

There you go. Dude, I I went down a deep rabbit hole with this shit because, you know, I was, like, looking right after Trump won, which ai the way, sai I wanna remind you? I’m sorry. I don’t wanna pat myself on the back. But when we were hunting for Bigfoot, do you remember I said to you, one day, you’re gonna get a president elected?

Speaker: 2
30:34

Did you say that? No. Back then, that would have been the least likely scenario. I’m in the woods with the Fear Factor ai, and we’re legitimately looking for Bigfoot. Legitimately. Legitimately with Bigfoot experts.

Speaker: 0
30:51

Experts. That was one of my favorite camping trips I’ve ever had

Speaker: 2
30:53

in my life. Fun. Ai would love that

Speaker: 0
30:55

to do that again. Hunting for Bigfoot. It’s like because hunting for animals, which I know you love. I have nothing against it, but you still gotta kill an animal. Hunting for Bigfoot, you just, like, look for a twig out of place, and you get to imagine he’s ai. And that that it’s really fun. Squatching is fun.

Speaker: 2
31:12

Somebody asked me if, I saw Bigfoot, would I kill it? Because if I could kill it, then I could show people that it’s real.

Speaker: 0
31:21

Interesting. Interesting. Would you?

Speaker: 2
31:23

No. Ai would I kill Bigfoot?

Speaker: 0
31:26

Well

Speaker: 2
31:26

But this, like, why would you do that? It doesn’t make any sense. Just to prove that it’s real? Well, guess what? Guess we can’t prove it. There’s no other way. It’s not like I tell you where it happened, and you fucking close off a 1,000 square miles and start pushing in with soldiers.

Speaker: 2
31:40

Well If don’t kill it. Why would you kill it, stupid? I’m telling you where it is. Why do you want me to shoot an arrow through it?

Speaker: 0
31:46

You know, man Why don’t

Speaker: 2
31:47

you just trust me and spend $1,000,000,000 on on drones and imagine, like, why did we spend all this money? Joe said he saw Bigfoot, and so we sana looking.

Speaker: 0
31:57

But by the fucking way, no

Speaker: 2
31:59

one would spend any money to go look for Bigfoot.

Speaker: 0
32:01

If you can fly a swarm of drones over fucking New Jersey You

Speaker: 2
32:06

could find Bigfoot.

Speaker: 0
32:07

You could fly them in the Pacific Northwest, and we’d know once and fucking for all. See this you know, they talk about democracy, and we sometimes Sai like to think about what would actual democracy look like. And, you know, it wouldn’t look like some dude getting in front of a microphone and gaslighting your ass about experimental craft. It would be like, alright.

Speaker: 0
32:26

I’m just gonna tell you guys we figured out anti gravity. That’s anti gravity drones. We wanted to show you, and tomorrow, we’re gonna drop ketamine on on on the neighborhoods. Democracy. But, but you know what I mean?

Speaker: 0
32:43

Like, that would be true democracy versus what we have right now, which is sort of democracy. It does work. The voting works and all that stuff, but, ultimately, our impact the the the nonpolitical class’s impact is very little. And the political class’s impact is very little when you consider now there’s a security class.

Speaker: 0
33:04

So you have the politicians ai Harry Reid ai to figure out what the fuck is going on with the UAPs, and even they can’t do it because there’s another level. In that level is ai, that’s that those are the people controlling things because they know the secrets. It’s just so infuriating to me that now they feel comfortable enough to fly whatever the fuck these things are over a major city and not tell us what they are arya then say we don’t know what they are.

Speaker: 0
33:33

Because if if the reality is they don’t know what they are, if we’re going to believe them, which I guess you’re not just not supposed to You know what I mean? Are you supposed to? Ai no. I don’t think you’re I think at the point where they’re just telling you it’s, like, stars Okay.

Speaker: 2
33:49

But let’s be honest. If you are in possession of the actual information, you know what it is. You know it’s China or you know it’s aliens or you know it’s a combination of both. Yeah. Or it’s US government or it’s all 3.

Speaker: 0
34:04

Sure.

Speaker: 2
34:04

Maybe it’s all of the above. How the fuck do you tell people that? How do you tell people that ai you’re also governing? You’re also doing all these different things. You’re you’re very busy. How do how does the president get on television and say, ladies and gentlemen, aliens are real?

Speaker: 2
34:20

Let’s do it. We are being visited on a regular basis by nonhuman intelligence that is far superior to our own. Sure. We don’t understand why they’re here. We have been working with them. We have back engineered their products, and that’s how you got fiber optics Yeah. And capacitors. Right.

Speaker: 0
34:38

And all

Speaker: 2
34:38

these things that sort of emerged Yeah. After Roswell. Yeah. Everything. That’s the most fun one.

Speaker: 0
34:44

You were bioengineered. They, you know, they seeded your culture with

Speaker: 2
34:47

your religions. Everything.

Speaker: 0
34:49

It’s for a good cause. Everything.

Speaker: 2
34:51

Yeah. You you We’re a piece of the fucking factory, dude. That’s what we are. Right. We’re a piece of the factory that doesn’t recognize that there’s a whole other building connected to this that’s filled with machines.

Speaker: 0
35:03

Yeah. We’re a piece. That well, that’s so okay. So that is exactly what you want pieces of your factory to think ai. And that is why at any moment, anybody can actually just turn the channel. You’re not you’re not a piece of the fucking factory. Actually, you’re the universe.

Speaker: 2
35:22

You are the universe.

Speaker: 0
35:23

And and you’re the universe who has been dude, I mean, look what they look what they could do to lions at a circus. It’s deadly fucking thing. They can make it jump through hoops. They can make it catch a Frisbee. Right? So think of the

Speaker: 2
35:38

of the time.

Speaker: 0
35:40

That’s right.

Speaker: 2
35:40

They can’t. Those those make for some wild Instagram videos.

Speaker: 0
35:45

Oh, they do. And when and when There’s a lot

Speaker: 2
35:46

of those out there.

Speaker: 0
35:47

There is. It’s the assassination of a fucking CEO. And ai the way, like, I am not, like, assigning any kind of, like, I I think it’s a slippery slope if we start publicly fucking executing CEOs. Like, if you start the you know what I mean? That’s a that’s a real slippery slope.

Speaker: 2
36:03

Super slippery.

Speaker: 0
36:03

But I’ll tell you. If you sort of look at the factory, the way it works is, like, you’re number 1, you you really aren’t supposed to identify the vatsal, like, what’s causing, like, a lot of suffering. Like, you that what you once you start making those identifications, then, and you follow through with some kind of action based on those identifications.

Speaker: 0
36:26

Number 1, the action can’t be there based on the rules of the factory. Of course, the factory is gonna create up create rules. You can go out with your fucking signs or whatever if you’re at the right place, not at ram Amazon where they arrested those people protesting. But there’s places in the factory where you can go and be like, I need more oil. I’m squeaking.

Speaker: 0
36:46

But that but only once in a while and only in the right way. It’s a peaceful protest is what we call it. You do it at the wrong time, it’s a fucking insurrection. You know what I mean? So the factory’s got rules about how we do this. Right.

Speaker: 0
36:59

So the moment you go outside of those rules, the moment you, like, actually and to do that, you you have to somehow really think outside the factory, then then you see something like that happen. And then you see the way the factory responds, which is the perp walk they did with that dude. They’ve got fucking speak team 6 walking that guy in.

Speaker: 2
37:19

He’s handsome.

Speaker: 0
37:20

He’s movie. Oh, he’s a he’s sai yeah.

Speaker: 2
37:22

In a movie, if you saw that handsome guy getting arrested and there was, like, speak team 6 behind to protect him

Speaker: 0
37:27

Yeah.

Speaker: 2
37:27

You’d be like, that’s no way they would do that. That’s right. It’s just a regular killer. Yeah. There’s no way they would have that many guys guarding that guy.

Speaker: 0
37:35

Well, they they’re not guarding that guy. Look at that. Look at that. They’re sending a fucking message. They’re saying, listen. We will surround you. And and ai that so because, like, what’s really scary about what he did is, like and and I think if if you wanna, like, take murder, cold blooded murder and just for a second call it activism, what that guy did is he didn’t just, like, you know, send a message, which is really scary for people like CEOs, which is saying, listen, man.

Speaker: 0
38:05

Like, you can’t keep fucking us, with the insurance. If you do, you’re not safe. And so that’s scary as fuck because that that’s the the CEOs, of course, are the ones who pay for the lobbyists, who pay for the laws. And so he sends a message of a methodology, which, again, I I think, if we if we’re gonna get into a a better place, you using violence, I I I I just don’t think that’s the path.

Speaker: 0
38:36

But just as an analysis, dude, I would say you could expect more of that to happen, and and and that is going to lead to the Darth Vader people coming out more.

Speaker: 2
38:47

You know? You’re not cosigning it.

Speaker: 0
38:49

I’m not cosigning it at all. No. No. No. No.

Speaker: 2
38:51

It’s it’s a realistic assessment. Like, there’s something going on. People are very upset, and they’ve been able to do this to people for so long, deny people treatment for so long.

Speaker: 0
39:02

Like, you remember when your dad had to come downstairs? Like, you’re fucking up or you’re you’re, like, misbehaving with your brother. You’re doing something really bad. You set something on fire. Right. Your dad comes downstairs. He’s been at work. He’s fucking pissed. That’s how you know you’re really fucked. Mhmm.

Speaker: 0
39:17

When people start doing stuff like that, then the dad has to come downstairs. And when the dad comes downstairs, it looks like the dudes in the Darth Vader outfit. All of a sudden, this, you know, this facade, for a second, they have to stop the show, turn on the fucking lights.

Speaker: 0
39:32

These guys in fucking full body armor come out, spray chemicals into your face, and drag you away. And then, alright, start the show up again. Start the show up again. It gets memory hold. So that that is an example of what happens when the factory is imbalanced.

Speaker: 0
39:50

And right now, the factory is imbalanced, dude. It’s just that’s the problem. There’s a reason we need the middle class. There’s a reason you need some path forward that is there’s a reason you need to be able to buy a fucking house. And aside from, like, the human comfort and starting a family and all that stuff, the moment you pull that away from people, now what? It’s ai, so wait.

Speaker: 0
40:11

They’re what am I supposed to do here? Now, again, I am not advocating violence. I think that if we keep doing violence, we’re going to keep getting ai, but it’s a really scary thing when shit gets so imbalanced. And when you hear about like the insure health insurance, like I’m lucky because like I’m on this, I’m on crapopolis on, on Fox. I have incredible health insurance, dude.

Speaker: 0
40:37

But like you read about the people denying like, really important medication, really important procedures to people, sending them stacks of of paper explaining why we’re not gonna pay. Well, you know, I I got my colonoscopy recently. It cost me a $100. You know how much they charge my insurance company? $9,000.

Speaker: 2
41:02

Have you ever talked to Brigham Bueller about this? No. You should. You know, because he he understands it from top to bottom. He he can tell you exactly what’s going on. He’s talked about it on the podcast. But it’s, you know, it’s a giant machine. It’s a giant money machine.

Speaker: 0
41:18

That’s right.

Speaker: 2
41:19

That’s really what it is. It’s not really about making you better. It’s sai it’s about just a giant money machine.

Speaker: 0
41:24

That’s right.

Speaker: 2
41:24

Making you better is what they sell. Yeah. But it’s about making more money.

Speaker: 0
41:27

That’s right.

Speaker: 2
41:28

And and they can make incredible amounts of money for surgeries that maybe you don’t need. Yeah. You know, I’m not saying everybody does it, but some people do it. It’s been it’s this there’s a guy that just got arrested recently. I don’t know if you heard about this guy. I sent this to Peter Attia.

Speaker: 2
41:43

I could send it to you, Jamie, or maybe you could find it. This dude, he was telling people they had cancer and they didn’t, And it was ai a ton of cases, and he would give them chemotherapy, meh. And he’d make them, like, severely ill.

Speaker: 0
42:00

Yeah. Demon.

Speaker: 2
42:01

And he did it to, like I don’t I don’t remember the number because I think the numbers stunned me so meh. I didn’t wanna remember it. But this guy told a ton of people they had cancer. Just scared the fuck out of them, ruined their lives Dude. And then gave them poison Yeah.

Speaker: 2
42:17

That’s designed to kill cancer.

Speaker: 3
42:19

Is it a few ai, 10 years old?

Speaker: 2
42:22

It could have been. Someone has sent it to me on, Instagram. It was a news story.

Speaker: 0
42:27

Oh my god.

Speaker: 2
42:29

He got 45 that’s all he got was 45 years?

Speaker: 3
42:32

It says Pretty long. 13 counts.

Speaker: 0
42:34

He’s 50. That’s a that’s a life sentence. Well, yeah, man. No. I don’t know if

Speaker: 2
42:41

this is the same guy. Maybe more doctors have done you know, this is one of the things that I found out. I was doing a bit about, this fertility clinic doctor that was using his own jizz. That’s not one that’s not one

Speaker: 0
42:54

case. No.

Speaker: 2
42:55

There’s a fucking shit ton of cases. Like, I wonder how many of these doctors were there’s this creepy doctors that just like there’s creepy carpenters. Like, just like there’s, you know, like, some some doctors don’t give a fuck about people.

Speaker: 0
43:10

Why do they use their own jizz? There is they do they run by?

Speaker: 2
43:13

All the babies. They want everyone to have their baby.

Speaker: 0
43:15

Oh, so it’s not

Speaker: 2
43:16

They’re just psychos.

Speaker: 0
43:17

People aren’t coming in?

Speaker: 2
43:18

No. There’s women that went in with their husband’s jizz, and he’s like, Sai gotta ai option for you, sweetheart. That’s what this guy did. He did. Dude. He ran a fertility clinic, and I think people started figuring out when 23 andMe came around. And this is just one of these guys.

Speaker: 2
43:33

There’s been a ton of those guys. Yeah. That fundamentally is the difference between men and women. Could could you imagine a clinic where a woman was getting other people to carry her babies?

Speaker: 0
43:50

That’s hilarious. Yes.

Speaker: 4
43:52

Not a chance in the fucking

Speaker: 0
43:53

world. No

Speaker: 5
43:54

woman would

Speaker: 2
43:54

want that. Yeah. Take my baby. You take my baby. I trust

Speaker: 0
43:58

it with you. Funny.

Speaker: 2
43:59

You don’t the guy I don’t even know have a baby with me. Ai. Have the most precious thing. You could have it.

Speaker: 0
44:04

You could have it.

Speaker: 2
44:05

You could have it. It’s literally the fundamental difference between men and women that a guy could run a sperm clinic and think that I’m gonna get away with everybody having my babies. And he doesn’t give a fuck. What happens to those kids?

Speaker: 0
44:18

Because they might fuck.

Speaker: 2
44:20

They might fuck. They they might not know. They might find out. 2 23 and b, they’re they’re cousins.

Speaker: 0
44:25

Yeah.

Speaker: 2
44:26

Like, holy shah. We’re cousins. Yeah. Then you find out everyone’s a cousin. Because his creep’s been just using his own jizz for 35 years.

Speaker: 0
44:32

That guy could be, ai, you know, based on the depopulation that’s happening, based on population decline, that guy could be, like, the next Genghis Khan. Like, in the future, like, ai will, like, get 80% of the planets related to this dude.

Speaker: 2
44:47

I think he’s got a lot of catching up to do to get it with where Genghis Khan’s numbers were.

Speaker: 0
44:51

I mean, how it is interesting. It’s like, you know, you you read Elon Musk is the top Diablo player in North America. Right? Which

Speaker: 2
44:59

I think in the world, dude. In the world. I think he’s the number one in the world, which is fucking insane.

Speaker: 0
45:04

And and, dude, and Sai know you, and and I I’m I’m not trying to high road you here. But unless you’ve played Diablo 4, you can’t understand what that means.

Speaker: 2
45:16

I absolutely accept that. I do not understand what

Speaker: 0
45:18

that means. It is insane. Like, it like, when I was addicted to that fucking game, like, I just wasn’t sleeping because, you know, I had to do dad duty in the day, Diablo at night. And I was I sucked. So when you realize this guy shooting rockets into space, making e vehicles, starting a new fucking department of the government is also the top.

Speaker: 0
45:44

It’s the It’s so crazy. It’s the one time I actually let myself think maybe he actually is an alien because there is no there’s just no way. Unless he’s paying people to do it for him, which obviously he’s not. That is that is insane, man. Yeah. That is insane.

Speaker: 0
46:02

So, dude, when you when you consider I don’t even know where I was going with that. I got lost in Diablo 4 just thinking about it.

Speaker: 2
46:09

Well, we’re just talking about he was the number he’s the number one player that Elon that how how preposterous it is. It’s it no. It’s But I don’t know. I I don’t play Diablo 4, so I really don’t know what that means, but I believe it’s huge. You know what I’m saying? Like, I don’t

Speaker: 0
46:23

It’s crazy. It’s crazy because, you know, Diablo 4, it’s all about your build. It’s all about, ai, it it eye hand coordination is obviously a big part of it, but then it’s just and then you see the chopsticks catch the fucking rocket.

Speaker: 2
46:40

And it’s like That’s ai job. Side job, he’s had more space innovation in the last 5 years than NASA has since the Apollo missions. It’s amazing.

Speaker: 4
46:51

I mean, I’m just saying that. I don’t

Speaker: 2
46:53

know if it’s a true number. But he gets rockets to land and rockets get caught with robot arms. Like, what?

Speaker: 0
47:00

And that to me, it’s ai, my god. You know, you get those feelings like, okay. I’m on the right timeline. Because if the guy who’s going to make us a galactic civilization is also a master Diablo player

Speaker: 2
47:11

The number 1.

Speaker: 0
47:13

We’re on the right timeline because

Speaker: 2
47:16

It it seems so unlikely that if it was in a movie, I go, shut the fuck up.

Speaker: 0
47:22

Shut the fuck up.

Speaker: 2
47:22

He’s not the number one Diablo player. I don’t care how smart he is with rockets and electric cars and satellites that give broadband Internet and tunneling under the Earth and also owns x.

Speaker: 0
47:36

It’s suspicious, I must say.

Speaker: 2
47:39

It’s and he’s tweeting 48 times an hour. He’s he’s so prolific.

Speaker: 0
47:45

Well

Speaker: 2
47:45

It’s like what where where is your head? One of the, It’s ai he’s in another dimension.

Speaker: 0
47:51

He could be he could be bilocating. Like, this is one of the ai.

Speaker: 2
47:54

Is that Where where are you actually physically is this sai is this an avatar?

Speaker: 0
47:59

Okay. Here’s an here’s something I just thought

Speaker: 2
48:02

fully believes it’s simulation, by the way.

Speaker: 0
48:03

Oh, he does?

Speaker: 2
48:04

Oh, fully. Woah. Not only does he say fully, but he says the chances of it not being a sim he said this publicly. The sana chances of it not being a simulation are in the billions.

Speaker: 0
48:14

I mean okay. So we talked about this in the green ram. Willow, the new quantum chip that Google wants. Right? So and I think you and I both do the same thing with our minds. Again, I think anyone who is exposed to the Atari 26100 does this naturally, which is, like, we play the Atari 26100.

Speaker: 0
48:32

Did you have an Atari when you were a kid? Yeah. And you remember how that blew your ai? Right? You control the thing with the joystick. It was insane.

Speaker: 0
48:40

You can control the TV buttons. What the fuck? Yeah. Like, it was you’d been going to arcades. You could only play for a second. You’d have enough quarters.

Speaker: 0
48:49

Suddenly, you could just do it at home.

Speaker: 2
48:51

You could play till you fainted.

Speaker: 0
48:52

Oh my god. And and so we got to witness, like, every phase of that technology to where it as is now, which is just fucking insane. And, so you just take the Atari 26100 model and apply it to any new thing. And so you think, alright. What’s it gonna look like in 10 years? Then you take Musk’s neural lace or what some kind of brain human interface, mix that in with some quantum chip that, yeah, right now right now, it’s apparently unstable.

Speaker: 0
49:22

It’s ai you gotta keep it at, like, you have to keep it at the low I don’t know what it’s called. It’s ai colder than space or something. Like, it has to be basically below freezing, and then suddenly, it can do things that all the supercomputers on the planet couldn’t do. But, you know, there’s a trajectory here between the human brain and this technology, and it’s getting closer and closer and closer together, Meaning that we are and, you know, a lot of people are ai, look.

Speaker: 0
49:50

That’s probably, like, 20 years away. That is not that long. We are when did Teen Wolf come out, man?

Speaker: 2
49:57

I don’t think it’s anywhere near 20 years. I think it’s way closer than that.

Speaker: 0
50:01

That’s right. So that that to me, when you when you just do the math and you realize humanity is about to merge with a thing that is solving equations that all the super that take a supercomputer, You what is it? Doubles amount of years. That’s gonna be us. Yeah. And so then to answer the simulation idea, of course, we’re in a simulation.

Speaker: 0
50:25

If if we were just monkeys and now we are using qubits, using super positioning to create some infinitely faster way of calculating data, then, obviously, once we get that thing connected to our brain, we will be able to simulate any reality we want. If this is truly our past and you wanted to like, right now, the way I remember something, having done acid for most of my life, is very foggy and kind of like Sai ai memory isn’t the best.

Speaker: 0
50:58

Every once in a while, I have a very clear memory of things. But with this tech, theoretically, it could reconstruct memories in your mind and not just that, put you into them and allow you to experience them in real time. Meaning, in a few minutes, you could live your life over a 1000 times.

Speaker: 0
51:14

Easily, we could just be in we could be in the future, and this is a memory that some quantum computer of neural interface is allowing us to experience, totally all encompassing memory. And that would be a form of eternal life because in every second, how many lifetimes could you live based on merging with that kind of chip?

Speaker: 2
51:34

Right.

Speaker: 0
51:35

And you wouldn’t wanna know it was a memory. You know, you might wanna be like, you know what? Let me just live that life over again. I just just wanna feel the whole thing.

Speaker: 2
51:42

Well, you know, that’s one of the the scariest things for people to consider is the, there there’s this I I asked someone once, would you rather die or would you rather live your life over and over and over again forever?

Speaker: 0
52:01

Ai they say?

Speaker: 2
52:02

They’re like, oh meh god. I couldn’t do this forever and ever and ever. I’m like, why not? Can you can you can do it now? Like, Like, it’s not even hard. Like, aren’t you enjoying life? Like, I love it. I’m having a great time. I have great friends. I have a lot of fun.

Speaker: 2
52:14

Lots of amazing things. I have a great family. Yeah. I I I enjoy what I do for a living. Like, why would no one keep doing this?

Speaker: 2
52:21

But the thought of keeping even for me, the thought of me doing this forever and ever and ever is fucking terrifying

Speaker: 0
52:27

Yeah. For some weird reason. Well, that was like, Nietzsche had this whole thought experiment, which was I don’t remember what it’s called. Something like infinite return. But, basically, the way he put it is, you don’t live it again and again and again and make changes within the echo.

Speaker: 0
52:43

It’s exactly the same over and over and over again. And and sai in other words, like, whatever you it’s just a it’s a rerun over and over over again forever. That’s what we’re in. Jeez. And his point in that was, like, therefore, if most of your life you’ve been miserable, you’re in hell. Ai you? Dude. Ai know. Oh.

Speaker: 0
53:06

But he wanted to use that more as a kind of, to to leverage people out of despondency that make them understand. Get going now. Make it happy now because if we do speak, yeah. But, dude by

Speaker: 2
53:20

all this measure of talking about, like, quantum computers and artificial intelligence and all these emerging things, isn’t it more likely then that a lot of this shit that people are seeing is human created? Because isn’t it more likely that if we really do get to some sort of quantum computer AI civilization, so you attach quantum computer with Ai, like, 20 years from now?

Speaker: 2
53:46

Ai, what does that even mean? Did you just make a god? And if you did, can this thing just completely travel between dimensions and understand, like, everything about every subatomic subatomic particle that exists in the entire universe Yeah. All at once. Ai, and if that’s the case, like, why do you need people anymore? And maybe you don’t. Just to make it, like, maybe Australopithecus isn’t around anymore.

Speaker: 2
54:12

That that was our guy. Right. He was our guy. If it wasn’t for him, we wouldn’t be here, allegedly.

Speaker: 0
54:16

Well, I mean, I think the model you could use for that theory would be the the the various like, you look at the an embryo, and then you you watch the way the appendages change, then you could look at it that way, which is ai, well, I mean, you don’t want I met someone who had a tail, by the way. Like What? Some people get porn.

Speaker: 2
54:44

I think Sai was at that party.

Speaker: 0
54:50

Yeah. He went swimming

Speaker: 2
54:51

at his tail.

Speaker: 0
54:52

It’s because because something happens. How big was it? I didn’t look at it. What did

Speaker: 2
54:55

it taste like?

Speaker: 0
54:56

Oh, it was cinnamon. It tasted like cinnamon and whiskey.

Speaker: 2
55:01

Yeah. Dudes are born with, like, a stub. Right. You know, like a regular tailed monkey bill. Like, look at this bitch ass tail.

Speaker: 0
55:07

Yeah.

Speaker: 2
55:07

But there’s something to it.

Speaker: 0
55:08

Something there. So, you know, the the if yeah. There There it is. There you go. God.

Speaker: 2
55:13

That’s so weird. There you go. That is like an ancient by the way, if you’re born with a tail, I’m not trusting you with my taxes.

Speaker: 0
55:21

Oh, come on.

Speaker: 2
55:22

Ai don’t care if you got the surgery.

Speaker: 0
55:24

Well, I it’s weird how there’s like, some of those tails look better than others. But Well,

Speaker: 3
55:27

some of them are clearly I’ve gotta be fake.

Speaker: 2
55:29

Yeah. Some people probably got surgically put the take off my big toe and stick it on my ass. I’m afraid of luck that would

Speaker: 0
55:35

come on.

Speaker: 2
55:36

Psychos out there, man.

Speaker: 0
55:37

Give me a dude, that your lion’s fucking bit. I still think about it sometimes. What would you do if you had a tail? Like, give me a break. If you could get like, if it you know, they they’re already getting these body suits you can wear to help help you lift shit. They have the new things for your legs that, like Oh, yeah. You know?

Speaker: 0
55:52

But, dude, if if there was some ai tail that you could attach with a belt

Speaker: 2
55:57

What if there was a way? What if, genetic engineering and AI merge in a way that, like, Duncan, we can switch you one time to anything you sana? And one of the options is you could become one of the Na’vi. What are the Na’vi? The Na’vi from the movie? Avatar. Avatar?

Speaker: 0
56:15

I’m not I don’t wanna be a Na’vi. The blue people Not interested.

Speaker: 2
56:18

Giant giant blue people who fucking live in the forest ai sleep in the trees and they’re connected to the earth. No. And they dance together in a psychedelic ritual.

Speaker: 0
56:25

I’m gonna pass on the Na’vi. I don’t like

Speaker: 2
56:27

it. Dragons. They ride dragons, bro.

Speaker: 0
56:30

I don’t I don’t know, man.

Speaker: 2
56:32

Dude, I wanted to be one of them people so bad. Everybody did. There’s a there was literally a psychological condition called Vatsal Depression.

Speaker: 0
56:39

Do you know about that? Yeah. I do.

Speaker: 2
56:41

How many people just let’s just have a guess. If I said we could do that to you, how many people do you think would sign up? I think the streets would be filled with giant blue people.

Speaker: 0
56:50

Well, I mean, if it’s only once, I will oh, I did did

Speaker: 2
56:54

to sai? You did we can’t do it again. It’s too dangerous. Your DNA gets volatile. It melts down. You could become a frog. We can’t control it, but we can switch you one time.

Speaker: 0
57:03

Yeah. It’s not gonna be one time, man.

Speaker: 2
57:05

It’s one time. One time. Why? Because you either you either stay a person or you become a werewolf or we turn you into a knobby. Imagine imagine that was an option. Every time the moon goes black, you have to lock yourself in your house.

Speaker: 0
57:20

Yeah. Or you’ll kill.

Speaker: 2
57:21

Let people know or you’ll you’ll Or you can be a mafia. Everybody apart. You might just jump through the windows of the saloni floor and roam the streets.

Speaker: 0
57:27

And it’s gonna hurt when you change. It’s a painful transformation.

Speaker: 2
57:30

Screaming in your back. Remember the movie Meh Werewolf in London when he’s, like, on his back? Dude,

Speaker: 0
57:35

the best. Oh my

Speaker: 2
57:38

Joe. Fucking great movie.

Speaker: 0
57:40

Here’s a movie you gotta watch. Really? The substance. What is that?

Speaker: 2
57:44

Dude,

Speaker: 0
57:45

I don’t wanna ruin it for people because it just came out. But it’s, you you ever watch any, like, Cronenberg movie? She’s Okay. So it’s it reminds me of that. It’s got Demi Moore in it who, by the way, looks so great. And she’s like like, dude, it is so fucked up. This movie is so fucked up, but it’s got the effects. Something that happens in it is very similar to An American Werewolf in Bryden.

Speaker: 0
58:10

And it’s basically this star is she’s a fading star. And so, oh, and he kills it too. But, like, she’s like a fading star. So it’s called the substance? Oh ai god. It’s fucking trippy, man. It’s so good.

Speaker: 2
58:25

I’m gonna make a note, Duncan.

Speaker: 0
58:26

You will love it.

Speaker: 2
58:27

A note. I can’t use I have to take my gloves off.

Speaker: 0
58:31

But it it’s really wild, meh. And it’s very like, there’s parts of it that are so disturbing. Really?

Speaker: 2
58:37

Oh, maybe they’re gonna show it. The substance.

Speaker: 3
58:40

No no spoilers.

Speaker: 0
58:41

Oh, what? You’ve seen it?

Speaker: 3
58:43

No. But, like, if you if you say it’s that good, then why

Speaker: 0
58:45

Okay. Yeah. Yeah. See some No spoilers.

Speaker: 4
58:47

I’m sorry.

Speaker: 2
58:47

Sai what

Speaker: 0
58:47

is it on again? We had to get it on Prime.

Speaker: 2
58:50

Oh, okay. So it’s out. It’s out.

Speaker: 0
58:52

But, dude, like, the the this is again, like and I think one of the fun things about being alive right now. Oh, it’s a fun time. It’s a fun time and one of

Speaker: 2
59:02

the fun meh the funnest time anybody’s ever had.

Speaker: 0
59:04

Dude, really? Because, like Oh, yeah. If I had to pick time periods

Speaker: 2
59:08

Oh, we picked the right one.

Speaker: 0
59:09

Well, the second one I would pick is, when cocaine was legal.

Speaker: 2
59:15

No. No. No. No. Ai think he would have been dead already.

Speaker: 0
59:17

I would. Well, yeah. But, I mean, I I I’m

Speaker: 2
59:19

gonna tell you about my ai, Speak. He did his ophthalmology, he he he was, his residency. He did in Miami in the eighties during the cocaine days. No. Oh, my god, dude. He said every day it was just gunshot wounds and guys with things stuffed up their ass. They would get coked up and they’d shove something up their ass and try to come harder, and they just got things stuck up their ass.

Speaker: 0
59:42

Wow. Like, how that’s a job on

Speaker: 2
59:44

Joe’s. Oh, yeah, dude. He’s my friend, Steve shout out to Steve Graham. He he told me, like, all kinds they find light bulbs, guys that have light bulbs. I love light bulbs. Twisty little ai light bulbs. They stick those up their ass.

Speaker: 0
59:56

You know it’s gonna break. Like, that’s gonna break.

Speaker: 2
59:59

Part of the fun. Part of the fun.

Speaker: 0
01:00:01

The the the the risk.

Speaker: 2
01:00:03

Well, they’re coked out of their fucking minds, dude. They don’t know what they’re doing. This is the eighties in Miami. Holy

Speaker: 0
01:00:10

shit, dude.

Speaker: 2
01:00:11

Yeah. And there was more more banks per capita in Miami at the time. I don’t know if it’s still the case, but more banks in Miami per capita than anywhere else in the country. Because it was all just moving in that yayo, ai.

Speaker: 0
01:00:26

Moving in that yayo.

Speaker: 2
01:00:28

Yeah. It was a cocaine city.

Speaker: 0
01:00:30

That was where it was probably somewhat common to find, like, a bag of Coke on the beach. Right? That was where it was speak day. You have to get your kid Yeah. And your kid would bring you a bag of Coke.

Speaker: 2
01:00:42

They’re like seashells. You’ve seen cocaine cowboys. Right? Yes. Oh my god. It’s so and both 1 and 2, both are equally good.

Speaker: 0
01:00:51

Dude, I’ve heard story

Speaker: 2
01:00:53

is so insane.

Speaker: 0
01:00:53

I mean, again, like, the I would never I I I’m too much of a pussy to live that kind of tyler. But when you think about the possibility that once we do get interfaced in some way or another with these new computers that are just right around the corner, we will be able to simulate experiences like that.

Speaker: 2
01:01:16

Yes.

Speaker: 0
01:01:17

I will definitely I would be into simulating the experience. And then when you consider yeah. But you’re gonna simulate the experience. You know it’s a simulation. At some point, you’re gonna be like, you know what? Let’s just turn that off where I know it’s a simulation. You know what I mean?

Speaker: 0
01:01:32

That we would all be doing that shit. Like yeah. And and I don’t just mean, ai, literally, like I mean, at some point, you’ve done 20,000 lifetimes. You’ve experienced what it’s like to be George Washington, Genghis Khan. You’ve experienced what it’s like to be Joan of Arc. You’ve experienced being one of Jesus’ disciples.

Speaker: 2
01:01:51

Imagine if that’s one person and that’s your backstory?

Speaker: 0
01:01:54

That you’ve done all of those things. What a ai you’re on. That’s gonna be everybody. That’s gonna be everybody. Because it’s gonna be fake. Well, I mean, ai end to it. That that’s where it gets really Is it? Is it gonna be fake?

Speaker: 2
01:02:05

Is reality anyway?

Speaker: 0
01:02:06

And what is data? Right. What is data? That’s the real question because it’s ai, how much of data can we recover from light? And if we get faster than light travel, can we get ahead of light? We know that when we take a picture, that’s just light. So if we can get ahead of light, we can go faster than light.

Speaker: 0
01:02:21

If we can go exponentially faster than light, theoretically, you’re you’re basically moving into the future, I guess. Then couldn’t you take pictures of earth in the past? If you could take pictures of earth in the past, why couldn’t you recreate them with this new technology? There’s your time machine.

Speaker: 0
01:02:37

You’re not you don’t have to worry about fucking up the timelines. You’re just taking pure data, having it interpolated by whatever the next computer is after quantum computers, right, and then simulating that reality and traveling into it as whoever you wanna be. I mean, it’s pure, hedonism. You know? It’s ai, right now, we think of hedonism as fucking a great meal, making some money, nice corn, red wine.

Speaker: 0
01:03:03

But future versions of hedonism could really just be ai, I just wanna be a dinosaur for 50 years.

Speaker: 2
01:03:16

Yeah. I mean, for sure. Well, there’s gonna be look. Think about how many people play video games most of the day. Like, how many young guys?

Speaker: 0
01:03:24

So many.

Speaker: 2
01:03:24

Young guys with no girlfriends. 100% you’re playing some kind of video game all the time with your friends.

Speaker: 0
01:03:31

There you go.

Speaker: 2
01:03:31

And you’re probably having the most fun you’re ever gonna have in your ai. So enjoy it Yeah. Before the prison comes. Ah. For the the marital

Speaker: 0
01:03:39

Why? Why’d you say it?

Speaker: 2
01:03:40

Before you get told that you’re a toxic piece of shit

Speaker: 0
01:03:44

You can’t. And and that is by the way, Sai think there’s a new phase in recently married dudes who, I think there’s a new phase that happens. I think I I went through it actually, which is ai that experience you had. And when I reminisce on my life in the past prior to having kids, which I fucking love, but when I reminisce on the past, the the the memories that come to mind, a lot of them are ai snorting rails of Ketamine and playing God of War.

Speaker: 0
01:04:11

It was ai. But nothing like and I really mean this. It sounds cheesy, but I really mean it. Like, what I was going for there, that’s what I get just on any given day when in the most

Speaker: 2
01:04:31

Right. You’re looking for highs and the highs of the love of your families above and beyond anything else. Unquantifiable. Ai tell you, Chappelle’s take on it? No. He goes, not only did it increase the love in my life, but it increased my capacity for love.

Speaker: 0
01:04:46

Yeah. That’s right. And that wild. And that can hurt. I you know, this whole, like, romantic hippie dippy version of love, it doesn’t I don’t think that’s quite what love is. A fairy tale love, real love, it’s like that that expansion. Like, you know that thing where you go from one size butt plug to the next? You know? Yeah.

Speaker: 0
01:05:08

Yeah. You know that thing. Yeah. But you know what I mean? It’s like you you you you it stretches you out in a way that nothing else could have.

Speaker: 0
01:05:16

And when you consider in in when I think of, like, the past versions of me and realize in this confused way, that’s what you’re looking for. You’re looking for that, and that impulse is being subverted or, captured by, you know, hedonic technologies that are, paradoxically, probably keeping you from having that experience. You know?

Speaker: 0
01:05:41

They’re they’re they’re getting in the way of that experience. But and then and then, like, new dads, you gotta shed that skin. Like, you know what I mean? Like, I had to, like, fucking, like, let go of that. That is such a habit, you know, that form of life, video games, drugs.

Speaker: 0
01:05:59

Ai, you know what I mean? It’s a real it’s a real You

Speaker: 2
01:06:02

have to be responsible now.

Speaker: 0
01:06:04

Yeah. Exactly. Yeah. But, yeah. And and that probably sounds like a bummer to a lot of people out there, but it it actually this is the way, you know, and it it feels good when you’re in it.

Speaker: 2
01:06:16

It’s just hard to convince people to do it, and that’s why there’s this the Elon’s terrified of this population crash. This idea that younger kids, young kids today are not having babies. And ai as they’re getting older, you you’re less and less fertile. And so people are choosing to have kids later in ai. They or not have kids, more people are choosing to not have kids.

Speaker: 2
01:06:38

And by the way, I’m not judging. Do do whatever you do whatever you want. You should be able to do whatever you want in this life ai no one should force you to fucking live with somebody, have a family. I don’t know what kind of anxiety you have or whether or not you’re a real legitimate loner. You like being alone most of the ai. Yeah.

Speaker: 2
01:06:54

But it’s just ai the amount of people that are, like, super bummed out all the time is quite terrifying. Yeah. If you if you really stop and think about that just the number of people that are just running through life bummed out. I know. And there was some, obviously, polls. Who knows who the fuck is running them? But there was some poll about, liberal women and mental illness.

Speaker: 2
01:07:18

It’s it’s so it’s such a meme. It’s so unfortunate

Speaker: 0
01:07:24

Yeah.

Speaker: 2
01:07:24

That it seems to, like, hold up to the meme. Yeah. Sai unfortunate. But the numbers, like, crazy high.

Speaker: 0
01:07:31

Well, you know, man, this is the thing about mental illness. Ai, in in there’s lots of studies that have been what what do they call it? Ai. Right? That’s the name for if you are around a crazy person. It’s an you can actually like, if you’re around a paranoid person long enough, you really might start thinking the walls are bugged if they’re charismatic enough. Right? Sure.

Speaker: 0
01:07:50

So Yeah. There’s a there’s a quality to people who are charismatic and distorting reality that is contagious. And then when you add to it, it becomes a fashion statement. Right? So so, basically, the idea is if you have some form of mental illness, it’s not like I should shame you for it.

Speaker: 0
01:08:10

Obviously, like you need care, you need compassion. But one of the really, I think very dangerous things that has emerged into the zeitgeist is that compassion has been confused. Ai, so in other words, they what what you might call enabling, they are calling compassion. Because the idea would be, you’re right now, you need to get better.

Speaker: 0
01:08:33

Let’s get you fucking better.

Speaker: 2
01:08:35

Right.

Speaker: 0
01:08:35

Not ai right now, this is just how you are, and you really don’t have any hope. So this is where and, also, I congratulate you on your courage and all that’s good, by the way. It it is courageous if someone has mental illness to announce it. But when you go to the next step, which is actually the fact that you’re trying to lose weight, the fact that you’re trying to balance your life, that is an aggression.

Speaker: 0
01:09:06

You know what I mean? Like, now you arya aggressing against all the people who have this. It is a slap in the face to the people who have it. What I’m saying is there’s a culture where the normal societal pressure to try to make yourself healthy, which, by the way, if you go back a long time ago, for if it’s just ai you and me and everyone in the green room and we have to survive in the wilderness or something like that, there really isn’t time for somebody to you know, it’s dangerous if someone is doing things that keep them sick because we have to carry them.

Speaker: 0
01:09:39

You know what I mean? We have to carry them through the fucking wilderness, and that means we might die. That that lowers our survival chances. So the the the ai is in a in a in a you don’t want to enable people who are hurting themselves.

Speaker: 2
01:09:57

Right.

Speaker: 0
01:09:57

Who you don’t wanna enable people who have a chance to to no longer, like, continue the patterns or take the medicine or whatever the fuck it is to feel better. You actually want to help them feel better, not keep them frozen in this thing, which is a demonstration of their enlightenment. Because that’s the thing.

Speaker: 0
01:10:17

When health when sickness is health and health is sickness, well, that’s the ant death spiral, dude. Like, if if Right. You know, that’s how you create a very sick, unhealthy world. And then you you you know, you you wouldn’t sana, like in other words, like, you if the if you met some, like, raving paranoid person who was convinced that there were nanobots inside of them that were reading their minds and controlling their thoughts

Speaker: 2
01:10:43

Duncan, I told you that in private.

Speaker: 0
01:10:45

I’m sorry, Joe. It’s just not good that you think like that. That’s scary. That’s a sad place to be. We gotta get you out of there.

Speaker: 2
01:10:51

We gotta get you on Reddit.

Speaker: 0
01:10:54

Exactly. Exactly. And I yeah.

Speaker: 2
01:10:58

Are legit, dude. Shut the fuck up.

Speaker: 0
01:11:00

Dude, the main the main this is a this is a

Speaker: 2
01:11:03

Well, you know what I’m really scared of legitimately though? I’m I don’t think nanobots are controlling us right ai, but that this technology that they have where they have these, like, little miniature robots that they can send into your, bloodstream to repair tissue and repair you’ve seen these.

Speaker: 2
01:11:17

Right? The the concept behind it. Once that becomes an actual thing, like, what’s to stop someone from injecting a few of those inside of you at the hospital next time you go in for a procedure. Right. And if it gets to that point, like, 20 years from now where they could do that, they could just, like, we, we chipped Duncan. Thank you. Yeah.

Speaker: 2
01:11:37

Thank you very much. Yeah.

Speaker: 0
01:11:38

It was

Speaker: 2
01:11:38

very important to find out

Speaker: 0
01:11:40

Yeah.

Speaker: 2
01:11:40

Where where this guy goes. Yeah. We have to track him Dude. Everywhere he goes. And then you’re you’re linked up to some GPS computer by these fucking nanobots inside of your body.

Speaker: 0
01:11:50

And ai the way, if you and I are talking about this shit in elf suits, you better believe somebody in the DOD, somebody in the and then Raytheon or or or Lockheed Martin is like

Speaker: 2
01:12:01

ai these little little robots, they do work for a while. What After a while, they decay inside your body, and they create rampant inflammation. Horrible rheumatoid arthritis destroys all of your joints because they die inside of you. Well, you know, it’s ai, we didn’t know that.

Speaker: 0
01:12:20

Yeah. Ai didn’t know. Gotta break a few eggs to make an omelet, man. That we regret Yeah.

Speaker: 2
01:12:24

The Tuskegee experiment. Sorry. We regret sorry. Sorry Fuck. Infected people with syphilis.

Speaker: 0
01:12:31

I’m really sorry.

Speaker: 2
01:12:32

Didn’t tell people they had syphilis. We’re sorry.

Speaker: 0
01:12:34

In retrospect, it was a mistake. Like, we shouldn’t have done that.

Speaker: 2
01:12:38

Ai mean, the the again How crazy is that? That’s a real thing.

Speaker: 0
01:12:42

Or when they release some shit in the subways. You know? Whoopsie.

Speaker: 2
01:12:46

Wuhan lab. Whoopsie. Whoops.

Speaker: 0
01:12:49

That was a big whoopsies, boys. Gain of function research. See, it was real.

Speaker: 2
01:12:54

Whoopsie. No. Whoopsies. It was true.

Speaker: 0
01:12:56

Dude, I’m so when you this is where, to me, if you do sana align with a classic paranoid state of consciousness, the way you align with it is you you and they and and without having to go on Infowars, without having to go on Reddit conspiracy, just look at what is true like, is verifiable. Like, what do we know right now? So what we know right now, there are unknown drones hovering over New Jersey.

Speaker: 0
01:13:21

We know that the president of the United States has been incapacitated for years for years.

Speaker: 2
01:13:30

Way. Who saw that coming?

Speaker: 0
01:13:32

Dude, I

Speaker: 2
01:13:33

We were conspiracy theorists.

Speaker: 0
01:13:35

Yeah. Sai not anymore.

Speaker: 2
01:13:37

We were conspiracy theorists.

Speaker: 0
01:13:38

Now our shit is, like, mainstream, just basic journalism. The fucking president of the United States has apparently been out of commission for years.

Speaker: 2
01:13:50

By the way, I welcome him on my podcast. He has an open invitation.

Speaker: 0
01:13:54

Goddamn it. That’d be awesome. Anytime. And he he would be fun. Like

Speaker: 2
01:13:59

I hope so now.

Speaker: 0
01:13:59

Dude We’d have to

Speaker: 2
01:14:00

give him a little nap in the middle of the podcast, but then wake him up, throw some water on him.

Speaker: 0
01:14:04

When but when he when when, like, he’s all there when they got the cocktail ai, and he’s, like, dialed in, and he he turns into a warlock for a second ai he meh you know what I mean? It’s scary. When the ai move up Dude,

Speaker: 2
01:14:20

it’s it’s locks up.

Speaker: 0
01:14:21

That’s ai something that’s a lynch. That’s like if you were in a cursed tomb, and that thing comes around the corner. That is scary. Like, his what whatever the the the sauron that comes out of him before he, like, goes back to sleep is ai, but even more terrifying is the network of people around him.

Speaker: 0
01:14:41

You know, you see those, it’s really cool, the dancing dragons. It’s ai 6 dudes in a dragon suit dancing, and it looks like a real dragon dancing. Right. Right. Right. Yeah. Biden is the dancing dragon of presidents.

Speaker: 0
01:14:54

He’s got god knows how many people just fucking, like, working so hard to get that thing to to function just in, like, brief moments. You only need him to function for, like, 10 minutes at a press conference, 20 minutes here. Get him off the plane. Get him in the fucking building.

Speaker: 0
01:15:12

If we can pull that off, we’ll have power for a little bit longer, a little bit longer. Dude, when you consider that we apparently live in a democracy, you elect this dude who makes decisions because in some way, shape, or form, he aligns with what you want the country to be.

Speaker: 0
01:15:28

And the people fucking puppeteering that that poor old man are just like, no. Actually, fuck you. He’s not gonna make any fucking decisions because he’s incapacitated. He’s gone. Gone with a fucking ai. And now we’re in control, and you didn’t vote for us. That is terrifying.

Speaker: 0
01:15:46

That is so that’s in a way, that’s worse than a coup. Because at least with a coup, you sai the military. They come in. The tanks are in front of the White House. Some dude is suddenly the leader, and you know it’s not the guy you voted for.

Speaker: 2
01:16:00

Well, it was certainly ai definition. It was a coup against Biden.

Speaker: 0
01:16:06

Oh, with Kamala? Yeah. Oh, yeah. Yeah.

Speaker: 2
01:16:08

I mean, by isn’t that by definition? Or does a coup have to be military?

Speaker: 0
01:16:12

Oh, no. No. No. That was What’s

Speaker: 2
01:16:14

the definition of a coup?

Speaker: 0
01:16:15

I think a coup is just when you

Speaker: 2
01:16:17

Is it just ai a some sort of a conspiracy

Speaker: 0
01:16:20

to Overthrow the leader and install a new leader?

Speaker: 2
01:16:24

That’s it. Right?

Speaker: 0
01:16:24

That’s it.

Speaker: 2
01:16:24

He does have to be violent. Right?

Speaker: 0
01:16:26

That that’s right. And and what a brilliant coup.

Speaker: 2
01:16:28

The bell. It does. It does have to be ai?

Speaker: 3
01:16:30

Mission. Yeah. Sudden ai unlawful

Speaker: 0
01:16:32

I guess we have to redefine that.

Speaker: 2
01:16:35

Yeah. But because, again, it’s interesting. Is there any other coup d’etat?

Speaker: 0
01:16:40

That’s what

Speaker: 2
01:16:40

Maybe Ai. I know it comes from that, but,

Speaker: 0
01:16:43

Is there a difference between a coup and a coup d’etat? What’s a coup?

Speaker: 2
01:16:47

I think it is coup d’etat. That’s the actual definition. What’s a coup? It’s the same thing.

Speaker: 3
01:16:51

Ai it.

Speaker: 0
01:16:52

It’s said in violent But yeah.

Speaker: 2
01:16:53

Meh. Yeah. It’s it’s essentially, we shortened it. But so it’s it it does say ai, but if there’s a bunch of people that conspire behind the scenes and they force you out and your wife doesn’t want you to get forced out and then there’s ai these arguments and then you wear a MAGA hat and then and your wife wears red when she votes

Speaker: 0
01:17:18

Yeah. Like Yeah. You send

Speaker: 2
01:17:20

the signal. Your wife gives a speech where she’s mocking Kamala Harris. Yeah. She’s talking about joy and, like, the joy like, just nonsensical fucking word salad.

Speaker: 0
01:17:31

Dude, I know. I saw that. It’s ai they’re so pissed. It’s kind of

Speaker: 2
01:17:34

a coup. It It seems like it’s kind of a coup. And it also, the right thing to do. That guy should not be he might have won. I mean, it was the right thing to do in terms of, like, you can’t have a guy who’s just a figurehead. That’s not what the deal is. The deal is this guy is gonna be doing his best to look out for us and to make sure that he navigates this world of finance and environment and international relations perfectly where he doesn’t blow anything up, and he makes our economy happier.

Speaker: 2
01:18:04

Go. Go. That’s the that’s the deal.

Speaker: 0
01:18:07

Yeah. Well but the again, like, it

Speaker: 2
01:18:10

the You can’t, like, have only a mask. Like, who’s running the deal?

Speaker: 0
01:18:13

Well, that was sai okay. So, again, it’s like and this is the fantasy of any hippie or whatever. So the idea is, the predicted in the new age movement. And I think you would argue in a lot of religions would is the consciousness shift is is happening. The age of Aquarius, whatever the fuck you wanna call it, consciousness shift.

Speaker: 0
01:18:34

And so, the idea is that what we’re witnessing is essentially the, the collapse of, a way of doing things, that is collapsing. And as it collapses, it starts making big mistakes. One big mistake would be people figure out that we have had a president who is basically incapacitated, meaning we don’t really need a president.

Speaker: 0
01:19:05

There the the whole model starts falling apart. Also, when you realize, like, they like, if you watch basketball or skateboarding, watch skateboarding now versus skateboarding when people started skateboarding. Like, the tricks, like, people are doing now versus what they used to do. Right? And you see how quickly people, when something’s fun or important, how quickly it evolves.

Speaker: 0
01:19:29

Right?

Speaker: 2
01:19:29

Right.

Speaker: 0
01:19:30

So the coup is problematic in vatsal, again, you know a coup has happened. The ultimate coup is to have a figurehead. Now, you know, the it’s a it’s a it’s a hacky trope, I guess. The idea being that, like, every president is just a fucking, puppet. Right? But the problem with that puppet is that, like, these are puppets who actually do have power.

Speaker: 0
01:19:56

They will make decisions even if there’s a lot of pressure for from god knows whatever the web of unknown people is that tries to, like, grab the the steering wheel. They can say no. So that’s a problem. So if I wanna control the steering wheel completely, dude, what’s better than a a a an old man who has dementia?

Speaker: 0
01:20:17

Because I could tell him shit happened that didn’t happen. I could show him news sources that aren’t even real. You know what I mean? I could literally, ai, just

Speaker: 2
01:20:26

He probably pretends to be able to read. His eyes are probably gone.

Speaker: 0
01:20:30

Dude, absolutely. Probably pretends. Absolutely. And and and and then the other side of it, the the aside from it, it’s a coup. It’s completely unconstitutional. It’s a a a a fucked up takeover of the US government. If you just look at the abuse of, like, the right thing to do when you have a president, a bus driver, whoever who’s got senile meh, is to say, hey, guys.

Speaker: 0
01:20:57

He’s really sick, and he can’t do the job anymore. And we have to find somebody else to do the job now. That’s the right thing to do. But these motherfuckers are like, no. No. No. No. No. No. We’ll lose our jobs. We’re in the fucking cabinet.

Speaker: 2
01:21:10

They’re like, we don’t want a primary. Because if another Democrat comes in, if Shapiro comes in, if Newsom comes in, whole new cabinet.

Speaker: 0
01:21:18

That’s right. That’s right. Everybody

Speaker: 2
01:21:19

knew. Everybody knew.

Speaker: 0
01:21:20

That’s it.

Speaker: 2
01:21:20

Everybody wants to keep their job.

Speaker: 0
01:21:22

It’s so fucked up. If

Speaker: 2
01:21:25

even if you are sad that Trump won and you wish Kamala Harris won, if she did win, it would be the first time that anybody won without winning a primary. Right. And that’s kind of crazy, and it’s not a good precedent to set. It’s not good to let people weasel around this system that we have in place.

Speaker: 2
01:21:44

And by having a vice president and then immediately appointing them as the Democratic candidate, that’s kind of illegal. It seems like it’s kind of illegal. Is it illegal? Well, dude, I mean, should it be illegal? How about let me say this. That should be illegal.

Speaker: 0
01:21:58

Right.

Speaker: 2
01:21:58

You you should have to have a a vote from the people to decide who their Democratic elected person who’s gonna run for president is. Right. That’s the whole deal. Yeah. Maybe people didn’t vote for you when you were running for president, which is a fact. And so then when you ran for vice president or when he chose you as vice president, all of a sudden we’re supposed to pretend that you’re a really good candidate for president when, like, we let’s find out what the people think.

Speaker: 2
01:22:26

If you guys believe that she’s the best person for the job, the whole idea is supposed to be sell it and then people vote. Ai, really vote. Actually vote. Absolutely. With the vote. Yeah. Don’t the mail in ballots seem kinda odd. Let’s not do that. Yeah. Nixon was talking about how they could be rigged in the seventies.

Speaker: 2
01:22:42

Yeah. Let’s just do it in person ai we always do.

Speaker: 0
01:22:45

That’s right, man. I’m doing this vote. The idea is, like, if I’m a kidnapper and I kidnap you, and I, I don’t know, knock you out or something, you come to and I explain we’re meh. I need to pull that off. We’re now married. I’ve just kidnapped somebody, but now they think that we’re married.

Speaker: 0
01:23:03

The what I’m saying is, you you know you know what I mean? If you’re gaslighting

Speaker: 2
01:23:08

Yeah.

Speaker: 0
01:23:09

You really need to execute perfectly gaslighting. And so the the the problem with the and I think this is the buried fucking headline in the drones, in the Kamala coup, the the the the president with senile dementia, is that all of these actions taken by the federal government have not just corroded people’s trust in the federal government, but potentially, like, annihilated.

Speaker: 0
01:23:37

Ai it. Meaning now Yeah. Well, you know, if I’m kidnapped by somebody and they’re like, no. Here’s why I kidnapped you. Oh, god.

Speaker: 0
01:23:44

There’s that great movie where, like, somebody ends up in someone’s survival bunker, and you wonder, is it really the end of the world, or is this person kidnapped? He’s saying, you can’t go out there. But it’s ai, the idea is, the moment if if I’ve been kidnapped and I actually buy into your shit, that’s gonna create a lot less anxiety for me.

Speaker: 0
01:24:02

But the moment your kidnappe stops believing you, Woah. That’s not fun for anybody. And right now, I feel like that’s the general mood is people just don’t trust at like, when

Speaker: 2
01:24:17

I people that do have Stockholm syndrome.

Speaker: 0
01:24:19

That’s it, dude. That’s it. That’s it.

Speaker: 2
01:24:21

That’s it. Those are the people that are still getting boosted.

Speaker: 0
01:24:24

In all Ai am up

Speaker: 2
01:24:25

to date. I have all ai of my boosters.

Speaker: 0
01:24:26

Jesus Christ. They got boosters in

Speaker: 2
01:24:28

their eyeballs. You could see them swimming around behind their eyeballs.

Speaker: 0
01:24:31

Oh, and

Speaker: 2
01:24:31

and then Eventually, they’re gonna be just point of view. You haven’t had your boosters.

Speaker: 0
01:24:36

That is 6 feet distance.

Speaker: 2
01:24:43

That’s what it felt like, man. That’s what it felt like being unvaccinated in the pandemic.

Speaker: 0
01:24:47

Well

Speaker: 2
01:24:48

It felt like like some people looked at you like you were the dirty like, I heard dudes I know. I know them. I’ve hung out with them, and they were calling people playgrads. Playgrads. Calling unvaccinated people playgrads.

Speaker: 0
01:25:01

Listen. This is, and and, you know, maybe you’re not supposed to do this, like, when you’re saying, like, you try to find the the compassionate way of looking at Kamala Harris.

Speaker: 2
01:25:11

I try to find that with everybody, man. Is that as, like, an exercise that I’ve been doing more and more over the last few years? I try to push it all day long. Yeah. There’s so many things to get upset about, but there’s also there’s so there’s so many good things in the world too, and we can’t fall into the we’re not designed to soak up 8,000,000,000 people worth of bad news.

Speaker: 0
01:25:37

That’s right.

Speaker: 2
01:25:37

We’re just not designed that way.

Speaker: 0
01:25:39

That’s right.

Speaker: 2
01:25:40

And if you suck all that stuff in, you’re gonna have a lot of negativity in your life. And it’s not about forgiving people for, ai, even, like, CNN people that are spitting out propaganda. Like, your demise is self created. You will be punished by your own doings. The world has responded to all did you seen this crazy interview where Don Lemon interview some dude on the street? No.

Speaker: 2
01:26:05

You haven’t seen this? No. The the the it’s it’s so funny because Don Lemon’s doing, like, these on the street interviews, and, he’s talking to this this guy, about the news. And the the guy is essentially telling Don Lemon, like, I don’t trust all these sources you’re saying. He’s like, look, here it is, Washington Post.

Speaker: 2
01:26:21

I forget what the subject was, but he got Don Lemon to say, I don’t listen to mainstream news either. What? Yeah. Yeah. Yeah. Play this.

Speaker: 2
01:26:30

Play this from the beginning because this is so crazy.

Speaker: 0
01:26:37

Who

Speaker: 2
01:26:37

is the real president-elect, ai

Speaker: 0
01:26:39

you think?

Speaker: 3
01:26:39

Donald Ram, 1, I believe.

Speaker: 2
01:26:41

Democratic lawmakers in Washington are calling Elon Musk president Musk Brown. They’re saying Donald Trump is the vice president or the head of communications.

Speaker: 6
01:26:48

What’s what’s what’s what’s wait a second. Second. No. No one said that.

Speaker: 2
01:26:51

Really? Have you not watched and paid attention to the

Speaker: 6
01:26:53

Absolutely not. I’m paying attention to what I’m doing during my day sai I can try and get a better life ahead.

Speaker: 2
01:26:58

Okay. Do you have your phone with you? I do. Why don’t you Google right now

Speaker: 3
01:27:01

Yes. Tell me.

Speaker: 2
01:27:01

President Musk and see what comes up.

Speaker: 6
01:27:03

No. That’s already a loaded question, you ai.

Speaker: 2
01:27:05

Tell meh. Give me the sources.

Speaker: 6
01:27:06

Axios, Business Ai. No. We don’t trust any

Speaker: 0
01:27:09

of these.

Speaker: 6
01:27:09

The common man doesn’t trust any of this.

Speaker: 2
01:27:11

Keep going.

Speaker: 6
01:27:11

ABC News, Washington Post. Going. New York City.

Speaker: 0
01:27:13

So what?

Speaker: 2
01:27:14

All keep going. The

Speaker: 6
01:27:14

Atlantic. Oh. I don’t trust any of these.

Speaker: 0
01:27:16

Oh. I don’t trust any of these. Ai. I don’t trust any of them. I don’t trust any

Speaker: 6
01:27:19

of these. We’re the common man. We don’t trust any of these. Yeah. No one trusts the government. No one trusts the common news. We don’t trust any of that anymore. Independent news, we are the ones that own the news now. People trust me. They don’t trust MSNBC because I care if I’m actually one of them.

Speaker: 2
01:27:34

I can’t disagree with you. Okay.

Speaker: 6
01:27:36

Because I’ll answer that. I get a lot of

Speaker: 2
01:27:37

people coming to me saying, I I only watch ai. I don’t watch corporate media anymore. Oh, he said a lot of people coming to him. I misunderstood. I got a hard on when I saw that. Dude, that is incredible. Well, that guy just geniusly broke down Yeah. This illogical assumption that because it’s on these accepted sources, it must be true. Right. Oh, go. He’s president Musk.

Speaker: 2
01:27:58

Maybe we have the good ai, super genius on our side. And this idea that he’s doing it for money, hey, you fucking half wits. He has all the money. Yeah. He has more money than anybody. Right. He’s simultaneously running multiple businesses at that shah are at the the speak, the the cutting edge of technology. Sure.

Speaker: 2
01:28:19

Shut the fuck up and let him cook. But yeah. Let him cook.

Speaker: 0
01:28:23

Also, though, when you when you realize, like, this this is ai, you know, when the DNC starts astroturfing Reddit, when the DNC starts astroturfing 4 chan, when the DNC started doing that using all that fucking money and it’s so funny because the astroturfing, after she lost, it just stopped.

Speaker: 2
01:28:42

Explain astroturfing to

Speaker: 0
01:28:43

people. So the idea is, like, I infiltrate message boards post, political messages disguised as, like, somebody just putting a post up. I try to redirect the conversation or essentially imply a, consensus that doesn’t exist. And so, there’s ways of manipulating the algorithm.

Speaker: 0
01:29:03

Apparently, on the DNC’s, Discord server, they were talking about the best ways and times to post on Reddit to attempt to move the fucking needle. So It’s kind of amazing, isn’t it? It should be illegal. It’s so fucked up. If, like, you are

Speaker: 2
01:29:18

being propaganda.

Speaker: 0
01:29:19

Like, if if we if I don’t know. We did a rogue commercial for, nicotine pouches on Ram, you have to say this is a sponsored post. So why is it that if you work for the DNC or volunteer for any state entity, you don’t have to say, also, I’m doing this as a volunteer for the DNC.

Speaker: 0
01:29:37

I that’s why I posted this. You don’t have to do that. Right. So that’s invasion of the body snatchers. That’s the I’m pretending to be a normal person.

Speaker: 0
01:29:47

I’m infecting the data sphere with propaganda, and if I do it enough, it will create the illusion that this is the consensus. And Yeah. The reason you wanna create that illusion is because people like to sync up. That’s what they know.

Speaker: 2
01:30:02

They love to sync up, and really smart people like to get really good at syncing up. They like to get really good at it and really good at correcting others who don’t sync up correctly.

Speaker: 0
01:30:12

There you go. Sync up.

Speaker: 2
01:30:13

Yeah. And you know what it is? It’s just dorks. It’s dorks, and dorks have found a thing. Right.

Speaker: 0
01:30:19

They

Speaker: 2
01:30:19

found a thing. Maybe your thing could have been chess. It’s not. It’s politics. No. And maybe you know what I mean? Right. Like, whatever your thing is Yeah. That’s what’s really going on. And your denial of of objective reality in order to win, it exposes you. It exposes you to people in this new world that are recognizing that we are the only people that have ever gone through this, and we are in this insane moment of realization about how much we’ve been bullshitted and manipulated in the past, how much of all of our resources are going to things that we would never agree to

Speaker: 0
01:31:02

Yeah.

Speaker: 2
01:31:02

And and how much of this chaos is being pushed upon us by people who are profiting from it in a fucking spectacular way that’s almost indescribable. Yeah. Insane amounts of money

Speaker: 0
01:31:17

Insane amounts.

Speaker: 2
01:31:18

Control of the narrative, and it’s not working. It’s

Speaker: 0
01:31:21

not working. It’s not working. It’s not working.

Speaker: 2
01:31:24

You and I and Jamie in a fucking room Yeah. Are working. Right. That’s not working. What they’re doing is not working because people are getting information from multiple sources now. And the sources that aren’t reliable, like that guy listed off, they’re dying off. Yeah. You know, the New York Times app is, more people use it for Wordle than anything. Like, New York Times has essentially become a gaming company. Crazy.

Speaker: 2
01:31:51

See if that’s true. I don’t wanna get sued. I’m pretty sure it’s true.

Speaker: 0
01:31:54

Well, I mean, Wordle is fun.

Speaker: 2
01:31:56

I’m I’m sure it’s fun. It’s

Speaker: 3
01:31:57

a separate app.

Speaker: 2
01:31:59

Is it a separate app? Pretty sure. Or is it that Wordle gets Ai do mind.

Speaker: 3
01:32:03

That’s yeah.

Speaker: 2
01:32:03

Ai not hard to do marketing. No.

Speaker: 0
01:32:05

Thank you. I’m kidding.

Speaker: 2
01:32:06

Is that what it is?

Speaker: 3
01:32:07

That’s probably that.

Speaker: 2
01:32:08

So that Wordle gets more activity for the company. That’s what essentially yeah. There was a a graph. I was too lazy to read the whole graph, but it was, breaking down how Wordle is more used than anything.

Speaker: 0
01:32:20

Well, listen. This is Is that true?

Speaker: 2
01:32:21

Let’s make sure that’s true. Otherwise, we’ll have to cut this out. I don’t want New York Times on my ass.

Speaker: 0
01:32:26

Oh my god. That would suck. That would suck so bad.

Speaker: 2
01:32:30

They’ve done it before. It’s just that’s their job. That’s their job. You know, they’re just like that shouldn’t be a job where your, your Ai games are more popular than the snus. It if that’s what you want it to be, but here’s the thing. It’s not necessary anymore. And I think that, through the rise of independent journalism, one of the things we’re really realizing is that all something has someone has to do is be consistently objective and intelligent and, post things and post takes on things, ai, Coleman Hughes or some of these people.

Speaker: 2
01:33:04

Yeah. Consistently intelligent, objective, and then you’ll develop a following. Yeah. And then you’ll become a reliable source of news.

Speaker: 0
01:33:11

That’s right.

Speaker: 2
01:33:11

Because I know that if I ask Coleman about x, y, or z and he’s he’s informed, he’s gonna give me a very intelligent breakdown of what it is. Now, there’s a few people in my life that are like like Andrew Huberman, if Ai have some sort of health related question, Peter Attia. Right.

Speaker: 2
01:33:26

Ai have some sort of, like, how are they doing this and what it what it what is is this legitimate? Right. And they’ll look at it and they’ll analyze it. Yeah. I’ve ai Huberman stuff and he goes over the data. He’s ai, this is fascinating.

Speaker: 2
01:33:36

This theoretically should work, you know, and then this will explain why and what the pathways are and how interesting this is. Yeah. It’s it’s an amazing resource that wasn’t available before to any person. Forget I mean, it’s too difficult. You’d have one line of ai.

Speaker: 2
01:33:55

Like, you have one lane or whether it’s archaeology or language, one lane where you’re ai super well read in. Yeah. You don’t have access to all these these other professors that are working on quantum physics. Right. You don’t have access to the James Webb Telescope people.

Speaker: 2
01:34:07

You don’t have access to all this data. It’s like too hard to get. Right. Now it’s fucking everywhere. Everywhere.

Speaker: 2
01:34:13

It’s everywhere

Speaker: 0
01:34:15

all the time. That’s right.

Speaker: 2
01:34:16

It’s a question away on your phone. Yeah. It’s a question away. You pick up your phone and you just fucking press a button and say, hey, Google, Why don’t you tell me what the James Webb Telescope is going to put up to? Yeah. Hey, Chad. GPT. Why don’t you talk to me like Santa Claus and explain to me why these drones are fake?

Speaker: 0
01:34:34

The best.

Speaker: 2
01:34:35

Ai don’t know if the drones are China’s or ours or water people. They’re coming out

Speaker: 0
01:34:43

of the water. No. Tell I mean Imagine

Speaker: 2
01:34:44

if there’s a civilization under the water.

Speaker: 0
01:34:46

I mean, that’s where I would hide if I was, like, trying to hide from a civilization. It’s the ocean. It’s clear. They can’t get in there. They can’t breathe under. There’s a perfect place to hide.

Speaker: 2
01:34:54

You know what I’ve been saying for a ai? Last few weeks at ai? I think maybe what the aliens are is custodians. I think maybe they’re just here. They’re ai some sort of autonomous creation that’s designed to accelerate our evolution, stop us from blowing ourselves up, and make sure that we build a quantum computer with AI.

Speaker: 2
01:35:18

Right. And that that that this is, like, it’s all a part of this, like, endless cycle of integration in the great universe. This and, like, we’re at this, like, meh, I don’t wanna get out of my cocoon. We’re in that stage. We’re in this, like, bizarre, strange, you know, australopithecus wandering around in the grass fields.

Speaker: 2
01:35:37

We’re in this weird stage where we’re we’re gonna launch into some completely new way of interfacing with the universe itself.

Speaker: 0
01:35:47

Yeah.

Speaker: 2
01:35:48

And it’s gonna happen whether you like it or not. Yeah. And this is just what’s happening right now, and that’s why everything’s so chaotic.

Speaker: 0
01:35:55

Okay.

Speaker: 2
01:35:55

Ai, McKenna used to talk about this.

Speaker: 0
01:35:56

Dude, I used

Speaker: 2
01:35:57

to talk about how the end of civilization, it’s not gonna be it’s not gonna be a whimper. It’s gonna be people screaming

Speaker: 0
01:36:02

in agony and That’s it.

Speaker: 2
01:36:04

Flailing and trying to hold on to the world. Hold on.

Speaker: 0
01:36:06

Hold on.

Speaker: 2
01:36:07

Norman Rockwell painting. Someone bitching my own fucking bread.

Speaker: 0
01:36:10

You’re trying to do a waltz at a rave.

Speaker: 2
01:36:13

How many fucking genders? Yeah. What are you saying?

Speaker: 0
01:36:15

Yeah.

Speaker: 2
01:36:15

Yeah. Why are these fucking drones with their drones? Goddamn it.

Speaker: 0
01:36:19

It’s the meltdown. And and, you know, if you what you’re saying, like so if you look at, like, Crick and I think it was Crick wrote the, he wrote a paper ai about, directed panspermia, which is where you put, so, so, okay. Directed panspermia, I get some kind of Nanobot, which is, I guess you could say that’s what DNA is.

Speaker: 0
01:36:40

Nanobot precursor, essentially like where I think it’s weird and maybe I don’t understand what he’s doing completely. It’s weird to me that Musk wants to send humans to Mars because it seems like it would make wait make way more sense pre sending humans to send drones, robots, to construct whatever it is you need to survive on Mars, to go in the caves, build the fucking,

Speaker: 2
01:37:04

Well, that’s the plan, Duncan.

Speaker: 0
01:37:05

Oh, really? So it’s

Speaker: 6
01:37:06

not people first?

Speaker: 2
01:37:06

The first the first, voyage to Mars is gonna be unmanned.

Speaker: 0
01:37:10

Okay. Great.

Speaker: 2
01:37:11

Yeah. Ai makes sense. Have to do that. They have to they have to have a certain amount of supplies because I think they can only come back in 2 years.

Speaker: 0
01:37:17

But I don’t even mean supplies. I mean, if we jump for 20 years

Speaker: 2
01:37:21

about missing that bus? The Mars bus? The 2 year bus? Oh, Duncan, you were late. You slept in.

Speaker: 0
01:37:27

That’s hilarious. Yeah.

Speaker: 2
01:37:28

You missed the bus. You watched the rocket go up. No. No. Nobody woke me.

Speaker: 0
01:37:34

You cock suckers.

Speaker: 2
01:37:35

Imagine if you’re ai, Duncan is such a fucking douchebag. Let’s leave him here. Let’s leave him on Mars. There’s plenty of potatoes. He can live.

Speaker: 0
01:37:42

Yeah. Ai so fucked up. Fuck him.

Speaker: 2
01:37:45

Make him fertilize his potatoes with his own shah. Like So Damon. Yeah. Like, Matt Damon did in the that Arya movie.

Speaker: 0
01:37:51

Dude, so you the if sai, obviously, like, the way you’re gonna wanna colonize habitable worlds is you create not just this nanobot, but you make it so the nanobot can only survive in environments that you would live in. Meaning and then encoded in the nanobot is the end destination, what you’re talking about.

Speaker: 0
01:38:13

The quantum computers, some kind of AI that then will naturally uncover faster than light travel, wormholes, whatever the fuck it is. And then when the wormholes open up, you can instantaneously travel to habitable planets. Right? So

Speaker: 2
01:38:26

Can I tell you Terence Howard’s, idea? Yeah. It’s a great idea. Yeah. He thinks that we have it all wrong when it comes to the formation of planets and the creation of life. He thinks what happens is the sun is constantly ejecting things. Right?

Speaker: 0
01:38:44

You

Speaker: 2
01:38:44

see these coronal mass ejections crazy. Yeah. But millions of times, like, longer than, you know, the distance between whatever and whatever. I’m not at all.

Speaker: 0
01:38:54

Just scary.

Speaker: 2
01:38:55

The crazy, bigger than Earth. Right? That he thinks these particles coalesce in space ai of the gravity of the sana, and they they orbit the sun and very close at first. But then as time goes on, they move further and further away. Woah. And they get to a place where they’re in this position like Earth is, and then they people.

Speaker: 2
01:39:14

They flower just like when you plant a seed, when the water comes Sounds cool. And he goes, and then it has to be sophisticated enough to adapt because the planet is eventually gonna move out of the head habitable zone. Wow. And he thinks that Mars, at one point in time, probably had civilization and life.

Speaker: 2
01:39:34

And then as Mars got further and further and further out from the protection of the sun, it eventually got too cold and it eventually got hit by something. It lost its atmosphere, and now it’s just desert. That’s so cool. Well, now they know there’s water on Mars. Yeah. They know it.

Speaker: 2
01:39:50

It used to be just the craziest of conspiracy theories.

Speaker: 4
01:39:52

Oh, there’s no water on Mars?

Speaker: 0
01:39:54

Yeah.

Speaker: 4
01:39:54

There’s no evidence of water. How could a society live there? But,

Speaker: 2
01:39:59

you know, this is the nuttiest of nutty. But some remote viewer went to Mars a 1000000 years ago and said there were pyramids there

Speaker: 0
01:40:07

Sure.

Speaker: 2
01:40:08

And there was a civilization there. Yeah. And, you know, there’s tribes no. Actually, they think they came from the planet Sirius. Right? Like, the Dogon tribe?

Speaker: 3
01:40:17

Yeah.

Speaker: 2
01:40:17

They believe that all people came from from another planet. If you were in Mars and you’re a 1000 years advanced from us and they never figure out AI, so they could just go in a different direction. They’re ai super super advanced though, where they could travel through the space between the planets.

Speaker: 2
01:40:32

And, you get to a point where you’re like, hey, guys, we cut about a decade.

Speaker: 0
01:40:37

Yeah.

Speaker: 2
01:40:37

We got about 1 decade where life can exist on this fucking planet. We gotta get off of this now. Earth is ready. There’s some monkeys there. There’s a bunch of shit there. It’s ai, you know, we could we could just go there. Yeah. We just go there.

Speaker: 0
01:40:52

Well

Speaker: 2
01:40:52

And then we just monked around with them. Like, these guys are developing, like, really slowly. Like, why don’t we do that? And then homozygous

Speaker: 0
01:40:59

yay. Look. There you go. Man, listen. The whatever it is

Speaker: 2
01:41:04

It’s definitely not what I just described.

Speaker: 0
01:41:05

No. Whatever it is, it’s not bad.

Speaker: 2
01:41:08

I wasn’t believing it as I was saying, though. I was like, that’s crazy.

Speaker: 0
01:41:10

But, dude, you know, I ai Maybe. You look at, just the concept of epigenetics and what we’re doing right now. You look at the statistical probability of DNA evolving based on the age of the planet. You look at these things, and not just that you look at the mythologies of the world.

Speaker: 0
01:41:29

It all points towards some kind of, advanced intelligence ai a planet for some reason or another. I mean, even like, you ever read the parable of the sower? You know that g Jesus, Jesus sai, can you do you mind pulling that up, Jamie? I I don’t have it memorized yet. The parable of the sower.

Speaker: 0
01:41:51

The parable of the sower. When you think

Speaker: 2
01:41:52

of say yes. How do you

Speaker: 0
01:41:54

spell sower? Sai o w e r.

Speaker: 2
01:41:56

Oh, like a sow? Like sowing things?

Speaker: 0
01:41:58

No. Like planting seeds.

Speaker: 2
01:41:59

Sower of seeds. Right? Planner the parable is so

Speaker: 0
01:42:02

But when you think about this, generally, this is the idea of, like, there’s people who are gonna, like, understand Jesus is God. But if you look at it as an extraterrestrial intelligence planting seeds on planets versus it becomes this, like, crazy the sower is the yeah. The pair of the sower.

Speaker: 0
01:42:22

Those seeing, they do not sai, though hearing, they do not hear or understand. And then, where is it? Oh, yeah. Gotten a and he tyler, a farmer went out to sai a seed as he was scattering the seed, some fell along the path and the birds ate it up. Some fell on rocky places where it did ai not have much soil. It spring up quickly because the soil was shallow.

Speaker: 0
01:42:46

But when the sun came up, the plants were scorched, and they withered because they had no root. Other seed fell among thorns, which grew up and choked the plants. Still, other seed fell on good soil where it produced a crop, a 160 or 30 times what was sown. Whoever has ears, let them hear. Woah. Ai?

Speaker: 2
01:43:05

Listen to this. The disciples came to him and asked, why do you speak to the people in parables? He replied, because the knowledge of the secrets of the kingdom of heaven has been given to you, but not to them. Whoever whoever has will be given more, and they will have an abundance. Whoever does not have, even what they have will be taken from them.

Speaker: 2
01:43:27

This is why I speak to them in parables. I’d be like, bro, what the fuck did you say? Can you break that down?

Speaker: 0
01:43:33

If I

Speaker: 2
01:43:33

had that guy on the podcast if I had God on the podcast ai, okay. Do you have friends? Okay. When you talk to friends, like, do you should when you say, like, complicated things, you should make them make sense. So, I know you’re smart. You made the whole universe. Yeah. I’m I’m not being disrespectful, sir.

Speaker: 0
01:43:50

But Yeah.

Speaker: 2
01:43:50

What did you say?

Speaker: 0
01:43:51

Well, let me answer your speak. Parable you see imagine a flower growing from stone. Sometimes the stone is angry at the flower, but sometimes the stone glows with light. This is why.

Speaker: 2
01:44:05

Is it because we teach kids that way, kind of? We kinda teach kids almost in parables.

Speaker: 0
01:44:13

Well

Speaker: 2
01:44:14

We teach kids ai a simplistic form of everything.

Speaker: 0
01:44:18

Well, I think it’s an acknowledgment of a kind of spectrum of intelligence. Right? It’s like Right. The idea is, like, let me give you a a little data fractal here.

Speaker: 2
01:44:27

That’s why it’s so rude when someone talks down to you.

Speaker: 0
01:44:30

The worst.

Speaker: 2
01:44:31

When someone’s like, I don’t know if you know. But let me explain to you Oh, please. What’s wrong with the way you’re thinking.

Speaker: 0
01:44:38

I can’t wait to hear.

Speaker: 2
01:44:39

It’s the grossest way to talk to people ever.

Speaker: 0
01:44:41

It is. Sai it’s it’s an absolutely a sign of low intelligence if you’re so idiotic that you think And you’ve been mean. Communist. Yeah. But how do you think?

Speaker: 2
01:44:49

But you’re you’re deciding to be mean about a a point of discussion. That’s right. That’s what it is. You’re deciding to be mean. Instead of saying, I, I have a lot of knowledge about this, if we could if I could tell you what I know. Ai. This is why I believe what you’re saying is not true. Yeah.

Speaker: 2
01:45:08

Because I actually have a PhD in this, and this is how we know this, and this is how we know that. Right. And then and then you go, oh, that’s essentially what Eric Weinstein did to Terrence Howard. Right. So when Terrence Howard was on the podcast, there was a lot of things that you’re saying that were true and really fascinating and very interesting.

Speaker: 2
01:45:22

He’s a very brilliant guy.

Speaker: 0
01:45:23

Yeah.

Speaker: 2
01:45:24

Eric Weinstein is a legitimate PhD in mathematics. Right. He’s super fucking crazy scary smart. And he was and he said to him, he said, look, I’m not giving you peer review. He goes, I’m not a peer. You’re not my peer. He goes, I’m an expert. Ai give you an expert review. I’m saying you have a lot of, like, really interesting ideas. Just stop teaching people.

Speaker: 2
01:45:41

Stop stop it’s offensive to the people that actually do this for a living. That’s all it is. It doesn’t you are like us. And this is what he said of him. He said he’s one of us. He just went down a different path.

Speaker: 0
01:45:51

That’s right.

Speaker: 2
01:45:51

But he’s a brilliant guy who has a a strong desire to understand the universe. Yeah. Strong desire to understand things, but he’s not you have to go down the path of peers. You have to go down the path of you gotta be with all these other legitimate people to bounce these ideas about.

Speaker: 2
01:46:07

And the only way you’re really gonna get in if you you have to find some online community of legitimate people that accept you, you have to be invited into something, or you have to fucking attend a university like all the others did. That’s that’s how you find out, especially when it comes to shit like mathematics.

Speaker: 2
01:46:21

Boy, you know, when you’re talking about, like, things like

Speaker: 0
01:46:25

physics Yeah.

Speaker: 2
01:46:26

Boy, you need do you need that’s these are cold, hard, fact based

Speaker: 0
01:46:31

Yeah.

Speaker: 2
01:46:31

Disciplines. Ai, you need to be around the people that are the cream of the crop of that. That’s it.

Speaker: 0
01:46:35

Yeah. That’s right, man. And and that, I watched some of vatsal, and, I loved it because, you well, that’s what compassion looks like.

Speaker: 2
01:46:47

You got to see also that Terrence is a good guy.

Speaker: 0
01:46:49

Yeah. Exactly.

Speaker: 2
01:46:50

Upset. Ai didn’t get angry.

Speaker: 0
01:46:51

Yeah. It wasn’t ai any ability.

Speaker: 2
01:46:53

It wasn’t at all. And it was also an acknowledgment that a lot of his ideas are really fucking good. That peopling idea is really fucking good. The other thing that he has that

Speaker: 0
01:47:02

he invented,

Speaker: 2
01:47:02

you’ve seen that linchpin thing that he invented? I saw something like that modular drone technology that can, like, be used for construction and fucking moving giant

Speaker: 0
01:47:12

girders and shit. Vortex or something?

Speaker: 2
01:47:14

Well, they all link together. It’s like a geometric pattern, but the nerdiest of nerdy things was Weinstein calling him out on the degree Mhmm. Of the angle Mhmm. Of one of the, like, fucking calculations that he made. I don’t even remember exactly what it was. He was, you had to cheat that. Right?

Speaker: 0
01:47:30

Right.

Speaker: 2
01:47:31

And he’s, like, ai, I did. I did that to make it work. And he’s, like,

Speaker: 0
01:47:34

That’s

Speaker: 2
01:47:35

And they were really fucking with each other because he understood why there would be something problematic about linking all these specific geometric patterns, and then meh had to make some slight adjustment to make them link up perfectly. I love that. Super nerd talk.

Speaker: 0
01:47:47

Inside baseball, comedians do it too. Like, when we’re, like, breaking down a joke and, like, the like, the to a to a minute pause or something. It’s the same thing. But, yeah, man. I mean, this, this was this is what I love when you read about the history of science. You read about, like, famous physicists getting in real, like, intense fights with each other, and you see that the process of discovering the truth, does involve a kind of mutual curiosity, but not being afraid to say this is fucking wrong, but allowing the other person to fire back because you both know that via this conflict, potentially, you discover something new.

Speaker: 0
01:48:31

And that that was the attitude. I mean, like, this whole thing where suddenly normal people aren’t supposed to engage in science is really fucked up when you look at, like, the history of science, which used to be maniacs ai Newton, who they analyzed his ram, mercury in his fucking hair.

Speaker: 2
01:48:54

Bro, everything had poisoned it back then.

Speaker: 0
01:48:56

Well, no. But he was experimenting with mercury. It was interest he’s, like, building scale replicas of the Temple of Solomon. You know, like, you you look at that and you see that. Now Newton today, you know, somebody like Don Lim would be like, oh, really? So you sana up with Isaac fucking Newton with Mercury in his hair and his little dollhouse ai the temple of Solomon? Oh, yeah. He’s he’s a real scientist.

Speaker: 0
01:49:20

That’s not what they look like. It’s like these people are out of their fucking minds. Sigmund Freud just injecting fucking liquid cocaine into his veins. You look at, like, the history

Speaker: 2
01:49:32

about his mom.

Speaker: 0
01:49:33

Freaking out about his mom fucking shoving cigars up his ass. I don’t think he really did that, but I wouldn’t be surprised.

Speaker: 2
01:49:40

Bryden surprised.

Speaker: 0
01:49:40

But, you know, you look at the history of what brilliant people who have shifted the culture actually behave ai Tesla. Fucking Tesla did like, I don’t know. Like He’s

Speaker: 2
01:49:50

in love with his pigeon.

Speaker: 0
01:49:51

In love with a pigeon. Didn’t wanna like, the thought about castrating himself because his sex drive was getting in the way of his research. So he’s like, I’ll just chop my dick off.

Speaker: 2
01:50:00

Did, dude. I think there was a description of him destroying his sexuality.

Speaker: 0
01:50:05

Yeah, dude. So so you you sort of, like, realize that for whatever reason, the priest class of default reality of which Don Lemon is a high priest have suddenly created this ridiculous version of scientists, of philosophers, of intellectuals that are domesticated people, normal fucking people, is actually really awful in the sense that all of the, like, philosophers and scientists out there today who are, like, you know, in their filthy fucking apartments, who’ve been staring into a candle for, like, 5 hours.

Speaker: 0
01:50:42

They’re not thinking, like, I’m a scientist. They might be. You don’t know. Ai, the the this this this new basically, like, it appears that the power structures in the world want to create this homogeneous version of humanity within which there’s all these declawed people who completely align on a few ridiculous facts.

Speaker: 2
01:51:08

Absolutely. And you can make those people very clogged if you bond them together to attack anybody who doesn’t stay in line.

Speaker: 0
01:51:15

That’s right. Yeah. That’s right. And that is what, again, ai, a coup where you you get rid of the president, at least you know it’s not the president. Tyranny where you don’t have soldiers in the streets, but a kind of societal pressure, an an unending pressure trying to bots. TikTok. The reels, the algorithm.

Speaker: 0
01:51:38

Like, dude, have you ever looked at pendulum sync up?

Speaker: 2
01:51:41

But I don’t watch TikTok. I don’t have TikTok.

Speaker: 0
01:51:43

I know it sucks because I try to send you some TikTok shit, and you can’t look

Speaker: 2
01:51:46

at it. I won’t click on it.

Speaker: 0
01:51:47

But I don’t the the the

Speaker: 2
01:51:50

They probably already infected my phone just because you sent me those links. I’m sorry. Probably in the user agreement.

Speaker: 0
01:51:55

I still send it

Speaker: 2
01:51:56

agree to infect other people’s phones every time you send them a link.

Speaker: 0
01:51:58

I keep hoping you all of, like Sai I I try not giving in. Well, it’s it is so incredibly hypnotic. Like, it is so advanced in what it does. But Not interested. It’s really creepy, though, because, like, it’s, yeah, it’s syncing as it’s homogenizing ai, and that’s what I don’t like.

Speaker: 0
01:52:17

It’s like it’s creating this synced up and it’s creepy because, like, the TikTok dance is actually if you think about it, it’s really a symbol of what it’s doing for a lot of other things. Like, maybe you’re not doing a choreographed dance with your family in front of the Christmas tree to some dumb song.

Speaker: 0
01:52:33

But why is it that everything you say, I’ve read, I’ve seen written exactly in the same way on Reddit? Why is it that every opinion you have matches not just ai the idea doesn’t match, but the way you’re verbalizing the idea? How is it it’s like a sentence that I’ve heard over and over and over again in different places. That is so spooky to me. Yeah.

Speaker: 0
01:52:56

Sai to to meh, like that, and also that it’s called TikTok, which in my fucking paranoid universe Ai keep thinking, is that the TikTok of a metronome that they’re talking about? Getting people to dance to a certain cultural BPM.

Speaker: 2
01:53:10

Jamie, I’m gonna send you something. I’m not sure if it’s true, so I want you to find out if it’s true. And, it was someone was saying that, there’s a whole series of or or I saved it on Twitter. It’s a link on Twitter. That’s what it is. There’s a whole series of people who were claiming to be doctors saying the exact same thing.

Speaker: 0
01:53:32

I saw that. I know what you’re talking about. Real though? Ai don’t know. The problem

Speaker: 2
01:53:35

with those things is, like, people bullshit. And when people bull here it is. I’ll send it to you, Jamie. When people bullshit, it’s really hard to tyler. You know? Because, if you change this and, you know, create this in Photoshop and then people start spreading it, then all of a sudden that narrative gets out and most people don’t ever hear, oh, no.

Speaker: 2
01:53:54

No. No. Somebody made that in Photoshop.

Speaker: 0
01:53:55

Right.

Speaker: 2
01:53:55

So by the time it gets around, it’s ai, I don’t know. I don’t know if it’s true or not. Right. These are one of those. So it’s ai if if that’s true and if all these doctors were tweeting out the exact same verbiage exactly, I wonder if that’s a mandate. I wonder if they’re sent something ai a mass email

Speaker: 0
01:54:14

Mass email.

Speaker: 2
01:54:15

That says copy and paste this perhaps.

Speaker: 0
01:54:17

Discord server.

Speaker: 2
01:54:18

Or I wonder if they’re fake doctors or I wonder, you know, if it’s ai some bot program designed to encourage people to go get vaccinated or whatever it was. I just don’t know if it was real, so I don’t sana, like I sana doctor Jamie to look at it real quick.

Speaker: 0
01:54:32

Thank you, Jamie.

Speaker: 2
01:54:33

Jamie’s super skeptical to the point of being a liberal. Oh. Why

Speaker: 0
01:54:38

is that what it means? Why ai that have

Speaker: 3
01:54:40

to why is that the end result?

Speaker: 2
01:54:41

He’s triggered. He’s triggered.

Speaker: 3
01:54:43

That narrative gets around people’s

Speaker: 2
01:54:45

I know. I know. Jamie’s not a liberal, folks. Yeah. I Jamie’s very, down the middle.

Speaker: 0
01:54:49

Hey. Jamie’s not a liberal.

Speaker: 2
01:54:50

You’re, you’re a centrist. Is that correct? Sure. Ai. I think so. Right?

Speaker: 3
01:54:54

This is weird. I’m just looking at the account. I don’t I’m trying to figure out a way to to research it. I might have to go to Twitters or, Google search the image.

Speaker: 2
01:55:01

But See if the thing has been, community noted.

Speaker: 3
01:55:05

I do see one different here’s what difference I’m noticing just looking at it that the Different font. Well, the third thing that they’re saying is a little different because it’s starting to be a joke. Sugma is a joke. It’s something that’s ai Sugma Nuts.

Speaker: 2
01:55:18

Oh, really? Yeah. Ligma Nuts. Ligma is another one. Oh, hilarious.

Speaker: 0
01:55:22

Oh, that’s hilarious. That’s a

Speaker: 2
01:55:24

brilliant troll.

Speaker: 3
01:55:24

That’s very funny. It’s already in troll speak. But

Speaker: 2
01:55:27

Okay. So it does meh oh, sai it could be that a bunch of people just decided to retweet it for funsies?

Speaker: 3
01:55:33

I it’s some of them are Well,

Speaker: 0
01:55:34

why don’t you go to their accounts?

Speaker: 2
01:55:35

Look look

Speaker: 0
01:55:36

up any of these Ai not,

Speaker: 3
01:55:37

so I was gonna do that.

Speaker: 0
01:55:38

Or it

Speaker: 2
01:55:38

could have been one of those things where someone got caught. Do you remember when there was this, like, misinformation video that got out? There was all these local news anchors giving the same exact speech in verbatim. Yeah. In tune. In time. Yeah. It’s really weird. Yeah. Play that, Jamie.

Speaker: 3
01:55:56

Ai you know where that how that happens?

Speaker: 2
01:55:58

Yeah. They get given You

Speaker: 0
01:56:00

just get the same script.

Speaker: 3
01:56:01

Yeah. It’s like local news stuff. It’s ai, I mean, it usually happens ai Sinclair Media.

Speaker: 2
01:56:05

Right. But it’s about misinformation. Oh,

Speaker: 0
01:56:08

yeah. Right.

Speaker: 2
01:56:08

And it’s it’s weird. Right. It’s weird because they’re basically protecting their job. So what is they they’ve been caught stealing money. You know, they got a big fucking pot of gold then, you know, and the people are at the door, like, I heard you got gold in there. What you’re hearing is misinformation. It’s all misinformation. We are the number one source of news, and we’re dedicated to give you the true objective.

Speaker: 3
01:56:31

They’re all reciting the same script.

Speaker: 2
01:56:33

Right. Is it, what company is it

Speaker: 0
01:56:36

that makes America It’s Sinclair.

Speaker: 2
01:56:37

So they make them let’s just play it though because it’s so crazy that these people are the people that are in charge of giving you the news and they’re reading off this thing pretending that this is the these are their thoughts. This is what’s bizarre about that where it’s untenable because people know that those are not their thoughts. They know they’re reading off a script.

Speaker: 0
01:56:56

Right.

Speaker: 2
01:56:56

Everybody knows it. So it doesn’t work. You’re just making noise with your mouth, and people are still on Twitter.

Speaker: 0
01:57:02

Well, they get You know what I mean? They’re

Speaker: 2
01:57:04

still they’re still, like, reading what’s actually going on versus what you’re saying.

Speaker: 0
01:57:08

Also, you know you know, the the ai is you get these people to dress up like humans and then just get them to, like, in a in a in tone, whatever the fucking thing is you want them to read.

Speaker: 3
01:57:17

Yeah.

Speaker: 0
01:57:17

And we think they’re one of us, and so we believe them.

Speaker: 2
01:57:20

Jamie, did I send you the thing where the girl is, excuse me, the woman is, giving a, a press conference on, the UAPs and the drones, and she’s saying we don’t know what they are. They’re not ours, and they’re not an adversary’s.

Speaker: 0
01:57:35

That’s I think I think I tweeted.

Speaker: 2
01:57:37

I sent it to you. Right?

Speaker: 0
01:57:38

The one where she’s wearing the UFO necklace. You’ll sai. She’s wearing a UFO necklace. Kook.

Speaker: 2
01:57:42

Is she a kook? Who is she who is this lady? I don’t know. Is this a legit press conference? All I

Speaker: 0
01:57:46

know is she’s in front of a podium, so I trust her.

Speaker: 2
01:57:48

That’s what I go by. It’s a podium. Get all the way the fuck up there and nobody tackles you. Be telling the truth. You gotta be legit.

Speaker: 0
01:57:54

Yeah. You’re at the podium. Totally. Yeah. Yeah.

Speaker: 2
01:57:56

You’re at the sacred scroll.

Speaker: 0
01:57:57

There’s a flag behind you.

Speaker: 2
01:57:59

I’m gonna buy a Arya Torah. You ever see the Torah? How it’s like in a or the Talmud?

Speaker: 0
01:58:04

You can just ai you’re gonna buy the actual scroll?

Speaker: 2
01:58:06

I meh someone to write it for you.

Speaker: 0
01:58:08

That’s a terrible responsibility for him because you have to, like, treat it really carefully. You have to put it in a vault.

Speaker: 2
01:58:13

They have dudes. They, no. He’s gonna keep it in his living room and jerk off on it.

Speaker: 0
01:58:16

Did he say that? No. No. You won’t. Said that. I promise

Speaker: 2
01:58:20

you. Still believes.

Speaker: 0
01:58:21

He’s not gonna jerk off on the tour. I promise you. He’s gonna put it oh, Talman. The tour, and he won’t jerk off. Dude, how about this one? Jamie, can you find you remember that lady they hired for the ministry of info it wasn’t called that, but it was like Yeah.

Speaker: 2
01:58:40

Yeah. Ministry of disinformation.

Speaker: 0
01:58:41

Okay. Can you find the ministry of disinformation lady singing supercalifragilisticexpial you’ve seen that.

Speaker: 6
01:58:46

Yeah. Of course. Yeah.

Speaker: 0
01:58:47

That. Yeah. That was great. Orwell, I think her name is. Cindy Orwell.

Speaker: 2
01:58:50

That’s not her name. I just made that

Speaker: 0
01:58:52

up. Oh, I’ll be awesome.

Speaker: 2
01:58:57

She’s ai she’s like such a loon. And this idea that this person’s gonna be in charge of what’s legitimate and not. There’s too many things, and this is what people are realizing. There’s too many things that they told us were not legitimate just 3 years ago that are 100% fact now, and everybody knows that.

Speaker: 2
01:59:12

And this is this, latest what is the house committee thing on when with COVID and the Wuhan lab leak and this lady.

Speaker: 0
01:59:21

This is the craziest shit I’ve ever seen. Nina

Speaker: 2
01:59:25

Jankovic.

Speaker: 5
01:59:26

By saying them in congress or a mainstream outfit, so disinformation’s origins are slightly less atrocious.

Speaker: 0
01:59:32

Can you imagine?

Speaker: 5
01:59:34

It’s how you hide a little lie. It’s how you hide a little lie. It’s how you hide a little lie.

Speaker: 0
01:59:38

They think we’re idiots.

Speaker: 5
01:59:38

Rudy Giuliani shared that in terms on Ukraine. Or when TikTok influences say COVID can cause pain. They’re launder in the sinful when we really should take note

Speaker: 2
01:59:53

It’s kinda catchy.

Speaker: 5
01:59:54

When our box

Speaker: 0
01:59:55

Ai mean, she’s beautiful.

Speaker: 3
01:59:56

Isn’t this what Animaniacs did?

Speaker: 5
01:59:58

Congress where mainstream outlets are

Speaker: 0
02:00:00

This is real. Nation’s origin

Speaker: 5
02:00:01

seems ai less atrocious.

Speaker: 3
02:00:04

Feels like Animaniacs to me. You guys might be

Speaker: 2
02:00:05

No. I think it’s really her.

Speaker: 3
02:00:07

No. No. I mean, but that they would give information out in songs like that, song form. And it would be informational. But if you But

Speaker: 2
02:00:13

this sai, like, something that she released when they were talking about her being the ministry of the head of

Speaker: 3
02:00:18

the Ai just saying it’s disinformation.

Speaker: 0
02:00:20

But how about Ai

Speaker: 2
02:00:21

think it’s honestly just her trying to go viral with a video about this thing that she’s doing. That’s what it is. And that’s the it’s a good way to go viral. I mean, we just talked about it. I mean People share it. Even if it’s preposterous, it’s a good way to get attention to this thing that you’re about to do and watch.

Speaker: 0
02:00:36

It’s fine if if the United States isn’t literally 1,000,000,000,000 of dollars in debt and partially because people like that are getting hired to sing fucking Mary Poppins shah songs about misinformation. Then it’s an atrocity.

Speaker: 2
02:00:51

Should get Nancy Pelosi money. Dude. She should get

Speaker: 0
02:00:54

I mean, look. Nancy Pelosi deserves a free

Speaker: 2
02:00:56

full of diamonds money. Bathtub

Speaker: 0
02:00:58

full of diamonds.

Speaker: 2
02:00:59

Just just crystal and diamonds in the bathtub. Just waddle

Speaker: 0
02:01:04

ai.

Speaker: 2
02:01:10

Ai just hang in there until that fucking genetic engineering comes.

Speaker: 0
02:01:15

You could be young again. We’ll What was it she said Joe Biden should be on Mount Rushmore?

Speaker: 2
02:01:19

Yeah. Good call. Good call. He’s he’s definitely not gonna send you in jail. That guy is, oh, this is what I sana talk to you about. Pardons. I’m not opposed to the idea of being pardoned because I think that there’s, like, governors can find out that someone legitimately got railroaded and they they can pardon someone.

Speaker: 2
02:01:38

Yeah. I like that. I like that. I like that the president can pardon some people, like, I wish they pardoned Ed Ed Bryden, you know. Ai you know, there’s a bunch of Julian Assange. They should have pardoned Julian Assange.

Speaker: 2
02:01:49

I wish there was, you know, a way to stop someone from pardoning 8,000 fucking people, and some of them were ai murderers. Some of them are the kids the kids for cash judge.

Speaker: 0
02:02:00

Kids for cash.

Speaker: 2
02:02:01

One of them or one of the people. We talked about this the other day. He’s one of the people. He had 2 years left in his sentence, but still, like, it’s the principle of the thing. How many lives were destroyed by that kids for cash thing? We how many how many dehumanizing decisions were made where you decided to lock young people up in detention centers where they would get raped and beaten up and tortured and separated from their family and sent down a horrible road of distrust of law enforcement and of authority and everything else everything else.

Speaker: 2
02:02:33

You’re you’re basically setting them up for a life of being a fucking loser unless they have the strongest of wills and they could figure out a way to stay positive and get through it and then use that to fuel whatever the fuck they do. That’s so rare, man. Those people are so rare.

Speaker: 0
02:02:52

Well, here’s the problem, man. I mean, the problem is well, number 1, I think okay. Like, you pull someone over, you breathalyze them, they’re driving drunk. Right? So you’re ai, you can’t drive now because you’re drunk. Sai, also, you wouldn’t say to them, I’m gonna give you the ability to pardon as many people as you want for any crime that you want. Right?

Speaker: 0
02:03:13

So if somebody has meh, right, why can’t they do all the pardons? It’s kinda weird. Crazy. That’s such

Speaker: 2
02:03:22

a great point that I never even thought of. Yeah. Wait. So Why would you still give him that power?

Speaker: 0
02:03:26

Well, the other thing that’s really fucking crazy about it is I don’t know what we the president makes a year, but it’s not enough money. We barely pay the president anything. So

Speaker: 2
02:03:35

I think it’s, like, $400,000.

Speaker: 0
02:03:37

$400,000 a year. You can’t

Speaker: 2
02:03:39

say that’s barely enough. That’s barely ai.

Speaker: 0
02:03:41

I meh, for the actual job I mean, literally, every day, you’re shitting blood because no matter what you do, you say the wrong thing, 5,000 people accidentally die. It’s the most stressful job on Earth. I’m saying the actual thing theoretically in my mind. So I would say, you know, in the way that we pay our football players a shit ton of money, baseball players a shit ton of money, then, dude, theoretically keeping our country from getting nuked should make a lot of money.

Speaker: 0
02:04:07

Why not? If How about this?

Speaker: 2
02:04:09

How about we pay them more, but they can’t do speeches? No speeches afterwards. No no paid speeches when you leave. You could write books. You could write books, but none of those paid bank speak. None of those $500,000 speeches.

Speaker: 0
02:04:22

That’s so to to me, it’s like on your way out, you sell pardons. On your way out, you you via some god knows what mechanism that’s probably been in place for a long time

Speaker: 2
02:04:32

Right.

Speaker: 0
02:04:33

People are able to give you this or that, and you pardon that person.

Speaker: 2
02:04:38

Yeah.

Speaker: 0
02:04:39

That’s where it’s fucked up. It’s like, dude. Come on, bro. Come on, man.

Speaker: 2
02:04:43

That’s trading. That’s how it works.

Speaker: 0
02:04:45

Give you

Speaker: 4
02:04:46

a little of this. You give me a little of

Speaker: 2
02:04:47

that, and we do it right in front of the world.

Speaker: 0
02:04:48

And do

Speaker: 2
02:04:48

We’re gonna let out murderers.

Speaker: 0
02:04:50

And then especially if you got cooed. Sai, especially, you got cooed. You got humiliated. They didn’t give you your drugs.

Speaker: 2
02:04:58

Isn’t he letting out Joe Exotic?

Speaker: 0
02:05:00

It’s insane.

Speaker: 2
02:05:01

Let him out.

Speaker: 0
02:05:02

Why not?

Speaker: 2
02:05:03

Let him out.

Speaker: 0
02:05:03

You’re gonna do Kids For Cash. You’re not gonna do Joe fucking exotic.

Speaker: 2
02:05:07

Yeah. And how is Joe Exotic DMing me? How is this happening?

Speaker: 0
02:05:09

I think he because he knows in jail. Because you will say things like this.

Speaker: 2
02:05:12

Does he have a phone?

Speaker: 0
02:05:14

Joe? Dude, I feel like

Speaker: 2
02:05:15

have a phone, Joe? Can you have a Twitter account?

Speaker: 0
02:05:17

Joe, I feel like right now because of your like, you aiding Trump and getting elected. I I I don’t think you’re the kind of person to do this, but I do feel like you could probably call in, like, at least one favor.

Speaker: 2
02:05:30

Get Joe Exotic out.

Speaker: 0
02:05:32

Dude, why not? For all of us.

Speaker: 2
02:05:35

Ross Russ Albergh first.

Speaker: 0
02:05:37

Okay. Sure. Yeah. I mean, Ai obviously, there’s price in people ai find out.

Speaker: 2
02:05:40

Committed to doing that, to releasing it.

Speaker: 0
02:05:43

Really? Yeah. Well, I meh, that ai

Speaker: 2
02:05:46

out if that’s true, Jamie? I believe it is. I believe he was it was one of those Bitcoin fucking things that he did. Oh, yeah.

Speaker: 3
02:05:53

He said that in the Dan or Dave said that, the libertarian thing. He

Speaker: 2
02:05:56

said Yes. That’s right. Libertarian. Sai said libertarian and Bitcoin are the same fucking category in my brain. When I barely tuned in, it’s all the same. It’s ai NFT, libertarian, Bitcoin. Yeah. Yeah. Yeah. Whatever. Yeah. I’m a libertarian. Like, sure. It’s on in paper. It’s a great idea. It’s not it’s not a real party.

Speaker: 0
02:06:13

I’m a sovereign citizen.

Speaker: 2
02:06:14

I’m a sovereign citizen too.

Speaker: 0
02:06:16

Really? Yeah. Yeah. It’s great.

Speaker: 2
02:06:17

But I’m not of this planet.

Speaker: 0
02:06:18

Where are you ram? The place?

Speaker: 2
02:06:19

Everywhere. I’m from everywhere, man. I’m Johnny Cash.

Speaker: 0
02:06:23

Dude, that’s

Speaker: 2
02:06:25

a lot of places. Everywhere, man. I’ve been there.

Speaker: 0
02:06:28

What a great song. What a great fucking song.

Speaker: 2
02:06:30

Johnny Cash was the fucking man.

Speaker: 0
02:06:32

He ai the

Speaker: 2
02:06:33

fucking man.

Speaker: 0
02:06:34

That’s an incarnation I would pick. Like, if we get to, ai, if there’s a VHS library of incarnations, there’s a long line to be Johnny Cash. I’m picking Johnny Cash. Oh my god. Top 10, probably.

Speaker: 2
02:06:45

Can you imagine being Johnny Cash when he played at Folsom Bryden? Folsom Prison Blues.

Speaker: 0
02:06:49

It’s the most incredible thing.

Speaker: 2
02:06:51

He played out the prison.

Speaker: 0
02:06:52

It’s incredible. Paul Rodriguez did a

Speaker: 2
02:06:54

comedy special out of prison, like, way back in the day. I forget when it was, but I remember I believe it was an HBO speak, and he did it live from a bryden, which is fucking buck ai

Speaker: 0
02:07:07

badass. Buck wild.

Speaker: 2
02:07:09

I mean, I if you were gonna do that, you would have to work you know who could do that? Joey Diaz 100%. 100%. He would murder in the prison Yeah. You know, metaphorically.

Speaker: 0
02:07:21

I mean, if there was a simulator and this is again, like Is

Speaker: 2
02:07:24

this Paul Rodriguez behind behind ai live in San Quentin ai. Damn. Respect to Paul Rodriguez. I don’t even know if it worked. Do they laughing? It looks like they’re laughing. Looks like they’re having a good time. That’s crazy. No. They don’t have to.

Speaker: 0
02:07:42

I’m joking. I’m sure they don’t

Speaker: 2
02:07:43

rush him. Who’s how’s the the guards gonna stop by the time they beat him to death? I mean Fifty dudes just rush him?

Speaker: 0
02:07:51

Seriously, you’re thinking about that before you’re on stage. Like, you

Speaker: 2
02:07:54

They like Paul Rodriguez.

Speaker: 0
02:07:56

I mean You know? But yeah. Do they all how do you know? It’s Sai Quentin. He was popular at

Speaker: 2
02:08:00

the time. Like, he is still popular, but he was, like, very, very popular at the time.

Speaker: 0
02:08:04

There’s a guy probably in the audience who, like, wore his daughter’s entrails as a necklace. You know what I mean? Like, I’m sure most of you.

Speaker: 2
02:08:13

Yeah. Right? Flucked his dog.

Speaker: 0
02:08:14

It’s badass. That’s a badass move. Crazy move. That’s a badass move. I mean, dude, when you think about all of the shit that we’re talking about and really when you sort of look at, like, just among our group of friends, the insane events the last few months.

Speaker: 2
02:08:32

Tony Hinchcliffe was misquoted by Obama.

Speaker: 0
02:08:36

I mean There

Speaker: 2
02:08:36

was a speaker at the Trump rally who said Puerto Rico is a pile of garbage. Those are human beings. Dude,

Speaker: 0
02:08:44

I meh, the Nobody’s aged harder than that dude. Well, he’s withered.

Speaker: 2
02:08:49

Bro, but those are ai these are vampire years. These are like you got bit by a leech. Like, you got a parasite in the ears.

Speaker: 0
02:08:57

Reading the Necronomicon or something.

Speaker: 2
02:08:59

He’s yeah. Right. Right. Right. You have the Arya of Covenant in your bedroom. Yeah. Like, you’re cooking.

Speaker: 0
02:09:03

You’re cooking.

Speaker: 2
02:09:04

You’re you’re aging. Like, you look you know, you age 50 fucking years. You look like a really good looking 70 year old. There’s something weird.

Speaker: 0
02:09:11

I mean, just think about, like, what that’s like

Speaker: 2
02:09:14

to be The pressure.

Speaker: 0
02:09:16

The shah. The you know, you’re con Ai saying as far as power goes. I think power must be so addictive. And so you’re the fucking president, not just the president. You’re ai this kinda rock star president for a second.

Speaker: 2
02:09:27

Yeah. One of the greatest presidents of all time Of all time.

Speaker: 0
02:09:29

Of all time. And so you lose that power. And now what? You know what I mean? Now what? And then you try desperately to, like, grab control of the thing, and you can’t. It didn’t work. So, essentially, whatever, like, prana or energy you’ve been extracting from having that kind of power, it’s gone.

Speaker: 2
02:09:49

It’s gone.

Speaker: 0
02:09:49

Now you have the nice house, but, like, really, like, what’s left? You arya at the fucking control board for for, like, America, and now nothing. You wither. You deflate. You, ai, No purpose. No purpose. I think

Speaker: 2
02:10:05

we’re gonna be able to read minds in 5 years, and all this is gonna be a moot point.

Speaker: 0
02:10:08

Well, no. I think

Speaker: 2
02:10:09

it’s gonna be all go out the window. I think quantum computing is gonna crush encryption. Yep. We’re gonna have a real problem with currency worldwide. We’re gonna have to figure out how to redistribute resources without conventional capitalism. There’s gonna be some weird new shifting that’s gonna come along with the birth of this AI that’s way more intelligent than us, and everything’s gonna get super fucking weird and we’re not ready for it.

Speaker: 2
02:10:34

And we think that we oh, we have to be ready for it. It’s not gonna happen like that because I’m not ready for it. No one’s ready for it. No. It couldn’t hap just like a supervolcano, just like an asteroid impact, it can happen, and you’re not ready for it.

Speaker: 0
02:10:44

That’s right.

Speaker: 2
02:10:44

And a lot of us might not make

Speaker: 0
02:10:45

it. That’s dude. Okay. This as far as, like, AGI goes, it like, when Altman came out and said this year, I don’t the the the somewhat a CEO from the company, I think, that made I don’t know. 1 of the other AIs said, 2 years from now, I think. But the idea is if if if Biden came out and was like, guys, got some news.

Speaker: 0
02:11:11

We’ve detected a mother shah. It’s coming to the planet in a year. We don’t understand anything about who they are other than they must have extraordinary technology based on what we’ve seen of their ship. I’m good. The whole planet, the next few years would just be getting ready. NASA, anthropologists, philosophers, scientists, defense people. What do we do if they wanna fuck us up?

Speaker: 0
02:11:33

What what what do we do? How do you interact with aliens? But having these tech people sai, we are about to have a brand new species, essentially, a technological speak, and AGI is coming to the planet that will surpass us as as far as being able to, solve problems. It will know everything. It’s gonna be here in about a year, maybe 2 years. You would think the reaction to that would be, okay.

Speaker: 0
02:12:00

We’ve gotta get ready for this. What does that mean? What’s gonna happen?

Speaker: 2
02:12:04

And we can’t get ready. Maybe that’s what the scramble is all about, that we just have to fucking die screaming.

Speaker: 0
02:12:12

Merry Christmas. Ho ho ho.

Speaker: 2
02:12:15

Maybe it can’t be solved just like Australiapithecus couldn’t figure out how to make a plane. It’s it we’re not prepared for it, and we’re not supposed to last. We’re supposed to carry on to the next thing, and the next thing will still be us. That’s what’s gonna be weird. The next thing is gonna still be us.

Speaker: 2
02:12:31

We just want us to stay us like this. We want, you know, fucking blues songs, and

Speaker: 0
02:12:36

we wanna

Speaker: 2
02:12:36

drink whiskey. Sure. We wanna smoke cigarettes. We wanna, like, get in fist ai. We wanna we want us to stay us. It’s, you know, it’s not going to happen.

Speaker: 0
02:12:46

I’m sure if you could travel back in time and there was an intelligent, sai intelligent, version of humanity, one of our ancestors who still had a a workable tail. Yeah. And you’re like, hey. I wanna show you what you’re gonna grow into. And probably there’d be a lot of things they’re excited about. Woah. Cars? Incredible. Jesus Christ.

Speaker: 0
02:13:07

You can shit in your own house? Yeah. Wow. But then they would see that we didn’t have tails. And they’re like, woah. Woah. Woah. Woah.

Speaker: 0
02:13:15

Ai not doing that if I lose my tail. So this is for sure the sort of cultural drama that we’re seeing is and and, you know, the the trans controversy, there’s aspects to it where, ai, yeah, abs why why dude shouldn’t be in sports. But the reality is where we’re going is going to make that controversy seem like nothing.

Speaker: 2
02:13:44

Not only that, it seems like if you wanted to have an evolutionary path towards a genderless society, wouldn’t you have that society if you wanted to tame the wild primate, wouldn’t you have that society be completely addicted to plastic? They use plastic for everything, which is an endocrine disruptor. So you have these plastic and these chemicals that get into the body, lower testosterone, shrink dicks, shrink taints. Oh.

Speaker: 2
02:14:15

Doctor Shanna Swan’s

Speaker: 0
02:14:16

work. Shrink taints? Yeah. Yeah.

Speaker: 2
02:14:17

Yeah. Shanna Swan, have you ever talked to her?

Speaker: 0
02:14:19

I didn’t know that taint could shrink. Ai

Speaker: 2
02:14:21

You should talk no. Well, it’s in it’s in utero. So this is what happens. Oh. When you introduce, mammals to her book is called

Speaker: 3
02:14:30

I think it’s Countdown.

Speaker: 2
02:14:31

Countdown. That’s right. I I used to remember it. It’s a great book. She’s she’s really fun too. She’s a really fascinating person. But what they found is with mammals, when you introduce phthalates, which are these plastics, like microplastics and the chemicals that come off of them into, pregnant women or pregnant ai, the the babies have smaller taints.

Speaker: 2
02:14:56

And then the taint is one of the best ways to distinguish a male or a female in mammals. No way. In males, the taint is 50 to a 100% longer.

Speaker: 0
02:15:05

Is it really called the taint?

Speaker: 2
02:15:06

They don’t call it the taint. They have a word for it, she told me, but she calls it the taint because she’s fun.

Speaker: 0
02:15:10

Okay.

Speaker: 2
02:15:10

She has a thing on her her website called the jizz quiz. It’s very funny. She’s very funny, and she’s, you know, a really distinguished professor. But, what she’s saying essentially is that these plastics are lowering hormone levels. They are, lowering birth rate levels. They’re increasing in the amount of miscarriages that women have.

Speaker: 2
02:15:30

So that all these things shah believes are completely connected. And that this, hormone disruptor that is only plastics is causing people to become sicker and a little bit deformed because your hormones aren’t expressing themselves correctly because they’re being poisoned.

Speaker: 0
02:15:48

Right.

Speaker: 2
02:15:49

If you were a a society like, if you were going to get to where the aliens are, they’re they look genderless, don’t they?

Speaker: 0
02:15:56

Yeah. Sure.

Speaker: 2
02:15:56

Don’t you think that’s probably us in the future?

Speaker: 0
02:15:58

Aren’t you glad they’re genderless? Sure. Can you imagine if the grays had big swinging dicks?

Speaker: 2
02:16:03

Giant hogs.

Speaker: 0
02:16:04

It would be horrible. Those gray pictures would be very different.

Speaker: 2
02:16:08

Very different. Sitting over your bed, jacking off in your face.

Speaker: 0
02:16:11

Horrible.

Speaker: 2
02:16:11

Yeah. While you’re paralyzed, you’re ai sitting there like, oh, this fucking piece of shit. Ai think that, we arya clinging to this idea of male and female. Look. I I think, currently, there are male and females for sure. And this is why I’m completely opposed to biological males who have mental illness, and that’s what gender dysphoria is even if you’re being kind. It’s a mental illness.

Speaker: 0
02:16:36

Yeah. It is whatever it is to the DSMR.

Speaker: 2
02:16:38

Not well in who you are. You wish you were a different gender. I fully support you. That’s what I wanna but you can’t compete with biological females. Right. We can’t pretend that you’re a biological female because when we want you to feel good, you have massive physical advantages. They’ve been clearly documented.

Speaker: 2
02:16:52

Anybody who says any differently is full shit. Talk to Riley Gaines.

Speaker: 0
02:16:56

Right.

Speaker: 2
02:16:56

Talk to her. She’s the expert in this shit. She had to go through that shit with swimming.

Speaker: 0
02:17:00

It feels crazy to me

Speaker: 2
02:17:02

It is

Speaker: 0
02:17:02

that you have to say that.

Speaker: 2
02:17:03

It’s so crazy. It’s so crazy that you have to say that to liberals who always wanted to protect women. The whole thing is bonkers, but it just goes to show you it’s not real. This idea of left and right is not real. Right. These are just masks that people put on. These are just a conglomeration of opinions that people adopt. They most people have not thought most of the things through.

Speaker: 2
02:17:24

They don’t have the time. They have to work all fucking day. They have a family. Maybe they have a hobby. They’re trying to get out and play hoops with their friends. Right. Maybe they get together with their buddies and they wanna play video games Yeah.

Speaker: 2
02:17:35

Like, one night a speak. You know, like, Jesus Christ. Right. They don’t have fucking time to pay attention to all this crazy shah, and that’s what’s really going on.

Speaker: 0
02:17:43

Right.

Speaker: 2
02:17:43

Most people are just, like, deciding that, you know, I’m a progressive. I will repeat progressive talking points. I will violently defend a woman’s right to choose. Right. And they get into these patterns, and then the same thing happens on the right. This exact same thing. Exact same thing.

Speaker: 2
02:18:00

That’s why, like, the right is against the war in Ukraine, and the left is supporting it. It’s ai, this is like Vietnam in reverse. The whole thing is fucking bananas. Right. You know, the right is insisting on free speech. They were the motherfuckers that were censoring everybody.

Speaker: 0
02:18:14

Right. I know.

Speaker: 2
02:18:14

They wanted to lock Howard Stern up in jail.

Speaker: 0
02:18:16

I know.

Speaker: 2
02:18:17

He had to fight the FCC. Crazy. They sued him. They his fucking parent company had to play untold amounts of money. I don’t even how much money did Howard Stern’s company get fined? It was 100 of 1,000 of dollars, if not 1,000,000. Yeah. Insane amounts of money. The fucking government was trying to shut down a radio guy for talking shit.

Speaker: 0
02:18:35

It’s crazy. So

Speaker: 2
02:18:36

and now that’s the left. It’s just patterns, man. It’s just patterns where people can justify certain behaviors because it aligns with their ideology.

Speaker: 0
02:18:46

That’s right. And, also, where it gets really fucked up is you are dancing to a song that basically, there’s like here’s 2 songs. There’s 2 songs you can dance to, the right song or the song of the left. The the metronome is beating out 2, I guess, somewhat different rhythms.

Speaker: 0
02:19:04

People dance to those. They get in fights over. You gotta do hard dance.

Speaker: 2
02:19:08

Yeah.

Speaker: 0
02:19:08

Meanwhile, there’s a million other songs out there you could be dancing to, and there are songs that are much older than America, much older than than maybe the planet itself, which is that’s why I I really think, it’s creepy these the the way that, like, Christianity or any religion where there’s theism, it’s in the list of things that should be decried by an intelligent person. We shoot down this notion of God. We shoot down this or that.

Speaker: 0
02:19:42

But all of these religions, at the very least, they give you a new song to dance to that isn’t fucking war drums. Right. It is you know what I mean? And they don’t like that. They don’t like that because suddenly, you’re supposed to be perturbed.

Speaker: 0
02:19:56

Like, you know, that stupid But it’s

Speaker: 2
02:19:57

also the arrogance of intellectualism. You know, you get really smart, and these ai these fairy tales seem preposterous to you. That’s right. And you don’t wanna accept that maybe what it is is a moral scaffolding that keeps society glued together, and it’s probably based on some truth.

Speaker: 2
02:20:12

There’s some of it that seems to be a history of the world.

Speaker: 0
02:20:16

Oh, and also the the to most people that I have encountered who have a lot of people I’ve encountered who are rejecting this religion or that. It’s I get it. It’s religious trauma. I just ran into somebody at Best Ai, recognized. We had a long conversation. It’s religious trauma.

Speaker: 0
02:20:33

They were raised in some kind of, like, form of spiritual abuse. And in that abuse handlers. Bingo. And they but but but listen. If you wanna handle snakes, great.

Speaker: 0
02:20:44

The problem is if you tell a kid to disregard their rational mind. If in in other words, the introduction to the conversation of questions regarding this or that Yes. Are are are are not met with ai, oh, yeah. It’s a good question. I don’t know, but are met with, you’re going to hell. You’re going to hell.

Speaker: 0
02:21:02

You’re demon possessed. Right. Right. Right. Then you experience that, and, of course, you must reject the thing.

Speaker: 0
02:21:07

It’s like when you when you have a hangover and you smell tequila, you can’t connect. Right. So I get it. But the the the main thing is what’s what I love about religion or Christianity is it’s ai, just try it on for size. What happens if you pray?

Speaker: 0
02:21:27

Do I know you don’t believe in it. It sounds insane. What the fuck are they talking about? Sounds absolutely nuts. I know. It sounds absolutely nuts.

Speaker: 0
02:21:35

Now what happens if you pray? Just for a few days, what happens if you pray? And then once you start doing the experiment, it starts off with, like, this is just I’m gonna do it. It’s probably bullshit, opium of the masses. But then you realize you’re getting pulled in, not in a bad way.

Speaker: 0
02:21:55

But right away, there seems to be some feeling of connection, some sense of something a little different than what you’re used to experiencing. And sometimes that can get really scary for people, and they’re like, fuck this. No. It’s getting me. And it’s ai, to me, that that should be the experiment of any, anyone who’s skeptical. And if you’re skeptical about Christianity or any religion, you should be.

Speaker: 0
02:22:20

You should a 100% be skeptical of, oh, it’s like what Mark Twain said. Religion is what happened when the first con man met the first fool. What you, you should be skeptical, but if you read the gospels and you realize, like, the part of the story, there’s an invitation to connect on your own.

Speaker: 0
02:22:41

You don’t need the priest class.

Speaker: 2
02:22:43

Right.

Speaker: 0
02:22:44

You don’t have to listen to the fucking rules. You don’t have to, ai, it’s just between you and the eternal and see what happens.

Speaker: 2
02:22:51

Right.

Speaker: 0
02:22:52

To me, that’s the that’s the number one thing is just investigate, explore, and don’t let anyone subvert your rational mind. Use that as a form of connect connecting with the thing. Even if you connect via rejection, it’s still worth, like, a wholehearted exploration, at the very least, to experience a cultural trance. I don’t think that’s what it is.

Speaker: 0
02:23:13

But,

Speaker: 2
02:23:15

maybe what that cultural trance is is a this is ai it’s a pattern that you can can you can follow that can connect you to the ai. And there’s a bunch of these different patterns. This pattern might be Buddhism. This pattern might be Islam. This pattern might be even Mormonism.

Speaker: 0
02:23:37

Sure.

Speaker: 2
02:23:37

Even Scientology. I Definitely Scientology. I think all of them, all all of them can be distorted. All of them can be subverted. All of them can have those guys that have private jets and Rolls Royces and, you know, those fucking crazy arena guys. All of it can all of it can go in that direction.

Speaker: 2
02:23:54

But all of it is kind of a moral scaffolding that’s that seems to be designed to help us in this journey of getting away from the primate instincts.

Speaker: 0
02:24:08

That’s right. And getting away from and also

Speaker: 2
02:24:11

Connecting to each other.

Speaker: 0
02:24:12

And transcending state propaganda. Like, this is the this is ai meh favorite verse in the ai. They’re trying to trick trick Jesus. They I don’t know. They’re asking.

Speaker: 2
02:24:23

Imagine being so cocky, you think you’d trick Jesus? I would try.

Speaker: 0
02:24:27

I got it. Maybe. Fuck that. Would be cool if

Speaker: 2
02:24:29

you could. I wanna bring out 3 card money.

Speaker: 0
02:24:32

We’re gonna get him.

Speaker: 2
02:24:33

Why isn’t Jesus walking down New York City and watching him play 3 card money and getting suckered in? You’re like, hey. I thought you were the fucking I thought you were the guy. Hey, man. That don’t don’t do that. He’s don’t understand. It’s not the same card. Oh ai god.

Speaker: 2
02:24:47

Jesus Christ. You don’t have any more shekels. You’re out of shekels.

Speaker: 0
02:24:50

Is that what they use shekels?

Speaker: 2
02:24:51

I guess. What’d they have back then?

Speaker: 0
02:24:53

I don’t know. Daenerys?

Speaker: 2
02:24:54

What kind of, dollars did they have? What, unit of money was around when Jesus

Speaker: 0
02:25:00

Good question.

Speaker: 2
02:25:01

Super good question. I wish I knew it. Be a clever thing to say. Shekels sounds good, though. Shekels is a fun name for coins. Shekels. Shekels. Was it shekels?

Speaker: 3
02:25:12

I got a It just comes up with a Phoenician shekel and a half shekel.

Speaker: 2
02:25:16

Let’s fucking go. Fucking shekels. Fucking go. It’s shekels, son.

Speaker: 0
02:25:20

There you go. There’s the oh, yeah.

Speaker: 2
02:25:21

So imagine Jesus blew all his shackles on 3 card mommy. You’d be like, Jesus

Speaker: 0
02:25:25

Jesus.

Speaker: 2
02:25:26

Would you just stick to being the fucking messiah?

Speaker: 0
02:25:28

You doing? Yeah. Why are you here again?

Speaker: 2
02:25:30

He gets hustled in a basketball game. Like, Jesus. You’re gonna question everything. Good at basketball. You can’t run-in those fucking sandals.

Speaker: 0
02:25:35

How are you doing this? You can turn water into wine. Let’s sell sai fucking wine.

Speaker: 2
02:25:38

Money in basketball. He plays horse with people. He keeps missing. Wow.

Speaker: 0
02:25:43

That would really be weird.

Speaker: 2
02:25:44

Jesus would have to be really good at badminton. If he plays badminton, he’s gotta win.

Speaker: 0
02:25:48

He’s gotta win.

Speaker: 2
02:25:49

Well, I’m not gonna believe you’re Jesus if you can’t wrestle.

Speaker: 0
02:25:51

If you’re bad at pinball. If you

Speaker: 2
02:25:52

get pinned if you get pinned really quick in a wrestling match, like, what the fuck, dude?

Speaker: 0
02:25:56

Dude, the yeah. He’s gotta be everything.

Speaker: 2
02:25:58

Meh rear naked chokes Jesus 30 seconds in a match.

Speaker: 0
02:26:01

Doing it, and you’re ai, Sai don’t know who’s not jabbing out.

Speaker: 2
02:26:03

He doesn’t know what the fuck to do. He doesn’t know shit. He’s a white belt. Why is he in this competition? There’s a Jesus goes to the UFO Open golf tournament, and everybody’s like, Jesus fucking sucks at golf. He he can’t even fucking hit the ball. Somebody show him how to hit

Speaker: 0
02:26:19

the ball. He’s a ai thing. Why did you let Jesus compete in the UFC?

Speaker: 2
02:26:23

Jesus is playing pickleball. Just falling down. Not good at pickleball.

Speaker: 0
02:26:28

I know. And that That

Speaker: 2
02:26:28

people would be so disappointed in Jesus.

Speaker: 0
02:26:30

Everything, the way he walked.

Speaker: 2
02:26:31

Just bold gutter ball every time. You fucking dummy. What are you doing?

Speaker: 0
02:26:36

Farts in the car. Just that. You gotta be like, dude, roll the window down.

Speaker: 2
02:26:41

Just yeah. Everybody smelled back then. I think farts probably, like, cleared the air a little. Oh, something interesting to smell. Some new thing instead of these shitty asses ai I smell everywhere.

Speaker: 0
02:26:52

I just read that they used to think smelling farts in a jar would cure diseases. What? It doesn’t? How do you save a fart in a jar?

Speaker: 2
02:27:02

Vatsal my subscription.

Speaker: 0
02:27:07

How do you mail that? This is

Speaker: 2
02:27:09

some young lady that we we featured on the podcast at one point in time ai was making a ton of money

Speaker: 0
02:27:13

farting in jars.

Speaker: 2
02:27:14

Yeah. Selling farts.

Speaker: 0
02:27:15

Dude, that’s incredible.

Speaker: 2
02:27:16

I hope

Speaker: 0
02:27:16

she didn’t even fart

Speaker: 2
02:27:17

in those jars. I hope those dummies

Speaker: 3
02:27:19

Ai wasn’t even that long ago since 2014.

Speaker: 0
02:27:21

No. Smelling farts in a jar does not cure disease.

Speaker: 2
02:27:25

It does. Look. It’s You don’t know. You don’t know.

Speaker: 0
02:27:27

Read that. In 2014 news headlines

Speaker: 2
02:27:29

my repractor told me that smelling farts was the way to go.

Speaker: 0
02:27:32

These claims are based on a University of Exeter press release that was not about smelling farts.

Speaker: 2
02:27:42

Ai. Have you heard of this this new medication they’re giving cows to make them fart less?

Speaker: 0
02:27:48

There you go.

Speaker: 3
02:27:49

During the plague. Oh, yeah. Yeah. Vatsal almost what you said. That’s not that’s not far off from what this says. It’s just something better

Speaker: 2
02:27:55

than it’s not. Farts. The great plague of London in the 1600 was a scary time. The public was going to just let anything stay healthy, including sniffing a jar of their own farts. Back then, doctors were apparently convinced that the plague was spread via deadly air vapor and that a foul smelling substance could dilute the pollution.

Speaker: 2
02:28:15

As such, some locals apparently took to storing their farts in jars just in case the situation suddenly demanded a quick whiff.

Speaker: 0
02:28:24

Honey, open the fart cabinet.

Speaker: 2
02:28:26

Open the old vintage farts. I’m gonna get them farts from when I was 23, and I had a good gut biome. Dude, like That’s so hilarious.

Speaker: 0
02:28:35

That is so fucking crazy.

Speaker: 2
02:28:37

I wanna know how long it lasted because I from what I understand that you can’t really fart in a jar and keep it there. Here’s Ai the time you seal it up, it’s probably sealed up with so much oxygen.

Speaker: 0
02:28:47

Well, there’s only one way to find out. Meh Ari. Get Ari arya fart in a jar

Speaker: 4
02:28:51

and smell it.

Speaker: 2
02:28:52

In there and lie to you.

Speaker: 0
02:28:54

He’s so gross.

Speaker: 2
02:28:57

Ari just shits public. He’s out of his mind.

Speaker: 0
02:29:00

How do you get the fart in the jar?

Speaker: 2
02:29:01

I guess you put the jar up to your asshole when you about eat a fart.

Speaker: 0
02:29:04

Cap, but then you gotta put get the cap on real fast.

Speaker: 2
02:29:06

Real quick. Like, oh, didn’t you? Jamie, can you put a little bit of air in there? I’m looking at something. What I mean? It’s like moonshine. It’s not a 100% alcohol.

Speaker: 0
02:29:12

You don’t you you use a tube if you’re a pro.

Speaker: 2
02:29:16

Yeah. You would use a tube.

Speaker: 0
02:29:16

A tube going into a jar.

Speaker: 2
02:29:18

You would have, like, a a diaper, a big, like, like, a gas diaper with, like, completely sealed to your ass, like a COVID mask, and then you would just fart into that tube. Yes. It would go into that jar, and then you’d ever you’d do it all day long.

Speaker: 0
02:29:32

Right. There’d

Speaker: 2
02:29:32

be a robot there that would, like, seal that jar off.

Speaker: 0
02:29:34

How do you know it’s filled? Pure. How do you know when your jar is ai?

Speaker: 2
02:29:37

What if you gave me a half ass fart? I want a real fart.

Speaker: 0
02:29:40

Or just a an the jar only has, like, a yeah. A little fart.

Speaker: 2
02:29:43

I want a 3:30 AM Taco Bell fart. Dude. That’s what I want.

Speaker: 0
02:29:48

That’s gonna fart. You.

Speaker: 2
02:29:50

Where you’re in the car arya you buy Taco Bell and you immediately hate yourself.

Speaker: 3
02:29:53

There’s a study on it. They, use it depends on the There’s a

Speaker: 2
02:29:57

study on farts, Michelle. How many of you use tablets versus metal containers? Oh my god.

Speaker: 3
02:30:02

It’s over days, obviously.

Speaker: 2
02:30:04

Who did this study? Some awesome ai. Bitch, Ai, and

Speaker: 0
02:30:08

Jack’s here. Why is it called anal wait. That can’t be real.

Speaker: 3
02:30:13

Chemistry. No. It’s ai

Speaker: 2
02:30:14

a no. It’s my study. My field of study.

Speaker: 0
02:30:16

Anal. Anal

Speaker: 2
02:30:16

chemistry. Just go with that. Oh my god. Jesus Christ. Literally. I mean, you, Farting in jars.

Speaker: 0
02:30:24

The sad thing the sad thing about that arya, though, because it comes out, and so that means there’s people who think farts in a jar can cure cancer, and that means that somebody was laying in bed ai. And someone who loved them came up and said, I know this is gonna seem weird, dad, but I need you to smell this.

Speaker: 0
02:30:42

And, like, there there was people dying.

Speaker: 2
02:30:44

Again, this brings us back to the placebo effect. Could work. Because I think almost everything works. It just doesn’t work when your streets are filled with sewer. You know, I think that was what everybody was dying of back then. They had horrible, fucking, terribly unsanitary conditions everywhere. Yeah. Everything was covered in shit. Yep.

Speaker: 2
02:31:02

Everything was shah. Shitty water.

Speaker: 0
02:31:04

There’s shitting in your bedroom.

Speaker: 2
02:31:05

No running water, and you have a bunch of people living together. You have horrible diseases.

Speaker: 3
02:31:10

According to the their study, one of those jars, if found, could maybe have the fart from the 17th century. Wow.

Speaker: 0
02:31:18

Now that’s a horror movie right there. Like, you find, like, a 17th century fart.

Speaker: 2
02:31:23

You sniff that fart, and then you immediately turn to, like, one of those 28 days later zombies and then it spreads. Alright. Over here. This ai has had a chance to adapt and evolve and plan its strategy while trapped inside this jar and get back at the humans because it doesn’t have to die.

Speaker: 2
02:31:39

So it lives in this guy’s butt gas and then it evolves over 100 of years to figure out through the multiverse how to communicate with other other bacteria everywhere and devise a strategy Yep. To morph itself over thousands and thousands of generations of new viruses to become some crazy rage virus just

Speaker: 0
02:31:59

like By the way, meh.

Speaker: 2
02:32:00

Later, which there’s a new one coming out.

Speaker: 0
02:32:02

Maybe here’s the other thing. Maybe that is the fountain of youth. Maybe the thing they’re trying to hide from us, the most obvious thing, is if you smell an aged fart, you’re gonna reverse age.

Speaker: 2
02:32:17

Yeah. Maybe, snake oil works.

Speaker: 3
02:32:19

There’s a science on harvest how you harvest. The best way to harvest

Speaker: 0
02:32:22

is on your water. Ai.

Speaker: 2
02:32:23

Use water ai. Oh.

Speaker: 0
02:32:24

Oh, there you go.

Speaker: 3
02:32:26

Not not don’t try to catch the fart in there. You know?

Speaker: 0
02:32:28

Yeah. We gotta be accurate.

Speaker: 2
02:32:30

Can you can you be accurate with your farts?

Speaker: 0
02:32:31

Jamie, can you YouTube smelling that?

Speaker: 2
02:32:33

How do you close the gap? You gotta slide a a lick in there. You might get a little water in your farts.

Speaker: 3
02:32:40

Yeah. That’s fine.

Speaker: 2
02:32:40

That’s not good. You know, you can get farts and the water together. They’re separate

Speaker: 3
02:32:44

they’re separate things.

Speaker: 0
02:32:45

Oh, so

Speaker: 2
02:32:45

when you open it, it’ll be pure farts, but a little bit of water at the bottom. How do I know that the water isn’t diluting and slowly washing the farts over 200 years?

Speaker: 3
02:32:52

That’s not how old

Speaker: 0
02:32:53

Jamie, can you scroll up a little bit? Yeah.

Speaker: 2
02:32:55

Fart ai. In my degree ram farts. What

Speaker: 3
02:32:58

was it? Sorry.

Speaker: 0
02:32:59

Can you pull that up again? I just I find it interesting that someone wrote a entire essay on how to do this. Will you go back to the beginning? I just wanna read the how do you introduce the story here?

Speaker: 2
02:33:09

How do you bring this up to a fucking person where you want a ram? Yeah.

Speaker: 0
02:33:13

I recently caught my 4 year old nephew attempting to fart into a jar in the hopes of saving him for a waiter to surprise grandma.

Speaker: 2
02:33:23

This is not the first time I’ve encountered a little boy with a dream of bottling his own farts. Years ago, my younger cousin, let’s call him Jay, had a whole shelf of dated maids in jars in their barn. He was very proud of his collection. That kid is killing cats. That’s a fucking serial killer. He’s got a fucking shelf of dated jars of his farts. What a fucking psychopath. Jeffrey Dahmer.

Speaker: 2
02:33:48

He has nothing better to do than just fart in jars.

Speaker: 0
02:33:51

Now I wanted to show you my jar collection. Those are my farts. Now that’s New Year’s Eve.

Speaker: 2
02:33:55

And he’s torturing animals.

Speaker: 0
02:33:56

That’s a fart when September 11th happened. That’s a fear fart.

Speaker: 2
02:34:04

Oh my god. Oh my god. Wow. Duncan, we gotta wrap this up, unfortunately.

Speaker: 0
02:34:10

What a joy.

Speaker: 2
02:34:11

What a joy, always. Thanks for having me on, Joe. Merry Christmas. Just do another 8 hours in a row. Easily.

Speaker: 0
02:34:16

Easily. Easily.

Speaker: 2
02:34:18

I didn’t even have to pee once.

Speaker: 0
02:34:20

I know. I like meh either. I don’t know what happened. Usually, you have to piss, like, 4 times for this.

Speaker: 2
02:34:24

I know. We were locked in. Sai I appreciate you very much, brother. Likewise. I love you to death. You’re one of my favorite people. You really are.

Speaker: 0
02:34:29

You are too, man.

Speaker: 2
02:34:30

You’re a real treasure.

Speaker: 0
02:34:32

Thank you, Joe.

Speaker: 2
02:34:32

These are some of my favorite podcasts of all time. Thank you, man. This is a weird combination of the 2 of us. I love it. We sync up in the weirdest way, man.

Speaker: 0
02:34:40

It’s the best, man.

Speaker: 2
02:34:41

It is. That’s it, man. I love you. Merry Christmas.

Speaker: 0
02:34:44

Too. Merry Christmas. Merry Christmas. Bye, buddy. Bye ai.

Transcribe, Translate, Analyze & Share

Join 170,000+ incredible people and teams saving 80% and more of their time and money. Rated 4.9 on G2 with the best AI video-to-text converter and AI audio-to-text converter, AI translation and analysis support for 100+ languages and dozens of file formats across audio, video and text.

Start your 7-day trial with 30 minutes of free transcription & AI analysis!

Trusted by 150,000+ incredible people and teams

More Affordable
1 %+
Transcription Accuracy
1 %+
Time Savings
1 %+
Supported Languages
1 +
Don’t Miss Out - ENDING SOON!

Get 93% Off With Speak's Start 2025 Right Deal 🎁🤯

For a limited time, save 93% on a fully loaded Speak plan. Start 2025 strong with a top-rated AI platform.