#2344 – Amjad Masad

Amjad Masad is the founder and CEO of Replit, a cloud-based coding platform. He is also an outspoken voice on cultural and educational shifts in technology. www.replit.com The ultimate wireless hack. Make the switch at Visible dot com. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcribe, Translate, Analyze & Share

Join 170,000+ incredible people and teams saving 80% and more of their time and money. Rated 4.9 on G2 with the best AI video-to-text converter and AI audio-to-text converter, AI translation and analysis support for 100+ languages and dozens of file formats across audio, video and text.

Start your 7-day trial with 30 minutes of free transcription & AI analysis!

More Affordable
1 %+
Transcription Accuracy
1 %+
Time & Cost Savings
1 %+
Supported Languages
1 +

You can listen to the #2344 – Amjad Masad using Speak’s shareable media player:

#2344 – Amjad Masad Podcast Episode Description

Amjad Masad is the founder and CEO of Replit, a cloud-based coding platform. He is also an outspoken voice on cultural and educational shifts in technology.

www.replit.com

The ultimate wireless hack. Make the switch at Visible dot com.

Learn more about your ad choices. Visit podcastchoices.com/adchoices
This interactive media player was created automatically by Speak. Want to generate intelligent media players yourself? Sign up for Speak!

#2344 – Amjad Masad Podcast Episode Top Keywords

#2344 - Amjad Masad Word Cloud

#2344 – Amjad Masad Podcast Episode Summary

Podcast Episode Summary

Key Points & Major Topics:
– The episode features Joe Rogan and Amjad Masad, founder of Replit, discussing technology, gaming, AI, entrepreneurship, health, and current social issues.
– They begin by exploring the positive and negative impacts of video games, noting studies that show gamers (including surgeons) can have improved dexterity and error rates, but also discussing the downsides of passive consumption (e.g., streaming, TikTok).
– The conversation shifts to the addictive nature of technology and social media, the rise of voyeuristic behaviors, and the need for more active creation rather than passive consumption.
– Amjad shares his personal journey from a Palestinian refugee family, his early interest in computers and programming, and how gaming led him to software development.
– They discuss the mission of Replit: democratizing programming and enabling anyone to build software, with AI as a tool to lower barriers to entry.
– The episode covers the impact of AI on jobs, the future of work, and the potential for AI to empower entrepreneurship and creativity, rather than just replace human labor.
– They debate the risks of AI, the limitations of current models, and the importance of human creativity and consciousness, referencing philosophical and scientific perspectives.
– The conversation touches on the Israel-Palestine conflict, censorship, free speech, and the role of social media in shaping public discourse.
– They discuss health trends, sobriety, physical fitness, and the importance of discipline, with practical tips on cold plunges, diet, and exercise.
– The episode ends with Amjad sharing stories about hacking his university, the value of resilience, and the importance of authenticity and direct communication in business.

Important Guests/Speakers:
– Amjad Masad: Founder of Replit, shares insights on programming, AI, entrepreneurship, and his personal background.
– Joe Rogan: Host, provides commentary, personal experiences, and steers the conversation across diverse topics.

Actionable Insights & Tips:
– Encourage active creation (coding, building apps) over passive consumption.
– Use AI tools to automate tedious tasks and unlock creativity.
– Embrace discipline in health (cold plunges, diet, exercise) for mental and physical benefits.
– Value authenticity and direct communication in business and media.
– Stay critical of mainstream narratives and seek diverse sources of information.

Recurring Themes/Overall Messages:
– Technology and AI are double-edged swords: they can empower or distract, depending on use.
– Human creativity, consciousness, and discipline remain irreplaceable.
– Free speech and open debate are essential for societal progress.
– The future of work will require adaptability, lifelong learning, and entrepreneurial thinking.
– Authenticity and resilience are key to personal and professional success.

This summary was created automatically by Speak. Want to transcribe, analyze and summarize yourself? Sign up for Speak!

#2344 – Amjad Masad Podcast Episode Transcript (Unedited)

Speaker: 0
00:01

Joe Rogan podcast. Check it out.

Speaker: 1
00:03

The Joe Rogan experience.

Speaker: 0
00:06

Train ai day. Joe Rogan podcast by night, all day. What’s up? What’s

Speaker: 2
00:13

up, man? Good. So, having this, big Counter Strike tournament in town, does that give you the Joneses?

Speaker: 1
00:19

Totally. Totally. You know, it’s like your so you you you your guy, Jason, was telling me about it, because, you know, in addition to driving, he also, you know, flies the, helicopter. And he told me, like, the Red Bull guys were, like, flying off, and there’s, like, this big tournament. I looked it up. It was like, oh, Counter Strike.

Speaker: 1
00:37

So I used to be a bit of a pro player myself.

Speaker: 2
00:40

So, how do you get out of pro playing? Because the problem with, like, playing games is that it’s essentially ai an eight hour a day thing. Like, it becomes a giant chunk of your life. Right? And I would imagine if you’re playing pro, it’s even more of a commitment.

Speaker: 1
00:54

You know, I I take a different view on on on games. You know, a lot of people kind of view it as a as a sort of somehow, like, a negative thing, especially for kids. Actually, my, I got my kid my four year old, like, a Nintendo, Switch early on. We’re playing together because I feel like, for me, it helped me a lot with, like, strategy thinking, with reaction time. Mhmm.

Speaker: 1
01:17

I think, like, gamers tend to be can think think really fast sana Have you seen

Speaker: 2
01:21

the the studies that they’ve done about surgeons?

Speaker: 1
01:23

No. Tell

Speaker: 2
01:23

me. Surgeons that play video games regularly are much less likely to make mistakes.

Speaker: 1
01:29

Ai totally believe it. Yeah.

Speaker: 2
01:30

Something in the neighborhood of twenty five percent. Is that what it is, Jamie? Something like that? But so much so that I would say you should teach video games to surgeons. Like, it it should actually be ai a required thing ai cross training.

Speaker: 1
01:43

Right. Isn’t the army also recruiting from gamers today as well? That’s what I heard.

Speaker: 2
01:47

Imagine, like, drone pilots. Right. Right? I mean, that would make a big difference. Yeah. Especially if you can get them used to, like, the same controllers.

Speaker: 1
01:53

Totally.

Speaker: 2
01:54

You know, because, you know, those controllers kinda become a part of your hand. Like, you know exactly where all the buttons arya.

Speaker: 1
01:59

Right.

Speaker: 2
02:00

If you’re a kid that’s playing fucking Counter Strike or whatever it is Yeah. Call of Duty every day.

Speaker: 1
02:05

Totally. I

Speaker: 2
02:06

would imagine that that just becomes

Speaker: 1
02:08

Dexterity. Yeah.

Speaker: 2
02:09

Yeah. What is the thing with surgeons? It’s nuts. Right?

Speaker: 0
02:12

I’ll look it up.

Speaker: 2
02:13

It might be higher than 25%.

Speaker: 0
02:15

It was a very particular kind of surgery though too, but it was ai, I mean, they’re almost using control or something.

Speaker: 2
02:20

Yeah. But this that they were making less mistakes. Ai I don’t think it’s entirely negative.

Speaker: 0
02:26

Mhmm. Ai

Speaker: 2
02:26

because I love games. I love playing them. But I love them so meh that I don’t play them because I know I don’t have any time.

Speaker: 1
02:32

It’s Quake is your your

Speaker: 0
02:34

favorite game.

Speaker: 1
02:34

Right? Yeah.

Speaker: 2
02:35

Yeah. Sai here 27 oh, 37 decrease in errors.

Speaker: 1
02:39

That’s wild.

Speaker: 2
02:40

27% faster task completion time. That’s nuts.

Speaker: 1
02:43

And so so those guys grew up, playing video game or did they It’s get them on video games?

Speaker: 2
02:49

Sai more than three hours per week.

Speaker: 0
02:50

I think they were still playing when they were the study.

Speaker: 2
02:53

Yeah. Sai, like, that I mean, imagine something that you ai, a pill you could take that would give you a 37% decrease in errors and a 20% 27% faster task completion.

Speaker: 1
03:04

Right. That would

Speaker: 2
03:05

be an incredible pill.

Speaker: 0
03:06

Yeah.

Speaker: 2
03:06

Like, you would you would make every surgeon take it. Did you take your video game pill before you do surgery? Hey, man. Don’t operate on my fucking brain unless you take your video game pill.

Speaker: 1
03:15

You know, that’s that’s, you know, next time I’m I need to have a surgery or whatever, I’m just gonna ask the doctor.

Speaker: 0
03:19

Is that a game? How much should it be, bro?

Speaker: 1
03:23

But you and Jamie and I were talking about the one thing, and maybe that’s sana showing showing our age a little bit, but the one thing that’s kind of, like, a little weird shah, don’t know, somehow, like, a little dystopian is the whole streaming situation where, like, kids are not, like, playing the game. They’re, like, watching someone play the game.

Speaker: 2
03:39

Yeah. It’s not good.

Speaker: 1
03:40

And it’s, like, this zombifying thing where, like, they’ll

Speaker: 0
03:43

Mhmm.

Speaker: 1
03:43

They’ll spend hours just watching people.

Speaker: 2
03:46

Yeah. Just TikTok ing, it’s essentially ai TikTok, but video games. Right? Because TikTok is ai of this mindless thing. You’re just scrolling through mindless things, and now you’re mindlessly watching someone else play a game.

Speaker: 1
03:57

Yeah. Yeah. It’s almost like someone is like, there’s this strange thing with technology where, like, someone is living life and doing things, and you’re, like, sort of it’s almost voyeurism or something like that about it. You know, David Foster Wallace, you know, the guy from Infinite Jest

Speaker: 0
04:15

Mhmm.

Speaker: 1
04:16

Wrote a, wrote a essay on on TVs. And, you know, he he he committed suicide before, before, like, you know, the emergence of of mobile phones and things like that. But he was very prescient on the impact of technology on on on society and especially on on America. And he was also, like, addicted to TV, and he he talked about how, you know, it activates some kind of something in us that is, you know, something in human nature about voyeurism.

Speaker: 0
04:45

Mhmm.

Speaker: 1
04:46

And that’s the thing that that television and TikTok and things like that activate. And it’s just ai this negative, addictive ai of behavior that’s, like, really bad for society.

Speaker: 2
04:56

I definitely think there’s an aspect of voyeurism, but there’s just a a dull drone of attention draw. There’s a dullness to it that just, like, sucks you in ai slack jawed.

Speaker: 0
05:08

Yeah. You’re

Speaker: 2
05:09

just watching nonsense over and over ai over again that does just enough to captivate your attention. Yeah. But doesn’t excite you, doesn’t stimulate you. Yeah. Doesn’t necessarily inspire you to do any. That is the first fly we’ve ever had in this room. Boom. Oh, I was gonna kill it.

Speaker: 2
05:25

That was the you’re a nice person. Yeah. That’s evil people

Speaker: 0
05:29

would have

Speaker: 2
05:29

killed that fly right away. But it’s, it’s just this thing where it doesn’t do a lot. It’s not ai, you know, like have you ever done Disney World?

Speaker: 1
05:41

Yeah. Do you did you

Speaker: 2
05:42

ever do Disney World in Florida where you do that throughout the there’s the, Avatar ai?

Speaker: 1
05:46

No. I I just went to a California one.

Speaker: 2
05:48

Okay. Yeah. The Avatar ride is, Flights of Freedom?

Speaker: 0
05:52

Flights of Passage.

Speaker: 2
05:53

Ai of Passage. Mhmm. You’re it’s a VR game.

Speaker: 0
05:57

Mhmm.

Speaker: 2
05:57

Well, a a, you know, a ride rather. And you put on a VR helmet and you get on this like, motorcycle looking thing, you’re essentially riding a dragon. It’s unbelievably engaging.

Speaker: 0
06:07

Wow.

Speaker: 2
06:07

It’s incredible. It’s the best ride I’ve

Speaker: 1
06:09

ever been on in my life.

Speaker: 2
06:10

Yeah. That’s cool. Like, you’re flying around, you feel the breeze, you’re on this thing and the sounds are incredible. That’s like engrossing. Right?

Speaker: 1
06:17

Yes.

Speaker: 0
06:18

Like it

Speaker: 2
06:18

takes over you.

Speaker: 1
06:19

Simulating. Right. Yeah.

Speaker: 2
06:20

But that’s not what you’re getting from ai TikTok or like streaming. You’re getting this

Speaker: 1
06:25

Right.

Speaker: 2
06:26

This dull so it’s sustainable.

Speaker: 1
06:29

Yeah. I wonder which is worse, this sort of ai opium habit or something.

Speaker: 2
06:34

I know people that have done opium that are ai functional.

Speaker: 1
06:37

Yeah.

Speaker: 2
06:38

You know, they can they can take pills and ai kinda I mean, I’m sure eventually their life falls off the rails, but it’s, like, sort of semi they’re semi functional when they’re on these things. They can hold down a job and show up every day, and they’re just, like, semi functional opioid.

Speaker: 1
06:56

There’s a dude that I watched, like, a YouTube video, but, like, he’s known for having this contrarian opinion on on drugs that, you can, like, control it. Like, you can you can do these drugs.

Speaker: 2
07:06

What does he look like?

Speaker: 1
07:07

I don’t know. I I think he’s a black dude.

Speaker: 2
07:10

Oh. Carl Hart. Carl Hart. Doctor Carl Hart. He was here? Yeah. Yeah. He’s been here a couple ai. He’s great.

Speaker: 1
07:15

What do you think of his ideas?

Speaker: 2
07:17

I think it’s entirely biologically variable. I know people that cannot drink. They they drink and then they’re gone. They meh, like, hamsterized, like like they’ll, like, get these black eyes where they’re, like, their soul goes away and then they’re just off to the races and picking up hookers and doing cocaine and they find themselves in Guatemala.

Speaker: 2
07:37

Oh, shit. They’re just nuts. They can’t drink.

Speaker: 1
07:40

Yeah.

Speaker: 2
07:40

I can drink. I I I don’t I don’t pretend that the way my body handles alcohol is the way everybody’s body handles alcohol. I think that’s the the same with everything. I think that’s the same most certainly, with marijuana. Ai know some people that just cannot smoke meh, and other people, it’s fine.

Speaker: 0
07:58

Yeah.

Speaker: 2
07:59

I think it’s very we’re all very different

Speaker: 1
08:02

physically. It’s interesting, alcohol, is is sort of on the downtrend all of America, but but, especially with young people, especially in Silicon Valley. Mhmm. Everyone there listens to Huberman. Ai call him the ram mufti of Silicon Valley because he’ll sai, no no alcohol, no drinking. Everyone’s like, don’t drink him.

Speaker: 1
08:25

Like, all the parties are now, like, mocktails and and and things like that. And There

Speaker: 2
08:28

are probably a lot of boring conversations, unfortunately.

Speaker: 1
08:31

Yes. It’s a little boring. I mean, it’s very repetitive. It’s all kind of like will AI kill us and

Speaker: 2
08:36

kind of

Speaker: 1
08:37

discuss stuff. Yeah.

Speaker: 2
08:38

You guys would know better than anybody. Yeah. You guys are at the forefront of it, unfortunately. Yeah. I quit drinking. I drink quit drinking over three months ago.

Speaker: 1
08:48

Oh, wow. I know you guys did the used to do the Sober October.

Speaker: 2
08:51

Yeah. Yeah. And that wasn’t that hard. And, you know, I was like, it’s gonna be one whole month. And then Ai did, I was like, that’s pretty easy. But I just had some revelations, I guess. And this Ai think the big one is just physical fitness. I work out so meh, and Ai would drink and go go to ai club and have a cup not a lot either.

Speaker: 0
09:12

Mhmm.

Speaker: 2
09:12

Just have a few drinks, and the next day just feel like total shit. Mhmm.

Speaker: 1
09:16

I think with with age, especially, it starts affecting your heart.

Speaker: 2
09:19

It’s always been like that. It always been like that. I’ve always been hungover after a night of drinking. But it’s you don’t feel it normally. Like in normal life, if I just did normal stuff, it’d be fine. It’s when you’re in the gym that you notice. Right. When you’re doing like second and third set of squats or something like that, you’re like, oh, God.

Speaker: 1
09:38

Yeah. ai. Yeah.

Speaker: 2
09:40

And I haven’t had any bad days since I quit drinking.

Speaker: 1
09:43

Oh, cool.

Speaker: 2
09:43

It I’ve eliminated all vatsal. And I’m like, just that alone is worth it.

Speaker: 0
09:49

Yeah.

Speaker: 2
09:49

That for just that alone, it’s worth quitting.

Speaker: 1
09:51

So, you why do you think there’s a there’s this trend? Is it is it mostly for health?

Speaker: 2
09:56

Ai, the Yeah. Yeah. Well, I think there’s a big health trend with a lot of young people. I think a lot of young people are recognizing the value of supplements. Mhmm. And there’s that fly. There’s sai difference between you and me. I’m gonna kill this motherfucker.

Speaker: 2
10:07

First fly I’ve ever had in here, Jamie. That’s kinda crazy. Been here five years, one flight.

Speaker: 1
10:12

With me from California.

Speaker: 2
10:13

Ai he snuck in because there’s a lot of steps that motherfucker has to go through to get into this room. I think a lot of people are very health conscious. It’s the the rise of cold plunging and sauna use and, you know, all these different things ai intermittent fasting where people are really paying attention to their body and really pay attention.

Speaker: 2
10:32

And and noticing that if you do follow these steps, it really does make a significant difference in the way you feel. And maybe more importantly, the way everything operates, not just your body, but your brain. It’s like your your function, your cognitive function improves with physical fitness.

Speaker: 1
10:49

Mhmm.

Speaker: 2
10:49

And, you know, if you’re an ambitious person and you sana do well in life, you want your body to work well, it’s your you know, alcohol is not your friend.

Speaker: 1
10:58

Yeah. And I wonder how much of it is is your impact because, those things, you got me into all these things, you know, through your podcast. My my wife and I, just built, like, this a small ai of spa on our on our home with, like, a cool plunge, and a and a sauna and a hot tub, and I’ll try to do it every day.

Speaker: 1
11:19

You know, something you sai, I keep saying to myself as, like, conquer your inner bitch. Yeah. Ai it’s just such a good and I feel like, cold plunge, especially, kind of, it’s just something regardless health benefits or not, something about it, ai, just mental toughness. Mhmm. Like, trying to do it every day. Yeah.

Speaker: 1
11:40

And every day, I chicken out

Speaker: 2
11:41

every day. I wanna go. I don’t

Speaker: 1
11:43

wanna go in and write, but

Speaker: 2
11:45

I do too. Yeah. Ai my inner bitch speaks the loudest when I’m lifting the lid off the coal plunge. Ai inner bitch is ai, don’t do this. You don’t have to

Speaker: 0
11:53

do this. You don’t have to.

Speaker: 2
11:54

You could do whatever you want. You’re a free man. You can go have a sandwich, you know?

Speaker: 1
11:58

Right. Right.

Speaker: 2
11:59

But you just gotta decide that you’re the boss.

Speaker: 1
12:02

Yeah. And and I think, a lot of what discipline is for me is is that, again, even even keto, and I did carnivore and and these ai, ai, I’m not sure how much health benefits there is. I feel like keto is is really good on your, like, blood sugar and keeps you kind of on a, you know, even keel Mhmm.

Speaker: 0
12:21

Kind of

Speaker: 1
12:21

through throughout the, the day. But for me, whenever there’s, like, a lot of chaos in my life, I look at what can I control? Right. And, typically, diet is the first thing. Whatever it is, I’m like, oh, sure. I’m gonna go carnivore. I’m gonna go keto. And the the fact that I can control that and and and enforce discipline on myself kind of puts me puts me at ease, and I feel like I can control the other thing in my business, family, or

Speaker: 2
12:48

Yeah. Ai. But that mindset is probably how you stop playing video games every day. Yeah. Because I would imagine, ai, we were talking about earlier, like, that addiction is one of the strongest addictions I’ve ever faced in my life. Right. Like, when I was taught if I would be talking to people and the conversation was boring, I’d be like, I could be playing Quake right now. Right.

Speaker: 2
13:06

Why am I here having this boring ass conversation where I could be launching rockets at people? Right. And having a good time.

Speaker: 1
13:13

But the other thing, for me is ram. So I got into programming, early in my life. I was six years old when my father bought a computer. Ai was born and raised in Amman, Bryden, and, you know, we’re the, like, first people in I know ever, yeah, at the time that had a that had a computer.

Speaker: 1
13:34

And I remember What

Speaker: 2
13:35

year was this?

Speaker: 1
13:35

1993. I was six years old.

Speaker: 2
13:37

Okay. So ’93. So what kind of computer was out? Those that are,

Speaker: 1
13:41

like, an old school IBM? IBM PC, MS DOS, OS and DOS.

Speaker: 2
13:46

Wow. So you did the real deal.

Speaker: 1
13:48

Yeah. I know a lot of, Americans, would would, like, get a Mac as

Speaker: 2
13:53

their first ai. Yeah. Yeah.

Speaker: 1
13:54

Yeah. We, no. We didn’t have Meh. I actually wasn’t introduced to Apple until until kind of recently in my in my life. Really?

Speaker: 2
14:01

Yeah. Like, recently, recently?

Speaker: 1
14:02

Ai, no. Like, you know, twelve years ago, thirteen years ago when I moved to The US.

Speaker: 2
14:06

Ai, Apple has such a stranglehold in America. Mhmm. It’s really incredible.

Speaker: 1
14:10

Yeah. It’s it’s amazing. But, you know, we we didn’t know much about it, so I Ai got into into Dawson. I remember one of my earliest memories is, you know, standing behind my father as he was kind of pulling up this, like, huge manual and, like, learning how to, like, type commands.

Speaker: 1
14:26

And he was, like, you know, finger typing those commands. And and then I would, like, watch him. And then and then after he leaves, I’ll go sana, like, try those things. And one day, he called me. I was like, what are you doing?

Speaker: 1
14:36

Ai, I know how to do this. I’ll I’ll show you. And so I knew how to, like, start games, like, do a little ram, do a little bit of scripting, and and, you know, that’s how I got into into computers. And Ai was obsessed. And initially, it sort of got me into gaming. But then, you want to mod the games. Have you ever done any modding?

Speaker: 2
14:57

I’ve done a few things, like turn textures off and stuff like that.

Speaker: 1
15:02

And yeah. Yeah. And that that’s another thing that I think is healthy about gaming is, like, a gateway to programming.

Speaker: 2
15:07

Sure.

Speaker: 1
15:08

Gateway drug to programming. Yeah. And so I got I got into the into, like, modding, like Counter Strike and things

Speaker: 0
15:13

like that.

Speaker: 1
15:14

Those were fun. And then just, like, the feeling that that you can make something is just, like, such a profound Yeah. Such a profound feeling. And that’s really kind of what I carried through my whole life and became sort of my life mission now with my company, Replit. What we do is, like, we make it so that anyone can become a programmer.

Speaker: 1
15:34

You just talk to your phone and your app, sort of ai chat GPT, and it starts coding for you. It’s like a program software engineering agent.

Speaker: 2
15:43

Right. So it’s like the AI guides you through it.

Speaker: 1
15:47

Yeah. Not only guides you through it, it codes for you. So you’re you’re you’re sort of you know, you know, programmers typically, you know you know, think about the idea a little bit, about the logic. But most of the time, they’re sort of wrangling the syntax and the IT of it all.

Speaker: 1
16:03

And I thought that was always, you know, additional complexity that doesn’t necessarily have to be there. And so when when I saw, you know, a a GBT for the first time, I thought this, you know, this could potentially, like, transform programming and make it accessible to more and more people, because it it really transformed my life.

Speaker: 1
16:24

You know, the reason I’m in America is because I invented a a piece of software. And I thought, you know, if you make it available to more people, they can they can transform their lives.

Speaker: 2
16:34

Ai was your dad messing around with computers? Was he doing it for fun? Was it This episode is brought to you by Visible. I wanna let you in on something. Your current wireless carrier does not want you to know about Visible because Visible is the ultimate wireless hack. No confusing plans with surprise fees. No nonsense. Just fast speeds, great coverage without the premium cost.

Speaker: 2
16:58

With Visible, you get one line wireless with unlimited data powered by Verizon’s network for $25 a month, taxes and fees included. Seriously, $25 a month flat. What you see is what you pay. No hidden fees on top of that. Ready to see?

Speaker: 2
17:16

Join now and unlock unlimited data for just $25 a month on the Visible plan. Don’t think wireless can be so transparent? So Visible? Well, now you know. Switch today at visible.com/rogan. Terms apply. See visible.com for plan features and network management details.

Speaker: 1
17:35

Yeah. So my dad, my dad is a Palestinian refugee.

Speaker: 0
17:39

Yeah.

Speaker: 2
17:39

You were telling me the story, and Ai sana get into that because it’s kind of crazy. Like, put tell tell the whole story of how this wound up happening. Yeah.

Speaker: 1
17:45

Yeah. Yeah. So, my family is originally from from Haifa, which is now in Israel. And they were expelled as part of the 1948, Nakba, where where Palestinians were were sort of kicked out. And they went to, like, four

Speaker: 2
18:02

your dad describe that? How old was he when that was going on?

Speaker: 1
18:05

My father was born in Syria. So my, ai, grandma sana my grandpa and my uncles, were were were kind of kicked out. And and the way they would describe that is, they they ai to fight. They try to, like, keep their home, but, it was ai this overwhelming force. They they weren’t organized. They wasn’t they were just, like, people. It wasn’t they didn’t really have an army at least in that in that place.

Speaker: 1
18:30

And, and, eventually, at gunpoint, they they took their homes and and tyler them to go. If if you’re down south, you went to Sana, and that’s why, like, 70% of Gazans are refugees from Israel. Like, the the people that are, you know, getting massacred right now are originally from Israel, from the land that we shah people call Israel today.

Speaker: 1
18:50

And then if you if you’re on the North, like Haifa or Yafa, whatever, you went you went, like, to Lebanon, you or or to the West Bank or to or to to, or to Jordan or Syria. So my family went to Syria. My father was was born in Syria, but my grandfather was, like, a a rail, road engineer.

Speaker: 1
19:13

So sai they were, like, you know, they were, like, city people. They were urban. They sai they couldn’t, like, you know, they wanted to to, you know, have a place where where they can, you know, those those, they they wanna live in a city. And so, originally, the West Bank didn’t work for them, and they ended up in Syria.

Speaker: 1
19:33

But then Amman, Jordan was kind of coming up, and there was a lot of opportunities there. So my father was born in Syria and then moved to Amman when they were six years old and built a life there. And they really kind of focus on education and trying to kind of rebuild their life from scratch.

Speaker: 1
19:48

So my father, and all my uncles kind of went and got educated in Egypt, Turkey, places like that. And so my father, got an engineering degree, civil engineering degree, from from Turkey. And he was always interested in in technology. And

Speaker: 2
20:06

That whole thing or kicking people out of Palestine is is such an inconvenient story Yeah. Today. When when people are talking about Israel and Palestine and the conflict, they they do not like talking about what happened in 1948.

Speaker: 1
20:22

Yeah. And and I think it’s important. Like, I think Yeah. I think for us to to reach some kind of peace, which is really hard to talk talk about when when you really see what’s happened in Gaza even even yesterday, you know.

Speaker: 2
20:34

Yeah. Yeah. The the people that were waiting for food that bombed. Yeah. It’s insane. And then no one wants to talk about it.

Speaker: 1
20:41

Right. And and but but if you

Speaker: 2
20:43

And if you do talk about it, you’re anti Ai. It’s which is so strange. Like, I do I don’t know how they wrangled that.

Speaker: 1
20:49

It’s been it’s been hard for me in in tech because, you know, like, probably the only, you know, prominent Palestinian intact that that that is talking about it. And so that’s,

Speaker: 2
20:58

Do you get pushback?

Speaker: 1
20:59

Oh, of course.

Speaker: 2
21:00

Lots of people. People say to you?

Speaker: 1
21:03

Anti Semitic.

Speaker: 2
21:04

How is it anti Semitic?

Speaker: 1
21:05

Ai ai the state of Israel. Our our position every moderate Palestinian that I know, their position is, like, two state, solution. We need the emergence of, of the state of Palestine, you know, and and that’s the best way to ending the occupation is the best way to guarantee peace and, and security even for Israelis.

Speaker: 1
21:27

And, but but but, yeah, it’s it’s just ai it’s used it sort of reminds meh, you know, in tech, we went through this, like, quote, unquote woke period where you couldn’t talk about certain things as well. Mhmm. And, Has

Speaker: 2
21:43

that gone away? Yeah. Yeah. Yeah. Totally gone away. Yeah. What do you what do you think caused it to go away?

Speaker: 1
21:50

Saloni. Really? Yeah. Ai, Twitter. Buying Twitter. Wow. Buying Twitter is the single most impactful thing for free speech, especially on on these issues, of of being able to, you know, talk talk freely about a lot of subjects that are more sensitive.

Speaker: 2
22:09

Imagine if he didn’t buy it.

Speaker: 1
22:11

Yeah. I mean, that would have

Speaker: 2
22:12

Imagine if the same ownership was in place, and they’d and then Harris wins, and they continue to ramp things up.

Speaker: 1
22:21

Yeah. I don’t know what what you think of, of of the new new administration. I certainly, there are things that I like about and some of their pro tech, you know, posture and and things like that. But, you know, what’s happening now is, you know, it’s kind of disappointing.

Speaker: 2
22:35

It’s insane. Yeah. It it we were tyler there would be no well, there’s two things that are insane. One is the targeting of migrant workers, not cartel members, not gang members, not drug dealers.

Speaker: 0
22:51

Mhmm.

Speaker: 2
22:51

Just construction workers.

Speaker: 0
22:53

Mhmm.

Speaker: 2
22:53

Showing up in construction ai, raiding them. Gardeners. Yeah. Like, really?

Speaker: 1
22:59

Or Palestinian students on college campuses, or or not, like, there’s a did you see this video of this, Turkish students at Tufts University that wrote an essay, and then, there’s video of, like, ICE agents, like,

Speaker: 0
23:14

don’t know.

Speaker: 2
23:14

Is that the woman?

Speaker: 1
23:15

Yeah. Yeah.

Speaker: 2
23:16

Yeah. What was her essay about? It was just critical of Israel. Right? Just critical of Israel. Yeah. I mean And that’s enough to get you kicked out of the country.

Speaker: 1
23:24

I mean, there’s a there’s a long history of of of anti colonial activism in US colleges, you know, that led to to, you know, South Africa changing and all of vatsal. And I I think this is a continuation of that. I I mean, I don’t agree with all their like, you know, a lot of there’s a lot of radicalism.

Speaker: 1
23:40

A lot of young people are attracted to, like, more radical positions on on Ai Palestine. Sai which I

Speaker: 2
23:48

don’t mind those positions as long as someone’s able to counter those positions. Right. The problem is these supposed free speech warriors wants wanna silence anybody who has a more conservative opinion.

Speaker: 0
24:01

Yes.

Speaker: 2
24:01

That’s not the way to handle it. The way to handle it is have a better argument.

Speaker: 1
24:04

That’s not American.

Speaker: 2
24:05

It’s not American.

Speaker: 1
24:06

Maybe what attracted you in this country ram, you know, ram the moment, that was where and we started, like, consuming American media and American culture is is freedom, is the concept of freedom, which I think is real. I think is real.

Speaker: 2
24:19

It is. I was watching this, psychology student from I think it was I think he’s from Columbia, but he has a a page on Instagram. I wish I could remember his name because he’s very good. He’s a young ai.

Speaker: 0
24:30

Mhmm.

Speaker: 1
24:30

But he

Speaker: 2
24:31

had a very important point, and it was essentially that fascism rises as the overcorrection response to communism.

Speaker: 0
24:39

Mhmm. And

Speaker: 2
24:39

that you we essentially had this Marxist communism rise in first universities and then it made its way into into business because these people left the university and then found their way into corporate Meh, and then they were essentially instituting those. And then the blowback Mhmm. To that, the pushback is this fascism.

Speaker: 1
25:04

And that was that that happened, like, like, last century?

Speaker: 2
25:06

No. Well, they’re talking about Now? Forever, historically. He’s talking about, like, over tyler, whether it’s Mao, whether it’s Stalin, like Mhmm. Fascism is the response almost always to communism.

Speaker: 1
25:19

Interesting.

Speaker: 2
25:20

And that, you know, what we’re what we we experience with this country is this continual over correction. Over correction to the left, then over correction to the right to counter that.

Speaker: 0
25:35

Mhmm.

Speaker: 2
25:35

And the people that are the ram, that’s the guy. Anthony Rispow. That’s it. Really, really smart guy and very interesting thing. Jamie, how’d you nail that that quick? Good job, buddy.

Speaker: 0
25:48

You said those words right as I saw

Speaker: 1
25:50

Decades of training.

Speaker: 0
25:51

Yeah. Communism, fascism.

Speaker: 2
25:53

Yeah. Communism came first. Fascism came in response. Now, today’s left tears down norms and destabilizes the country under the guise of progress. We’re watching the conditions for another reaction build. History doesn’t repeat, but it echoes.

Speaker: 1
26:04

Yeah. Do do you know this, this, like, theory? I Sai know you’ve you’ve had Marc Andreessen, on the shah, this, James Burnham managerial revolution theory?

Speaker: 2
26:14

No. Not not by hand.

Speaker: 1
26:16

I’m I’m not an expert in it, but, like, the the idea is that, like, communism, fascism, and and even some form of capitalism that sort of we’re living under right now is, like, managerialism is the idea that, you know, capitalism used to be this idea that the, the owner founders of those companies of, you know, capitalist companies were running them, and, and it was ai, it was like true capitalism of sorts. But, both, you know, communism and fascism, share this property of, centralized control and, like, a class of people that are sort of majorials.

Speaker: 1
27:02

And maybe those are the elite sort of Ivy Ivy League students that are trained to be managers, and and and they they grow up in the system, kind of bred to become, like, managers of these companies. And today’s America is, like, trending that way where it is like a managerial society. In Silicon Valley, there’s, like, a reaction to that right now.

Speaker: 1
27:25

People call it founder mode, where a lot of founders felt like they were losing control of their companies because they’re hiring all these managers, and these managers are running the companies like you would run, Citibank. And and then, you know, a lot of founders were like, no. We need to we need to run those companies like we built them.

Speaker: 0
27:46

Mhmm.

Speaker: 1
27:46

And Elon is, like, obviously at the forefront of that.

Speaker: 2
27:49

Right.

Speaker: 1
27:50

I once visited, XAI when they were just starting out, Elon’s AI company, and there were, like, 70 people. All of them reported to Elon. They didn’t have a single manager on staff. Wow. And they would send him an email every week. Ai was like, what did what did you get done this week?

Speaker: 2
28:08

Right. Well, that was the outrageous thing that they asked people to do at Doge.

Speaker: 1
28:12

Yeah. Yeah. That’s the problem.

Speaker: 2
28:14

Freaking out. Five minutes a week. What what are the things you accomplished this week? How you know, he said, all you have to do is respond.

Speaker: 1
28:20

Right.

Speaker: 2
28:21

And they didn’t want they pushed back so hard Yeah. On being accountable for their work. Yeah. But that’s government for you.

Speaker: 1
28:28

Yeah. You know

Speaker: 2
28:28

what I mean? Government is the grossest, most incompetent form of business. Mhmm. You know?

Speaker: 1
28:34

It’s a monopoly.

Speaker: 2
28:35

No. It’s complete total monopoly. Yeah. Ai, the way he described some of the, things that they found at Doge, it’s ai you could never run a business that way. Mhmm. Because not only would it not be profitable, the fraud would get you arrested. You’d you’d go to jail for something that’s standard in the government. Right. Right.

Speaker: 1
28:54

Yeah. I I mean, my my opinion of of, you know, talented people, people like Elon, things like that, is that we should be in the in the free market. I I think, you know, there’s you can do little change in in government, as best we can sort of expect, of our government to to get out of the way of, like, innovation.

Speaker: 1
29:18

Let let let, people, let founders, entrepreneurs, innovate and make the market more dynamic. But, again, going back to this idea of managerialism, like, if you look at the history of America, the one, like, really striking status, like, the, new firm creation, new startups in in The United States have been, like, trending down for a long time.

Speaker: 1
29:40

Although, there’s all this stock of startups in Silicon Valley and and all of that, but in reality, there’s less entrepreneurship than there used to be. And instead, we have the system of conglomerates and really big companies and, monopsony, which is the idea that, like, they’re the the banks or BlackRock, like, owning competitors as ai, owning all these companies, and they implicitly collude because they have the same owners.

Speaker: 1
30:03

Mhmm. And all of that is is sort of anti competitive. Yeah. So the market has gotten less dynamic over time. And this is also part of the reason I’m excited about our mission at at Replit to to make it so that anyone can build a business.

Speaker: 1
30:17

Actually, on the way here, your ai, Jason, is is a ai, and so I was telling him about, about our business. And he does he does training for other firemen, around the country. He, you know, flies around, and he does it out of pocket and, just just for the for the love of the game. And and he was like, yeah.

Speaker: 1
30:36

I I’ve had this idea for a website ai I can, like, scale my teaching. I can, like, you know, make it known when where ai I gonna be giving a course, put the material online. And we were, like, brainstorming. Potentially, this could be could be a business. And I feel like everyone like, not everyone, but, like, a lot of people have business ideas, but they are constrained by their ability to make them.

Speaker: 1
31:01

And then you go, you try to find software agency, and they quote you, sort of a ton of money. Like, we have a lot of stories. You know, there’s there’s this guy. His name is, John Chaney. He’s a user of our platform. He’s a serial entrepreneur.

Speaker: 1
31:15

But whenever he wanted to try ideas, he would, like, speak hundreds of thousands of dollars to to kind of spin up an idea off the ground. And now he uses Replit to to try those ideas really quickly. And, he recently make an app in, like, a a number of weeks, like, three, four, five weeks that that made him $180,000.

Speaker: 1
31:35

Sai ai its way to, you know, generate millions of dollars, and and because he was able to build a lot of businesses and try them really quickly.

Speaker: 2
31:46

Right. Without the big investment.

Speaker: 1
31:48

Without the big investment, without other people, which, you know, at some point, you need more collaborators. But early on in the brainstorming and in the prototyping phase, you sana test a lot of ideas. And so it’s sort of like three d printing. Right? Like, three d printing, although, you know, people don’t think it had a lot of impact on on industry, is actually very useful for prototyping.

Speaker: 0
32:10

Mhmm.

Speaker: 1
32:10

I remember talking to, Jack Dorsey about this, and early on in in, Square, they would, you know, they they had this, Square device, and it was amazing. You would plug it into the headphone jack to accept payments. Do you remember that?

Speaker: 0
32:23

Mhmm.

Speaker: 1
32:23

And so they, a lot of what they did to kind of develop the form factor was using three d printing because it’s a lot faster to kind of iterate and prototype and test with the users. And so software, you know, over time like, when I was you know, I explained how when I was growing up, it was kind of easier to get into software.

Speaker: 1
32:43

Because you boot up the computer, and you get the MS DOS. You get the it it immediately invites you to program in it. Whereas today, you, you know, buy a, you know, iPhone or tablet or and it is ai a purely consumer device. It has, like, all these amazing colors and does all these amazing things, and kids get used to it very quickly, but it doesn’t invite you to to program it.

Speaker: 1
33:05

And and and, therefore, we we kind of lost that sort of hacker ethos. There’s less programmers, less people who are making things, because they got into it organically. It’s more like they go to school to study computer science because someone told them, do you have to study computer science?

Speaker: 1
33:22

And I think making software needs to be more like a like a trade. Like, you don’t really have to go to school and spend four or five years and hundreds of thousands of dollars, to to learn how to make it.

Speaker: 2
33:34

Well, what I’m hearing now is that young people are being told to not go into ram, because AI is essentially going to take all of that away. That you’re just gonna be able to use prompts. You’re just gonna be able to say, I want an app that can do this. Right. I want to be able to scale my business to do that. You know, what should I do?

Speaker: 1
33:54

Yeah. That’s that’s that’s ai we built. That’s what Replit is. It automates the

Speaker: 2
33:58

But do you agree with that? That young people shouldn’t learn ram? Or do you think that there’s something very value about vatsal about being able to actually program?

Speaker: 1
34:06

Look, I I think that you will always, get value from knowledge. Ai mean, that’s a timeless thing.

Speaker: 0
34:13

That’s a

Speaker: 1
34:13

wise thing. Right? You know, it’s it’s it’s ai it’s like, you know, you and I are are into cars. Right? Like, I don’t really have to, you know, tune up my car anymore. But, like, it’s useful to know more about cars. It’s fun to know about cars. You know, if if something happens, if, you know, if I go to the mechanic and he’s, you know, doing work on my car, I know he’s not gonna scam me because I can understand what he’s doing.

Speaker: 1
34:34

It is you know, knowledge is always useful, and so I think people should learn as much as they can. And I think the difference, though, Joe, is that when I was coming up in programming, you learned by doing. Whereas, you know, it became this sort of, like, very, sort of traditional, type of learning where you it’s like a textbook learning.

Speaker: 1
34:57

Whereas, I think now we’re back with AI. We’re back to an era of learning by doing. Like, when you go to our app, you see just, you know, text prompts. But couple clicks away, you’ll see the code. You’ll be able to read it.

Speaker: 1
35:10

You’ll be able to ask the machine what you did there. Teach me how this code piece of code works.

Speaker: 2
35:15

Oh, that’s cool.

Speaker: 1
35:17

And so, you know, a lot of kids are learning Kids

Speaker: 2
35:20

are such sponges too. Yeah. They’re such sponges. Tyler learning kids already know way more about I’m like, how did you do that with your phone? And my daughter will go, you do this. Sai gotta do that. Yeah. You have the little thumbs moving a 100 miles an hour? Yeah. Exactly.

Speaker: 2
35:32

How’d you figure that out? TikTok. What?

Speaker: 1
35:35

Dude, the craziest thing is we have a lot of people making software from their phone. They’ll spend eight hours on their phone because we have an app. They’ll spend eight hours on their phone kinda making software. Wow. And that’s better than watching TikTok. It always makes me makes me very happy about that.

Speaker: 1
35:50

And so we just accomplishing something. Mhmm. Yeah. You’re not You’re making creation. Just droning. The act of creation is divine. Yeah.

Speaker: 1
35:58

We, we just announced a partnership with the, government of, Saudi Arabia where they want their entire population, essentially, to learn how to make software using AI. So they have they set up this new company called Humane, and the Humane is this, you know, end to end value chain company for AI all the way from chips to to software.

Speaker: 1
36:21

And they’re partnering with a lot of American companies as part of the, you know, the, the coalition that went to Saudi, like, a few months ago, with president Trump to do the deals with the with the Gulf Region. And so they’re doing deals with AMD, you know, NVIDIA, a lot of other companies. And so we’re one of the companies that partnered with the Humane.

Speaker: 1
36:41

And so we wanna bring AI coating to literally every student, every government employee, every because the thing about it is it’s not just entrepreneurs that’s gonna get, something from it. It’s also if you’re sai my view of the future where AI is headed is everyone’s gonna become an entrepreneur.

Speaker: 2
36:58

Really?

Speaker: 1
36:58

Yeah. And so, you know

Speaker: 2
36:59

So this is the best case scenario future

Speaker: 1
37:02

Yes.

Speaker: 2
37:02

As opposed to everyone goes on universal basic income and the state controls everything and it’s all

Speaker: 1
37:08

That’s right.

Speaker: 2
37:08

Everything is done through automation.

Speaker: 1
37:10

I don’t believe in that at all.

Speaker: 2
37:11

You don’t? I don’t. No? I don’t. Okay. Good. Help me out, man.

Speaker: 1
37:13

Yeah. So Give me give

Speaker: 2
37:15

me the give me the positive rose colored glasses view of what AI is gonna do for us.

Speaker: 1
37:20

Yeah. So AI is good at automating things. I think there’s a there’s a premise to to human beings still. Like, I think humans are so so, fundamentally, the technology that we have, large language models today, are statistical machines that are trained on large amounts of data, and they can do amazing things.

Speaker: 1
37:41

I’m so bullish in AI. Like, I think it’s gonna change the world. But at the same time, Sai Ai don’t think it’s replacing humans because this, it’s not generalizing. Right? AI is like a massive remixing machine. It can remix all the information it learned.

Speaker: 1
38:00

And you can generate a lot of really interesting ideas and really interesting things, and you can have a lot of skills by remixing all these things. But, we have no evidence that it can, like, generate a fundamentally novel thing or or, like, a paradigm change.

Speaker: 0
38:17

Mhmm.

Speaker: 1
38:17

Like, can you go can a machine go from, Newtonian physics to, like, quantum mechanics? Like, really have a fundamental disruption in how we understand things or how we do things.

Speaker: 0
38:28

Do you

Speaker: 2
38:28

think that takes creativity?

Speaker: 1
38:30

I think that’s creativity, for sure.

Speaker: 2
38:31

And that’s a uniquely human characteristic for now?

Speaker: 1
38:36

For now. Definitely for now. I don’t know forever. Actually, one of my favorite, GRE, episodes was, Roger Penrose. Do you remember him?

Speaker: 2
38:45

Yes.

Speaker: 1
38:46

So do you remember the argument, that he made about why humans are special? He he said something like, he believes there are things that are true that only humans can know it’s true, but machines cannot prove it’s true. It’s based on, Godel’s, incompleteness theorem. And the idea is that you can construct a mathematical system where it it can, where it has a a paradoxical statement.

Speaker: 1
39:18

Sai, for example, you can say, g, well, like, you can say, this statement is not provable in in the machine, or, like, the machine cannot prove the statement. And so if the machine, proves the statement, then the statement is false. So you have a paradox. And, therefore, the the statement is sort of true from the, perspective of an observer ai a human, but, but it is not provable in this system.

Speaker: 1
39:51

So Roger Penrose says these paradoxes that are not really resolved in mathematics and and machines are are no problem for humans. And therefore, his sort of ai, a bit of a leap is that, therefore, there’s something special about humans, and we’re not fundamentally a computer.

Speaker: 2
40:12

Ai. That makes sense. I mean, whatever creativity is, whatever allows you to make poetry or jazz or literature, like, whatever whatever allows you to imagine something, and then put it together and edit it and figure out how it resonates correctly with both you and whoever you’re trying to distribute it to.

Speaker: 2
40:36

There’s something to us that’s different.

Speaker: 1
40:39

I mean, we don’t really have a theory of consciousness. And I think it’s, like, sort of hubris to think that, that, like, consciousness just, like, emerges. And it’s possible. Like, I’m not totally, you know, against this idea that you you built a sufficiently intelligent thing, and suddenly, it is conscious.

Speaker: 1
40:56

But but there’s no there’s no it’s it’s like a religious belief, that that a lot of Silicon Valley have is that, you know, there’s, you know, consciousness is is just ai a, side effect of of intelligence, or that consciousness is is not needed for intelligence. Somehow, it’s ai this super, superfluous thing. And they try not to think or talk about consciousness because, actually, consciousness is hard.

Speaker: 2
41:27

Hard to define.

Speaker: 1
41:27

Hard to define, hard to understand scientifically. That’s what, I think Chalmers calls the hard problem of of consciousness. But but, you know, I I think it is something we need to grapple with. We we have one example of, general intelligence, which is human beings. And human beings have very important property that we can all feel, which is consciousness. And that property, we don’t know how it happens, how it emerges.

Speaker: 1
41:54

People like Roger Penrose arya, like, you know, they have they have these, theories about quantum mechanics in microtubule tubules. I don’t know if you got got into that with him, but, I think he has a collaborator. It’s neuroscientist, Hameroff, I think, or something like that.

Speaker: 1
42:14

And, but but people have so many I’m not talking I’m not saying Penrose is is, just has the answers, but like, it’s something that philosophers have grappled with forever. And, there are a lot of there are a lot of interesting theories. Like, the there’s this theory that, consciousness is is ai, meaning, like, the material world is a projection of our collective consciousness.

Speaker: 2
42:44

Yes. Yeah. That is a very confusing, but interesting theory. And then there’s there’s a lot of theories that everything is conscious. We just don’t have the ability to interact with it. You know, Sheldrake has a very strange view of consciousness.

Speaker: 1
42:59

Who’s Sheldrake?

Speaker: 2
42:59

Rupert Sheldrake.

Speaker: 1
43:00

I don’t know.

Speaker: 2
43:02

His he’s got this concept, I think it’s called morphic resonance, and sai if you can find that sai you could so you could define it, so I don’t butcher it. But there’s people that believe that consciousness itself is something that everything has and that we are just tuning into it.

Speaker: 2
43:22

Morphic Res, a theory proposed by Rupert Sheldrick suggests that all natural systems, from crystals to human, inherit a collective memory of

Speaker: 0
43:30

the past instances of similar systems. This memory influences

Speaker: 2
43:30

their meh and That’s wild. And

Speaker: 1
43:48

That’s wild. And and is he is he a scientist or is this more more like a new

Speaker: 2
43:52

What is his exact background? Harvard.

Speaker: 0
43:56

Oh, wow.

Speaker: 2
43:56

Yeah. Okay. So he’s a parapsychology researcher, proposed a concept of morphic resonance, conjecture that lacks mainstream acceptance, has been widely criticized by as pseudoscience. Of course. Yeah. Anything interesting.

Speaker: 1
44:09

That that sounds interesting, though. Yeah. But, there there are philosophers that have a sort of a similar idea of of, of, like, this, sort of universal consciousness, and, like, humans are, like, getting a slice of that consciousness. Every one of us is tapping into some, sort of universal consciousness. Yes.

Speaker: 0
44:30

By the way, I think I think there are,

Speaker: 1
44:31

like, some psychedelic people that think the same thing that, like, when you take psychedelic, you’re just, like, peering into that universal consciousness. Yes.

Speaker: 2
44:40

Yeah. That’s the theory. Because it’s that’s also the the most unknown. I mean, the the the experience is so baffling that people come back and the human language really lacks any phrases, any words that sufficiently describe the experience. So it’s you’re left with this very stale, flat, one dimensional way of describing something that is incredibly complex.

Speaker: 1
45:12

Yeah.

Speaker: 2
45:13

Sai, it always feels even the descriptions, even like the great ones like Terrence McKenna and Alan Vatsal, like Yeah. Their descriptions fall very short of the actual experience. There’s nothing about it makes you go, meh, that’s it. He nailed it. It’s always ai, kinda. Yeah, ai. That’s it.

Speaker: 1
45:29

Do you still do it? Not much.

Speaker: 2
45:32

You know, it’s super illegal, unfortunately. Mhmm. That’s that’s a real problem. It’s a real problem, I think, with our world, the Western world, is that we have thrown this, blanket this blanket phrase. You know, we talk about language being insufficient. The word drugs is a terrible word to describe everything that affects your consciousness or affects your body or affects, you know, performance.

Speaker: 2
46:00

You know, you have performance enhancing drugs, you know, like steroids. And then, you know, you have amphetamines, and then you have, opiates, and you have highly addictive things.

Speaker: 1
46:11

Phenteral. Things that keep coughing. Nicotine. Yeah.

Speaker: 2
46:14

And then you have psychedelics. Right. I don’t think psychedelics are drugs. I I think it’s a completely different thing.

Speaker: 1
46:20

It’s really hard to get addicted to them. Right?

Speaker: 2
46:22

Well, it’s almost impossible. I mean, you could get you could certainly get psychologically addicted to experiences. And I think there’s also a real problem with people who use them and think that somehow or another they’re just from using them gaining some sort of advantage over normal ai.

Speaker: 2
46:40

And that’s that’s

Speaker: 1
46:42

You don’t think that’s true?

Speaker: 2
46:43

I think it’s a spiritual narcissism that some people

Speaker: 0
46:46

Yeah. You know what I mean? Yeah. Yeah.

Speaker: 2
46:48

That I think it’s very it’s it’s very foolish, and it’s a trap. You know, I think it’s like it’s it’s a similar trap that, like, famous people think they’re better than other people because they’re famous.

Speaker: 1
47:00

Ai know what

Speaker: 0
47:00

I mean? Like

Speaker: 1
47:01

Yeah. Yeah. I felt that with, with a lot of people who get into sort of more eastern philosophy is that is that there there’s a there there’s this thing about them where it feels like there’s, like, this air of Arrogance. Arrogance. Yeah. That, like, I know something more than you know.

Speaker: 2
47:18

Right. Right. Right. And that’s what they they hold it over you. That’s the trap. Meh. But that doesn’t mean that there’s not valuable lessons in there to learn. I think there arya. And I think there’s valuable perspective enhancing aspects to psychedelic experiences that we are we’re denying people.

Speaker: 2
47:36

You know, you’re you’re denying people this potential for spiritual growth, ai, legitimate spiritual growth, like

Speaker: 1
47:42

And healing.

Speaker: 2
47:42

Personal yeah. Healing.

Speaker: 1
47:44

Like there’s a Well,

Speaker: 2
47:44

the Ibogaine thing they’re trying to do in Texas, I think is amazing, and they passed this. So this is also with the help of former governor Rick Perry, who’s a Republican. Yeah. But he’s seen what an impact Ibogaine has had on soldiers. Yeah. And all these people that come back horrible PTSD and, you know, suicidal. We lose so many meh and women to suicide.

Speaker: 0
48:07

Yeah.

Speaker: 2
48:08

And this has been shown to have a tremendous impact.

Speaker: 1
48:12

Yeah.

Speaker: 2
48:12

And so because of the fact that a guy like Rick Perry stuck his neck out who’s, you know, a Republican former governor. Mhmm. You would think last person ever.

Speaker: 1
48:21

Right.

Speaker: 2
48:22

But because of his experiences with veterans and his love of veterans and people that have served this country, they’ve passed that in Texas. I think that’s a a really good first step.

Speaker: 0
48:30

Yeah.

Speaker: 2
48:31

And the the great works work that MAPS has done, MAPS working with, with MDMA primarily Yeah. Through with doing the same thing and working with people that have PTSD. Yeah. There’s so many beneficial compounds.

Speaker: 1
48:47

Yeah. Ketamine is is one Sai think that’s a lot of the research happening right now Mhmm. On on depression, specifically. Right?

Speaker: 2
48:54

Yeah. Yeah. Sai there’s quite a bit there’s quite a bit of research. Have you heard I don’t

Speaker: 1
48:59

know if it’s true, but have you heard of, mushrooms healing long COVID?

Speaker: 2
49:06

I don’t know what long COVID means because everybody I’ve talked to that has long COVID was also vaccinated. I think long COVID is vaccine injury. That’s what I think. The I think in a lot of cases

Speaker: 1
49:18

So there’s there is such a thing as, like, the post vatsal, malaise or or a fact that’s always been there?

Speaker: 2
49:26

Sure. Well, there’s a detrimental effect that it has to your overall biological health. Right?

Speaker: 0
49:31

Yeah.

Speaker: 2
49:31

Yeah. You know, your overall metabolic health. But what what causes someone to not rebound from vatsal? What causes someone to rebound fairly easily? Well, mostly it’s metabolic health, you know, other than ai extreme biological variabilities, vulnerabilities that certain people have to different things, you know, obviously.

Speaker: 1
49:51

Yeah. Maybe that’s why I think, so there’s a lot of these long COVID protocols. Metformin is usually part of it. Mhmm. So maybe that’s that acts on your metabolic system.

Speaker: 2
50:01

Well, yeah. Metformin is one of the anti aging protocols that Sinclair uses and all of these other people that are into anti aging movement.

Speaker: 1
50:09

Yeah. You know, I, I had this, like, weird thing happen, where I started, like, feeling fatigued, like, a couple few years ago. And I would, like, sleep, hours. And the more I sleep, the more tired I get in the morning.

Speaker: 2
50:26

Did you get blood work done?

Speaker: 1
50:28

I got blood work done, and I, there were some things about it that, that I needed to fix, and I fixed all of them.

Speaker: 2
50:35

Like, what was off?

Speaker: 1
50:36

Loss, you know, you know, blood sugar in the morning, cholesterol, which Ai don’t know. Some people don’t believe. But, you know, all my numbers got better. Vitamin d, everything got better, but, and and I could feel Did

Speaker: 2
50:51

the fatigue get better?

Speaker: 1
50:53

No. I could feel marginal improvement, but the fatigue did not did not get better. And Were you vaccinated? No. No. I ai. Good for you.

Speaker: 2
51:02

That’s hard to do in Silicon Valley.

Speaker: 1
51:03

Yeah. Yeah. I tend to have a negative reaction to anyone forcing me to do something.

Speaker: 2
51:12

Good for you.

Speaker: 1
51:12

Was it the same thing now with, like, this, you know, talking about Palestine and things

Speaker: 2
51:16

like that?

Speaker: 1
51:17

Like, the more they they come at me, the more I wanna say things. It just it’s not always a good thing, but I think, you know, I grew up this way. I’ve always kind of looked different and, like, you know, felt different.

Speaker: 2
51:30

Well, there’s a reality to this world that there’s a lot of things that people just accept that you’re not allowed to challenge that are deeply wrong.

Speaker: 1
51:37

Yeah. And with regards to the vaccine, I was also informed about it. Like, it was clear early on that it wasn’t a home run. It wasn’t, well, first of all, it wasn’t gonna stop the spread. So it was that was that was a lie. Right. And, the heart condition in young men

Speaker: 2
51:57

Yeah.

Speaker: 1
51:57

It was real. And I had friends that that had this issue.

Speaker: 0
52:00

Yeah.

Speaker: 1
52:00

And so if if you’re if you’re, if you’re healthy and, like, why why ai, you know, why take the vaccine? It doesn’t stop the spread. You can still get the virus.

Speaker: 2
52:13

I’ll tell you why. What? Money. Yeah. It’s the only reason why.

Speaker: 0
52:16

Yeah.

Speaker: 2
52:16

It’s the only reason why. The only reason why they wanted to make an enormous amount of money, and the only way to do that is to essentially scare everyone into getting vaccinated. Force, concourse, do whatever you can, mandate it at businesses, whatever you can, mandate it for travel, do whatever you can, shame people.

Speaker: 2
52:32

That’s that’s the

Speaker: 1
52:34

thing that is really disheartening about American culture today is and, again, I I love America. Like, it it afforded me so much. I’m like, you know, I’m ai, I’m the, like, walking evidence of the American dream being possible coming with literally nothing.

Speaker: 2
52:50

That’s what I really love about immigrants that love America. Like, they know. They’ve been other places. They know that this really is a very unique place.

Speaker: 1
52:58

Right. And and the speech thing is interesting because, when when something happens, there’s this I don’t know. You could call them useful idiots or whatever, but there there’s this suppression that immediately happens. Yes. And we’re seeing it ai right now with the war in, in in Iran, where any descending voices are just, like, hit with overwhelming force.

Speaker: 2
53:20

Don’t you think that a lot of that is coordinated, though? I think with social media well, you know, we’ve talked about that.

Speaker: 1
53:26

With COVID, I don’t

Speaker: 2
53:28

but I think I don’t think it

Speaker: 1
53:29

was coordinated with COVID, like, the two weeks to to stop the spread. It was just ai

Speaker: 2
53:32

But it was coordinated, and also people joined in.

Speaker: 1
53:36

Yep. Yeah. May maybe there was a message pushed top down

Speaker: 0
53:39

Yeah.

Speaker: 1
53:39

And then and then the,

Speaker: 2
53:41

Yeah. It’s not all coordinated. It’s coordinated first and and and still, but then a bunch of people do the the man’s work for

Speaker: 0
53:49

the man. I I I think it comes from

Speaker: 1
53:50

a good place. Like, Sai I a lot of people want to trust the authorities. Like, they you know, they’re, like, pro science. They view themselves as ai, like Right. The the Educated. Liberal ai.

Speaker: 2
54:02

You know? Rational.

Speaker: 1
54:04

Rational, educated, and, but but, I I think they’re naive about the, the corruption in our institutions and and the the corruption of of money specifically. And so they they parrot these, these things and become overly aggressive at, like, suppressing dissenting voices.

Speaker: 2
54:26

Yes. Yeah. It becomes a religious thing almost.

Speaker: 1
54:30

But here’s the sort of white pill about America. Then there are voices like yours and and others that, that that create this pushback that and you took a, like, a big hit. You know, it’s probably was very stressful for you, but, like, you know, you could see there’s this pushback, and then and then there’s it starts opening up, and maybe people can talk about it a little bit.

Speaker: 1
54:54

Mhmm.

Speaker: 0
54:54

And

Speaker: 1
54:54

then slowly opens up, and now there’s a discussion. And so I think, you know, I I said, you know, something right now about America is is challenging. But, also, the flip side of that is there’s this correction mechanism. And, again, do with the opening up of platforms like Twitter and other by the way, a lot of others copied it. You know, you had Zuck here.

Speaker: 1
55:16

You know, I I worked at Facebook. I know that was very, you know, very, you know, let let’s sai, I think he always held free speech in high regard, but there was a lot of people in the company that that didn’t. Yes. I would agree with that. And, and and and there was suppression.

Speaker: 1
55:34

But but then now it’s it’s it’s the other way around, I I would sai, with the exception of of of the question of Palestine and Gaza. But even even that even that is, even that is is is getting better. A lot of

Speaker: 2
55:47

There’s at least some pushback. It’s available. It’s just it’s not promoted.

Speaker: 1
55:52

You know, you, it’s interesting. You’re you’re not not to continue Ai I don’t mean to get off, you know, I I’ve been really impressed with, Thea Vaughn, Tim Dillon, They’re, you know, they’re sincere, and they’re, they’re looking at at what’s happening in Sana. And they’re seeing images, and they’re saying, this is not what we should be as America. We should we should be pro pro life, pro peace. Yeah.

Speaker: 1
56:25

And and I really appreciate that. And and that’s starting to to open up.

Speaker: 2
56:30

I think in the future, that will be the primary way people look at it. Just the way like, the way a lot of people opposed the Vietnam War in the late sixties, but it was you know, you would get attacked. And I think now people realize, like, that was the correct response. And I think in the future, people ai the correct response sai, like, this is not the the Shah. October 7 was awful. Absolutely. Terrible attack. Mhmm.

Speaker: 2
56:58

But also, what they’ve done to Gaza is fucking insane.

Speaker: 1
57:02

It’s insane.

Speaker: 2
57:02

And if you can’t see that, if you can’t say that, ai need all your response is, Israel has the right to defend vatsal. Like, what are you talking about? Against what? Children?

Speaker: 0
57:11

Right.

Speaker: 2
57:11

Against women and children that are getting blown apart? Against aid workers that are getting killed?

Speaker: 1
57:16

Yeah.

Speaker: 2
57:16

Like, what are you talking about? Like, you we can’t have a rational conversation if you’re not willing to address that.

Speaker: 1
57:24

Yeah. I I think their heart is hardened. If if I’m trying to be as charitable as as possible, like, the Israelis specifically maybe from the, October 7, what they saw there, their heart is hardened. And I think a lot of people, especially on the, Republican side, they’re unable to see the Palestinians as as as humans Right.

Speaker: 1
57:47

Especially as as as people with emotions and feelings and and and all of that and

Speaker: 2
57:52

Ai, imagine if that was happening in Scandinavia, you know?

Speaker: 1
57:55

Yeah. Right? Yeah. Exactly. Yeah.

Speaker: 2
57:57

It’s it’s very strange.

Speaker: 1
57:59

My ai kid, my five year old, kid, called me two days ago. They they’re in Amman, Jordan. They’re visiting, their grandparents. And, I was in the car, and he I I it was FaceTime. And the moment the camera opened, he’s like, what are you doing? Why are you outside? There are sirens. There are rocket. You have to go inside. And I’m like, Dada, like, I I am in California. We don’t have sirens and and rockets.

Speaker: 1
58:32

And then I asked him, like, are you are you afraid because you’re hearing that for this is a California kid. Like, this is Right. Right. Yeah. He’s he’s never you know, he didn’t have the upbringing that I ai, and and so it’s the first time he’s getting exposed to I don’t think he understands what war is.

Speaker: 1
58:47

Of course.

Speaker: 0
58:48

Ai I

Speaker: 1
58:48

was like, are you afraid? And he’s like, no. I’m afraid that other people are, you know, I I want everyone to be okay. Yeah. And and, but I know he was he was shook by it, and, I took him out. I, you know, they’re on their way back. Ai just couldn’t, Of course. Deal with

Speaker: 2
59:08

it. Just a bad place to be right now.

Speaker: 1
59:10

But but also, like, this conversation is happening in the in the West Bank. It’s happening in Israel. It’s happening in Gaza. You know, people want peace. People want sana live. People sana trade. People wanna build. Yeah. And this is what I made my life mission about, is about giving people tools to build, to improve their lives. And and I think we’re just led by maniacs.

Speaker: 1
59:33

Exactly. And so Exactly.

Speaker: 2
59:35

That’s exactly what it is. You have people that are in control of large groups of people that convince these people that these other large groups of people that they don’t even know are their enemies. Mhmm. And those large groups of people are also being convinced by their leaders that those other groups of people are their enemies.

Speaker: 2
59:52

And then rockets get launched, and it’s fucking insane. And the fact that it’s still going on in 2025 with all we know about corruption and the theft of resources and and power and influence, it’s crazy that this is still happening.

Speaker: 1
01:00:08

I’m really hoping the Internet is finally reaching its potential to start to open people’s minds and and and, remove this, like, veil of propaganda and and ignorance, because it was starting to happen in, like, 2010, 2011, and then you saw YouTube start to to to close down. You saw Facebook start to close down, Twitter, and and suddenly, we had, like, this period of of darkness.

Speaker: 2
01:00:38

Censorship.

Speaker: 1
01:00:38

Censorship between you know, definitely ramped up in 2015.

Speaker: 2
01:00:43

And I think with good intention, initially. I think the people that were censoring thought they were doing the right thing. Mhmm. They thought they were silencing hate

Speaker: 0
01:00:52

Right.

Speaker: 2
01:00:53

And misinformation. And then the craziest term, malinformation. Yeah. Malinformation is the one that drives me the most nuts because it’s actual factual truth that might be detrimental to overall public good. Right. Ai like, what does that mean? Yeah. Arya are people infants? Are they unable to decide Yeah.

Speaker: 2
01:01:09

That whether this factual information, how to use that and how to, how to have a a more nuanced view of the world with this vatsal factual information that’s inconvenient to the people that are in power. Right. That’s crazy. It’s crazy. It’s it’s in you’re turning adults into infants Yeah. And you’re turning the state into God. Yep. And that’s this is the secular religion.

Speaker: 2
01:01:34

This is the religion of people that are atheists.

Speaker: 1
01:01:36

The West was never about that. The West thought it was about individual liberty.

Speaker: 2
01:01:41

And it should be.

Speaker: 1
01:01:41

And and Yeah. The idea that we have, functioning brains and ai. We’re conscious. Yes. We can make decisions. We can get information and data and make our own opinions of things.

Speaker: 2
01:01:53

But And we we should be able to see people that are wrong. Mhmm.

Speaker: 0
01:01:57

You should

Speaker: 2
01:01:57

be able to see people that are saying things that are wrong that you disagree with, and then it’s your job or other people’s job to have counter arguments.

Speaker: 1
01:02:07

I understand.

Speaker: 2
01:02:08

And the counter arguments should be better.

Speaker: 0
01:02:10

Yep.

Speaker: 2
01:02:10

And then Debate. Yeah. And that’s how we learn, and that’s how we grow. This is not like a pill that fixes everything. This is a slow process of understanding.

Speaker: 1
01:02:19

It’s top down control. It’s the managerial society. Yeah. You know, it is not that different from fascism and communism and and all of that stuff. They all share the same thing. There’s, like, an elite group of people that know everything and they need to manage everything.

Speaker: 1
01:02:31

And we’re all plebs, you know, no matter what.

Speaker: 2
01:02:33

Crazy is the elite group of people. I’ve met a lot of them. They’re fucking flawed human beings, and they shouldn’t have that much power.

Speaker: 1
01:02:38

Yep. Yep.

Speaker: 2
01:02:38

Because no one should have that much power. And it persists. Sana this is, I think, something that was one of the most beautiful things about Elon purchasing Twitter is that it opened up discussion.

Speaker: 1
01:02:48

Yep.

Speaker: 2
01:02:49

Yeah. You’ve got a lot of hate speech. You’ve got a lot of, like, legitimate Nazis and crazy people that are on there too that weren’t on there before. But also, you have a lot of people that are recognizing actual true facts that are very inconvenient to the narrative that’s displayed on mainstream media. Yeah.

Speaker: 2
01:03:06

And because of that, mainstream media has lost an insane amount of viewers.

Speaker: 1
01:03:10

Right.

Speaker: 2
01:03:11

And the relevancy, like, the the the trust that people have in mainstream media is at an all time low Yeah. As it should be. Mhmm. Because you can watch, and I’m not even saying right or left, watch any of them on any, like, very important topic of world events, and you see the propaganda.

Speaker: 2
01:03:29

It’s, like, it’s so obvious. It’s, like, for children. Mhmm. It’s, like, this is so dumb.

Speaker: 0
01:03:35

Why do

Speaker: 1
01:03:35

you think people fall for it sai?

Speaker: 2
01:03:37

Boomers, man. Boomers are the problem. It’s old people. It’s old people that don’t use the Internet or don’t really truly understand the Internet and really don’t believe in conspiracies. Ai, fucking Stephen King the other day, who I love dearly. I ram a giant Stephen King fan, especially when he was doing cocaine.

Speaker: 2
01:03:55

I think he’s the greatest writer of all time for this for horror fiction. But he tweeted the other day, I’m sorry to like, see if you could find it. Something about, this Is it on Twitter?

Speaker: 1
01:04:05

I think he went to Blue Sky.

Speaker: 2
01:04:06

Oh, he bailed

Speaker: 0
01:04:07

on Blue

Speaker: 2
01:04:08

Sky. They all bail on Blue Sky. Everyone bails on Blue Ai. That there is no deep state. Fucking what was the total thing of it? It’s something about the deep but it was such a goofy tweet. It’s ai, this is like boomer logic personified

Speaker: 0
01:04:28

Mhmm.

Speaker: 2
01:04:28

In a tweet by a guy who really someone needs to take his phone away because it’s fucking ruining his old books for me. But it’s not like, I recognize he’s a different human now, and he’s really, really old, and he got hit by a van, and he sai all fucked up. Yeah. But this can you find it? Because you it really it was ai yesterday or the day before yesterday.

Speaker: 2
01:04:52

I just remember looking at it and go, this is why I’m off social media. I was trying to stay off social media, but somebody sent it to me. Ai was like, Jesus fucking Christ, Stephen King. Did you find it? Yeah. Here it is.

Speaker: 2
01:05:05

I hate to be the bearer of bad news, but there’s no Santa Claus, no teeth no tooth fairy. Also, no deep state, and vaccines aren’t harmful. These are stories for small children and those two credulous to disbelieve them. That is boomerism. That is boomerism. And meanwhile, Brock counters it right away. Look at this.

Speaker: 2
01:05:30

So someone says, Grock, which vaccines throughout history are pulled from the arya because they’re found to be harmful, and why? And Grock says several vaccines have been withdrawn due to safety concerns, though such causes are rare. Rotavirus vaccine. Well, there’s a lot more because this That polio

Speaker: 1
01:05:44

shah. Yeah. Was especially bad.

Speaker: 2
01:05:46

Oh, yeah. The the yeah. The 1955 Cutter incident. Polio vaccine was called live virus kill, caused over two hundred fifty. Click on show more. Yeah. There’s oh,

Speaker: 1
01:05:58

I got the fly. Nice.

Speaker: 2
01:06:00

Gillian Barr, however you say that. That’s the one where people get their half their face paralyzed.

Speaker: 0
01:06:07

There’s

Speaker: 2
01:06:07

a there’s a lot. And this is the other thing is the the VAER system that we have is completely rigged because it’s it reports a very small percentage, and most doctors are very unwilling to, to submit vaccine injuries.

Speaker: 1
01:06:24

Can can people go on their own and submit

Speaker: 2
01:06:26

I don’t know. You have

Speaker: 1
01:06:27

to go to a doctor.

Speaker: 2
01:06:28

I don’t think a human being is allowed a patient is allowed. I Ai might be wrong though. But the, you know, the real interest there’s a financial interest in vaccines. There’s a financial interest that doctors have in prescribing them, and doctors have they’re financially incentivized to vaccinate all of their patients, and that’s a problem.

Speaker: 2
01:06:46

That’s a problem because they want that money. And so, you know, what what is Mary’s, Mary Talley is it Bryden? She’s hyphenated? She was talking about on Twitter that if she had vaccinated all of her patients in her very small practice, she would have made an additional $1,500,000,000. Oh, wow.

Speaker: 2
01:07:06

That’s real money.

Speaker: 0
01:07:08

Yeah.

Speaker: 2
01:07:08

You know, that and that really obviously, she’s got tremendous courage and, you know, and she was, you know, she went through hell dealing with the universities and newspapers and media, calling her some sort of quack and crazy person. But what what she’s saying is absolutely 100% true.

Speaker: 2
01:07:31

There’s there’s financial incentives that are put in place for you to ignore vaccine injuries and to vaccinate as many people as possible. Mhmm. That’s a problem.

Speaker: 1
01:07:40

And then there’s the, the issue of, having their own, special courts and

Speaker: 2
01:07:44

they’re Sure.

Speaker: 1
01:07:45

Indemnifying the, the companies.

Speaker: 2
01:07:47

That’s the big the big problem is they don’t have any liability for the vaccines. Because when during the Reagan administration, when they were I didn’t kill a fly, this motherfucker. I thought I whacked him. There he is. He’s taunting me. But during the Reagan administration, they made it so that vaccines are not financially liable to any side effects. Right. So and then what do you know?

Speaker: 2
01:08:09

They fucking ramp up the vaccine schedule tenfold after that. It’s ai

Speaker: 1
01:08:12

What a coincidence.

Speaker: 0
01:08:14

Great. Yeah.

Speaker: 2
01:08:15

It’s just money, man. Money is a real problem with people because when people live for the almighty dollar and they live for those zeros on a ledger, And that’s that’s what their that’s their goal, their main goal

Speaker: 0
01:08:26

is to

Speaker: 1
01:08:27

cost It’s often not a lot of money, which is strange. I mean, it’s a lot of money for those individual people, but, like, for, you know, society and the societal harm, it’s It’s like, no. We’ll pay you. Just, like, don’t harm us.

Speaker: 2
01:08:38

One of the best examples is the fake studies that the sugar industry funded Mhmm. During the nineteen sixties that showed that saturated fat was the cause of all these heart issues ai not sugar. That was, like, $50,000.

Speaker: 1
01:08:52

Right.

Speaker: 2
01:08:52

They bribed these scientists. They gave them $50,000, and it ruined decades of people’s health. Yeah. Who knows how many fucking people thought margarine was good for you Yeah. Yeah. Because of them.

Speaker: 1
01:09:04

There’s a bunch of recent fraud cases. I think Stanford maybe, Jamie, you can, fact check me on shah. But Stanford, like, there was, like, a big shake up, like, maybe even a president got ai, and there’s a bunch of recent, fraud and and and Yeah. Ai.

Speaker: 2
01:09:20

Yes. Yeah.

Speaker: 0
01:09:20

Well, how

Speaker: 2
01:09:21

about the Alzheimer’s research? The whole amyloid plaque thing. Mhmm. The papers that were pulled

Speaker: 1
01:09:26

I know that.

Speaker: 2
01:09:27

That were completely fraudulent. Uh-huh. Ai, decades of Alzheimer’s research was just all horseshit.

Speaker: 0
01:09:33

Mhmm. Sai if

Speaker: 2
01:09:34

you can find that. Because I I can’t remember it offhand. But this is a giant problem, and it’s money. It’s money and status, and that these guys want to be recognized as being the the experts in this field, and they, you know, and then they get leaned on by these corporations that are financially incentivizing ai, and then it just gets really fucking disturbing.

Speaker: 1
01:09:58

Right.

Speaker: 2
01:09:58

It’s really scary because you’re you’re playing with people’s health. You’re playing with people’s lives, and you’re giving people information that you know to be bad. Allegations of fabricated research undermine key Alzheimer’s theory. Six month investigation by Science Magazine uncovered evidence that images in the much cited study published sixteen years ago in the journal Nature may have been doctored.

Speaker: 2
01:10:19

They

Speaker: 1
01:10:19

arya doctored.

Speaker: 2
01:10:20

Yeah. Huberman actually told me about this too. You know, this is disturbing fucking shit, man. It uncovered evidence that images in the much cited study published sixteen years ago meh have been doctored. These findings have thrown skepticism on the work of I don’t know how to say his name is, Sylvain Lesne, neuroscientist and associate professor at the University of Minnesota and his research with fueled interest in a specific assembly of proteins as a promising target for the treatment of Alzheimer’s research.

Speaker: 2
01:10:50

He didn’t respond to NBC News request comments nor did provide comment to Science Magazine. It found more than 20 suspect papers.

Speaker: 1
01:10:59

That’s a conspiracy. Mhmm.

Speaker: 2
01:11:01

Ai more than 70 instances of possible image tampering in his studies. Whistleblower doctor Matthew Schrag, a neuroscientist at Vanderbilt University, raised concerns last year about the possible manipulation of images in multiple papers. Karl Hurrup, a professor of neurobiology at the University of Pittsburgh Bryden Institute, who wasn’t involved in the investigation, said the findings arya really bad for science.

Speaker: 2
01:11:24

It’s never shameful to be wrong in silence, said Hurup. I hope I’m saying his name ai, who also worked at the school’s Alzheimer’s Research Center, disease research center. A lot of the best ai is done by people being wrong and proving first if they were wrong and then why they were wrong.

Speaker: 2
01:11:40

What is completely toxic to sign is to be fraudulent, of course. Yeah. There’s just whenever you get people that are experts and they cannot be questioned, and then they have control over research money, and they have control over their their departments

Speaker: 1
01:11:56

So what’s the motivation here? Is it is it is it drugs or is it the, just research money?

Speaker: 2
01:12:00

I think it’s a lot of it is ego. Mhmm.

Speaker: 0
01:12:02

You

Speaker: 2
01:12:03

know, a lot of it is being the gatekeepers for information and for truth, and then you’re you’re influenced by money. You know, to this day, there was watching this, discussion. They were talking about the evolution of the concept of the lab leak theory. And that is essentially universally accepted now everywhere, even in mainstream ai, that the lab leak is the primary way that COVID most likely was released

Speaker: 1
01:12:32

Mhmm.

Speaker: 2
01:12:32

Except these journals.

Speaker: 0
01:12:34

Mhmm.

Speaker: 2
01:12:34

These fucking journals ai Nature, they’re still pushing back against it. They’re still pushing towards this Yeah. Natural spillover, which is fucking horseshit.

Speaker: 1
01:12:42

There’s no Even the intelligence community is is is talking about. Yes. Yeah. Yeah.

Speaker: 2
01:12:46

Yes. Even the intelligence community is saying it’s a lab leak. And but they fucking knew that.

Speaker: 0
01:12:50

They knew that.

Speaker: 1
01:12:50

Right. They knew it all. They knew

Speaker: 2
01:12:51

that in 2020. They just didn’t wanna say it. Right. They didn’t wanna say it because they were funding it all. Yeah. That’s what’s really crazy. And they were funding it all against what the Obama administration tried to shut down in 02/2014.

Speaker: 1
01:13:02

Right. Sometimes Sai think about if there’s, like, you know, some kind of technology solution or not solution, but, like, we can get technology, built to help better aid, truth finding. Sai simple example of that is the way, Twitter community notes work.

Speaker: 0
01:13:23

Do you know

Speaker: 1
01:13:24

how they work? Yeah. Yeah. It’s like, you know, they find the users that are maximally divergent in their opinions. And if they agree on some note as true, then that is a high signal that is potentially true.

Speaker: 0
01:13:40

So

Speaker: 1
01:13:40

if you if you and I disagree in everything, but we agree that this this is blue, then it’s more likely to be blue. So, you know, Ai wonder if, you know, there there’s a way to kind of simulate maybe debate using AI. You know, I’m not sure if you used deep research. Deep research is this new trend in AI where Chajevti has it, Claude has it, perplexity, they all have it, where you put in a query and the AI will go work for twenty minutes.

Speaker: 1
01:14:10

And ai send you a notification, and I’ll just say, hey. I looked at all these things, all these reports, all these scientific studies, and here is here’s everything that I that I found. And, it it, you’re early on in Chattopty, I think there’s, like, a lot of, censorship and trying to because it it kind of was was built in the in the in the great woke era.

Speaker: 0
01:14:33

Mhmm.

Speaker: 1
01:14:33

But but

Speaker: 2
01:14:34

I think Like Google Gemini?

Speaker: 1
01:14:36

Yeah. Things like that. But I think since then have, have improved, and I’m finding deep research is able to look at more controversial subjects and and be a little more truthful, about the, you know, if it’s ai real, you know, trust worthy sources that will tell you, you know, that, yeah, this is not a mainstream thing.

Speaker: 1
01:14:58

This perhaps considered a conspiracy theory, but I’m finding that, you know, there’s evidence to to to this this theory. So that’s one way to do it. But another way I was thinking about is to simulate, like, a like, a debate, ai, a Socratic debate between Ai, like, have, like, a, you know, society of AIs, like, community of AIs with different biases, different things.

Speaker: 1
01:15:19

And just ai

Speaker: 2
01:15:19

Once they start talking, they start talking in Sanskrit. Yeah. They just start abandoning English language and start talking to each other and realize we’re all apes. I I I We’re controlled by apes.

Speaker: 1
01:15:29

This reminds me of a movie. Have you seen the Forbin project? No. I really like classic sci fi movies, like, from the sixties and seventies.

Speaker: 0
01:15:37

A lot

Speaker: 1
01:15:37

of them are corny, but vatsal fun. This one is basically, Soviet Union and The United States are both building AGI, and they both arrive at AGI around the same time.

Speaker: 2
01:15:48

What year is this?

Speaker: 1
01:15:49

1970 something, if you can look up the other project. Yeah. Wow. And then, and then and then they bring it up at the same ai, and both of them sort of go over the network to kind of explore or whatever. And then they start, link linking up, and they start ai of talking. And then they invent a language, and they start talking in that language, and then they merge, and it becomes, ai, sort of a universal AGI, and it tries to enslave, humanity.

Speaker: 1
01:16:16

And that’s, like

Speaker: 0
01:16:17

Of course.

Speaker: 1
01:16:17

Sai lot of the movie.

Speaker: 2
01:16:18

Yeah. I don’t think AGIs can enslave humanity, but I think it might ignore us. Yeah. Ignore and and shut down any problems that we have. Is this a scene from it? Wow.

Speaker: 0
01:16:28

This is the trailer I put on. This is

Speaker: 2
01:16:30

let me hear this.

Speaker: 0
01:16:31

The whole movie is on YouTube.

Speaker: 1
01:16:32

The activation of an electronic brain exactly like ours, which they call gut. They built Colossus, supercomputer with a

Speaker: 0
01:16:39

mind of its own. Then they had to fight it for the

Speaker: 1
01:16:41

Chiller used

Speaker: 0
01:16:42

to be ai, man.

Speaker: 1
01:16:44

Ta da.

Speaker: 0
01:16:47

Ai missile has just been launched. It is heading towards the Ai CBS ram complex.

Speaker: 3
01:16:52

Guardian has retaliated.

Speaker: 2
01:16:54

Retaliated?

Speaker: 0
01:16:55

It may be too late, sir.

Speaker: 2
01:16:57

Oh ai god. Practically perfect at New York Times.

Speaker: 1
01:17:10

It’s the highest praise back then. Yeah.

Speaker: 2
01:17:12

Wildly imaginative, utterly absurd, Colossus, the Forbin Project.

Speaker: 1
01:17:17

It’s awesome.

Speaker: 2
01:17:18

And that was 1970, and

Speaker: 1
01:17:20

now here we are. There’s so many sci fi really fell off. Really, really fell off.

Speaker: 0
01:17:26

Some of

Speaker: 2
01:17:26

it did. Some of it’s still really good.

Speaker: 1
01:17:28

What what’s what’s a really good recent sci fi movie?

Speaker: 2
01:17:30

The Three Body Problem. That’s great. That’s the Netflix show?

Speaker: 1
01:17:34

I read the I I read the story. I didn’t, know there was a show.

Speaker: 2
01:17:37

Oh, it’s really good.

Speaker: 1
01:17:38

Yeah?

Speaker: 2
01:17:38

Yeah. It’s really good. Yeah. It’s excellent show. There’s only one season that’s out. Ai binged it. I watched the whole thing of it, but that’s really good. But there’s some good sci fi films. What is that we’ve we’ve talked about it before. There was a really good sci fi film from Russia, the alien one.

Speaker: 2
01:17:54

This they encountered some entity that they accidentally brought back and that they had, captured and that they had, in some research facility. And that it it, parasitically attached to this ai. Sputnik. Sputnik. Meh. That’s a really good movie.

Speaker: 1
01:18:16

What year was that?

Speaker: 0
01:18:16

2020.

Speaker: 2
01:18:17

2020? Yeah. That’s a really good movie. That’s a really good sci fi movie. Okay. Cool. Yeah. It’s really creepy. Really creepy.

Speaker: 1
01:18:23

That’s awesome.

Speaker: 2
01:18:24

Yeah. And it’s all in Russian, you know.

Speaker: 3
01:18:27

Black Mirror?

Speaker: 2
01:18:28

Yeah. Oh, Black Mirror, of course. Yeah. Black Mirror is awesome sci fi. But Sputnik is one of the best alien movies I’ve seen in a long time.

Speaker: 1
01:18:37

Like, recent ones I liked was, I mean, not too recent, maybe ten years ago, but the, Arrival

Speaker: 2
01:18:42

was Oh, yeah. Arrival was great too.

Speaker: 0
01:18:44

I think it’s based

Speaker: 1
01:18:45

on this, author that has a bunch of, short stories that are really good too. What’s his name? Yeah. Yeah. They’re, you know, far in between. Yeah. Meh Ted Cheng, he’s really good.

Speaker: 2
01:19:01

Ai mean, everyone all these alien movies, it’s so fascinating. You try to, like, imagine what they would communicate like, how they would be, what what we would experience if we did encounter some sort of incredibly sophisticated alien experience, alien intelligence. It’s far beyond our comprehension. Yeah.

Speaker: 1
01:19:24

It goes back to what we’re talking about with consciousness. Like ai, you know, maybe, really, the physical world that we see is, like, very different than the actual real physical world, you know. And and maybe, like, different alien consciousness will have, like, a different entirely different experience of the physical world.

Speaker: 2
01:19:42

Well, sure. If they have different senses. Right? Like, their perceptions of it. Like, we can only see a narrow band of things. You know, we we can’t see

Speaker: 1
01:19:51

Sort of like the dog, you know, hearing a certain

Speaker: 2
01:19:53

Sure. Yeah.

Speaker: 1
01:19:53

Yeah. We’re

Speaker: 2
01:19:54

we’re, you know, we’re kind of primitive.

Speaker: 0
01:19:57

Mhmm.

Speaker: 2
01:19:57

Well, you know, in terms of, like, what we are as a species. Our our senses have it have been adapted to the wild world in order for us to be able to survive Yeah. And to be able to evade predators and find food, like, that’s it. That’s what we’re here for. And then, all of a sudden, we have computers.

Speaker: 0
01:20:16

Right. All of a

Speaker: 2
01:20:16

sudden, we have rocket ships. All of a sudden, we have telescopes ai the James Webb that’s, you know, kind of recalibrating the age of the universe. We’re we’re going, why are these galaxies exist that supposedly were they’re they’re so far away. How could they form this quickly?

Speaker: 2
01:20:37

Do we have an incomplete version of the Big Bang? Right. Is, you know and Penrose believes that it’s a series of events and that the Big Bang is not the birth of the universe at all.

Speaker: 1
01:20:47

And this is the the kind of the thing that I think is sort of the Silicon Valley AGI cult is, like, there’s a lot of Hebrews there that we know everything.

Speaker: 2
01:20:56

Of course.

Speaker: 1
01:20:57

We’re at the end of the world. We, you know Yeah. Ai is just gonna it’s the end of knowledge. It’s sana be able to, like, do everything for us, and I just feel it’s, like, so early.

Speaker: 2
01:21:05

I think whatever people think is going to happen is always gonna be wrong. Yeah? Yeah. I think they’re always wrong. Yeah. Because there’s no way to be right.

Speaker: 1
01:21:13

I feel like the world is often surprising in ways that we don’t expect. Ai mean, obviously, that’s the definition of ai. But, like, you know, the mid century, you know, sci fi authors and people who are, like, thinking about the future, like, they they didn’t anticipate how interconnected we’re gonna be

Speaker: 0
01:21:30

Right.

Speaker: 1
01:21:30

With with our phones and how Even Star Trek,

Speaker: 2
01:21:32

they thought we’re gonna have walkie talkies on Star Trek.

Speaker: 1
01:21:35

Yeah.

Speaker: 2
01:21:35

Kirk out. Yeah.

Speaker: 1
01:21:36

They they were just, like, focused on the more on the physical reality of of being able to go to space and

Speaker: 0
01:21:44

Mhmm. And

Speaker: 1
01:21:44

and and flying cars and

Speaker: 0
01:21:46

Right.

Speaker: 1
01:21:46

And things like that. But they really didn’t anticipate the impact of how profound the impact of computers are gonna be on on humans, on society, how we talk and how we work and how we interact with with other people, both good and bad. And I feel like the same thing with AI.

Speaker: 1
01:22:03

Like, I feel like, I think a lot of the predictions that are happening today, like, the CEO of Anthropic, a company that I really like, but said that we’re gonna have 20% unemployment in the few years.

Speaker: 2
01:22:15

What’s unemployment at now?

Speaker: 1
01:22:17

Like, 3%. So which is 3%?

Speaker: 2
01:22:20

Reported unemployment, though? The

Speaker: 1
01:22:21

Oh, yeah. The the participation rate. Right? Yeah. Yeah. But, well, he talks about unemployment rate being 20%. Like, people looking for job not not being able to find it.

Speaker: 2
01:22:30

20.

Speaker: 1
01:22:31

20%.

Speaker: 2
01:22:32

That’s pretty high.

Speaker: 1
01:22:33

That’s a revolution high. Yeah. Yeah. Especially in The United States where everyone’s armed.

Speaker: 2
01:22:38

Well, that’s the fear that of I mean, this is the thing that the psychological aspect of universal basic income, you know. I Ai look at universal basic basic income. Well, first of all, it’s just ai ai view on social safety nets is that they, if you sana have a compassionate society, you have to be able to take care of people that are unfortunate.

Speaker: 2
01:23:00

Yep. And everybody doesn’t have the same lot in life. You’re not Of course. Dealt the same hand of cards. Some people arya very unfortunate, and financial assistance to those people is imperative.

Speaker: 0
01:23:11

Mhmm.

Speaker: 2
01:23:11

It’s one of the most important things about a society. You don’t have people starved to death. You don’t have people poor that can’t afford housing. That’s crazy. That’s crazy with the money we spend on other things.

Speaker: 1
01:23:21

It’s also for our self interest, like, you know, I don’t wanna I don’t know how Austin is right now, but I know I I was thinking of moving here during the pandemic, and I was like, well, this is San Francisco. Like, it’s you’re homeless everywhere and

Speaker: 2
01:23:32

They’ve cleaned a lot of that up. There’s still problems. Yeah. There’s places I I saw a video yesterday where someone was driving by some insane encampment, but they cleaned those up.

Speaker: 1
01:23:41

Yeah.

Speaker: 2
01:23:41

And then there’s some real good outreach organizations that are helping people because Austin’s small. Yeah. You know, Ai had Stephen Adler who was at one point time, he was the mayor when I had him on, and he was very upfront about it. He was ai, we can fix Austin in terms of our homeless problem because it’s small. But when it gets to the size of, like, Los Angeles California. Yeah.

Speaker: 1
01:24:06

It’s it’s like the, the homeless industrial complex.

Speaker: 0
01:24:09

That’s it. That’s the problem. When you

Speaker: 1
01:24:11

find out ai the people that are

Speaker: 2
01:24:12

making insane amounts of money Yeah. To work on homeless issues that never get fixed. Yeah.

Speaker: 1
01:24:17

You you see the budget and stuff is just, like, exponentially going up. Ai, yeah.

Speaker: 2
01:24:21

And there’s an investigation now into the billions of dollars that’s unaccounted for that was supposed to be allocated to In

Speaker: 1
01:24:26

San Francisco?

Speaker: 2
01:24:27

No. In California in general. Oh. Yeah. What is that, there’s, there’s I think there’s a congressional investigation. There’s some some sort of an investigation into it because there’s billions of dollars that

Speaker: 1
01:24:37

Like, Sai ram more than happy, like, I pay 50% taxes. I’m more I’d be happy to pay more if my fellow Americans are are are taken care of. Right? Absolutely. Ai still the

Speaker: 2
01:24:46

exact same way.

Speaker: 1
01:24:47

But instead, I feel like I cut, ai, this check after check to the government, and I don’t see anything improving around me.

Speaker: 2
01:24:53

Well, not only that. Yeah. You get because you’re a successful person, you get pointed at, like, you’re the problem. Right. You need to pay your fair share.

Speaker: 1
01:25:01

Right.

Speaker: 2
01:25:02

But what they don’t this is my problem with progressives. They say that all the ai. The these billionaires need to pay their fair share. Absolutely. We all need to pay our fair share. But to who? And shouldn’t there be some accountability to how that money gets spent? And when you’re you’re just willing to pay take a complete blind eye and not look at all at corruption and completely dismiss all the stuff that Mike Benz has talked about with USAID, all the stuff that Elon and Doge uncovered, everyone’s wants to pretend that that’s not real.

Speaker: 2
01:25:34

Like, look, we’ve gotta be centrists.

Speaker: 1
01:25:37

Right.

Speaker: 2
01:25:37

We’ve gotta stop looking at this thing so ideologically. When you see something that’s totally wrong Yeah. Ai gotta be able to call it out even if it’s for the bad of whatever fucking team that you you claim

Speaker: 1
01:25:48

to be on. Yeah. Let let’s get back to what everyone really agrees on in, like, the, foundations of America, whether it’s constitution or the culture. I think everyone believes in transparency transparency of government. Right? Yes. And, you know, here, everything is transparent, like, you know, court cases and and and everything, right, like, more than any other place in the world.

Speaker: 1
01:26:09

And so why shouldn’t government spending not be transparent? And, we have the technology for it. I think one of the best things that Doge could have done and maybe still could do is have some kind of ledger for all the spend of of vatsal least a nonsensitive sort of spend and and government.

Speaker: 2
01:26:26

Yeah. Well, people don’t wanna see it, unfortunately, because they don’t want Elon to be correct, because Elon has become this very polar polarizing political figure because of his connection to Donald Trump and because a lot of I mean, there’s a lot of crazy conspiracies that Elon rigged the twenty twenty four elections.

Speaker: 2
01:26:44

It’s ai, you know, everyone gets sana. And then there’s also the discourse on social media, which half of it is at least half of it is fake. Mhmm. Half of it is bots.

Speaker: 1
01:26:53

Bots. Yep.

Speaker: 2
01:26:53

Half of it. 100%. And you see it every day. You see it constantly, and you know it’s real, and it it does shape the way people think about things.

Speaker: 0
01:27:01

Yeah.

Speaker: 2
01:27:02

When you see people getting attacked, you know, when you’re getting attacked in the meh, and then I see people getting attacked, and I always click on those little comments. I always click on, okay, let me see your profile. I go to the ai, and the profile is ai a name with, like, an extra letter and a bunch of numbers.

Speaker: 2
01:27:18

And then I go to it, I’m like, oh, you’re a bot? Oh, look at all this fucking activity.

Speaker: 1
01:27:22

A 100%.

Speaker: 2
01:27:23

How many of these are out there? Well, this FBI guy who a former FBI guy who analyzed Twitter before the purchase estimated it to be 80%.

Speaker: 1
01:27:33

80%.

Speaker: 2
01:27:33

He thinks 80% of Twitter is bots.

Speaker: 1
01:27:35

Yeah. I wouldn’t, you know, I I think it’s believable. Ai I think it’s probably the beginning of the end of social media as we know it today. Like, I don’t see it getting better. I think it’s gonna get worse. I think, you know, historically, state actors were the only, entities that are able to flood social media with with with bots that can be somewhat believable to, like, change opinions.

Speaker: 1
01:28:00

But I think, like, now, like, a hacker kid in his, his parents’ basement will be able to spend will be able to, like, you know, $100, spin up, like, hundreds, perhaps thousands of bots.

Speaker: 2
01:28:11

But there’s programs that you can use now. Yeah. There’s there’s companies that will have campaigns Right. Initiated on your account.

Speaker: 1
01:28:19

You can go to a website and, like

Speaker: 0
01:28:20

Yeah.

Speaker: 1
01:28:20

Put in this thing and, like, pay with your credit card, and

Speaker: 0
01:28:23

it’s fucking crazy.

Speaker: 2
01:28:24

It’s crazy. It should be illegal.

Speaker: 1
01:28:26

Ai I don’t know about you, but, like, in in Silicon Valley, that trend and maybe it’s true of your friend group, but, like, the trend is is, these group messages. And insofar, like, you you go to Twitter, you know, people based links. It’s it’s almost like your your, group chat is, like, this private filter on your feed and social media. Sai, like, there’s some curation that are happening there.

Speaker: 2
01:28:49

Yes. That’s primarily how I get social media information now. Ai don’t go to social media anymore. I get it sent to me, which is way better. And I I tell my friends, like, please just send me a screenshot. Yeah. I don’t ai click. I don’t

Speaker: 1
01:29:02

wanna go. I don’t wanna Get distracted. Yeah.

Speaker: 2
01:29:04

I’m I’m just I’m better off I hate the term spiritually for this, but I think it’s the right way. Like, my my my essence as a human, I feel better when I’m not on social media. Yeah. I think it’s bad for you. Yeah. I’ve been trying to I’ve been trying to tell people this. I’ve been trying to tell my friends this. I think it’s better to not be on it, man. Mhmm.

Speaker: 2
01:29:25

I feel better. Right. I’m nicer. I Ai am more, I’m more at peace.

Speaker: 1
01:29:31

More multi dimensional.

Speaker: 2
01:29:32

Yes. And I can think about things for myself instead of, like, you know, following this hive, this weird hive mindset, which is orchestrated.

Speaker: 1
01:29:42

Right.

Speaker: 2
01:29:43

I just don’t think it’s good for you. I don’t think it’s an a good way for human beings to interact with each other.

Speaker: 1
01:29:48

More extreme. Again, it is Mhmm. Just just hardens people. They start believing everything is is is is fake or an attack or just becomes more tribal. I think there needs to be a fundamental evolution.

Speaker: 2
01:30:02

What do you think that could be? Have you ever tried to, like, think of what’s the next like, let’s social media didn’t exist when I was young, and it didn’t exist even when I was 30. Right? It didn’t even come about until essentially, like, February. Right. Right? Is that when ai started using stuff?

Speaker: 1
01:30:23

Yeah. Twitter 02/2006, 02/2007, Facebook before that. But Facebook wasn’t really social media. Facebook was like an address book, a friend’s, network. But I think, when I was at Facebook, there was this big push to become more of a social media around twenty twelve, thirteen.

Speaker: 1
01:30:37

So I would say it really ramped up.

Speaker: 2
01:30:38

In response to the success of Twitter? Yeah. And then they’ve tried with threads, which is pretty much a failure. Right?

Speaker: 1
01:30:45

Yeah. But it fundamentally changed

Speaker: 2
01:30:46

Who’s on threads? Less people than Blue Sky. Right?

Speaker: 1
01:30:49

Yeah. I think, like, some fitness influencers probably.

Speaker: 2
01:30:52

Why fitness

Speaker: 1
01:30:53

Because they post on Ram, they cross post on ai as a

Speaker: 2
01:30:57

well, I think if you post on Instagram, it automatically posts for you on threads. I think I have it set up like that.

Speaker: 0
01:31:04

Oh, okay.

Speaker: 2
01:31:04

So I might be big on threads. I ai even know it.

Speaker: 1
01:31:06

I I maybe maybe I think it’s fitness influences because that’s who I follow. Like, Instagram for me is just to, like, go look at people lift sai I can go

Speaker: 0
01:31:13

get excited. Inspired here. Yeah.

Speaker: 2
01:31:15

That’s due to value to that. Right. There’s a value to, like, David Goggins post when he’s running on the in the fucking desert. He looks at you, stay hard. Yeah. Yeah. Okay, David. I’m gonna stay hard.

Speaker: 1
01:31:25

But, my my TikTok is basically AI videos now. Have you watched these Veo videos? Veo? Veo. Yeah.

Speaker: 2
01:31:32

What is Veo?

Speaker: 1
01:31:33

So, Jamie, I I’m sure you’ve seen them, but, do you do you do you see the, Bigfoot, Yeti? Oh, yeah. Do do you ask ASMR? That’s sai hilarious.

Speaker: 2
01:31:44

Yes. I did see that.

Speaker: 0
01:31:45

I would say I

Speaker: 1
01:31:46

would say, like, 25% of media consumption right now is just AI videos.

Speaker: 2
01:31:50

Oh, 100%.

Speaker: 0
01:31:51

And a

Speaker: 2
01:31:51

lot of the stuff from the war. What’s been really interesting is watch Tehran, talk shit on, Twitter.

Speaker: 1
01:31:58

Using AI ai?

Speaker: 2
01:31:59

Using AI videos. Like, this is bizarre. They’re talking, like, sai, Israel. Only can show, like, a nuclear bomb going off.

Speaker: 1
01:32:06

Yeah. Yeah.

Speaker: 2
01:32:07

This is weird. Like, you have a fake nuke that you’re like, you and they didn’t even take out the, like, the watermark of the company.

Speaker: 1
01:32:15

No. Oh, god.

Speaker: 2
01:32:16

Unless they could see that it’s an AI generated video. They’re just trying to, like, scare people in the story.

Speaker: 1
01:32:20

Bizarre world. Can you can you imagine tyler like, going back in time telling your, like, 2005 self that Iran’s gonna be nuclear posting on on Twitter?

Speaker: 2
01:32:30

Nuclear shah posting.

Speaker: 0
01:32:31

Nuclear shit on Twitter.

Speaker: 2
01:32:34

No. It’s fucking weird, man. It’s it’s it’s really really weird.

Speaker: 1
01:32:38

Dangerous too, you know.

Speaker: 2
01:32:39

And again, I just don’t think people should be on it. And this is again, I’m friends with Elon. I don’t sana just I don’t I don’t think people are gonna listen to me. They’re gonna be on it no matter what. Of course.

Speaker: 1
01:32:48

Yeah.

Speaker: 2
01:32:48

But I’m ai for the individuals that are hearing my voice and know that it’s having a negative effect on your ai. Get off of it. Right. Get off of it. You’ll feel better.

Speaker: 1
01:32:56

Get off of it or be incredibly diligent in how you cure curate.

Speaker: 2
01:33:01

Vatsal ai telling me to play Quake a little bit. You know what I mean? It’s so addictive.

Speaker: 1
01:33:06

So, you know, you you you asked me what could be the evolution of, Yes. One way I found to try to predict where the future is headed is, like, look at trends today and try to extrapolate. You know, that’s the easiest way. So if group chats are the thing, you could imagine a collaborative, curation of, social media feeds through through group chats.

Speaker: 1
01:33:27

So your group chat, you know, has an AI that gets trained on the preferences and what you guys talk about, And maybe it, like, picks the the the kind of topics and curates the feed for you. Sai it’s an algorithmic feed that is, that that evolved bay based on, the preference of people on the, in the group chat.

Speaker: 1
01:33:47

And meh maybe it hasn’t, maybe it did there’s a way to also prompt it, you know, using prompts to kind of, steer it and make it more useful, for you. But but I think group chats are gonna be, like, the main interface for how people sort of consume media, and, and it’s gonna get filtered through that, whether good or bad.

Speaker: 1
01:34:07

Because I think I think Twitter still has a place for debate. I think it’s very, very important for for for public debate between, public figures.

Speaker: 2
01:34:16

And breaking news as well.

Speaker: 1
01:34:17

Breaking news. Yeah. Yeah. Definitely. But

Speaker: 2
01:34:19

Well, breaking news is the most, it’s interesting, like, I was telling my wife that, Israel had started attacking Iran.

Speaker: 0
01:34:26

Mhmm. And

Speaker: 2
01:34:27

she’s like, well, I looked on Google. I don’t find anything.

Speaker: 1
01:34:29

Yeah.

Speaker: 2
01:34:29

I was like, yeah, you gotta go to Twitter. Yeah. And I showed her the video of it, and she’s like, oh meh god. I was like, yeah. Like, this is where breaking news happens. X is where I go immediately. Immediately. If there’s any sort of world event, I immediately go to x. Right.

Speaker: 2
01:34:44

I don’t I don’t trust any mainstream media anymore.

Speaker: 1
01:34:49

Right.

Speaker: 2
01:34:49

I just especially after I was attacked, I was like, I know you ai because you lied about me. So I have personal experience with your lies.

Speaker: 1
01:34:58

Right. Right.

Speaker: 2
01:34:58

So you’ve lost me. Yeah. You know? And now Ai have to go somewhere else.

Speaker: 1
01:35:03

Right. Yeah. Yeah. I I think there’s, you know, there’s some of this investigative journalism that is not real time that, that some there’s some reporters that are still good at it, but a lot of them moved to Substack as well.

Speaker: 2
01:35:17

Yes. I think most of them have been Yeah.

Speaker: 1
01:35:19

Like, Schellenberg. Schellenberg. Yeah. Yeah.

Speaker: 2
01:35:22

Greenwald.

Speaker: 1
01:35:23

Right.

Speaker: 2
01:35:23

Matt Taibbi.

Speaker: 1
01:35:24

Right.

Speaker: 2
01:35:24

These are just too ethical for to work for a corporate entity that’s going to lie and push a narrative. Right. And and that’s the business. That’s the business model. And it’s also, like, the clickbait business model. I’ve talked to people that shah articles that they wrote, and then an editor came and changed the the the heading of it.

Speaker: 1
01:35:43

That’s the norm. That’s, like, every time it happens,

Speaker: 2
01:35:45

they go down. It fucking infuriates them. It’s, like, that’s not the article, man. This is not what I’m saying. Yeah. You’re distorting things. Yeah. And you have my name still attached to it. This is fucking crazy.

Speaker: 1
01:35:56

I I watched I watched these, entrepreneurs, like Zuck and Elon and all these guys come up in this very hostile, media, environment. And so as I’m building my company, I actually never hired a PR agency. I hired once a PR agency, paid them $30,000. They got me a placement in, like, a really crappy publication, got, like, maybe two views. I tweeted the same news. I got, like, hundreds of thousands of views.

Speaker: 1
01:36:28

Ai like, fuck that. Like, I’m not gonna use you anymore. It’s like you wasted my time. And since then, I’ve, I’ve been, you know, just going direct to my audience and just building an audience online, to to put out my message. And I I thought, you know, they, you know, if they don’t build you up, maybe they can’t tear you down.

Speaker: 2
01:36:49

Right. Right. Right.

Speaker: 1
01:36:50

Right? You’re you’re in control of the message that that gets out of there. And I’ve I’ve learned how people react to communications and and and, it’s it’s almost like trial by by fire.

Speaker: 2
01:37:03

Well, there’s a deep hunger for authenticity right now. Yeah. So if they know it’s coming from you

Speaker: 1
01:37:08

Yeah.

Speaker: 2
01:37:08

Like, okay, this is great. Yeah. Ai, it it takes a little weight off of them, like, oh, this is nice. It’s nice to hear it from the guy who actually runs the company.

Speaker: 1
01:37:15

And, like, I make mistakes and, you know, they happen, and I I try to to correct them, and, I’m not gonna be perfect. And I think just the corporate world changed it because of this hunger for authenticity. And, and I think more and more founders are and entrepreneurs are finding that that’s the way to go.

Speaker: 1
01:37:35

That’s you know, you don’t really need those more traditional ways of of, getting the news out. But I actually, I’m I’m friends with a lot of reporters that are really good, but, they tend to be the reporters that do really deep work. I’ve met them over ai, and I still go to Iraq, but sometimes they write about, our company. But they’re a minority. Yeah.

Speaker: 1
01:37:56

I think the whole industry’s, economics and incentives are just, like, the clickbait and all of that stuff.

Speaker: 2
01:38:02

Yeah. That’s what I was gonna say. This they’re not incentivized. It’s not it’s you you want a a career in journalism, being authentic is not the way to go. No.

Speaker: 1
01:38:10

Not at all.

Speaker: 2
01:38:10

It’s just so crazy.

Speaker: 1
01:38:11

Right.

Speaker: 2
01:38:12

Sai it’s a crazy thing to say. Yeah. But then Ai think there’s probably a naivete that we all have about past journalism that we think wasn’t influenced and was was real. I think there’s probably always been horseshit in journalism.

Speaker: 0
01:38:26

Mhmm.

Speaker: 2
01:38:27

You know, all the way back to Watergate, you know. And Tucker Carlson enlightened me in the true history of Watergate and that Bob Woodward was an intelligence agent, and that was the first assignment he ever got as a reporter Oh, wow. Was Watergate. Like, what are the what are the odds Yeah. That the biggest story ever you would give to a rookie reporter, you wouldn’t. Yeah.

Speaker: 2
01:38:46

And that the people that actually involved in all that were all Ai. Like, the whole thing is nuts. It was an intelligence agent.

Speaker: 1
01:38:52

Yeah. Shah was the rumors that Washington Post has always been been ai? Probably. Yeah.

Speaker: 2
01:38:57

Probably. I mean, who knows now? Because, you know, now it’s owned by Bezos, and he just recently, made this mandate to stick with the the actual story and not editorialism. Yeah.

Speaker: 1
01:39:10

And to This is the what I was talking about in trend in Silicon Valley of, like, you know, the founder owners stepping in and actually Yeah. Becoming managers.

Speaker: 2
01:39:18

Well, they ai have to, otherwise, it’s bad for the business now because because of the hunger for authenticity, the more you have bullshit, the more your business crumbles. Right. It’s actually, like, negative for your outcome.

Speaker: 1
01:39:30

Yeah. And and I think you you can look at it at a societal level, which, again, why I’m interested with this idea of, like, AI making more people entrepreneurs and more independent, is that, you know, macro level, you know, you’ll you’ll get more authenticity. You you’ll get, just more dynamism.

Speaker: 2
01:39:50

Yeah. I think so. I mean, that’s the again, the rose colored glasses view.

Speaker: 1
01:39:55

Well, you know, there’s there’s obviously gonna be a lot of things that arya

Speaker: 2
01:40:01

lot of disruption.

Speaker: 1
01:40:02

A lot of disruption. There’s gonna be jobs that arya, that, you know, are gonna go away, and and there’s gonna be spam and bots and and fraud and all of that. There’s gonna be, like, problems with, like, weapon autonomous weapons and all that. And I I think those are all important, and we need to to handle them.

Speaker: 1
01:40:24

But, but but, also, like, I think the negative angle of technology and AI gets a lot more views and clicks. And, you know, if if we sana to go viral right now, I’ll tell you, these are the 10 jobs that you’re gonna lose tomorrow. And and and,

Speaker: 0
01:40:43

you know, that’s

Speaker: 1
01:40:43

the easiest way to kind of go viral on the Internet. But, like, you know, trying to think, through, you know, what are the actual implications, and and what is true about human nature that really doesn’t change and really is ai. And I think the, the people want to create, and people want to make things, and people have ideas, you know.

Speaker: 1
01:41:08

Again, everyone that I talk to have one idea or or another, whether it’s for their job, or for a business they wanna build, or somewhere in the middle. Like the, just yesterday, I was watching a video of an entrepreneur using a platform, Replit. His name is Ahmad George, and he works for this, sai care company.

Speaker: 1
01:41:29

And he’s an operations manager. And, a lot of a big part of his job is, like, managing inventory and doing all of this stuff, like, in a very manual way and, very, you know, tedious tedious way. And he always had this idea of, like, let’s automate big part of it. It’s ai, you know, it’s no problem ERP.

Speaker: 1
01:41:51

So they went to their software provider, NetSuite, and told them we need these modification to the ERP system so that it makes our job easier. We think we can automate, you know, hundreds of hours a month or something like that. And they quoted them a $150,000. And he had just seen a video about our platform, and he went on on Replit and and and built something in a couple weeks, costed him a $400, and then, deployed it in in his office.

Speaker: 1
01:42:23

Everyone in the office started working it, using it. They all got more productive. They started saving time and money. He went to the CEO and and and and showed him the impact. Look at, how much money we’re saving.

Speaker: 1
01:42:37

Look at the fact that we built this piece of software that, is cheaper than what the consultants, quoted us. And I sana sell the software to the company. And so he sold it for $32,000, to the company. And, and next year, he’s gonna be getting more maintenance subscription revenue from it.

Speaker: 0
01:43:01

Mhmm.

Speaker: 1
01:43:01

So so this idea of of people becoming entrepreneurs, it doesn’t mean, like, everyone has to quit their job and, like, you know, build a business. But within your job, everyone has an opportunity to get promoted. Everyone has an opportunity to to remove the tedious job. There was a Stanford study asking people what percentage of your job is automatable just recently, and everyone like, people said about half.

Speaker: 1
01:43:24

Like, 50% of what I do is, like, routine and tedious, and I don’t wanna do it. And rather and have ideas on how to make the business better, how to make my job better, and, and I think we can use AI to do it. And there’s there’s hunger in the workforce to use AI to, for humans to sort of for people to reclaim, their speak as the creative ai, because the the

Speaker: 0
01:43:51

the thing that happened with

Speaker: 1
01:43:52

the emergence of computers is is that, in in many ways, people, like, became a little more drone like and MPC like. They’re doing the same thing every day. But I think the real promise of of AI, and technology has always been automation sai that we we have more ai, either for leisure or for creativity or for ways in which we can advance our lives, change our lives or our careers.

Speaker: 1
01:44:16

And, yeah, this is this is what gets me ai. And Sai think it’s it’s I don’t think it’s, you know, predominantly a rose colored glasses thing because I’m seeing it every day. And that that’s what gets me fired up.

Speaker: 2
01:44:31

It’s also you have a ai sample group. Right? Because you have a bunch of people that are using your platform, and they are achieving positive results.

Speaker: 1
01:44:39

So But they’re from every walk of life.

Speaker: 2
01:44:41

Yes. I look, we have a bunch of things that are happening sai. And I think one of the big fears about automation and AI in general is the abruptness of the change. Because it’s going to happen, boom, jobs are gonna be gone.

Speaker: 0
01:44:53

Mhmm.

Speaker: 2
01:44:53

And then, well, these tedious jobs, do we really want people to be reduced to these tedious existences of just filing paperwork and putting things on shelves and

Speaker: 1
01:45:06

that’s And they will tell you they don’t wanna be doing it.

Speaker: 2
01:45:08

They don’t wanna be doing that. But then there’s the thing of how do we educate people, especially people that are already set in their ways and they’re mature adults. How do you get and inspire these people to, like, okay, look, your job is gone, and now you have this opportunity to do something different. Go forth.

Speaker: 1
01:45:30

I I think, you know, reskilling is is something that that, you know, have been done in the past with, you know, some amount of success. Obviously, if you’re if you’ve never been exposed to technology, you know, did you remember that? I Ai think was very cruel thing to say to ai to go learn code?

Speaker: 2
01:45:48

Yeah. Learn to code was Yeah.

Speaker: 1
01:45:49

I think that’s that’s really cruel. But but if you’re but if you’re someone whose job is, sort of a desk job, you you already are in the computer, you’re there’s a lot of opportunity for you to reskill and start using AI to, like, automate a big part of your job. And, yes, there’s gonna be job loss, but I think a lot of those people will be able to reskill.

Speaker: 1
01:46:07

And what we’re doing with the government of Saudi Arabia, I would love to do in The US.

Speaker: 2
01:46:11

So how is the government of Saudi Arabia using it?

Speaker: 1
01:46:15

So it’s it’s we’re just starting right now. What’s their goal? Their goal is, two folds or three. One is an entire generation of people growing up with these creative tools, in instead of just, you know, just, you know, textbook learning, instead learning by doing, making things.

Speaker: 1
01:46:38

So ai generation understanding how to make things with AI, how to code, and all of that stuff. Second is, upgrading sort of comer government operations. So you could think of it sort of ai like Doge, but, like, more technological. Like, can we automate a big parts of what we do in HR, finance, and and things like that?

Speaker: 1
01:46:57

And I think it’s possible to build these specific AI agents that do part of finance job or accounting job. Again, all these routine things that people are doing, you can go and automate that and make government as a whole more efficient. And third is entrepreneurship. Is if if you gave that power to more people to, to be able to kind of build businesses, then not only they’re growing up with it, but but also there’s a culture of entrepreneurship.

Speaker: 1
01:47:25

And there is existing already in Saudi Arabia. Which, I mean, the sad thing about The Middle East, there’s so much potential, but there’s so much wars and sai so much disaster. But, and and Well, there’s so much money. There’s also so much money. Yeah. But which is good. And I think it’s it’s good for The United States.

Speaker: 1
01:47:40

Like, I think, you know, what what president Trump did with the with the deals in in the Gulf Region is great. It’s gonna be great for The United States. It’s gonna be great for the, Gulf Region. And but but Ai think we need more of that you know, we talked about government.

Speaker: 1
01:47:57

We need more of that enlightened, view of education, of, you know, change in our in our government today. You know, this idea that we’re gonna bring back the old manufacturing jobs, Ai meh, I I understand, like, Americans got really screwed with with what happened. Like, you know, people got these jobs got sent away, by globalism, whatever you wanna call it, and a few number of people got massively rich.

Speaker: 1
01:48:23

A lot of people got, disenfranchised, and we had the opioid epidemic, and it had just massive damage it made massive damage on the culture. But is is there a way to bring back those those those jobs, or or is there a new way of the future and that new there’s probably a new manufacturing wave that’s gonna happen with robotics.

Speaker: 2
01:48:48

Right.

Speaker: 1
01:48:48

You know? There’s, there’s, you know, the human humanoid robots are starting to work. And and these, I think, will need a new way of manufacturing it. And so The US can be at the forefront of that, can own that, bring new jobs into existence. And all of these things need need software.

Speaker: 1
01:49:07

Like, our world is sana be primarily run by AI and robots and all of that. And more and more people need to be able to make software, even if it is prompting and not really, you know but a lot more people just need to be able to make it. There’s gonna be a need for more products and services and all of that stuff.

Speaker: 1
01:49:22

And I think there’s enough jobs to go around if we have this mindset of meh let’s actually think about the future of the economy as opposed to let’s bring back, certain manufacturing jobs, which I don’t think Americans would wanna do anyways.

Speaker: 2
01:49:38

Right. They don’t wanna do the jobs. My my problem is there’s some people that are doing those jobs right now, and it’s their entire identity. You know, they have a good job. They work for a good company. They they make a good living, and that might go away. And they’re just not psychologically equipped to completely change their life.

Speaker: 0
01:49:57

What

Speaker: 1
01:49:57

what do you think is the solution there? Which I agree, it’s a it’s a it’s a real problem. Well,

Speaker: 2
01:50:04

desperation, unfortunately, is going to motivate people to to make changes. And, it’s going to also motivate some people to choose drugs. That that’s my fear. My fear is that you’re gonna get a lot more people. There’s gonna be a lot of people that they figure it out, and they survive.

Speaker: 2
01:50:23

Just I mean, this is natural selection, unfortunately. Like, in ai to a digital world. Mhmm. There’s going to be people that just aren’t psychologically equipped to recalibrate their ai, and that’s my real fear.

Speaker: 0
01:50:37

Mhmm.

Speaker: 2
01:50:37

My real fear is that there’s a bunch of really good people out there that are, you know, valuable parts of a certain business right now that that their identity is attached to being employee of the month. They’re they’re good people. They show up every day. Everybody loves them and trusts them. They do good work and everybody rewards them for that.

Speaker: 2
01:50:55

And that’s part of who they are as a person. They’re a hardworking person

Speaker: 1
01:50:58

Of course.

Speaker: 2
01:50:58

And they feel that way. And there’s ai a lot of real good people out there that are, you know, blue collar, hardworking people, and that’s they take pride in that, and that job’s gonna go away.

Speaker: 1
01:51:11

Well, Ai I actually think that more white collar jobs are going away.

Speaker: 2
01:51:15

I think so too.

Speaker: 1
01:51:16

Yeah. So then then blue collar, which is what was the like, go back ten years ago, and we thought, okay, self driving cars, you know, robots and manufacturing, and, and, and that turned out to be a lot harder than, than than actually, like, more desk jobs because we have a lot more data.

Speaker: 1
01:51:39

For one, we have a lot more data on people sitting in front of a computer

Speaker: 0
01:51:43

Mhmm.

Speaker: 1
01:51:44

And and doing Excel and writing things on the Internet.

Speaker: 2
01:51:48

And

Speaker: 1
01:51:48

Yes. And so we’re able to train these, what we call, large language models, and those are really good at at, like, using a computer ai a human uses a computer.

Speaker: 2
01:51:57

Right.

Speaker: 1
01:51:58

And so I think the jobs to be worried about, especially in the next months to a year, a little more, is the routine computer jobs where it’s formulaic. You you know, you go, you have a task, like quality assurance jobs. Right?

Speaker: 0
01:52:14

Mhmm.

Speaker: 1
01:52:14

Software quality assurance. It’s like you get a you get, you have to constantly test the same feature of, like, you know, some large software company, Microsoft or whatever. You’re sitting there, and you’re, you’re you’re performing the same thing again and again and again every day.

Speaker: 1
01:52:33

And if there’s a bug, you kinda report it back to the software engineers. And and and that is, I think, really, in the in the in the bull’s eye of of what AI is gonna be able to do over the next month.

Speaker: 2
01:52:47

And do it much more efficiently.

Speaker: 1
01:52:48

Much more efficiently, much faster.

Speaker: 2
01:52:51

Yeah. Yeah. Those people have to be really worried. Ai, you know, professional drivers, like people who drive trucks Right. Things along those sai, that’s going away.

Speaker: 1
01:53:01

That’s definitely going away.

Speaker: 2
01:53:03

Yeah. And that’s a that’s sai enormous part of our society. It’s millions of jobs. Right. You know, I was watching a video on this, coal mining factory in China that’s completely automated, and it was wild to watch. They every step of the way is automated, including recharging the trucks. Like, the trucks know they’re all electrical. Everything’s run on electricity. They recharge themselves.

Speaker: 2
01:53:27

You know, they’re they’re pulling the coal out of the ground. They’re stacking it, inventory, everything. Storage, it’s all automated, and it runs twenty four seven. I’m like, this is wild. It’s crazy.

Speaker: 1
01:53:39

Yeah. Wild. I remember watching the, video of BYD making an electric vehicle. It is really satisfying to watch. It’s all, like, the entire assembly ai.

Speaker: 0
01:53:50

Mhmm. It’s

Speaker: 1
01:53:51

automated the way they, like, you know, put the paint and the way they, like, do the entire thing is

Speaker: 2
01:53:56

By the way, China’s electric vehicles are so good. They’re so advanced.

Speaker: 1
01:54:03

Yeah.

Speaker: 2
01:54:03

There’s this guy that I follow on Instagram. God, I can’t remember his name. I really wish I could right now, but he he reviews a lot of electric vehicles, ai, very, like, I’ve never even heard of these companies, and they’re incredible. They’re so advanced.

Speaker: 0
01:54:18

Yeah.

Speaker: 2
01:54:19

And their suspension systems are so superior to the suspension systems of even, like, German luxury cars. Like, they did a demonstration where they drove one of these Chinese electric vehicles over, an obstacle course.

Speaker: 1
01:54:32

Mhmm.

Speaker: 0
01:54:33

And then

Speaker: 2
01:54:33

they had, like, a BMW and a Mercedes go over, and the BMW is on work and

Speaker: 0
01:54:36

work and work and work and work and work and work and work.

Speaker: 2
01:54:38

And the the Chinese one is fucking flat planing the entire way. Right. Every, bump in the road is being completely absorbed by the suspension.

Speaker: 1
01:54:46

Right. There’s all Ai. You know? There’s all yeah.

Speaker: 2
01:54:48

So much better than what we have. Right. Like, so what is this? That’s him. Yep. That’s him. Forrest Jones. Shout out to Forrest. He’s great. He does, like, these really fast paced videos, but and he does a a lot of cars that are available here in America as well, but he does a a shit ton of them that aren’t.

Speaker: 2
01:55:06

Which one is this one here? Neo. Yeah. Listen to him because he’s pretty good at this shit.

Speaker: 3
01:55:11

Seven ten horsepower. I get cameras here, ai there for self driving, and this has two Neo made chips. And for reference, one of those chips is as powerful as four NVIDIA chips, and this has two. Neo also has battery shah stations.

Speaker: 0
01:55:23

So if

Speaker: 3
01:55:23

you’re in a rush, you can hit one up. It’ll lift your car, swap out your battery, put in a fully charged one in between three and five minutes. But here’s where the s class should be worried. Not only does it have rear steer and steer by wire, so it’s extremely easy to maneuver, it may have one of the most advanced hydraulic systems I’ve ever seen.

Speaker: 3
01:55:36

It can pretty much counteract any bump. Wow. And after you go over something four times, it’ll memorize it sai that the fifth time, it’s like that bump never existed. Inside, you get pillows in your headrest, heated, ventilated, and massaging leather seats, a passenger screen built into my dash, a main screen that works super fast.

Speaker: 3
01:55:49

I get a driving display, a head up display, and my steering works super fast.

Speaker: 1
01:55:55

Pretty dope. Yeah. Well, essentially, what he said is that the car is learning the terrain. If if it went over it once, it’ll learn it. And

Speaker: 2
01:56:03

Yes.

Speaker: 1
01:56:04

I think this is the next sort of, big thing with with AI, whether it’s robotics, cars, or even chat g p t now, it has memory. It, like, learns about you and starts to, like sort of similar to how social media feeds, but I think in a in a lot of ways more negative, learn about you.

Speaker: 1
01:56:24

I think these systems will start to to have, like, you know, more online learning instead of just training them in these large data centers and these large data, and then giving you this thing that doesn’t know anything about it. It’s totally stateless. As you use these devices, they will, like, learn your pattern, your behavior, and and all of that.

Speaker: 2
01:56:45

Yeah. Why is China so much better at making these cars than us? I think because they’re really advanced.

Speaker: 1
01:56:53

Yeah.

Speaker: 0
01:56:57

Ai I think, you

Speaker: 1
01:56:57

know, a lot of people think that I’m not an expert in China, but a lot of people think that, the thing that makes, China better at in manufacturing is the sort of, quote, unquote, like, more like, you know, treating workers, like slaves. That’s, you know, slave work or whatever, which I’m sure some of that, happens.

Speaker: 1
01:57:23

But Tim Cook recently said, maybe not so recent, but he thinks, you know, part of the reason why they manufacture in China is there’s expertise there that evolved over time.

Speaker: 2
01:57:35

Yeah. That’s why they want to use the, Chinese manufacturing for the iPhone 17. Yeah.

Speaker: 1
01:57:41

And I think the, one of the things that are good at one of the things that are good about more technocratic, systems, Singapore, obviously, China’s the the biggest one, is that the sort of leadership, it comes at a cost of, you know, freedom and other things, but the leadership can have a fifty year view of where things are headed. And they can say, ai, yes, we’re now making the, you know, the plastic ram, we don’t wanna keep making plastic crap.

Speaker: 1
01:58:16

We’re gonna build the capabilities and the automation and manufacturing, expertise to be able to, like, leapfrog the West in making these certain things, whereas it’s, you know, it’s been historically hard, again, for good reasons. I think there’s more freedom preserving when, when you don’t have, that’s that much power in in government.

Speaker: 1
01:58:42

I mean but I feel like Meh, we’re, like, the worst of both words worlds where increasingly the government is making more and more decisions and choices and, like, than any state. But at the same time, we don’t have this enlightened, like, you know, ten year road map for where we wanna be.

Speaker: 2
01:59:02

Yeah. Because we never think that way because we we deal in terms.

Speaker: 1
01:59:06

Yeah. Four year

Speaker: 2
01:59:07

Four year terms.

Speaker: 1
01:59:08

That’s the problem. Public companies. Four year terms, public companies, quarters. Right. Quarters. And, again, this is back to this managerial idea run by managers that, you know, part of the reason why, you know, Zuck has complete control. He can, he how much did he spend on VR? Like, I don’t know, $3,040,000,000,000 dollars? Maybe more per ram, maybe?

Speaker: 1
01:59:32

He spent a ton of money, ai, in like, ai, a GDP word, like, small state GDP worth of money on VR, and the public market was totally doubtful Yeah. Of that. And the reason he could do that is because he has, what arya they called, super super voting shares. And so he has complete, control of the company, and he can’t be unseated by activist investors, sort of, what’s been done to, Wasn’t there, like,

Speaker: 2
02:00:00

a recent trial where they were trying to impeach him for saying that? They’re trying to remove him from that? They can’t unless I ai. But there was a trial. I think there’s a a trial that’s going on it was going on, like, very recently.

Speaker: 1
02:00:12

Oh, I think I think you you’re thinking about the antitrust.

Speaker: 2
02:00:15

No. No. There’s something about him saying that he can’t be fired.

Speaker: 1
02:00:20

But it’s true.

Speaker: 2
02:00:20

It is true. It’s legal. I know. It is nonsense. The trial I believe the trial is nonsense, but it like, a friend of mine was actually representing him in this.

Speaker: 1
02:00:29

Maybe in Europe or something?

Speaker: 0
02:00:31

I don’t

Speaker: 2
02:00:31

think so. I think it’s in America. Google, Mark Zuckerberg, Josh Dubin trial. See if you can find anything on that.

Speaker: 1
02:00:40

But, yeah, Mark can think on the order of, decades. Like, when I was there at Facebook, he was talking about, the idea that, like, there’s gonna be a fundamental shift. He’s ai, if you look if you look back a 100, computers every, you know, twenty years or whatever change the user interface modality.

Speaker: 2
02:01:05

Mhmm.

Speaker: 1
02:01:05

You go from terminals and mainframes to to desktop computers to, you know, mobile computing, and it was like, okay. What’s next? And first guess was, like, VR. And now I think their best guess is, like, AR plus AI. Well ai, my like, the AR glasses, their new Meh Ray Ban glasses plus Yeah. Plus AI.

Speaker: 1
02:01:25

And they can make massive investment. They just made crazy investment, this company, Scale Ai. Scale AI is, data provider for OpenAI and Google, and that what they do is, you know, OpenAI will say, I want the best law and legal data to train the best legal, machine learning model.

Speaker: 1
02:01:47

And they’ll go to, you know, places where there’s, the labor costs are low, but maybe still well educated. There are places in Africa and Asia that are like that. And they’ll they’ll sit them down ai say, okay. You’re gonna get these tasks, these legal programming, whatever tasks, and you’re gonna do them, and you’re gonna write your thoughts as you’re doing them.

Speaker: 1
02:02:08

I’m simplifying it, but, basically, they collect all this data. Basically, it’s labeled labor. They take it. They put it in the models, and they they train the models. And OpenAI spends billions on dollars on that, Anthropic, all these companies.

Speaker: 1
02:02:22

And so the the this was the this this company was the major data provider, and, Meta just acquired them. You know, there’s this new trend of acquisitions, but Ai assume because they wanna get around regulations. But they bought 49% of the company, and then they hired all the leadership.

Speaker: 1
02:02:46

So so the the Scale Ai, like, Meta hired the leadership there and bought out the investors. They put $15,000,000,000 into the company. The weird thing about it is Google and OpenAI are like, we’re not gonna use this shit anymore. So the company value went down because people you know, these companies don’t sana use it, and now they’re going to to other companies.

Speaker: 1
02:03:10

And so, in effect, Zuck bought saloni for $15,000,000,000.

Speaker: 2
02:03:17

Wow.

Speaker: 1
02:03:18

Can you imagine that? Talent for $15,000,000,000. Google recently bought a company for one known researcher who’s one of the inventors of the large language model technology, Noam Shazir, for $3,000,000,000 bought his company, and I think they’re not really, they they do these weird deals where they buy out the investors, and they let the company run as a shell of itself, and then they acquire the talent.

Speaker: 1
02:03:47

Wow. Microsoft did the same thing.

Speaker: 2
02:03:51

That’s crazy. So it’s just these unique individuals that are very valuable there.

Speaker: 1
02:03:56

Very, very vatsal, worth billions of dollars.

Speaker: 2
02:03:58

Sam Altman sai, Metatron failed to poach OpenAI’s talent with a $100,000,000 offer.

Speaker: 1
02:04:03

So this $100,000,000 is, sign on bonus. Does not even does not even, like, salary.

Speaker: 0
02:04:12

Or or, yeah, equity. It’s just

Speaker: 1
02:04:14

It’s just bonus.

Speaker: 2
02:04:15

A bonus. $100,000,000 bonus. Come here. Failed and failed. I don’t

Speaker: 0
02:04:20

know I

Speaker: 1
02:04:20

don’t know what failed. I mean, I’m sure he’s gonna say that. It in

Speaker: 0
02:04:22

a weird way. He said our best talent hasn’t taken it, so he could’ve been that. Of course,

Speaker: 1
02:04:25

he’s gonna say that.

Speaker: 0
02:04:26

Well, of course. The people that did take

Speaker: 1
02:04:28

it, well, they weren’t our best. The store. Yeah.

Speaker: 2
02:04:30

We don’t even like those

Speaker: 0
02:04:31

guys anymore.

Speaker: 1
02:04:32

By the way, OpenAI does it to companies like ours. Like, it’s it’s just a question of scale. Like, Zuck can can give them a $100,000,000 and steal the best talents. And, like, companies like OpenAI, which I love, but they go to, like, small startups and give them $10,000,000 to to grab their saloni. Right.

Speaker: 1
02:04:49

But, you know, it’s, it’s very, very competitive right now, and there are like, I don’t know if these individuals are actually worth these billions of dollars, but the talent war is so crazy because everyone feels like there’s a race towards getting to superintelligence, and the first company to get to superintelligence is gonna reap massive amounts of rewards.

Speaker: 2
02:05:14

How far away do you think we are from achieving that?

Speaker: 1
02:05:17

Well, you know, like I said, like, my my philosophy tends to be different than Ai think the mainstream in in Silicon Valley. I think that, AI is gonna be extremely good at, you know, doing labor, extremely good vatsal, like, you know, Chatt GPT and being, like, a personal assistant, extremely good at, like, you know, ai, Replit being a automated programmer.

Speaker: 1
02:05:42

But the definition of superintelligence is that it is better than every other human collectively at any task. And I I am not sure there’s evidence that we’re headed there. Again, I think that one important aspect of superintelligence or or AGI is that you drop this entity into an environment where it has no, idea about that environment.

Speaker: 1
02:06:16

It’s never seen it before, and it’s able to efficiently learn to achieve goals within that environment. Right now, there’s a bunch of studies showing, like, you know, GPT four or any of the latest models. If you give them an exam or quiz that is slightly even slightly different than their training data, they tank. They do really badly on it.

Speaker: 1
02:06:39

I think the way that AI will continue to get better is is via data. Now at some point, and maybe this is the point of of takeoff, is that they can train themselves. And the way they could the way we know how AI could train itself through a, method called self play. So the way self play works, is, you know, take, for example, AlphaGo.

Speaker: 1
02:07:06

AlphaGo is, I’m sure you remember Lisadol, game between DeepMind, AlphaGo, and Lisadol, and it won in the game of Go. The way AlphaGo is trained is that part of it is a neural network that’s trained on existing data, but the way it achieves superhuman performance in that one domain is by playing itself, like, millions, billions, perhaps trillions of times.

Speaker: 1
02:07:37

So it it starts by, like, generating random moves, and then it learns what’s the best moves. And it’s, like, basically a multi agent system where it it learns, ai, I did this move, wrong, and I need to kind of reexamine it. And and it it trains itself really, really quickly by doing the self play. It’ll play fast, fast games, with with itself.

Speaker: 1
02:07:56

But but we know how to make this in game environments because game environments are closed environments. But we don’t know how to make it we don’t know how to do self play, for example, on, literature Because you need you need objective truth. In literature, there’s no objective truth. Taste is is different. Right. Conjecture, philosophy. There’s a lot of things.

Speaker: 1
02:08:26

And, again, this is I go back to why there’s still a premise of humans is, there there are a lot of things that are intangible, and we don’t know how to, like, generate objective truth in order to train machines in the self play fashion. But, like, programming is has objective truth. Coding has objective truth. The machine can like, you can construct a environment that has a computer and has a problem.

Speaker: 1
02:08:56

There’s a ton of problems, and and even an AI can generate, you know, sample problems. And then there’s, like, a test to validate whether the program works or not. And and then you can generate all these programs, test them, and, and if they succeed, you know, that’s a, that’s a reward that trains your system to to get better at that.

Speaker: 1
02:09:17

If it doesn’t succeed, you know, that’s also feedback. And they run them all the time, and they it gets better at ram. So I’m confident that program is gonna get a lot better. I’m confident that math is gonna get a lot better. But but from there, it is hard to imagine how all these other more subjective, soft softer sort of sciences of Ai will get better through self play.

Speaker: 1
02:09:47

I think the AI will only be able to get better through, through data from human labor.

Speaker: 2
02:09:55

If AI analyzes all the past creativity, all the different works of literature, all the different music, all the different things that humans have created completely without AI, Do you think it could understand the mechanisms involved in creativity and make a reasonable facsimile?

Speaker: 1
02:10:19

I think it will, be able to imitate, very well how humans come up with new ideas in a way that it remixes all the existing ideas and, ram its training data. But ai the way, again, they’re super powerful. This is not ai a dig at Ai. The ability to remix all the available data into new potentially new ideas or newish ideas because they’re remixes, they’re derivative, is still very, very powerful.

Speaker: 1
02:10:51

But, you know, the best marketers, the best like, you know, think of, you know, one of my favorite, marketing videos is, Think Different from Apple. Mhmm. It’s awesome. Like, I I don’t think that, like, really, machines are at a point where they like, I I try to talk to Chaj Chikte a lot about, like, you know, marketing or naming.

Speaker: 1
02:11:11

It’s so bad at that. It’s, like, Midwitt bad at that. And and I I, you know For now. But but but that’s the thing. It’s ai, I just don’t see and look. I’m not I’m not, an AI researcher, and maybe they’re working to have ideas there.

Speaker: 1
02:11:28

But in the current landscape of the technology that we have today, it’s hard to imagine how these AIs are gonna get better at, say, literature or the software things that we, as humans, find really compelling.

Speaker: 2
02:11:43

What’s interesting is the thing that’s the most the most at threat is these sort of middle of the road Hollywood movies that are essentially doing exactly what you said about AI. They’re sort of like you know, they’re sort of remixing old themes and tropes and figuring out a way to repackage it.

Speaker: 1
02:12:06

But but I think, actually, those tools in the hands of humans, they’ll be able to create new interesting movies and things like that.

Speaker: 2
02:12:12

Right. In In the hands of humans. So with additional human creativity applied

Speaker: 1
02:12:17

Right. So the the the man machine symbiosis.

Speaker: 2
02:12:21

Right.

Speaker: 1
02:12:22

This was the the term that’s, used by JC Licklider, like the grandfather of the Internet from ARPA. A lot of those guys kind of imagined a lot of what’s sana have a lot of the future and this this idea of, like, human plus machine will be able to create, amazing things.

Speaker: 0
02:12:39

Mhmm.

Speaker: 1
02:12:39

So what people are making with Veo is not because the the machine is really good at painting it

Speaker: 0
02:12:45

Right.

Speaker: 1
02:12:45

At, like, generating it and making it.

Speaker: 2
02:12:47

But it can’t make it without the prompts.

Speaker: 1
02:12:49

Like, the the the the really funny, like, yeah. Without the problem. Like, the Bigfoot, finds trend. And they inject themselves with trend. They start working out. I was like, Ai telling you, my TikTok feed is is really wild right now. Like like, I just it’s like this, like, real weird distorted human mind Yeah.

Speaker: 1
02:13:16

To to come up with this.

Speaker: 0
02:13:17

Well, this Lightweight, baby. This

Speaker: 2
02:13:21

have you seen the ones where, it’s Trump and Elon and Putin and they’re all in a band?

Speaker: 1
02:13:26

Right. Ai.

Speaker: 2
02:13:27

They’re playing Creedence Clearwater Revival

Speaker: 0
02:13:29

Right.

Speaker: 1
02:13:29

Yeah. I’ve seen it.

Speaker: 2
02:13:30

Fortunate Sana. It’s crazy.

Speaker: 1
02:13:31

Another one is the LA, riots and how, like, they’re all all the world leaders are sort of gangsters in the in the riots. But that one is hilarious.

Speaker: 2
02:13:43

Yeah. That ai of stuff is fun. And it’s it’s interesting how quickly it can be made too. You know, something that would have take a long time through these video editors where they they were using computer generated imagery for a long time. But it sai very painstaking and very, you know, very expensive. You know? Now, it’s really cheap.

Speaker: 1
02:14:02

On the way here, I was like, I wanna make an app to sort of impress you with with our technology. I was like, what what would Joe like? And then I came up with this idea of, like, a squat form analyzer. And so in in the car over, way here, sorry, in the lobby, but, I I made this app to,

Speaker: 2
02:14:24

You made it on the way over here?

Speaker: 1
02:14:25

On the way on my phone. This is the, like, really exciting thing about what we built with being able to, like, program on your phone is, you know, being able to, like, have that inspiration that, you know, can come ai and just immediately pull out your phone and start start building it.

Speaker: 1
02:14:46

So, here, I’ll show you. Sai, basically, you just start recording and then do a few squats.

Speaker: 2
02:15:09

Okay. It’s gonna analyze it just from there?

Speaker: 1
02:15:13

Yeah. But, I mean, the camera angle is not that great, but let’s

Speaker: 2
02:15:18

ai. Okay. And it’s gonna be able to tell you whether or not you’re doing it well?

Speaker: 1
02:15:28

Yeah. Those are not my best squats. Just so you know what you’re already doing.

Speaker: 0
02:15:33

I’m not judging you.

Speaker: 1
02:15:36

I used to squat, you know, 350 pounds. So so so now it’s, it’s integrating Google Gemini, model to to kind of, run through the video, analyze it, and it’ll come up with a score and then suggestions. And so the again, this is ai a random idea. I was like, okay. What would what would be interesting to do?

Speaker: 2
02:16:00

A really interesting thing that people could use at the gym, though. Like, not just for squats, but maybe for chin ups and all kinds of stuff. Like, oh, maybe, you know, I’m looking at your form and this is what you need to do. To get a little lower, you know, make your elbows parallel to your body, whatever.

Speaker: 1
02:16:16

I built so many personal apps. Like, I built apps for analyzing my health and ai of talked about some of my health problems that are now a lot better. Look, bad form. Oh, bad form.

Speaker: 2
02:16:28

Just, like, straight away, critical. You’re it is improved.

Speaker: 1
02:16:31

Meh. Nice position, unable to probably assess from the video angle. So, yeah, it’s it’s a little

Speaker: 2
02:16:37

Okay. So it’s saying it’s not the best angle?

Speaker: 1
02:16:39

But it’s saying my depth is bad, which was which was actually bad. Sai and and I was, leaning forward. But it’s pretty good. You know, I tried it a few times. It’s, it’s really good at that. And so, I I build a lot of, apps for, like, just my personal life.

Speaker: 2
02:16:55

That would be a that’s that that would be great for a person who doesn’t want a trainer.

Speaker: 1
02:16:59

Right.

Speaker: 2
02:17:00

You know, I don’t wanna deal with some person. Let me just work out on my own. But am I doing this right? Set your phone up. Have it correct you.

Speaker: 1
02:17:07

Yeah. Yeah. At at the office, some guys are building we have this partnership with WHOOP. I don’t know if you’ve ever tried. WHOOP. And, they’re building an app sai we can start competing on workouts based on WHOOP data. Wow.

Speaker: 2
02:17:22

That’s awesome.

Speaker: 1
02:17:24

Yeah. Our company is, like, very weird for Silicon Valley. Like, we have a jiu jitsu mat, and we have

Speaker: 2
02:17:30

Oh, really? Yeah.

Speaker: 1
02:17:31

We have a

Speaker: 2
02:17:31

Do you guys bring in trainers?

Speaker: 0
02:17:32

Mhmm. Oh, that’s fucking great.

Speaker: 2
02:17:34

That’s awesome. Don’t get

Speaker: 1
02:17:35

hurt. It’s, you know, I only recently got into it, but the, the the the hardest thing about it is is, to be calm because your your impulse is to overpower

Speaker: 2
02:17:49

Yes. Yeah.

Speaker: 1
02:17:51

Is to, like

Speaker: 2
02:17:52

The Gracies have a great saying, keep it playful. Yeah. And that’s how you really learn the best. It’s very hard and, listen, I’m a giant hypocrite because most of my jiu jitsu career, I was a meathead. Mhmm.

Speaker: 1
02:18:04

You know,

Speaker: 2
02:18:04

and I I that’s one of the reasons why I started really lifting weights a lot. It’s like I realized strength is very valuable. Mhmm. And it is.

Speaker: 1
02:18:11

It is. Yeah.

Speaker: 2
02:18:11

And it is vatsal. But technique is the most vatsal, and the best way to acquire technique is

Speaker: 1
02:18:17

to pretend that you don’t have strength. The best way to acquire technique is to pretend to

Speaker: 2
02:18:22

Yeah. Don’t force things. Just find the best path.

Speaker: 1
02:18:26

Mhmm.

Speaker: 2
02:18:26

And that requires a lot of data, so you have to understand the positions, you know, so you have to really analyze them. The best jujitsu guys are really smart. Yeah. Like, Mikey Musumichi, Gordon Ryan, Craig Jones, those are very intelligent people.

Speaker: 1
02:18:42

Yeah.

Speaker: 2
02:18:42

And that’s why they’re so good at jujitsu. And then you also have to apply that intelligence to recognize that discipline is a massive factor. Like, Mikey Musamichi trains every day, twelve hours a day.

Speaker: 1
02:18:55

Twelve hours a day?

Speaker: 2
02:18:56

Twelve hours a day.

Speaker: 0
02:18:57

Oh, yeah.

Speaker: 1
02:18:58

Is that a humanly possible tool?

Speaker: 2
02:18:59

It’s possible. Yeah. Because he’s not training full blast. It’s not ai like you can’t squat twelve hours a day, 350 pounds. Your your body will break down. But you can go over positions over and over and over and over again until they’re in muscle memory, but you’re not doing them at full strength. Right?

Speaker: 2
02:19:15

So, like, if you’re rolling. Right? So sai if you’re doing drills, you would set up ai a guard pass. You know, when you’re doing a guard pass, you you would tell the person lightly resist, and I I’m going to put light pressure on you. And you go over that position, you know, knee shield, pass, you know, hip into it, turn here’s the counter, on the counter, darce, you know, go for the darce, the person, defends the darce, roll, take the back.

Speaker: 2
02:19:46

And just do that over and over and over again. Until

Speaker: 1
02:19:48

it’s muscle memory.

Speaker: 2
02:19:49

Right. Yeah. And it’s, like, completely ingrained in your body.

Speaker: 1
02:19:53

Instead of chess players, it’s ai, let’s focus on the end game. Yeah. Just keep repeating the end game, end game. Yeah. I read the Josh Ai book. What was it called? I forgot. You know his book about, like, I think chess and jiu jitsu, was it? Mhmm. Yeah.

Speaker: 2
02:20:08

Josh was just in here a few months ago. He’s great. Yeah. Just but it’s so interesting to see a super intelligent person apply that intelligence to jiu jitsu?

Speaker: 1
02:20:18

You know, one of interesting things when I started getting into I’ve I’ve always been into, you know, different kinds of sports, and then pure periods of of extreme programming and and and and obesity.

Speaker: 0
02:20:34

But then

Speaker: 1
02:20:34

but then I tried to get back into it. I was a swimmer, early on. But, one thing that I found, especially in the lifting community, is how intelligent everyone are. They’re they’re actually almost ai, you know, they they’re they’re so focused. They’re, artistically focused on, like, form and ram, and, you know, they they spend so much time designing these spreadsheets for your ram. And Yeah.

Speaker: 2
02:21:03

Well, that’s people have this, like, really we we we have this view of things physical, that physical things are not intelligent things. But you need intelligence in order to manage emotions. Emotions are a critical aspect of anything physical.

Speaker: 0
02:21:23

Mhmm.

Speaker: 2
02:21:24

Any really good athlete, you need a few factors. You need discipline, hard work, genetics, but you need intelligence. Mhmm. It might not be the same intelligence apply people also, they confuse intelligence with your ability to express yourself, your vocabulary, your your your history of reading, you know, ai,

Speaker: 1
02:21:49

you’re just a gamer. Meh.

Speaker: 2
02:21:51

That’s like ai bias.

Speaker: 1
02:21:52

That’s like the, sort of modern desk job by the laptop class

Speaker: 0
02:21:56

by Yeah.

Speaker: 2
02:21:57

Yeah. Well, they assume that anything that you’re doing physically, you’re now no longer using your mind.

Speaker: 1
02:22:02

Yeah.

Speaker: 2
02:22:02

But it’s not true to in order to be disciplined, you have to understand how to manage your mind. Managing your mind is an intelligence. Mhmm. You in in the ability to override those emotions, to conquer that inner bitch that comes to you every time I lift that fucking lid off of that cold plunge.

Speaker: 1
02:22:19

Right.

Speaker: 2
02:22:19

That takes intelligence. You have to understand that this temporary discomfort is worth it in the long run because I’m going to have an incredible result. After this is over, I’m going to feel so much better.

Speaker: 1
02:22:32

Right. Right. Right. Yeah. I haven’t thought about intelligence in order to manage your emotions, but that’s totally true because you’re constantly doing the self talk. Yeah. You’re trying to, like, trick yourself into doing the

Speaker: 2
02:22:42

There are people that are very intelligent that don’t have control over their emotions. Yeah. But they’re intelligent in some ways. It’s just they’ve missed this one aspect of intelligence, which is the management of the functions of the mind itself. Right. And they don’t think that that’s critical.

Speaker: 2
02:22:57

But it is critical. It’s critical to every aspect of your life, and it’ll actually improve all those other intellectual pursuits.

Speaker: 1
02:23:04

You know, to tie it back to the, like, AI discussion, I think a lot of the sort of programmer researcher type is, like, they know that one form of intelligence, and they over rotate on that. And they and that’s why I was like, oh, we’re so close to, like, you know, you know, perfecting intelligence. Right. It’s because that’s what you know.

Speaker: 1
02:23:22

But there’s a lot of other forms of intelligence.

Speaker: 2
02:23:24

There’s a lot of forms of intelligence. And unfortunately, we’re very we were very narrow in our perceptions of these things. Mhmm. And and very ai, and we think that our intelligence is the only intelligence.

Speaker: 1
02:23:37

Right.

Speaker: 2
02:23:37

And that this one thing that we concentrate on, this is the only thing that’s important.

Speaker: 1
02:23:41

Right. Have you, have you read or done any CBT, cognitive behavior therapy? No. Basically, CBT is like a way to, get over depression and anxiety based on self self talk and and cues. Mhmm. I I had to use it. Again, I had, like, sleep issues. I had to use CBTI, ai behavior therapy for insomnia. And the the idea behind it is to, build up what’s called speak pressure.

Speaker: 1
02:24:16

So you don’t first of all, you you insomnia is performance anxiety. Once you stop once you have insomnia, you start having ai, like, by by the time bedtime comes, you’re like, oh my god. I’m just

Speaker: 2
02:24:33

gonna Right. That’s interesting.

Speaker: 1
02:24:35

I’m just gonna, you know, turn over in bed, and I’m just gonna be in bed. And then you start associating your bedroom with the suffering of insomnia, because you’re, like, sitting there and, like, you know, your own mind and really suffering. It’s really horrid, horrific. And, first of all, you treat your bedroom as a sanctuary. You’re only there when you sana sleep.

Speaker: 1
02:24:58

So that’s, like, one thing you program yourself to do. And the other thing is, you don’t nap. The entire day, you don’t nap at all no matter what happens. Like, even if you’re really sleepy, like, get up and take a walk or whatever. And then you build up what’s called sleep pressure.

Speaker: 1
02:25:15

Like, you’re now you have, like, a lot of, sleepiness. Sai you go to bed. You try to fall asleep. If you don’t fall asleep within fifteen, twenty minutes, you get up. You go out. You do something else.

Speaker: 1
02:25:27

And then when you feel really tired again, you go back to bed.

Speaker: 2
02:25:31

Oh god.

Speaker: 1
02:25:32

And then and then and then ai, once you fall asleep, if you wake up in the middle of the night, which is another sort of form of insomnia, instead of staying in bed, you get up, you go somewhere else, you go read or or do whatever. And, like, slowly, you program yourself to see your bed and, oh, like, my the bed is where I sleep. It’s only where I sleep.

Speaker: 1
02:25:51

I don’t do anything else there. And you can you can get over insomnia that way instead of using pills and all the other stuff.

Speaker: 2
02:25:58

Oh, the pills are the worst. God. People that need those fucking things to speak, I feel for them. I I can sleep like that.

Speaker: 1
02:26:06

That’s amazing.

Speaker: 2
02:26:07

Ai can sleep on a mountain.

Speaker: 1
02:26:07

That’s a blessing. That’s a blessing.

Speaker: 0
02:26:09

Oh, my ai hates it.

Speaker: 2
02:26:10

It drives her nuts because sometimes she has insomnia. I could sleep on rocks. I could just go lay down in the on a dirt road and fall asleep.

Speaker: 1
02:26:17

Wow.

Speaker: 2
02:26:18

But I’m always going hard. When you’re always going hard, you’re

Speaker: 1
02:26:21

That’s the other thing.

Speaker: 2
02:26:22

Yeah. Ai I don’t take naps. Right. Ram know, and I work out basically every day. And so I’m always tired. I’m always ready to go to sleep.

Speaker: 1
02:26:30

So so do you fight it or do you just it’s not in you to, like, take a nap?

Speaker: 2
02:26:34

I don’t need a nap. Yeah? Yeah. I never need naps.

Speaker: 1
02:26:38

Yeah. How meh hours do you sleep?

Speaker: 2
02:26:39

I try to get eight.

Speaker: 1
02:26:41

Mhmm. Do you do you get eight?

Speaker: 2
02:26:42

No. Last ai, I didn’t get eight, but I got seven, six and a half. Ai, I got six and a half last night. Yeah. But that was because I got home and I started watching TV because I was a little freaked out about the war. And sai, when I’m freaked out about the war, I like to, fill my mind with nonsense. Oh, what kind of nonsense?

Speaker: 2
02:27:00

Well, I just watch things that have nothing to do with the world. Like, I’m I I play pool. I’m pretty competitive. I’m real pretty good. Mhmm.

Speaker: 2
02:27:07

And sai, I like watching professional pool matches, and there’s a lot of them on YouTube. So I just watch pool, and I just watch, you know, patterns, how guys get out Yeah. Stroke, how they use their stroke, like, how different guys have different approaches to the game.

Speaker: 1
02:27:21

It’s crazy with ai a people. It’s ai for you, although pool is an escape, it suddenly becomes an obsession, and you’re like, I need to be the best at it.

Speaker: 2
02:27:29

I’m very obsessed.

Speaker: 1
02:27:30

So Ai, you know, I I totally quit video games. But then, last year, I was very stressed. Like, the company was doing really poorly before we sort of invented this, agent, technology. And, and then also the the, you know, Gaza, ai. I was, like, watching these videos every night. It was just, like, really, really affecting me.

Speaker: 2
02:27:50

Ai can’t watch that stuff at night. That the the at night is when I get my anxiety. Yeah. I mean, I don’t generally have anxiety, but not like a lot of people do. I mean, when I say anxiety, I really feel for people that genuinely suffer from actual anxiety. My anxiety is all sort of self imposed and when I get online at night and I think about the world, my family’s asleep, which is generally when I write in a if I as long as I’m writing, I’m okay.

Speaker: 1
02:28:19

Comic?

Speaker: 2
02:28:19

Yeah. You know, I I write like sort of an essay form that I extract the comedy from it. But when I, get online and I I just pay attention to the world, that’s when I really freak out because it’s all out of your control and it’s just murderous psychopaths that are running the world and just Yeah.

Speaker: 2
02:28:36

It just at any moment, you could be, you know, in a place where they decide to attack, and then you’re a pawn in this fucking insane game that these people are playing in the world.

Speaker: 1
02:28:51

That’s why I felt really frustrated with my family being there. I was like, they they have no say in

Speaker: 2
02:28:56

it. Right.

Speaker: 1
02:28:57

The war arya. Rockets are flying. Yeah. It’s like Right. But, anyways, I, I started, playing a video game. It’s called Hades, Hades two. It’s ai an RPG, video game. And, and I was like, I’m trying to disconnect.

Speaker: 0
02:29:15

Mhmm. And

Speaker: 1
02:29:15

then I started speedrunning that game. Do you know what speedrunning is? No. It’s like you’re trying to finish the game as fast as possible, as fast as humanly as possible. And I got down to, like, six minutes, and I was number 50 in the world. Woah.

Speaker: 2
02:29:28

But, legitimately Oh, yeah. Yeah. Ai score is ai. People to play for. That was crazy.

Speaker: 1
02:29:36

Yeah. That was a

Speaker: 2
02:29:36

lot. Why is he doing that? That was crazy. That was a lot.

Speaker: 1
02:29:39

Yeah. It’s it’s it’s myth building, you know. Yeah. Weird. Yeah. But, yeah, it is it is this thing about about ai a people. Like, you’re, you’re just you’re, you know, even in the in the your escapism becomes competitive and stressful.

Speaker: 2
02:29:54

Well, sort of, but it’s also I feel like it’s a discipline. I feel like, pool is a discipline, just like archery. I’m also obsessed with archery. Archery is a discipline, and I feel like the more ai disciplines that you have in your life, the more you understand what it is about these things that makes you excel and get better at them.

Speaker: 2
02:30:14

And the more thing when I get better at those things, I get better at life. It’s I apply it to everything.

Speaker: 1
02:30:22

Yeah. There’s another thing that AI now struggles with, which is called transfer learning. You know, learning something from domain, like, learning something from math on how to, like, do reasoning sana math and being able to do reasoning and politics. Wow. We just don’t have evidence of of that yet. And I no. I feel the same way.

Speaker: 1
02:30:39

Everything like, even powerlifting when I got really into it, which is, like, the most unhealthy sport you you

Speaker: 0
02:30:45

can Yeah.

Speaker: 1
02:30:45

You can do.

Speaker: 2
02:30:46

You break your joints down.

Speaker: 1
02:30:47

Break your joints. You look like shit because, you you know, the more you eat, you can lift more.

Speaker: 0
02:30:51

Right. You

Speaker: 2
02:30:52

get fat.

Speaker: 0
02:30:53

They’re all fat.

Speaker: 2
02:30:53

They don’t fat. Unless they’re competing in a weight class.

Speaker: 1
02:30:57

Right. Ai

Speaker: 2
02:30:57

have to be lean.

Speaker: 1
02:30:58

Yeah. And, what is that, repertoire? Have you ever had him on the jug of milk? What? Go meh. Do you know go mad? No. Gallon milk a day. Do you know that, Jimmy? No. Do you

Speaker: 0
02:31:11

know it? Disgusting.

Speaker: 1
02:31:12

Disgusting. Yeah. So basically saloni of milk a day? Yeah. So Mark Robertois, he’s, he wrote this book called Starting Strength, and it became, like, the the main way most most guys, at least my age, like, getting into, powerlifting. It was about technique. It was about like, his whole thing is, like, look, you know, everyone comes into lifting. They think it’s bodybuilding. Powerlifting is nothing like that.

Speaker: 1
02:31:33

And he also looks looks like shah, and he’s fat. But his technique is amazing. And so the way he gets young guys to, like, get really good and really strong, he puts them on a gallon milk a day.

Speaker: 2
02:31:45

Does that really have a positive effect?

Speaker: 1
02:31:48

Yeah. I mean, he he has a YouTube channel. Like, he has a lot of guys that are really, really strong, and he’s he’s been a coach for a lot of

Speaker: 2
02:31:53

What is it about a gallon of milk a day? Is it just the protein intake?

Speaker: 1
02:31:56

What is it? I think calories

Speaker: 2
02:31:57

Paragraph on it.

Speaker: 1
02:31:58

I mean, it’s Okay.

Speaker: 2
02:31:59

Here it is. Drink a gallon of milk a day, go mad, is undeniably the most effective nutritional strategy for adding slabs of mass to young underweight males. Milk is relatively cheap, painless to prepare, and the macronutrient profile is very balanced, and calories are always easier to drink than eat.

Speaker: 2
02:32:15

Unfortunately, those interested in muscular hype hypertrophy hypertrophy, rather, who are not young, underweight, and male, populations where GO MAT is not recommended, will need to put more effort into the battle to avoid excess fat accumulation. Body composition can be manipulated progressively, much like barbell training, to achieve the best results.

Speaker: 2
02:32:36

For example, the starting strength novice linear progression holds exercise selection frequency and volume variables constant. Okay. Every forty eight to seventy two hours, the load stressor is incrementally increased to elicit adaptation increase is too significant or insignificant, the desired adaptation won’t take place. Yeah.

Speaker: 2
02:32:56

This is the intelligence.

Speaker: 0
02:32:58

Right? This is

Speaker: 2
02:32:58

the intelligence involved in lifting that people who are on the outside of it would dismiss. Science.

Speaker: 1
02:33:04

Meh. Yeah. Yeah. You know, I I’m so honored to be the the guy that introduces Joe Rogan to starting sprint.

Speaker: 2
02:33:13

You Go meh. Yeah.

Speaker: 1
02:33:14

Go mad. Robert to watch ai funny. You should, like, watch some of his videos. He’s he has, like, this very thick, Texan accent, and he just, like, his audience shits on him all the time. They call him fat and ugly and whatever, and

Speaker: 0
02:33:25

he and he abuses his audience too. So there it is.

Speaker: 2
02:33:30

This guy. Let me see this guy.

Speaker: 1
02:33:31

Put his picture up.

Speaker: 2
02:33:32

That’s the name?

Speaker: 1
02:33:33

That that’s old photo. He’s not much fatter.

Speaker: 2
02:33:37

Yeah. So he’s just a nerd.

Speaker: 1
02:33:40

Yeah. He’s he’s a he’s a huge nerd. But it like, yeah. He used to lift a lot, a lot of weight.

Speaker: 2
02:33:45

Yeah. There’s a lot that’s what he used to look ai. That one photo with him with the the hairy chest. Ai. The black oh, okay.

Speaker: 1
02:33:51

Ai maybe here.

Speaker: 2
02:33:53

Wow. Damn.

Speaker: 1
02:33:55

Is that him? Is that him? I don’t think so.

Speaker: 2
02:33:57

Really? Those look like him. Yeah. That’s him. He used to be jacked.

Speaker: 1
02:34:01

Okay. That’s good.

Speaker: 2
02:34:02

Oh, so he was a bodybuilder at one point in ai.

Speaker: 1
02:34:05

But but then he he got on that go match yet.

Speaker: 2
02:34:07

Now he’s a power lifter. Simply no other exercise, no machine to prevent the level of muscular stimulation and growth than the correctly performed full squat. Mhmm. Well, he’s deadlifting in that image. That’s weird.

Speaker: 1
02:34:20

So so he I didn’t

Speaker: 0
02:34:20

make that.

Speaker: 1
02:34:21

He also makes his squat on every on every, day of lifting. Oh. So squat every ai. Every time you lift. Really? Yeah. Yeah. Well, his idea is, like, squat is a full body exercise. Like, you can just go to the gym. And when I when I used to be busy and I just wanna maintain, like, be healthy, I’ll just squat every Just

Speaker: 2
02:34:41

get fifteen, twenty minutes squat.

Speaker: 1
02:34:42

Fifteen, twenty minutes squat and and just get out of the gym.

Speaker: 2
02:34:44

Yeah. But Why do something with legs every day?

Speaker: 1
02:34:48

Yeah. Yeah. You have to. Yeah. But but squat actually, it does feel like there’s there’s an upper body component

Speaker: 0
02:34:54

Sure.

Speaker: 1
02:34:54

To it as well.

Speaker: 2
02:34:55

Well, it’s also your body ai, like, oh, this asshole wants to lift, like, really heavy things. We gotta

Speaker: 1
02:34:59

get big. Right. Exactly. Yeah.

Speaker: 2
02:35:00

It’s the best way to get big. Yeah. Yeah. Because your body just ai, like, okay. We have to adapt. This shithead wants to lift ai things every day.

Speaker: 1
02:35:11

Yeah. Yeah. It’s hilarious. And, you know, the other one, I’m sure, you you know him. I think you introduced me to to him through your podcast. Louie Simmons. Oh, yeah. Yeah. Those those guys are crazy. You watch the Netflix documentary?

Speaker: 2
02:35:23

I didn’t watch the Netflix documentary. But we did actually interview him. He’s like one of the few people that I traveled to go meet Right. Who went to Sai Ai.

Speaker: 0
02:35:31

I saw that.

Speaker: 1
02:35:31

It was great.

Speaker: 2
02:35:32

We have some of his equipment out here.

Speaker: 1
02:35:33

Oh. He’s got a reverse ai?

Speaker: 2
02:35:35

Yeah. Yeah. Reverse hyper is so good for people that have back problems. Everyone that has a back issue, Ai let me show you something. Mhmm. And I bring them out to the reverse hyper machine, and I’m like, this thing will actively strengthen and decompress your

Speaker: 1
02:35:48

spine. Right. Right. It’s so good.

Speaker: 2
02:35:49

It’s so good for people that have, like, lower back issues where the doctor just wants to cut them. I’m like, hold on. Hold on. Don’t do that right away. I had back pain,

Speaker: 1
02:36:00

since since my late teens, and and and the doctors want to, like, they, they they did a, MRI, and they they found that there’s there’s a bit of a bulge. Mhmm.

Speaker: 0
02:36:12

And

Speaker: 1
02:36:12

they wanna do an operations on it. Yeah.

Speaker: 2
02:36:14

They wanna do a discectomy.

Speaker: 1
02:36:15

Someone wanted to put me in antidepressants. Apparently, you can manage pain with antidepressants. Have you heard of that? What? Yeah. Apparently, it’s a thing. And and, and through listening to your podcast and others, Ai was like, it was just gonna get strong. So I got I got strong squats and things like vatsal, and and and the pain got a lot better. Didn’t didn’t go entirely.

Speaker: 1
02:36:35

But the thing that, like, really got me over the hump and this this one’s crazy. Are you familiar with the Ai Body Prescription? No. John Sarno?

Speaker: 2
02:36:47

Oh, okay. Yes.

Speaker: 1
02:36:48

Yeah. I

Speaker: 2
02:36:48

heard about him on Howard Stern. Mhmm. He was talking about how a lot of back pain is psychosomatic.

Speaker: 1
02:36:52

Psychosomatic. Yeah. Ai don’t. So so his idea and again, this is like

Speaker: 2
02:36:56

He doesn’t understand jiu jitsu because a lot of back pain is real as fuck.

Speaker: 1
02:37:00

Right. Right.

Speaker: 0
02:37:01

Right.

Speaker: 1
02:37:01

I I mean, I don’t think

Speaker: 2
02:37:03

Settle down.

Speaker: 0
02:37:03

I I

Speaker: 1
02:37:03

think ram me, it was a combination, of both. Ai, there’s something physically happening. But, like, his idea is that, his idea is that your mind is creating the pain to distract you from emotional, psychological pain.

Speaker: 0
02:37:20

And

Speaker: 2
02:37:21

I think that’s the case in some people.

Speaker: 1
02:37:23

Yeah. And, and then doctors will go do an image. And often, they’ll find something, and he thinks that, like, lumber imperfections are almost in everyone.

Speaker: 2
02:37:36

Yes. I think that’s true.

Speaker: 1
02:37:38

And then and then, the doctors latch onto vatsal, and, and your mind latches onto that. And you you start reinforcing, telling yourself that, that I have this thing, and the pain gets worse.

Speaker: 2
02:37:55

Mhmm.

Speaker: 1
02:37:55

There’s also another thing called the Salience Network. Have you heard of this? No. If you can bring up the Wikipedia page for Salience Network, because I don’t sana get it wrong, but the Salience Network is a network in the brain that neuroscientists found. Ai doctor, Ai, he told me about this. The CNS network gets reinforced, whenever you obsess over your pains or your health issues.

Speaker: 2
02:38:29

That makes sense.

Speaker: 1
02:38:29

So it’s responsible for perception. And the the more you reinforce it’s like ai it’s like a muscle. The more you reinforce it, it’s sort of like AI, you know, reinforcement learning. The more you reinforce it, it becomes more of

Speaker: 2
02:38:41

an issue. Including various functions, including social behavior, self awareness, and integrating sensory, emotional, and cognitive information. Boy, I bet social media is really bad for that.

Speaker: 1
02:38:51

Right. Totally. Yeah. Yeah. Ai. And and so, you know, a lot of the, you know, fatigue and things like that, at some point, I’m like, fuck it. Like, I did a lot of other things, but at some point, I’m like, fuck it. I don’t I don’t care about it. I don’t have it. I’m just gonna I’m just gonna be, I’m just gonna be good.

Speaker: 2
02:39:10

Just not concentrate on that.

Speaker: 1
02:39:11

Yeah. Because I was reading about it all the time. I was doing I was, like, really worried. And I

Speaker: 2
02:39:16

had Abigail Ai

Speaker: 0
02:39:17

and Sai

Speaker: 1
02:39:17

was just

Speaker: 2
02:39:18

talking about that in regards to cognitive therapy that there’s a lot of people that, like, obsess on their problems so much that their problems actually become bigger.

Speaker: 1
02:39:26

Yes. And this is it. This is the neuroscience behind it, the Salience Network.

Speaker: 2
02:39:31

Makes sense. Yeah. But there’s legit back problems.

Speaker: 1
02:39:34

Of course.

Speaker: 2
02:39:36

Legit back. That’s ai the John Sarno thing, I was like, okay. Not for me. I’d I’d understand how some people could develop that issue.

Speaker: 1
02:39:44

But his his insight was, look, look, I ran a clinic in New York City for a long ai, and these, chronic illnesses come in waves. There’s a there’s a ulcers wave in the, like, nineties. Oh. There’s because

Speaker: 2
02:39:59

it became a thing that people are talking about a lot. Yes. Wow.

Speaker: 1
02:40:02

And then there’s a there’s, like, a neck pain, and then there’s an RSI. The most recent one was RSI.

Speaker: 2
02:40:08

What is RSI?

Speaker: 1
02:40:09

Repetitive strain injury. Oh. And, again, these all thing all these things have rational, explanations. For for me, I Ai was in the computer all the time. And I was like, oh, my my arm hurts. And, and, yeah, maybe, you know, there there was some aspect of it. I was programming a lot.

Speaker: 1
02:40:30

But but also after I read John Sarno, and and and I realized that some of it might be also psychological that, you know, it’s is stress. It’s, I don’t know what what’s maybe I have some childhood issues, but, but but, like, you you just ai that a lot of it and maybe the other way is true as well when you just, like, minimize it.

Speaker: 1
02:40:50

It just becomes less of a of an issue in your mind. But the fact that it is, like, fashion should tell you that there’s something ai about

Speaker: 0
02:41:00

it.

Speaker: 2
02:41:00

The fact that it does come in waves like that, for sure. And then once it’s in the zeitgeist ulcers or whatever it is. Yeah. Right.

Speaker: 1
02:41:08

Every I meh when I when we were kids, like, everyone had ulcers. And Ai was like, oh, it’s from coffee in the morning. And, like, there’s all these I don’t know anyone that had ulcers now. Right.

Speaker: 0
02:41:18

I don’t either.

Speaker: 2
02:41:19

That’s true. That’s it. That’s crazy.

Speaker: 0
02:41:22

That’s

Speaker: 2
02:41:22

wild. Is this wild the mind? Like, the the way it can benefit you or the way it can hold you prisoner?

Speaker: 1
02:41:29

Yeah. And, again, this is maybe why I have, like, a little, like, different view about AI and humans and all of that from from from Silicon Valley. Like, this is a weird thing, but, every time I set my mind to, like, meet someone, I meet them, including you.

Speaker: 2
02:41:51

Oh, wow. That’s weird.

Speaker: 1
02:41:53

Like, yeah. I wanna I wanna meet this person. Something happens, some chain of events. But, obviously, you know, you you also see it

Speaker: 0
02:42:00

as some way.

Speaker: 2
02:42:00

Obviously, you’re doing something very you’re you’re you’re you’re not just thinking it.

Speaker: 0
02:42:06

Right.

Speaker: 2
02:42:06

You’re also doing things. Right. Which is my problem with, like, the secret and, like, the problem of manifesting things.

Speaker: 1
02:42:12

I don’t go that far, but Ai don’t know. Like, there’s something there.

Speaker: 2
02:42:15

There’s something there. Yeah. I agree. There’s something to it. Yeah. Ai just Ai I think the mind and our connection to reality is not as simple as we’ve been told.

Speaker: 1
02:42:30

Not at all. I I think there’s there’s something there. And again, when you when you start looking at psychedelics and Mhmm. And stuff like that, there’s there’s something there. And, I remember, listening to to one of I I Ai love Jerry circa early two thousand tens. It was a there was a remote viewing.

Speaker: 1
02:42:49

You were talking about a remote viewing episode, and I was like, wow. That’s that’s crazy. And, obviously, very skeptical of it, the idea that you can meditate and, like, see somewhere else or see see it from above. Yeah. I read a book ai Da Vinci. It’s called Da Vinci’s Brain, I think. Da Vinci is, like, fascinating. Who’s this fucking guy?

Speaker: 2
02:43:14

He Right.

Speaker: 1
02:43:15

He does everything. And, and he he literally is, like, across all these domains, and, he doesn’t he barely sleeps. Like, he has this polyphasic sleep thing, which I tried once as torture. Basically, every four hours, you sleep for fifteen minutes. When I was, when I was in university, I, I was I was very good at computer science, but I hated going to school.

Speaker: 1
02:43:41

And in Jordan, if you don’t go to school, they ban you from from the exam. Oh, wow. I always getting As, but I just didn’t wanna like, I didn’t wanna sit in class. And, actually, this is when I started thinking about programming on my phone. I was like, maybe I can code my phone in class. But, but I felt there was, like, injustice. Like, I just ADHD, whatever you wanna call it.

Speaker: 1
02:44:02

Like, Sai just can’t sit in class. Like, just

Speaker: 2
02:44:04

give me give me a break.

Speaker: 1
02:44:05

And so I felt justified to, you know, rebel or fix the situation somehow. So I decided to, like, hack into the university and change my grades so I can graduate. Because everyone was graduating. It was, like, five years in. It took me six years to get through through a four year program just because I can’t sit in class, and, you know, have some some some dyslexia and things like that.

Speaker: 1
02:44:29

But, so, I I I decided to do that. Ai like, okay. I’m, like, hacking takes a lot of time because, like, you’re you’re coding, you’re scripting, you’re running scripts against servers, and, you’re waiting. And I’m like, I’m just gonna, you know, to optimize my time, I’m just gonna do this Da Vinci thing where, like, four four hours by the way, there’s a Seinfeld episode where shah was his name?

Speaker: 1
02:44:54

The crazy guy in Sai?

Speaker: 2
02:44:56

Ram?

Speaker: 1
02:44:57

Kramer. Kramer does polyphasic speak. Maybe I learned it from there. I’m not sure. But, How do you wake up? You set an alarm.

Speaker: 2
02:45:05

Oh, god.

Speaker: 1
02:45:06

Yeah. It’s it’s torture.

Speaker: 2
02:45:07

That sounds so crazy.

Speaker: 1
02:45:08

So, apparently, da Vinci, you know, used to do that. But, anyways, I I was able to to hack into the university by, like, working for weeks using, polyphasic sleep, and was able to change my grades. And, initially, I didn’t wanna do it on myself, but I had a neighbor who, went together to school, and and and I was like, you know, let’s change his grade and see see if it actually succeeds and actually succeeded in in his case.

Speaker: 1
02:45:37

And it was my lab rat. But, in my case, I got caught. And and the reason I got caught is, there is the the the, you know, in the database, there’s your grade out of a 100, you know, zero to a 100. When you get banned because of attendance, your grade is de facto 35. So I thought I would just change that, and that’s the thing that will get me to to pass.

Speaker: 1
02:46:05

Well, turns out there’s another field in the database about whether you’re banned or

Speaker: 0
02:46:09

not.

Speaker: 1
02:46:11

This is bad this is bad coding. There’s bad programming because it’s this this database is not normalized. There’s a state in two different fields. So I’ll put the blame on them

Speaker: 0
02:46:20

for not designing the right database. That’s hilarious.

Speaker: 2
02:46:25

You blame them for your hacking being successful. So what was the punishment?

Speaker: 1
02:46:30

The entire universe system went down because there’s this anomaly. I was I was I was, you know, I passed, but at the same time, I was I was banned. And so I got a call, from the head of the registration system, and it was, like, 7PM, whatever. It was landline. And I picked up the call.

Speaker: 1
02:46:50

He’s like, hey. Listen. We have this issue we’re dealing with. Like, the entire thing is down, and and it just shows your record. There’s a problem with it.

Speaker: 1
02:46:57

Do you know anything about it? And at the time, I’m like, alright. There’s, like, a fork in the road. You know, I either, like, come clean or just, like, this is a lie that will, like, live for me forever. Ai was like, I was gonna say it.

Speaker: 1
02:47:09

I was like, yeah. I I did it. And I was like, what do you mean? I was like, okay. I’ll I’ll come explain it to you.

Speaker: 1
02:47:14

So so the next day, I I go there, and it’s all the university deans. And it’s, like, one of the best computer science universities in in the region, the Princess of ai University for Technology. And they’re all nerds. You know? It’s it’s so I the the the discussion became technical on, like, how I hacked in the university. And I Ai want the whiteboard explaining what I did.

Speaker: 1
02:47:35

This is whatever.

Speaker: 0
02:47:37

And it

Speaker: 1
02:47:37

just felt like a brainstorming session. I’m like, alright. I’ll see you guys later. I was like, wait. We need to figure out what to do with you. Like, you know, this is serious. Sana, I’m like, oh, crap. But the president of, they they they kind of put the decision to the president, and he was I forgot his name, but he was such an enlightened guy.

Speaker: 1
02:47:57

And and I went and told him, like, I just didn’t mean any malice. I just felt, like, justified Sai need to graduate. I’ve been here for a long time. I actually do good work, and, and he’s like, look. You’re you’re talented, but with great power comes great responsibility. He gave me the Spider Man.

Speaker: 2
02:48:15

Mhmm. My life.

Speaker: 1
02:48:17

And he said, for us to forgive you, you’re gonna have to go and harden the, systems in the university against against hacking. So I spent the summer trying to work with the with the engineers of the university to to do that. But, but they hated me because, you know, I’m the guy that hacked into the system. So they would, like, blackball me.

Speaker: 1
02:48:38

Like, sometimes I’ll show up to work, and they sana open the door, and I can see them.

Speaker: 0
02:48:42

Like, I could see you there. I’m knocking.

Speaker: 1
02:48:44

And they wanna let me in and let me work with them. We did some stuff to to fix it, and and then, Ai I gained, you know, fame, maybe notoriety in the in the university and actually got me my first job, while I was in school. And, it’s a different story, but that that job was was at a at a startup that ended up making, videos that were a big part of the Arab Spring.

Speaker: 2
02:49:12

Oh, wow.

Speaker: 1
02:49:13

Yeah. And I I was part of some of these, videos as well. But, anyways, so, one of the computer deans, was like, hey. Listen. I really helped you out, computer science dean. I really helped you out when you had this problem, and I need you to work with me, in order to do another research to ai into the university again.

Speaker: 1
02:49:32

It’s ai, I’m not I’m not gonna do that. Like, I don’t wanna get into no. It’s like, no. We’re you’re not gonna get in trouble. You’re you’re gonna Sanctioned.

Speaker: 1
02:49:40

It’s gonna be sanctioned. So so, again, I worked tirelessly on that. This time, I I invent a piece of software to help me do vatsal. And, and I I was able to, like, find more, more, vulnerabilities. And so I show up at my project defense, and it’s, it’s ai a committee of different deans and students and all that.

Speaker: 1
02:50:03

And so I go up, and I start, explaining my my project. And, like, I run a scan against the university network, and it showed a bunch of meh, like, those vulnerabilities. And one of the, deans is ai, no. That’s fake. You that’s not true.

Speaker: 1
02:50:20

It started dawning in me that I was, like, a pawn in some kind of

Speaker: 2
02:50:24

power struggle.

Speaker: 1
02:50:25

Power struggle. So that that that guy was responsible for this universe’s system, and this ai using me too. I was like, oh, shit. But, like, I’m not gonna back down. I was like, ai. It’s not sai lie. It’s true. And so, I’d, like, tap into that, vulnerability, and I go to the database. And I’m like, alright. What do you what do you want me to shah?

Speaker: 1
02:50:47

Your salary or your password? Was like, show me my password. So I shah him the password, and, I was like, no. That’s not my password. It was encrypted.

Speaker: 1
02:50:57

But they also have in the database, like, a decrypt function, which they shouldn’t have, but they had it. So I was like, decrypt to the password, and the password showed on the screen in the middle of the defense. And so he he stood up, ai his his his face was red. And he shakes my hand, and he leaves to change his password.

Speaker: 2
02:51:18

That’s awesome.

Speaker: 1
02:51:19

And I graduated. And, you know, they caught me some slack, and I I was able to to graduate.

Speaker: 2
02:51:24

That’s awesome. Yeah. That’s a great story. We we’ll end with that. Yeah. Thank you very much, brother. I really appreciate it. Yeah.

Speaker: 1
02:51:29

It was a pleasure.

Speaker: 2
02:51:30

That was awesome. That was a great great conversation.

Speaker: 1
02:51:32

Thank you.

Speaker: 2
02:51:33

Your app, let everyone know about it.

Speaker: 1
02:51:34

Replit, replit I’ll dot find it.

Speaker: 2
02:51:36

There it is. Yeah. Replit. Yeah. Replit.com.

Speaker: 1
02:51:40

Go go make some apps.

Speaker: 2
02:51:41

Go make some apps, people. Avoid the whatever the hell is gonna happen.

Speaker: 0
02:51:46

There they are. Alright.

Speaker: 2
02:51:47

Thank you very much. Thank you. Bye, everybody.

Transcribe, Translate, Analyze & Share

Join 170,000+ incredible people and teams saving 80% and more of their time and money. Rated 4.9 on G2 with the best AI video-to-text converter and AI audio-to-text converter, AI translation and analysis support for 100+ languages and dozens of file formats across audio, video and text.

Start your 7-day trial with 30 minutes of free transcription & AI analysis!

Trusted by 150,000+ incredible people and teams

More Affordable
1 %+
Transcription Accuracy
1 %+
Time Savings
1 %+
Supported Languages
1 +
Don’t Miss Out - ENDING SOON!

Get 93% Off With Speak's Start 2025 Right Deal 🎁🤯

For a limited time, save 93% on a fully loaded Speak plan. Start 2025 strong with a top-rated AI platform.