#458 – Marc Andreessen: Trump, Power, Tech, AI, Immigration & Future of America

Marc Andreessen is an entrepreneur, investor, co-creator of Mosaic, co-founder of Netscape, and co-founder of the venture capital firm Andreessen Horowitz. Thank you for listening ❤ Check out our sponsors: https://lexfridman.com/sponsors/ep458-sc See below for timestamps, transcript, and to give feedback, submit questions, contact Lex, etc. Transcript: https://lexfridman.com/marc-andreessen-2-transcript CONTACT LEX: Feedback - give feedback to Lex: https://lexfridman.com/survey AMA - submit questions, videos or call-in: https://lexfridman.com/ama Hiring - join our team: https://lexfridman.com/hiring Other - other ways to get in touch: https://lexfridman.com/contact EPISODE LINKS: Marc's X: https://x.com/pmarca Marc's Substack: https://pmarca.substack.com Marc's YouTube: https://www.youtube.com/@a16z Andreessen Horowitz: https://a16z.com SPONSORS: To support this podcast, check out our sponsors & get discounts: Encord: AI tooling for annotation & data management. Go to https://encord.com/lex GitHub: Developer platform and AI code editor. Go to https://gh.io/copilot Notion: Note-taking and team collaboration. Go to https://notion.com/lex Shopify: Sell stuff online. Go to https://shopify.com/lex LMNT: Zero-sugar electrolyte drink mix. Go to https://drinkLMNT.com/lex OUTLINE: (00:00) - Introduction (12:46) - Best possible future (22:09) - History of Western Civilization (31:28) - Trump in 2025 (39:09) - TDS in tech (51:56) - Preference falsification (1:07:52) - Self-censorship (1:22:55) - Censorship (1:31:34) - Jon Stewart (1:34:20) - Mark Zuckerberg on Joe Rogan (1:43:09) - Government pressure (1:53:57) - Nature of power (2:06:45) - Journalism (2:12:20) - Bill Ackman (2:17:17) - Trump administration (2:24:56) - DOGE (2:38:48) - H1B and immigration (3:16:42) - Little tech (3:29:02) - AI race (3:37:52) - X (3:41:24) - Yann LeCun (3:44:59) - Andrew Huberman (3:46:30) - Success (3:49:26) - God and humanity PODCAST LINKS: - Podcast Website: https://lexfridman.com/podcast - Apple Podcasts: https://apple.co/2lwqZIr - Spotify: https://spoti.fi/2nEwCF8 - RSS: https://lexfridman.com/feed/podcast/ - Podcast Playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4 - Clips Channel: https://www.youtube.com/lexclips

Transcribe, Translate, Analyze & Share

Join 170,000+ incredible people and teams saving 80% and more of their time and money. Rated 4.9 on G2 with the best AI video-to-text converter and AI audio-to-text converter, AI translation and analysis support for 100+ languages and dozens of file formats across audio, video and text.

Start your 7-day trial with 30 minutes of free transcription & AI analysis!

More Affordable
1 %+
Transcription Accuracy
1 %+
Time & Cost Savings
1 %+
Supported Languages
1 +

You can listen to the #458 – Marc Andreessen: Trump, Power, Tech, AI, Immigration & Future of America using Speak’s shareable media player:

#458 – Marc Andreessen: Trump, Power, Tech, AI, Immigration & Future of America Podcast Episode Description

Marc Andreessen is an entrepreneur, investor, co-creator of Mosaic, co-founder of Netscape, and co-founder of the venture capital firm Andreessen Horowitz.

Thank you for listening ❤ Check out our sponsors: https://lexfridman.com/sponsors/ep458-sc

See below for timestamps, transcript, and to give feedback, submit questions, contact Lex, etc.

Transcript:

Transcript for Marc Andreessen: Trump, Power, Tech, AI, Immigration & Future of America | Lex Fridman Podcast #458

CONTACT LEX:

Feedback – give feedback to Lex: https://lexfridman.com/survey

AMA – submit questions, videos or call-in: https://lexfridman.com/ama

Hiring – join our team: https://lexfridman.com/hiring

Other – other ways to get in touch: https://lexfridman.com/contact

EPISODE LINKS:

Marc’s X: https://x.com/pmarca

Marc’s Substack: https://pmarca.substack.com

Marc’s YouTube: https://www.youtube.com/@a16z

Andreessen Horowitz: https://a16z.com

SPONSORS:

To support this podcast, check out our sponsors & get discounts:

Encord: AI tooling for annotation & data management.

Go to https://encord.com/lex

GitHub: Developer platform and AI code editor.

Go to https://gh.io/copilot

Notion: Note-taking and team collaboration.

Go to https://notion.com/lex

Shopify: Sell stuff online.

Go to https://shopify.com/lex

LMNT: Zero-sugar electrolyte drink mix.

Go to https://drinkLMNT.com/lex

OUTLINE:

(00:00) – Introduction

(12:46) – Best possible future

(22:09) – History of Western Civilization

(31:28) – Trump in 2025

(39:09) – TDS in tech

(51:56) – Preference falsification

(1:07:52) – Self-censorship

(1:22:55) – Censorship

(1:31:34) – Jon Stewart

(1:34:20) – Mark Zuckerberg on Joe Rogan

(1:43:09) – Government pressure

(1:53:57) – Nature of power

(2:06:45) – Journalism

(2:12:20) – Bill Ackman

(2:17:17) – Trump administration

(2:24:56) – DOGE

(2:38:48) – H1B and immigration

(3:16:42) – Little tech

(3:29:02) – AI race

(3:37:52) – X

(3:41:24) – Yann LeCun

(3:44:59) – Andrew Huberman

(3:46:30) – Success

(3:49:26) – God and humanity

PODCAST LINKS:

– Podcast Website: https://lexfridman.com/podcast

– Apple Podcasts: https://apple.co/2lwqZIr

– Spotify: https://spoti.fi/2nEwCF8

– RSS: https://lexfridman.com/feed/podcast/

– Podcast Playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4

– Clips Channel: https://www.youtube.com/lexclips
This interactive media player was created automatically by Speak. Want to generate intelligent media players yourself? Sign up for Speak!

#458 – Marc Andreessen: Trump, Power, Tech, AI, Immigration & Future of America Podcast Episode Summary

In this episode of the Lex Fridman podcast, Mark Andreessen, a prominent venture capitalist and entrepreneur, shares his insights on the future of technology and society. The conversation begins with a discussion on optimism and Andreessen’s vision for the best possible scenario for big and small tech in the next few years. He emphasizes the importance of innovation and the potential for technology to drive positive change in America.

Andreessen highlights the role of capitalism and individual freedom as key drivers of progress, drawing parallels to historical figures like Milton Friedman. He discusses the importance of maintaining a balance between public and private interests, referencing the Twitter files and the transparency they bring to corporate practices.

Throughout the episode, Andreessen is praised for his communication skills and ability to articulate complex ideas clearly. He reflects on his personal growth and learning journey, acknowledging how his views have evolved over time.

The episode also touches on the influence of sponsors on the podcast, with Lex Fridman clarifying that sponsorships do not affect guest selection or content. Andreessen’s admiration for tools like Notion, which integrate AI effectively for team collaboration, is mentioned.

A recurring theme is the importance of open dialogue and the exchange of ideas, even when they challenge the status quo. Andreessen shares a personal anecdote about a historian who declined a podcast invitation due to potential backlash, highlighting the tension between academic freedom and public perception.

Overall, the episode underscores the transformative power of technology and the need for thoughtful discourse in shaping the future. Andreessen’s insights offer actionable advice for embracing innovation while navigating the complexities of modern society.

This summary was created automatically by Speak. Want to transcribe, analyze and summarize yourself? Sign up for Speak!

#458 – Marc Andreessen: Trump, Power, Tech, AI, Immigration & Future of America Podcast Episode Transcript (Unedited)

Speaker: 0
00:00

The following is a conversation with Mark Andreessen, his saloni time on the podcast. Mark is a visionary tech leader and investor who fundamentally shaped the development of the Internet and the tech industry in general over the past 30 years. He’s the cocreator of Mosaic, the first widely used web browser, cofounder of Netscape, cofounder of the legendary Silicon Valley venture capital firm, Andreessen Horowitz, and is one of the most influential voices in the tech world, including at the intersection of technology and politics.

Speaker: 1
00:36

And now a quick few second mention of your sponsor. Check them out in the description. It’s the best way to support this podcast. We got Encore for unifying your Meh stack, GitHub for programming, Notion for team projects and collaboration, Shopify for merch, and Element for hydration. Choose wisely, my friends.

Speaker: 1
00:56

Also, if you sana to get in touch with me for whatever reason, go to lexfreedman.com/contact. And now onto the full ad reads, no ads in the middle. I bryden make this interesting, but if you skip them, please still check out the sponsors. I enjoy their stuff. Maybe you will too.

Speaker: 1
01:12

This episode is brought to you by Oncord, a platform that provides data focused AI tooling for data annotation, curation, and management, and for model evaluation once you train up the model on the data that you curate. In this conversation with Mark Andreessen, we actually discuss what he calls kinda ai the $1,000,000,000,000 questions. And one of them for AI is how effective will synthetic data be?

Speaker: 1
01:42

It really is an open question. What piece what fraction of the intelligence of future models will be based on training on synthetic data. At the top AI labs, I’m hearing a lot of optimism. As far as I can tell, that optimism is not currently, at least in the general case, based on any real evidence.

Speaker: 1
02:04

So I do think synthetic data will play a part. But how big a part? There’s still going to be some curation from humans. There’s still gonna need to be a human in the loop. I think the real question is, how do you effectively integrate the human in the loop sai that the synthetic data, sort of 99 synthetic, 1% human, that combination can be most effective.

Speaker: 1
02:33

That’s a real question. And, companies like Oncord are trying to solve that very problem. First of all, they wanna provide the tooling for the annotation, for the actual, human collaboration, but also asking and answering the research question of, like, how do you pull it all off and make the resulting model more intelligent for very specific application than for the general applications.

Speaker: 1
02:56

Yeah. So Encore does a really good job on the tooling side. Go try them out to curate, annotate, and manage your AI data at encore.com/lex. That’s encore.com/lex. This episode is brought to you by GitHub and GitHub Copilot. If you don’t know what that is, my friends, you’re in for a joyous, beautiful surprise.

Speaker: 1
03:22

I think a lot of people that program regularly know and love GitHub and know and love Copilot. It’s the OG Ai programming assistant, and, it’s the one that’s really trying to win this very competitive space. It is not easy. If you’re somebody that uses Versus Code, obviously well, maybe not obviously, but, you can use GitHub Copilot in Versus Code. But you can use it also in other IDEs.

Speaker: 1
03:54

I’m gonna be honest with you. It’s a very competitive space. I’m trying all the different tools in the space, and I really love how much GitHub and GitHub Copilot want to win in this competitive space. So I’m excitedly sort of sitting back and just eating popcorn like that, Michael Jackson meme and just enjoying the hell out of it.

Speaker: 1
04:19

And, like I said, I’m going to be doing a bunch of programming episodes, including with Ai, and he, I think, has a love hate relationship with AI and with, AI agents and with the role of AI in the programming experience. And he is really at the forefront of people that are playing with all these languages, with all these different applications, with all the different use cases of code.

Speaker: 1
04:46

And he is a new of em user, so he’s going to be skeptical in general of new technology. He’s a curmudgeon sitting on a porch on rocking chairs, screaming at the kids, throwing stuff at them. But at the same time, he’s able to play with the kids as well. So I am more on the kids’ ai. With the childlike joy, enjoy the new technology.

Speaker: 1
05:08

For me, basically, everything I do, programming wise, has the possibility of AI either reviewing it or assisting it. It’s constantly in the loop. Even if I’m writing stuff from scratch, I’m always just kind of, one second away from asking a question about the code or asking it to generate or rewrite a certain line or to add a few more lines, all that kind of stuff.

Speaker: 1
05:35

So I’m constantly constantly using it. If you’re learning to code or if you’re an advanced programmer, it is really important that you get better and better using AI as an assistant programmer. Get started with GitHub Copilot for free today atgh.i0/copilot. This episode is also brought to you by Notion, a note taking and team collaboration tool that Marc Andreessen, on this very episode, sings a lot of praises to.

Speaker: 1
06:05

I believe he sings was it on Mike or off Mike? I don’t remember. But, anyway, he loves it. It’s one of the tools, one of the companies, one of the ecosystems that integrate AI really effectively for team applications. When you have, let’s see, like, docs and wikis and projects and all that kind of stuff, you can have the AI load all of that in and answer questions based

Speaker: 0
06:30

on that. You You can connect

Speaker: 1
06:31

a bunch of apps. Like, you could connect Slack. You can connect Google Drive. I think in the context, we were talking about something like Notion for email for, like, Gmail. I don’t know if Notion integrates email yet. Ai they’re just like this machine that’s constantly increasing the productivity of every aspect of your life, so I’m sure they’re going to start integrating more and more apps.

Speaker: 1
06:56

I use it for Slack and Google Drive, but I use it primarily at the individual level for note taking. And even at the individual level, just incredible what Notion AI can do. Try it out for free when you go to notion.com/lex. That’s all lowercase notion.com/lex to try the power of Notion AI today.

Speaker: 1
07:18

This episode is also brought to you by Shopify, a platform designed for anyone to sell anywhere with a great looking online store. There are few people, embody the, the joy and the power of, capitalism than, Marc Andreessen. I believe Marc and Toby are friends. I was at a thing where Mark and Toby were both there, and they were chatting, and they were very friendly.

Speaker: 1
07:45

So I think they’re friends, and I got to hang out with Toby. And he’s, again, incredible person. Sai it again and again, it’s almost becoming funny that, eventually, we’ll do a podcast. I I don’t know why we haven’t done the podcast. There’s a few people in my life where, it’s like like, Jeffrey Hinton is one of those people.

Speaker: 1
08:05

It’s ai, we’ve agreed to do a podcast for so long, and we’ve just been kinda lazy about it. And, Toby is the same. Anyway, he’s the CEO of Shopify. I don’t even know if he knows that Shopify sponsors his podcast. It doesn’t matter.

Speaker: 1
08:22

It goes without saying, it should be obvious to everybody, that one doesn’t affect the other. I am very fortunate to have way more sponsors than, we could possibly fit, so I could pick whoever the hell I sana. And, whatever guest I choose will never have anything to do with this company that sponsored the podcast. There’s no there’s not even ai a tinge of influence.

Speaker: 1
08:46

In fact, if there’s anything, it’ll be the opposite direction, but I also try to avoid that. You know? It’s possible I talked to the CEO of GitHub, for example, on this podcast, and GitHub sponsors this podcast. It’s possible I talked to the CEO of Shopify, Toby, and, Shopify sponsors this podcast.

Speaker: 1
09:03

One doesn’t affect the other, and, obviously, again, goes without saying, but let me say it, make it explicit that nobody can buy their way onto the podcast, whether through sponsorships or buying me dinner or whatever. I don’t know. This there’s just this it’s impossible. And most likely, if that’s attempted, it’s going to backfire.

Speaker: 1
09:27

So I think people intuitively know not to attempt because it would really piss me off. Anyway, this is a, detour. We’re supposed to talk about Shopify. I have a Shopify store, luxury.com/store, that sells T shirts, but you can sell more sophisticated stuff, make a lot of money, and participate in this beautiful machinery of capitalism.

Speaker: 1
09:50

Sign up for a $1 per month trial period at shopify.com/lex. That’s all lowercase. Go to shopify.com/lex to take your business to the next level today. This episode is also brought to you by Element, my daily zero sugar and delicious electrolyte mix of which I consume very ridiculously large amounts.

Speaker: 1
10:12

You know, salt used to be currency in the ancient world. How silly are humans? We’re not silly. How sort of surprising the things we converge on as, being the store of value. Just value in general, the kind of things we assign value to together.

Speaker: 1
10:31

We just kind of all agree that this item, this material, this idea, this building is extremely vatsal. And then we compete over that resource or that idea or that building, and we fight and sometimes there’s, wars and sometimes there’s complete destruction and the rise and fall of ai, all over some resource.

Speaker: 1
10:58

What a funny, strange little world. Mostly harmless sai, Hitchhiker’s Ai to the Galaxy ai humans. For some reason, instead of that book, I was gonna say Catcher in the Rye. In my exhausted brain, the books kind of all morphed together. The Catcher in the Rye ai a really damn good book.

Speaker: 1
11:23

All of the classics I return to often, the simple books are just even ai the first book I read in English, trivial book, trivial book called The Giver. It’s like Sai returned to it in its simplicity. Maybe it’s, maybe it has sentimental value. Maybe that’s what it is. But just simplicity of words.

Speaker: 1
11:45

Animal Farm, I’ve read, I don’t know how many times, probably over 50 times. I returned to it over and over and over, the simplicity, the poetry of that simplicity. It’s something that just resonates with my brain. Maybe it’s a peculiar kind of brain. It is a peculiar kind of brain.

Speaker: 1
12:01

And I have to thank you for being patient with this peculiar kind of brain. Get a simple pack for free with, any purchase of whatever the thing I was talking about, which I think is Element. Try it at drink element.com/lex.

Speaker: 0
12:20

This is a Lex Fridman podcast. To support it, please check out our sponsors in the description. And now, dear friends, here’s Mark and Reason. Alright. Let’s start with optimism. If you were to imagine the best possible one to 2 years, 2025, 26 for meh, for big tech and small tech, what would it be?

Speaker: 0
12:59

What would it look like? Lay out your vision for the best possible scenario trajectory for America.

Speaker: 2
13:05

The roaring twenties.

Speaker: 0
13:06

Roaring twenties.

Speaker: 2
13:07

The roaring twenties. I mean, look, couple things. It is remarkable over the last several years with all of the issues, including, you know, every not just everything in politics, but also COVID and every other thing that’s happened. It’s really amazing. The United States just kept growing. If you just look at economic growth charts, the US just growing, and very significantly meh other countries stopped growing.

Speaker: 2
13:23

So Canada stopped growing. The UK has stopped growing. Germany has stopped growing, and, you know, some of those countries may be actually going backwards at this point. And there’s a very long discussion to be had about what’s wrong with those countries. And there’s, of course, plenty of things that are wrong with our country, but, the US is just flat out primed for growth.

Speaker: 2
13:40

And I think that’s a consequence of many factors. You know, some of which were are lucky and some of which through hard work. And so the lucky part is just, you know, number 1, we know we just have, like, incredible physical security by being our own continent. You know, we have incredible natural resources. Right?

Speaker: 2
13:56

There’s there’s there’s this running joke now that, like, whenever it looks like the US is gonna run out of some, like, rare earth material, you know, some farmer in North Dakota, like, kicks over a hay bale and finds, like, a $2,000,000,000,000 deposit. Mhmm. Right? I mean, we’re we’re just, like, blessed, you know, with with with geography and the natural resources. Energy, you know, we can be energy independent anytime we want.

Speaker: 2
14:13

This last administration decided they didn’t wanna be. They wanted to turn off American energy. This new administration has declared that they have a goal of turning it on in a dramatic way. There’s no question we can be energy dependent. We can be a giant net energy exporter. It’s purely a question of choice, and I think the the new administration is going to do that.

Speaker: 2
14:30

And so we and oh, and then I would say 2 other things. 1 is, you know, we we are the beneficiaries, and, you know, you’re an example of this. We’re a beneficiary we’re the beneficiary of, of, you know, 50, a 100, 200 years of, like, the basically most aggressive bryden smartest people in the world, most capable people, you know, moving to the US and raising their kids here.

Speaker: 2
14:46

And so we just have, you know, by far the most dynamic. You know, we’re by far the most dynamic population, most aggressive, you know, we’re the most aggressive set of characters in a certainly in any in any western country and have been for a long time and certainly are today.

Speaker: 2
14:59

And then finally, I would just say, look, we are overwhelmingly the advanced technology leader. You know, we we have our issues, and we have a, I would say, particular issue with manufacturing, which we could talk about. But for, you know, anything in software, anything in AI, anything in, you know, all these, you know, advanced ai, all these advanced areas of technology, like, we’re we’re by far the leader.

Speaker: 2
15:17

Again, in part, because many of the best scientists and engineers in those fields, you know, you know, come to the US. And so we we we just we have all of the preconditions for a, for a just a monster, boom. You know, I could see economic growth going way up. I could see productivity growth going way up, rate of technology adoption going way up.

Speaker: 2
15:34

And then we could we can do a global tour if you like, but, like, basically, all of our competitors have, like, profound issues, and, you know, we could kinda go through them 1 by 1, but the the the competitive landscape just is it’s like ai it’s remarkable how, how how much better position we are for growth.

Speaker: 0
15:50

What about the humans themselves? Almost philosophical questions. You know, I travel across the world and there’s something about the American spirit, the entrepreneurial spirit that’s uniquely intense in America. I don’t know what that is. I I’ve talked to, Sana who claims it might be the Scots Irish blood that runs through, the the history of America.

Speaker: 0
16:11

What is it? You at the heart of Silicon Valley, is there something in the water? Why is there’s this entrepreneurial spirit?

Speaker: 2
16:19

Yeah. So is this a family shah, or am I allowed to swear?

Speaker: 0
16:21

You you could say whatever the fuck you want.

Speaker: 2
16:23

Okay. So the t the great TV show succession. The show, of course, that would which you were intended to root for exactly 0 of the characters.

Speaker: 0
16:30

Yes.

Speaker: 2
16:30

The best line for succession was in the final episode of the first season when the whole family’s over in, Logan Ai ancestral, homeland of Scotland, and they’re at this castle, you know, for some wedding. And Logan is ai, like, completely miserable after having to you know, because he’s been in New York for 50 years.

Speaker: 2
16:44

He’s totally miserable being back in in in Scotland, and he gets in some argument with somebody and he’s like, my he says finally, he just says, my god. I cannot wait to get out of here and go back to America where we can fuck without condoms.

Speaker: 0
16:58

Was that a metaphor or okay.

Speaker: 2
17:00

Exactly. Right? And and so no. But it’s exactly the thing. And then everybody instantly knows what that like Yeah. Everybody watching that instantly starts laughing because you know what it means, which is it’s exactly this. I think there’s, like, an ethnographic, you know, way of it.

Speaker: 2
17:09

There’s a bunch of books on, like, all like you said, the Ai Irish, like, all the different derivations of all the different ethnic groups that have come to the US over the course of the last 400 years. Right? But it’s it been and what what we have is this sort of amalgamation of, like, you know, the, you know, the the northeast, you know, Yankees who were, like, super tough and hardcore.

Speaker: 2
17:24

Yeah. The Sai Irish are super aggressive. You know, we’ve got the, you know, the southerners and the Texans, you know, and the and, you know, the sort of, you know, whole ai of blended, you know, kind of Anglo Hispanic thing with, you know, super incredibly tough, strong, driven, you know, capable characters, you know, the Texas Rangers.

Speaker: 2
17:38

You know, we’ve got the, yeah, we’ve got the California. You know, we’ve got the, you know, the wild. We’ve got the incredibly, you know, inventive hippies, but we also have the hardcore engineers. We’ve got, you know, the best, you know, rocket scientists in the world. We’ve got the best, you know, artists in the world, you know, creative professionals, you know, the best movies.

Speaker: 2
17:53

And sai, yeah, there there there is, you know, the the the the, you know let’s say all of our problems, I think, are basically, you know, in my view to some extent, you know, attempts to basically sand all that off and make everything basically boring and mediocre. But there is something in the national spirit that basically keeps bouncing back.

Speaker: 2
18:11

And it and and and basically what we discover over time is we we basically just need people to stand up at a certain point and say, you know, it’s time to, you know, it’s time to build. It’s time to grow. You know, it’s time to do things. And so and there’s something in the American spirit that just, like, rears right back to life.

Speaker: 2
18:23

And I and I’ve seen it before. I actually saw you know, I I saw it as a kid here in the in the early eighties. You know, because the the the seventies were, like, horribly depressing. Right? In the in the US. Ai, that it was they were a nightmare on many fronts.

Speaker: 2
18:35

And in a lot of ways, the last decade to me has felt a lot like the seventies. Just being mired in misery, and just this self defeating, you know, negative attitude, and everybody’s upset about everything, and, you know, and then ai the way, like energy crisis and hostage crisis and foreign wars and just demoralization.

Speaker: 2
18:52

Right? You know, the the low point for in the seventies was, you know, Jimmy Carter who just passed away. He went on TV, and he gave this speech known as the Ai speech, and it was, like, the weakest possible trying to, like, rouse people back to a sense of, like, passion. Completely failed. And, you know, we had the, you know, the hostages in, you know, Iran for, I think, 440 days.

Speaker: 2
19:10

And every night on the nightly news, it was, you know, ai around the block, energy crisis, depression, inflation. And then, you know, Reagan came in, and, you know, Reagan was a very controversial character at the ai. And, you know, he came in and he’s like, yep. Nope. It’s morning in Meh, and we’re the ai city on the hill, and we’re gonna do it. And he did it, and we did it.

Speaker: 2
19:26

And the national spirit came roaring back and, you know, worked really hard for a full decade. And Ai I think that’s exactly what I I think, you know, we’ll see, but I think that’s what could happen here.

Speaker: 0
19:34

And I just did a super long podcast on Milton Friedman with Jennifer Burns, who’s this incredible professor at Stanford, and he was part of the Reagan. So some there’s a bunch of components to that, one of which is economic. Yes. And one of which, maybe you can put a word on it, of not to be romantic or anything, but freedom.

Speaker: 0
19:53

Individual freedom, economic freedom, political freedom, and just in general individualism.

Speaker: 2
19:59

Yeah. That’s right. Yeah. Yeah. And as you know, Meh has this incredible streak of individualism, you know. Individualism in America probably peaked, I think, between roughly, call it, the end of the civil war, 18/65 ai to probably call it 1931 or something. You know, and there was this, like, incredible run.

Speaker: 2
20:14

I mean, that period, you know, we now know that period is the 2nd industrial revolution, and it’s when the United States basically assumed global leadership and basically took took over technological and economic leadership from from England. And then, you know, that that led to, you know, ultimately then therefore being able to, you know, not only industrialize the world, but also win World War 2 and then win the Cold War.

Speaker: 2
20:31

And, yeah, there you know, there’s a massive industrial you know, massive, individualistic streak. By the way, you know, Milton Milton Friedman’s old videos are all on YouTube. They are every bit as compelling and inspiring Yep. As they, as they were then. You know, he’s a he’s a singular figure.

Speaker: 2
20:46

Many of us, you know, have you know, I never knew him, but, he was at, actually at Stanford for many years at the Hoover Institution, but, I never met him. But I know a lot of people who worked with him and, you know, that that, you know, he was he was a singular figure, but his his all all of his lessons, you know, live on are fully available.

Speaker: 2
21:01

But I would also say it’s not just individualism. And this is, you know, one of this is one of the big things that’s, like, playing out in a lot of our culture and kind of political fights right now, which is, you know, basically this feeling, you know, certainly that I have and I share with a lot of people, which is it it’s not enough for America to just be an economic zone, and it’s not enough for us to just be individuals, and it’s not enough to just have line go up, and it’s not enough to just have economic success.

Speaker: 2
21:23

There are deeper questions, at play, and and and also, you know, there there there’s more to a country, than just that. And and, you know, quite quite frankly, a lot of it is intangible. A lot of it is, you know, involves spirit, and and passion. And, you know, like I said, we we have more of it than anybody else, but, you know, we we have to choose to want it.

Speaker: 2
21:41

The the the way I look at it is, like, all of our problems are self inflicted. Like, they’re you know, decline ai a choice. You know, all of our problems are basically demoralization campaigns. You know, basically people telling us people in positions of authority telling us that we should, you know, we shouldn’t, you know, stand out. We shouldn’t be adventurous.

Speaker: 2
21:56

We shouldn’t be exciting. We shouldn’t be exploratory. You know, we shouldn’t, you know, this, that, and the other thing, and we should feel bad about everything that we do. And I think we’ve lived through a decade where that’s been the prevailing theme, and I I think quite honestly, as of November, I think people are done with it.

Speaker: 0
22:09

If we could go on a tangent of a tangent, since we’re talking about individualism, and that’s not all that it takes. You’ve mentioned in the past the book, the ancient city

Speaker: 1
22:18

Yes.

Speaker: 0
22:19

Ai, if I could only pronounce the name, French historian, Numa Denis Faustel de Cunon. I don’t know.

Speaker: 2
22:25

That was amazing.

Speaker: 0
22:26

Okay. Alright. From the 19th century. Anyway, you said this is an important book to understand who we are and where we come from.

Speaker: 2
22:31

So what that book does it’s actually quite a striking book. So that book is written by this guy, as a profuse you you sai, I’m gonna let tyler Lex do the pronunciations, foreign language pronunciations for the day. He was a, professor of classics, at, the Sorbonne in, in, Paris, the, you know, the top university, at, in the in the actually, in the 18 sixties.

Speaker: 2
22:51

So actually, it ram ran around after the US Civil War. And he was a savant of a particular kind, which is he and you can see this in the book as he had apparently read and, you know, sort of absorbed and memorized every possible scrap of Greek and and, Roman literature. And this was ai a walking, like, index on basically Greek and Ram. Everything we know about Greek and Roman culture. And that’s significant.

Speaker: 2
23:10

The reason this matters is because basically none of that has changed. Right? And so he he he had access to the exact same written materials that we have we have access to. And so they’re you know, we we’ve learned nothing. And then specifically, what he did is he talked about the Greeks and the Romans, but specifically what he did is he went back further.

Speaker: 2
23:23

He reconstructed the people who came before the Greeks and the Romans and what their life ai society was like, and these were the people who were now known as the as the Indo Europeans. And these were or you may have heard of these. These are the people who came down from the steppes, and so they they they came out of what’s now, like, Eastern Europe, like around sort of the outskirts of what’s now Russia, And then they sort of swept through, Europe.

Speaker: 2
23:40

They ultimately took over all of Europe. By the way, you know, almost many of the ethnicities in the Meh, the 100 of years to follow, you know, are are Indo European. And so, like, you know, they were this basically this warrior, basically, class that, like, came down and swept through and and, and and, you know, essentially, you know, populated much of the world.

Speaker: 2
23:55

And there’s a whole interesting saga there. But what he does and then they basically they they from there came basically what we know as the Greeks and the Romans were kind of evolutions off of that. And so what he reconstructs is sort of what life was like what life was like at least in the west for people in their kind of original social state.

Speaker: 2
24:11

And the significance of that is is the original social state is this is living in the state of the absolute imperative for survival with absolutely no technology. Right? Like, no modern systems. No nothing. Right? You’ve got the closing your back, you’ve got your, you know, you’ve you’ve got whatever you can build with your bare hands, right?

Speaker: 2
24:26

This is, you know, predates basically all concepts of of of technologies we understand that today. And so these are people under ai maximum levels of physical ai pressure, and so what what social patterns do they evolve to be able to do that? And and the social pattern basically was as follows.

Speaker: 2
24:40

Is a is a 3 part social structure, family, ai, and city, and, 0 concept of individual rights, and essentially no concept of individualism. And so you were not an individual. You were a member of your family, and then a set of families would aggregate into a tribe, and then a set of tribes would aggregate into a, into a city.

Speaker: 2
25:00

And then the morality was completely it was actually what Nietzsche talks Nietzsche talks about. The the morality was entirely master morality, not slave morality. And so in their morality, anything that was strong was good, and anything that was weak was bad. And it’s very clear why that is. Right?

Speaker: 2
25:14

It’s because strong equals good equals ai. Weak equals bad equals die. And that led to what became known later as the master slave dialectic, which is is it more important for you to live on your feet as a master even at the risk of ai? Or are you willing to, you know, live as a slave on your knees in order to not ai?

Speaker: 2
25:28

And this is sort of the the derivation of that moral framework. Christianity later inverted that moral framework, but it you know, the the original, framework lasted for, you know, many many thousands of years. No concept of individualism. The head of the family had total life and death control over the over over the family. The head of the tribe, same thing. Head of the city, same thing.

Speaker: 2
25:44

And then you were morally obligated to kill members of the of the other cities on on contact. Right? You were morally required to. Like, if you didn’t do it, you were a bad person. And then the form of the society was basically maximum fascism combined with maximum communism. Right?

Speaker: 2
25:59

And so it was maximum fascism in the form of this, like, absolute top down control where the head of the family, tribe, or city could kill other members of the community at any time with no repercussions at all. So ai a maximum ai, but combined with maximum communism, which is no market economy, and so everything gets shared. Right?

Speaker: 2
26:16

And the and sort of the point of being in one of these collectives is that it’s and, you know, and and people are sharing. And, of course, that limited how big they could meh, because, you know, the problem with communism is it doesn’t scale. Right? It works at the level of a family.

Speaker: 2
26:27

It’s much harder to make it work at the level of a country. Impossible. Maximum fascism, maximum communism. And then and then it was all intricately, tied into their religion. And their their religion was, it was in 2 parts. It was, veneration of ancestors, and it was veneration of nature.

Speaker: 2
26:43

And the veneration of ancestors is is extremely important because it was basically ai basically, the ancestors were people who got you to where you were. The ancestors were the people who had everything to teach you. Right? And then it was veneration of nature because, of course, nature is the thing that’s trying to kill you.

Speaker: 2
26:56

And then you had your ancestor every family, tribe, or city had their ancestor gods, and then they had their, they had their nature gods. Okay. So fast forward to today, like, we live in a world that is, like, radically different, but what you and you’re and the and the book takes you through kind of what happened from that through the Greeks and Romans through to Christianity.

Speaker: 2
27:10

And so the but but it’s very helpful to kind of think in these terms because the conventional view of the progress through time is that we are you know, the the cliche is the arc of the, you know, moral universe, you know, Ben Storrs justice, right, or sai called whig history, which is, you know, that the arc of progress is positive.

Speaker: 2
27:25

Right? And so we we you know, what you hear all the time, what you’re taught in school and everything is, you know, every year that goes by, we get better and better and more and more moral and more and more pure and a better version of ourselves. Our indo european ancestors would say, oh, no, ai, you people have, like, fallen to shit.

Speaker: 2
27:39

Like, you people took all of the principles of basically your civilization, and you have diluted them down to the point where they barely even matter. You know? And you’re having, you know, children out of wedlock, and you’re, you know, you regularly encounter people of other cities, and you don’t try to kill them.

Speaker: 2
27:52

And, like, how crazy is that? And and they would basically consider us to be living ai an incredibly diluted version of this sort of highly religious, highly cult ai, right, highly organized, highly fascist fascist communist society. I can’t resist noting that as a consequence of basically going through all the transitions we’ve been through, going all the way through Christianity, coming out the other end of Christianity, Nietzsche declares God is dead.

Speaker: 2
28:15

We’re in a secular society, you know, that still has, you know, tinges of Christianity, but, you know, largely prides itself on no longer being religious in that way. You know, we being the sort of most fully evolved modern secular, you know, expert scientists and so forth have basically re evolved or fallen back on the exact same religious structure, that the indo europeans had, specifically ancestor worship, which is identity politics, and nature worship, which is environmentalism.

Speaker: 2
28:40

And so we have actually, like, worked our way all the way back to their cult religions without realizing it. And and and it just goes to show that, like, you know, in some ways, we have fallen far from the far from the family tree, but in in some cases, we’re we’re we’re exactly the same.

Speaker: 0
28:53

You kind of described this progressive idea of wokeism and so on as, worshiping ancestors.

Speaker: 2
29:00

Identity politics is worshiping ancestors. Right? It’s it’s it’s tagging newborn infants with either, you know, benefits or responsibilities or, you know, levels of condemnation based on who their ancestors were. The the Indo Europeans would have recognized it on-site. We somehow think it’s like super socially progressive.

Speaker: 1
29:16

Yeah. And it is not.

Speaker: 2
29:17

I mean, I I would say obviously not. Let’s, you know, get get new answers, which is where I think you’re headed, which is, look, ai, is the idea that you can, like, completely reinvent society every generation and have no regard whatsoever for what came before you? That that seems like a really bad idea. Right? That’s ai the Cambodians with your 0 under Pol Pot and, you know, death, you know, follows.

Speaker: 2
29:34

It’s obviously the Ai tried that. You know, the the, you know, the the utopian fantasists who think that they can just rip up everything that came before and create something new in the human condition, and human society have a very bad history of of causing, you know, enormous destruction.

Speaker: 2
29:47

So on the one hand, it’s ai, okay. There there is like a deeply important role for tradition. And and and the way I think about that is it’s it’s the process of evolutionary learning. Right? Which is what what tradition ought to be is the distilled wisdom of all and and, you know, this is how we know Europeans thought about it.

Speaker: 2
30:01

It should be the distilled wisdom of everybody who came

Speaker: 0
30:03

before you.

Speaker: 2
30:03

Right? All all those important and powerful lessons learned. And that’s that’s why I think it’s fascinating to go back and study how these people lived is because that’s that’s part of the history and, you know, part of the learning that got us to where where we are today. Having said that, there are many around the world that are, you know, mired in tradition to the point of not being able to progress.

Speaker: 2
30:19

And in fact, you might even say globally that’s the default human condition, which is, you know, a lot of people are in societies in which, you know, there’s, like, absolute seniority by age. You know, kids are completely you ai, like, in the US, like, for some reason, we decided kids are in charge of everything. Right?

Speaker: 2
30:32

And, like, you know, they’re the trendsetters, and they’re allowed to, like, set all the agendas and, like, set settle the politics and settle the culture, and maybe that’s a little bit crazy. But, like, in a lot of other cultures, kids have no voice at all. No role at all because it’s the old people who are in charge of everything.

Speaker: 2
30:44

You know, they’re gerontocracies, and it’s all a bunch of 80 year olds running everything, which, by the way, we have a little bit of that too. Right? And so I would I would say is, like, there’s a doubt there’s a there’s a real ai. You know, full traditionalism is communitarianism. You know, it’s ethnic particularism.

Speaker: 2
30:59

You know, it’s ethnic chauvinism. It’s, you know, this incredible level of of resistance to change. You know, that’s ai. I mean, that it just doesn’t get you anywhere. Ai, it it it may be good and fine at the level of individual tribe, but it’s a society ai living in the modern world that you you can’t evolve.

Speaker: 2
31:13

You can’t you can’t advance. You can’t participate in all the good things that, you know, that that have happened. And so I you know, I think probably this is one of those things where extremist on either side is probably a bad idea. And I but, you know, but but this needs to be approached in a sophisticated and nuanced way.

Speaker: 0
31:29

So the beautiful picture you painted of the roaring twenties, how can the Trump administration play a part in making that future happen?

Speaker: 2
31:37

Yeah. Sai, look, a big part of this is getting the government boot off the neck of the American economy, the American technology industry, the American people. You know, and and, again, this is a replay of what happened in the sixties seventies, which is, you know, for what started out looking like, you know, I’m sure good and virtuous purposes, you know, we we ended up both then and now with this, you know, what I what I describe as sort of a form of soft authoritarianism.

Speaker: 2
31:58

You know, the the good news is it’s not like a military dictatorship. It’s not like, you know, you get thrown into Lubianca, you know, for the most part. It’s not coming at 4 in the morning. You’re not getting dragged off to a cell. So it’s not hard authoritarianism, but it is soft authoritarianism, and so it’s this, you know, incredible suppressive blanket of regulation, rules, you know, this concept of a vtocracy. Right?

Speaker: 2
32:17

What’s required to get anything done? You know, you need to get 40 people to sign off on anything. Any one of them can veto it. You know, it’s a lot of how our now political system works. And then, you know, just this general idea of, you know, progress is bad, and technology is bad, and capitalism is bad, and building businesses is bad, and success is bad.

Speaker: 2
32:35

You know, tall poppy syndrome, you know, basically anybody who sticks their head up, you know, deserves to get it, you know, chopped off. Anybody who’s wrong about anything deserves to get condemned forever. You know, just this this very kind of, you know, grinding, you know, repression, and then coupled with specific government actions such as censorship regimes, right, and debanking. Right?

Speaker: 2
32:55

And, you know, draconian, you know, deliberately kneecapping, you know, critical American industries. And then, you know, congratulating yourselves in the back for doing it or, you know, having these horrible social policies, ai, let’s let all the criminals out of jail and see what happens.

Speaker: 2
33:07

Right? And so, like, we we’ve just been through this period. I you know, I call it a demoralization campaign. Like, we’ve just been through this period where, you know, whether it started that way or not, it ended up basically being this comprehensive message that says you’re terrible, and if you try to do anything, you’re terrible, then fuck you.

Speaker: 2
33:20

And the Biden administration reached ai of the full pinnacle of that in in in in our time. They they got really bad on on many fronts at the same time. And so just ai relieving that, and getting ai of back to a reasonably, you know, kind of optimistic, constructive, you know, pro growth frame of ai.

Speaker: 2
33:37

There’s just there’s so much pent up energy and potential in the American system that that alone is gonna, I think, cause, you know, growth and and and and spirit to take off. And then there’s a lot of things proactively, but yeah. And then there’s a lot of things proactively that could be done.

Speaker: 0
33:50

So how do you relieve that? To what degree has the thing you ai ideologically permeated government and permeated big companies?

Speaker: 2
34:00

Disclaimer at first, which is I don’t wanna predict anything on any of this stuff, because I’ve learned the hard way that I can’t predict politics or Washington at all. But I would just say that the the the plans and intentions are clear, and the staffing supports it, and all the conversations are consistent, with the new administration and that they plan to take, you know, very rapid action on a lot of these fronts very quickly.

Speaker: 2
34:18

They’re gonna do as much as they can through executive orders, and then they’re gonna do legislation and and regulatory changes for the rest. And so they’re they’re gonna move, I think, quickly on a whole bunch of stuff. You can already feel, I think, a shift in the national spirit, or at least let’s put it this way. I feel it for sure in in Silicon Valley.

Speaker: 2
34:31

Ai, it it you know, I mean, we we, you know, we just saw a great example of this with what, you know, with what Mark Zuckerberg is doing. You know, and ai I’m I’m involved with with his company, but, you know, we we just saw ai of in public the scope of the and speed of the changes, you know, are are reflective of of sort of this of a lot of these shifts.

Speaker: 2
34:45

But I would say that that same conversation, those same kinds of things are happening throughout the industry. Right? And so the the the tech industry itself, whether people were pro Trump or anti Trump, like, there’s just like a giant vibe shift, mood shift vatsal, like, kicked in already.

Speaker: 2
34:57

And then I was with a group of Hollywood people about 2 weeks ago, and they were still, you know, people who at least at least vocally were still very anti Ram. But I said, you know, has anything changed since since November 6th? And they they immediately said, oh, it’s completely different.

Speaker: 2
35:10

It feels like the ISIS thawed, you know, woke us over. You know, they said that all kinds of projects are gonna be able to get made now that couldn’t before that, you know, Hollywood’s gonna start making comedies again. You know, like, it’s it’s they were just ai it’s it’s like sai it’s like a just like an incredible immediate, environmental change.

Speaker: 2
35:26

And I’m as I talk to people kind of throughout, you know, certainly throughout the economy, people who run businesses, I I hear that all the time, which is just this this last 10 years of misery is just over. I mean, the one that I’m watching that’s really funny I mean, Facebook’s getting a lot Meta’s getting a lot of attention, but the other funny one is BlackRock, which I’m not which I, you know, and I Ai don’t know him, but I’ve watched for a long time.

Speaker: 2
35:43

And so, you know, the Larry Fink, who’s the CEO of BlackRock, was, like, first in as a major, you know, investment CEO on, like, every dumb social trend and rule sai, like every, ai, I’m going for it, every ai, every retarded thing you can imagine, every Meh. S. G. And every, ai, every possible satellite companies with every aspect of just these these crazed ideological positions and, you know, he was coming in, he literally was like, had aggregate ai in.

Speaker: 2
36:09

He literally ai, like, had aggregate aggregated together 1,000,000,000,000 of dollars of of of of of of of shareholdings that he did not that were, you know, that were his his customers’ rights and he, you know, seized their voting control of their shares and was using it force all these companies to do all of this, like, crazy ideological stuff.

Speaker: 2
36:24

And he he was ai the typhoid Mary of all this stuff in corporate Meh. And if if and he in the last year has been, like, backpedaling from that stuff, like, as fast as he possibly can. And I ai just an example. Last week, he pulled out of the whatever the corporate meh zero ai. You know, he pulled out of the crazy energy energy energy stuff.

Speaker: 2
36:39

And so, like, you know, he’s backing away as fast as he can. He’s doing remember the Richard Pryor, backwards walk? Mhmm. Richard Pryor had this way where he could he could back out of a room while looking at like, he was walking forward. And so, you know, even they’re doing that, and just the whole thing I mean, this I don’t know if you saw the court recently ruled that Nasdaq had these crazy board of directors composition rules.

Speaker: 2
37:01

One of the funniest moments in my life is when my friend Peter Thiel and I were on the the the meta board, and these Nasdaq rules came down, mandated diversity on corporate boards, and so we sat around the table and had to figure out, you know, which of us counted as diverse, and the, very professional attorneys vatsal meh explained with a 100% complete, straight face that Peter Tyler counts as diverse, by virtue of being LGBT.

Speaker: 2
37:22

And and this is a guy who literally wrote a book called the diversity myth.

Speaker: 0
37:26

Yeah.

Speaker: 2
37:27

And he literally looked like he swallowed a live goldfish. And and that and and this was imposed I mean, this was, like, so incredibly offensive to him that, like, it just like, it was just absolutely appalling, and I felt terrible for him, but the look on his face was very funny.

Speaker: 2
37:40

And it was imposed by Nasdaq. You know, your stock exchange is imposing this stuff on you, and then the court whatever the court court of appeals just nuked that. You know, it’s ai the these things basically are being, like, ripped down 1 by 1, and and and what’s on the other side of it is basically, you know, finally being able to get back to, you know, everything that, you know, everybody always wanted to do, which is, like, run their companies, have great products, have happy customers, you know, like, succeed, like, succeed, achieve, outperform, and, you know, work with the best and the brightest sana not and not be made to feel bad about it.

Speaker: 2
38:08

And I I think that’s happening in many areas of American society.

Speaker: 0
38:11

It’s great to hear that Peter Tyler is fundamentally a diversity hire.

Speaker: 2
38:15

Well, so it was very you know, there was a moment. So so Peter, you know, Peter, of course, you know, is is, you know, is is is publicly gay, has been for a long time. You know, but, you know, there are other men on the board. Right? And, you know, we’re sitting there, and we’re all looking at it. We’re like, alright. Like, okay.

Speaker: 2
38:28

LGBT, and and we just we keep coming back to the b. Right? And it’s like you know? It’s like

Speaker: 0
38:37

Alright.

Speaker: 2
38:38

You ai, I’m willing to do a lot for this company, but

Speaker: 0
38:42

It’s all about sacrifice for diversity.

Speaker: 2
38:44

Well, yeah. And then it’s like, okay. Like, is there a test? Right. You know?

Speaker: 0
38:49

So Oh, yeah. Exactly. And how do you prove it?

Speaker: 2
38:52

The the questions that got asked. You know?

Speaker: 0
38:54

What are you willing to do? Yeah.

Speaker: 2
38:56

And I I think ram good. I think I’m very good at asking, lawyers, completely absurd questions with a totally straight face.

Speaker: 0
39:02

And do they answer with a straight face?

Speaker: 2
39:05

Ai. Okay. I think in fairness, they have trouble telling when I’m joking.

Speaker: 0
39:09

So you mentioned the the Hollywood folks, maybe people in Silicon Valley and the ai shift. Maybe you can speak to, preference falsification. What do they actually believe? How many of them actually hate Trump? What like, what percent of them are, feeling this vibe shift and are interested in, creating the roaring twenties in the way they’ve ai?

Speaker: 2
39:34

So first, we should maybe talk arya population. So there’s, like, all of Silicon Valley. And and the way to just measure that is just look at voting records. Right? And and and what that shows consistently, Silicon Valley is just sai, you know, at least historically ai entire time there has been overwhelmingly majority, just straight up Democrat.

Speaker: 2
39:48

The other way to look at that is political donation records. And, again, you know, the political donations in the valley, you know, range from 90 to 99%, you know, to one side. And so, you know, we’ll we’ll I just bring it up because, like, we’ll see what happens with the voting and with donations going forward.

Speaker: 2
40:01

I we maybe talk about the fire later, but I can tell you there is a very big question of what’s happening in Los Angeles right now. Ai don’t wanna get into the fire, but, like, it’s catastrophic and, you know, ai shift in the big cities in California, and I think a lot of people in LA are really thinking about things right now as they’re trying to, you know, literally save their houses and save their families.

Speaker: 2
40:20

But, you know, even in San Francisco, there was a big ai there was a big shift to the right ai the voting, in, in 24. So we’ll we’ll see where we’ll see where that goes. But, you know, you observe that by just looking at looking at the numbers over time. The part that I’m more focused on is, you know, and I don’t know how to exactly describe this, but it’s like the top 1,000 or the top 10,000 people.

Speaker: 2
40:38

Right? And, you know, and I don’t have a list, but, like, it’s the, you know, it’s all the top founders, top CEOs, top executives, top engineers, top VCs, you know, and then kind of in into the ram. You know, the people who kind of built and run the companies. And they’re they’re you know, I don’t have numbers, but I have a much more tactile feel, you know, for for for what’s happening.

Speaker: 2
40:56

Sai I the big thing I I have now come to believe is that the idea that people have beliefs is mostly wrong. I think that most people just go along, and I think even most high status people just go along, and I think maybe the most high status people are the most prone to just go along because they’re the most focused on status.

Speaker: 2
41:18

Sana the way I would describe that is, you know, one of the great forbidden philosophers of our time is the Unabomber, Ted Kaczynski. And amidst his madness, he had this extremely interesting articulation. You know, he was a he was a he was an insane lunatic murderer, but he was also a, you know, a Harvard super genius.

Speaker: 2
41:35

Not that those arya in conflict. But

Speaker: 0
41:40

Shots ai. Meh.

Speaker: 2
41:41

But, he was a very bright ai, and he he did this whole thing, where he talked about basically, he he was very right wing and talked talked about leftism a lot, and he had this great concept that’s just stuck in my mind ever since I wrote it, which is he had this concept which is called over social over socialization. And so, you know, most people are social most people are socialized. Like, most people are, you know, we live in a sai.

Speaker: 2
42:02

Most people learn how to be part of a society. They give some deference to the society. There’s something about modern western elites where they’re over socialized, and they’re just, like, overly oriented towards what other people ai themselves, you know, think, and believe.

Speaker: 2
42:15

And you can get a real sense of that if you have a little bit of an outside perspective, which I just do I think as a consequence of where I grew up. Like, even before I had the views that I have today, there there was always just this weird thing where it’s ai why does every dinner party have the exact same conversation?

Speaker: 2
42:31

Why does everybody agree on every single issue? Why is that agreement precisely what was in the New York Times today? Why are these positions not the same as they were 5 years ago? Right? But why does everybody, like, snap into agreement every step of the way?

Speaker: 2
42:47

And that was true when I came to Silicon Valley, and it’s just as true today 30 years later. And so I I think most people are just literally take I think they’re taking their cues from it’s it’s some combination of the press, the universities, the big foundations. Sai it’s, like, basically, it’s ai the New York Times, Harvard, the Ford Foundation, and, you know, I don’t know, you know, a few CEOs, and a few public figures, and, you know, maybe, you know, maybe the president of your party’s in power.

Speaker: 2
43:09

And, like, whatever that is, everybody just everybody who’s sort of good and proper and elite and good standing and in charge of things and a sort of correct member of, you know, let’s call it coastal American ai, everybody just believes those things. And then, you know, the 2 interesting things about that is number 1, there’s no divergence among the the the organs of power. Right?

Speaker: 2
43:28

So Harvard and Yale believe the exact same thing. The New York Times, The Washington Post believe the exact same thing. The Ford Foundation and the Rockefeller Foundation believe the exact same thing. Google and, you know, whatever. You know, Microsoft believe the exact same thing.

Speaker: 2
43:39

But those things change over time, but there’s never conflict in the moment. Right? And so, you know, the New York Times and the and the Washington Post agreed on exactly everything in 1970, ai, 1990, 2000, 2010, and 2020, despite the fact that the specifics changed radically.

Speaker: 2
43:56

The the lockstep was what mattered. And so I I think basically we we in the Valley, we’re we’re on the tail end of that in the same way. Holly was on the tail end of that in the same way New York’s on the tail end of that, The same way the media is on the tail end of that. It’s it’s ai some sort of collective hive mind thing.

Speaker: 2
44:09

And I just go through that to say, like, I don’t think most people in my orbit or, you know, let’s say the top 10,000 people in the valley or the top 10,000 people in Ai. I don’t think they’re sitting there thinking basically, I have rocks. Mean, they probably think they have rocks out of beliefs, but they don’t actually have, like, some inner core of rocks out of beliefs, and then they kind of watch reality change around them and try to figure out how to keep their beliefs, like, correct.

Speaker: 2
44:29

I don’t think that’s what happens. I think what happens is they conform to the belief system around them. And and I think most of the time, they’re not even aware that that that they’re basically part of a herd.

Speaker: 0
44:38

Is it possible that the surface chatter of dinner parties underneath that, there is a turmoil of ideas and thoughts and beliefs that’s going on, but you’re just talking to people really close to you or in your own ai. And then the socialization happens at the dinner parties.

Speaker: 0
44:56

Like, when you go outside the inner circle of 1, 2, 3, 4 people who you really trust, then you start to conform. But inside there, inside the mind, there is an actual belief or a struggle attention with The New York Times or with the with the listener. For the listener, there’s a there’s a slow smile that overtook Mark Andreessen’s face.

Speaker: 2
45:17

So, like, I’ll just tell you what I think, which is at at at the dinner parties and at the conferences, no. There’s none of that. It’s what what there is is that all of the heretical conversations anything that challenges the status quo, any heretical ai, and any new idea, you know, is a heretical idea.

Speaker: 2
45:32

Any deviation it the it’s either discussed 1 on 1 face to face. It’s it’s like a whisper network, or it’s like a realized social network. There’s a secret handshake, which is ai, okay, you meet somebody and you, like, know each other a little bit, but, like, not well, and, like, you’re both trying to figure out if you can, like, talk to the other person openly or whether you have to, like, be fully conformist.

Speaker: 2
45:49

It’s a

Speaker: 0
45:51

joke. Oh, yeah. Humor.

Speaker: 2
45:53

Somebody cracks a joke. Right? Somebody cracks a joke. Yep. If the other person laughs, the conversation is on.

Speaker: 0
45:58

Yeah. Yeah.

Speaker: 2
45:59

If the other person doesn’t laugh, back slowly away from the scene. Yeah. I didn’t mean anything by it. Yeah. Right? And and then, by the way, it doesn’t have to be, like, a super offensive joke. It just has to be a joke that’s just up against the edge of one of the use the Sam Bankman Fried term, one of the chivalents.

Speaker: 2
46:14

You know, it has to be up against one of the things, of, you know, one of the things that you’re absolutely required to to believe to be the dinner parties. And then and then at that point, what happens is you have a peer to peer network. Right? You you have you have you have a you have a a one to one connection with somebody, and then you you have your you have your your little conspiracy of of thought thought criminality.

Speaker: 2
46:31

And then you have your net you probably been through this. You have your network of thought criminals, and then they have their network of thought meh, and then you have this, like, delicate mating dance as to whether you should bring the thought criminals together. Mhmm. Right?

Speaker: 0
46:42

And the dance, the fundamental, mechanism of the dance is humor.

Speaker: 2
46:46

Yeah. It’s humor. Like, it’s right. Well, of course.

Speaker: 0
46:48

Memes. Yeah.

Speaker: 2
46:48

Well, for 2 for 2 reasons. Number 1 number 1, humor is a way to have deniability. Right? Humor is a way to discuss serious things without without without with having deniability. Oh, I’m sorry. It was just a joke. Right? So so that’s part of it, which is one of the reasons why comedians can get away with saying things the rest of us can’t.

Speaker: 2
47:01

This is you know, they they can always fall back on, oh, yeah. I was just going for the laugh.

Speaker: 0
47:04

But but

Speaker: 2
47:04

the other key thing about humor, right, is that is that laughter is involuntary. Right? Like, you either laugh or you don’t, and and it’s not like a conscious decision whether you’re gonna laugh. And everybody can tell when somebody’s fake laughing. Right? And this every professional comedian knows this. Right? The laughter is the clue that you’re onto something truthful. Mhmm. Like, people don’t laugh at, like, made up bullshit stories.

Speaker: 2
47:20

They they laugh because, like, you’re revealing something that they either have not been allowed to think about or have not been allowed to talk about, right, or is off limits. And all of a bryden, it’s ai the ice breaks and it’s like, oh, yeah, that’s the thing. And it’s funny. And, like, I laugh.

Speaker: 2
47:32

And then and then, of course, this is why, of course, live comedy is so powerful is because you’re all doing that at the same ai, so you start to have, right, the safety of, you know, the safety of numbers. And so so the comedians have, like, the it’s no no surprise to me, like, for example, Joe has been as successful as he has because they have they have this hack that the, you know, the rest of us who are not professional comedians don’t have.

Speaker: 2
47:48

But but you have your in person burden of it. Yeah. And then you got the question of whether the whether you can sort of join the networks together. And then you’ve probably been to this sai you know, then at some point, there’s, like, a different there’s, like, the alt dinner party, the Thorker Middle dinner party, and you get 6 or 8 people together, and you join the networks.

Speaker: 2
48:02

And those are, like, the happiest moat at least in the last decade, those are, like, the happiest moments of everybody’s lives because they’re just, like everybody’s just ecstatic because they’re ai, I don’t have to worry about getting yelled at and shamed, like, for every third sentence that comes out of my mouth, and we can actually talk about real things.

Speaker: 2
48:15

So so that’s the live version of it. And then the and then, of course, the other side of it is the the, you know, the group chat the group chat phenomenon. Right. And and then this and then, basically, the same thing played out, you know, until until Elon bought X and until Substack took off, you know, which were really the 2 big breakthroughs in free speech online.

Speaker: 2
48:30

The the same dynamic played out online, which is you had absolute conformity on the social networks, ai, literally enforced by the social networks themselves through censorship and and then also through cancellation campaigns and mobbing and shaming. Right? And and but then you had but but then group chats grew up to be the equivalent of Samizdat. Right? Mhmm.

Speaker: 2
48:45

Anybody who grew up in the Soviet Union under, you know, communism note you know, they had the hard version of this. Right? It’s like, how do you know who you could talk to, and then how do you distribute information? And, you know, like, you know, again, that was the hard authoritarian version of this, and then we’ve been living through this weird mutant, you know, soft authoritarian version, but with, you know, with some of the same patterns.

Speaker: 0
49:03

And WhatsApp allows you to scale and make it more efficient to, to build on these groups of heretical ideas bonded by humor?

Speaker: 2
49:13

Yeah. Exactly. Well and this is the thing. And then, well, this is kind of the running joke about group chat. Right? The running running kind of thing about group chats. It’s not even a joke. It’s true. It’s like it’s like every group chat, if you’ve noticed this, ai, every this principle of group chats, every group chat ends up being about memes and humor.

Speaker: 2
49:26

And the goal of the game the game of the group chat is to get as close to the line of being actually objectionable

Speaker: 0
49:31

Yeah.

Speaker: 2
49:31

As as you can get without actually tripping it. Right? And then, like, literally every group chat that I have been in for the last decade, even if it starts some other direction, what ends up happening is it becomes the absolute comedy fest where, but it’s walking, they walk right off the line and they’re constantly testing every once in a while somebody will trip the line and people will freak out and it’s like, oh, too soon.

Speaker: 2
49:50

Okay. You know, we gotta wait till next year to talk about that. You know, they they they walk it back. And so it it’s that same thing. And yeah.

Speaker: 2
49:55

And then your group chats is a technological phenomenon. It was amazing to see because, basically, it was number 1, it was, you know, obviously, the rise of smartphones. Then it was the rise of the of the the new messaging services. Then it was the rise specifically of, I would say, combination of WhatsApp and signal, and the reason for that is those were the 2 the 2 big systems that did the full encryption.

Speaker: 2
50:12

So you actually had you actually felt sai, and then the real ai, I think, was disappearing messages, which hit signal probably 4 or 5 years ago, and hit WhatsApp 3 or 4 years ago. And then the combination of, the combination of encryption and, and disappearing meh, I think, really unleashed it. Well, then there’s the ai.

Speaker: 2
50:32

Then there’s the fight over the the the length of the disappear messages. Mhmm. Right? And so it’s like, you know, Ai often get behind on my my thing. So I I set to 7 day, you know, disappear meh. And my friends who, you know, are ai, no, that’s way too much risk. Yeah.

Speaker: 2
50:44

It’s gotta be a day. And then every once in a while, somebody will set it to to 5 minutes before they send something, like, particularly inflammatory.

Speaker: 0
50:49

Yeah. A 100%. Well, what I mean, one of the things that bothers me about WhatsApp, the choice is between 24 hours and, you know, 7 days. One day or 7 days. Right. And I I have to have an existential crisis about deciding Yes. Whether I can last for 7 days with what I’m about to say. Yeah. Exactly.

Speaker: 2
51:07

Now, of course, what’s happening right now is the big thaw. Right? And so what Yeah. The the vibe shift. So what’s happening on the other on the other side of of the election is, you know, Elon on Twitter 2 years ago, and now Mark with Facebook and Ram. And ai the way, with the continued growth of substack and with other, you know, new platforms that are emerging, you know, ai, I I think it it may be you ai, I don’t know that everything just shifts back into public, but, like, a tremendous amount of the, a tremendous amount of the verboten, conversations, you know, can now shift back in the in the public view.

Speaker: 2
51:32

And I mean, quite frankly, this is one of those things, you know, quite frankly, even if I was opposed to what those you know, what people are saying, and I’m sure I am in some cases, you know, I I would argue still, like, net better for society that those things happen in public instead of private.

Speaker: 2
51:44

You know, do you do you really want ai, yeah. Like, don’t you wanna know? Yeah. And and so and and then it’s just look. It’s ai, I think, clearly much healthier to live in a society in which people are not literally scared of what they’re saying.

Speaker: 0
51:56

I mean, to to push back and to come back to this idea that we’re talking about, I do believe that people have beliefs and thoughts that are heretical, like a lot of people. I wonder what fraction of people have that. To me, this is the the preference falsification is really interesting.

Speaker: 0
52:12

What is the landscape of ideas that human civilization has in private as compared to what’s out in public? Because, like, that the the the dynamical system that is the difference between those two is fascinating. Like, there’s throughout history, the the fall of communism in multiple regimes throughout Europe is really interesting because everybody was following, you know, the line until not. Right.

Speaker: 0
52:39

But you better for sure, privately, there was a huge number of boiling conversations happening where, like, this is this the the bureaucracy of communism, the corruption of communism, all of that was really bothering people more and more and more and more. And all of a sudden, there’s a trigger that allows the vibe shift to happen.

Speaker: 2
52:58

That’s

Speaker: 0
52:58

right. To me, like, the in the interesting question here is, what is the landscape of private thoughts and ideas and conversations that are happening under the surface of of of Meh, especially my question is how much dormant energy is there for this roaring twenties, where people are like, no more bullshit.

Speaker: 0
53:18

Let’s get shit done.

Speaker: 2
53:19

Yeah. So let’s go through the we’ll go through the theory of preference falsification. Just just just just

Speaker: 0
53:23

by the way, amazing. The book, nonetheless, is fascinating.

Speaker: 2
53:26

Yeah. Yeah. So this is this is exactly this is one of the all time great books. Incredibly, about 20, 30 year old book, but it’s very it’s completely modern and current, in what it talks about, as well as very deeply historically informed. So it’s called Ai Truths, Public Lies, and it’s written by sai social science professor named Timur Kuran, ai, I think, Duke, and it’s it’s definitive work on this.

Speaker: 2
53:47

And so he he has this concept he calls preference falsification. And so preference falsification is two things. Preference falsification and and you get it from the title of the book, Private Truth, Public Ai. So preference falsification is when you believe something and you can’t say it, or, and this is very important, you don’t believe something and you must say it. Right?

Speaker: 2
54:05

And and and and and the commonality there is in both cases you’re lying. You you you you you believe something internally and then and then you’re lying about it in public. And so the the thing, you know, the and and there’s sort of 2 the 2 classic forms of it. There’s the I you know, for example, there’s the I believe communism is bryden, but Ai say it version of it.

Speaker: 2
54:21

But then there’s also the the the famous parable about the the real life example, but, the thing that Vaclav Havel talks about in the other good book on this topic, which is the power of the powerless, you know, who is an anti communist resistance fighter who ultimately became the, you know, the the president of Czechoslovakia after the fall of the wall.

Speaker: 2
54:38

But he wrote this book, and he he describes the other side of this, which is, workers of the world unite. Right? And so he he describes what he calls the parable of the greengrocer, which is your greengrocer in Prague in 1985. And for the last 70 years, it has been or 50 years, it’s been absolutely mandatory to have a sign in the window of your story that says workers of the world unite.

Speaker: 2
54:58

Right? And it’s 1985. It is, like, crystal clear that the world the workers of the world are not going to unite. Like like, of all the things that could happen in the world, that is not going to happen. The commies have been at that for 70 years.

Speaker: 2
55:10

It is not happening. But that slogan had better be in your window every morning because if it’s not in your window every morning, you are not a good communist. The secret police are gonna come bryden they’re gonna they’re gonna get you. And so the first thing you do when you get to the store is you put that slogan in the window and you make sure that it stays in the window all day long.

Speaker: 2
55:23

And but he says the thing is every single person the greengrocer knows the slogan is fake. He knows it’s a lie. Every single person walking past the slogan knows that it’s a lie. Every single person walking past the store knows that the greengrocer is only putting it up there because he has to lie in public.

Speaker: 2
55:37

And the greengrocer has to go through the humiliation of knowing that everybody knows that he’s caving into the system and lying in public. And so it it it turns into demoralization campaign. It it it it it it’s not just ideological enforcement. In fact, it’s not ideological enforcement anymore because everybody knows it’s fake. The authorities know it’s fake. Everybody knows it’s fake.

Speaker: 2
55:56

It’s not that they’re enforcing the actual ideology of the world’s workers of the world uniting. It’s that they are enforcing ai. Right? And compliance with the regime, and fuck you. You will comply. Right? And so so anyway, that that that’s the other side of that.

Speaker: 2
56:09

And and, of course, we have lived in the last decade through a lot of both of those. I think anybody listening to this could name a series of slogans that we’ve all been forced to chant for the last decade that everybody knows at this point are just, like, simply not true. I’ll I’ll let the audience, you know, speculate on those on their own group chats.

Speaker: 0
56:27

Sai Unmark your memes online as well, please.

Speaker: 2
56:29

Yes. Yes. Exactly. But okay. Sai, anyway, so it’s it’s the two sides of that. Right? So it’s it’s it’s it’s it’s ai, true, and it’s public lies. So then what preference falsification does is it talks about extending that from the idea of the individual experiencing that to the idea of the entire society experiencing that. Right?

Speaker: 2
56:43

And this gets to your percentages question, which is ai, okay. What happens in a society in which people are forced to lie in public about what they truly believe? What happens, number 1, is that individually they’re lying in public, and that’s bad. But the other thing that happens is they no longer have an accurate gauge at all or any way to estimate how many people agree with them.

Speaker: 2
56:58

And and and this is how you and, again, this this this literally is, like, how you get something like like the communist system, which is, like, okay. It it you you you you end up in a situation in which 80 or 90 or 99% of ai can actually all be thinking individually. I really don’t buy this anymore. And if anybody would just stand up and say it, I would be willing to go along with it.

Speaker: 2
57:14

But I’m not gonna be the first one to put my head on the chopping block. But you have no because of the suppression censorship, you have no way of knowing how many other people agree with you. And if the people who if the people agree with you are 10% of the population and you become part of a movement, you’re gonna get killed.

Speaker: 2
57:28

If 90% of the people agree with you, you’re gonna win the revolution. Right? And so the the question of, like, what the percentage actually is ai, like, a really critical question. And then and then basically in any sort of authoritarian system, you can you can’t, like, run a survey, right, to get an accurate result.

Speaker: 2
57:42

And so you actually can’t know until you put it to the test. And then what he describes in the book is it’s always put it to the test in the same way. And this is exactly what’s happened for the last 2 years, like, a 100% of exactly what’s happened. It’s, like, straight out of this book, which is somebody, Elon, sticks his hand up and says the workers of the world are not going to unite. Yeah. Right?

Speaker: 2
58:01

Or the emperor is actually wearing no clothes. Right? You know that famous parable. Right? So one person stands up and does it.

Speaker: 2
58:06

And and literally that person is standing there by themselves everybody else in the audience is like, oh. Mhmm. I wonder what’s gonna happen to that guy. Right? But, again, nobody knows. Elon doesn’t know. The first guy doesn’t know. Other people don’t know, like, which way is this gonna go.

Speaker: 2
58:18

And it may be that that’s sai minority position, and that’s sai way to get yourself killed. Or it may be that that’s the majority position in that and you are now the the leader of a revolution. And then basically, of course, what happens is, okay, the first guy does that, doesn’t get killed.

Speaker: 2
58:30

The second guy does well, a lot of the tyler, that guy does get killed. But when the guy doesn’t get killed, then a second guy pops his head up, says the same thing. Ai? Now you’ve got 2. 2 at least to 4, 4 at least to 8, 8 at least to 16.

Speaker: 2
58:40

And then as we saw with the fall of the Berlin Wall, this is what happened in Russia and Eastern Europe in 89. You when it when it goes, it can go. Right? And then it rips. And then what happens is very, very quickly if it if it turns out that you had a large percentage of the population that actually believe a different thing, it turns out all of a sudden everybody has this giant epiphany that says, oh, I’m actually part of the majority.

Speaker: 2
58:59

And at that point, like, you were on the freight train revolution. Right? Like, it is rolling. Right? Now the other part of this is the distinction between the role of the elites and the masses.

Speaker: 2
59:09

And here and here the best book is called The True Believer, which is the the Eric Hoffer book. And so the the the nuance you have to put on this is the the the elites play a giant role in this, because the the the elites do idea formation and communication, but the elites ai definition are a small ai.

Speaker: 2
59:24

And so there’s also this giant role played by the masses, and the masses are not necessarily thinking these things through in the same intel intellectualized ai way that the elites are, but they are for sure experiencing these things in their daily lives, and they for sure have at least very strong emotional views on them.

Speaker: 2
59:38

And so when you when you really get the revolution, it’s when you get the elites lined up with or or or a new either the current elites change or the new set of elites, a new set of counter elites, basically come along and sai, no. There’s actually a different and better way to live. And then the piece the people basically decide to follow the, you know, to follow the counter elite.

Speaker: 2
59:53

So that that that’s the other dimension to it. And, of course, that part is also happening right now. And again, case study number 1 of that would be Elon and his, you know, he turns out, you know, truly massive following.

Speaker: 0
01:00:02

And he has done that over and over in different industries. Not just saying crazy shit online, but saying crazy shit in the in the realm of space, in the realm of autonomous driving, in the realm of AI, just over and over and over again. Turns out saying crazy shit is one of the ways to do a revolution and to actually make progress.

Speaker: 2
01:00:20

Yeah. And it’s ai, well, and then but then there’s the test. Is it crazy shit or is it the truth? Yeah. Right? And and and, you know, and this is where, you know, meh there are many more specific things about Elon’s genius, but one of the one of the really core ones is an absolute dedication to the truth.

Speaker: 2
01:00:31

And so when Elon says something, it sounds like crazy shit, but in his mind it’s true. Now is he always right? No. Sometimes the rockets ram. Like, you know, ai he’s wrong. He’s human. He’s like anybody else. He’s not right all the time.

Speaker: 2
01:00:42

But at least my my through line with him both in what he says in public and what he says in private, which by the way are the exact same things. He he does not do this. He doesn’t lie in public about what he believes in ai. At least he doesn’t do that anymore. Like, he he’s a 100% consistent in my in my experience.

Speaker: 2
01:00:56

By the way, there’s 2 guys who are a 100% consistent like that that I know, to Elon and Trump. Yeah. Whatever you think of them

Speaker: 0
01:01:04

Yeah.

Speaker: 2
01:01:04

What they say in private is a 100% identical to they say in public. Like, they are completely transparent. They’re completely honest in that way. Right? Which is like and, again, it’s not like they’re perfect people, but they’re honest in that way. And it and it makes them potentially both as they have been very powerful leaders of these movements because they’re both willing to stand up and say the thing that if it’s true, it turns out to be the thing in many cases that, you know, many or most or almost everyone else actually believed, but nobody was actually willing to say out loud.

Speaker: 2
01:01:28

And so they can actually catalyze these shifts. And I and I mean, I think this framework is exactly why Trump took over the Republican party is Sai think Trump stood up there on stage with all these other ai of congressional Republicans, and he started saying things out loud that it turned out the base really was they were either already believing or they were prone to believe.

Speaker: 2
01:01:41

And he was the only one who was saying them. And so the again, elite masses. He was elite. The voters of the vatsal and the the voters decided, you know, no. No more bushes. Like, we’re going this other direction. That’s the mechanism of social change.

Speaker: 2
01:01:53

Like, what what we just described is, like, the actual meh of social change. It is fascinating to me that we have been living through exactly this. We’ve been living through everything exactly what Tyler Ai describes. Everything that Vaclav Havel described. You know, black squares in Instagram, like the whole thing. Right? All of it.

Speaker: 2
01:02:09

And we’ve been living through the, you know, the the true believer elites masses, you know, thing with, you know, with the sai of, like, basically incredibly corrupt elites, wondering why they don’t have the losses anymore and a set of new elites that are running away with things.

Speaker: 2
01:02:20

And so, like, we’re we’re living through this, like, incredible applied case study, of these ideas. And, you know, if there’s a moral of the story, it is, you know, I think fairly obvious, which is it it’s a really bad idea for a society to wedge itself into a position in which most people don’t believe the fundamental precepts of what they’re told they have to do, you know, to be to be good people like that.

Speaker: 2
01:02:37

That is just not not a good state to be in.

Speaker: 0
01:02:39

So one of the ways to avoid that in the future maybe is to keep the delta between what’s said in private and what’s said in public small.

Speaker: 2
01:02:47

Yeah. It’s like, well, this is sort of the the siren song of censorship is we can keep people from saying things, which means we can keep people from thinking things. Yeah. And, you know, by the way, that may work for a while. Right? Like, you know, this I mean, again, the hard form of the Soviet Union, you know, Soviet Union, owning a mimeograph pre photocopiers, there were mimeograph machines that were used to make Samostat underground newspapers, which is the the mechanism that can be written communication of of radical ideas radical ideas.

Speaker: 2
01:03:10

Ownership of a mimeograph machine was punishable by death. Ai? So that that’s the hard version. Right? You know, the soft version is somebody clicks a button in Washington and you were erased from the Internet. Right? Like, which, you know, good news, you’re still alive.

Speaker: 2
01:03:25

Bad news is, you know, shame about not being able to get a job. You know, too bad your family now, you know, they hate you and won’t talk to you. But, you know, what or whatever the, you know, whatever the version of cancellation has been. And so so so, like, does that work?

Speaker: 2
01:03:36

Like, maybe it works for a while. Like, it worked for the Soviet Union for a while, you know, in its way, especially when it was coupled with, you know, official state power. But when it unwinds, it can unwind with, like, incredible speed and ferocity, because to your point, there’s all this bottled up energy.

Speaker: 2
01:03:49

Now your question was, like, what are the percentages? Like, what what’s the breakdown? And so my my rough guess, just based on what I’ve seen in my world, is it’s something like 20, 60, 20. It’s like you’ve got 20%, like, true believers in whatever is, you know, the current thing.

Speaker: 2
01:04:05

People who are just, like, true believers of whatever they they’re they’re, you know, whatever’s the like I said, whatever’s tyler New York Times, Harvard professors, and the Ford Foundation, like, just they just believe and ai the way, maybe it’s 10, maybe it’s 5, but let’s say generously it’s it’s 20.

Speaker: 2
01:04:17

So so, you know, 20% kind of full on revolutionaries. And then you’ve got let’s call it 20% on the other side that are, like, no. I’m not on board with this. This is this is crazy. I’m not I’m not signing up for this, but, you know, you know, they their view of themselves is they’re in a small minority, and in fact they start out in a small minority because what happens is the 60% go with the first 20%, not the second 20%.

Speaker: 2
01:04:37

So you you’ve got this large middle of people. And it’s not that there’s anything like it’s not the people in the middle are not smart or anything like that. It’s that they just have, like, normal lives, and they’re just trying to get bryden, and they’re just trying to go to work each day and do a good job and be a good person and raise their kids and, you know, have a little bit of time to watch the game.

Speaker: 2
01:04:54

And they’re just not engaged in the cut and thrust of, you know, political activism or any of this stuff. It’s just not their thing. But then but that’s where the over socialization comes in. It’s just ai, okay, by default, the 60% will go along with the 20% of the radical revolutionaries at least for a ai, and then the counter elite is in this other 20 percent.

Speaker: 2
01:05:13

And over time, they build up a theory and network and ability to resist, and a new set of representatives and a new set of ideas. And then at some point, there’s a contest. And then and then and then and then right and then the question is what happens in the middle? What happens in the 60%? And it and it’s ai of my point. It’s not even really does the 60% change their beliefs as much as it’s ai, okay.

Speaker: 2
01:05:34

What what is the thing that that 60% now decides to basically fall into into step with? And I think that 60 percent in the valley, that 60% for the last decade decided and, you know, extremely, I would say, on edge, on a lot of things, and I you know, that 60% is pivoting in real time.

Speaker: 2
01:05:52

They’re they’re just done. They’re they’re they’ve just had it.

Speaker: 0
01:05:55

And I would love to see where that pivot goes because there’s internal battles happening right now. Right?

Speaker: 2
01:06:00

So this is the other thing. Okay. So there’s 2 two forms of internal there’s 2 forms of things that Antimo Kranos Timur has actually talked about this. Professor Kranos talked about this. So so one is he said that meh said this is the kind of unwind where what you’re gonna have is you’re you’re now gonna have people in the other direction.

Speaker: 2
01:06:11

You’re gonna have people who claim that they supported Trump all along who actually didn’t. Right?

Speaker: 0
01:06:17

Right.

Speaker: 2
01:06:17

So it’s gonna swing the other way. And by the way, Trump’s not the only part of this, but, you know, he’s just a convenient shorthand for, you know, for for a lot of this. But, you know, whatever it is, you’ll you’ll have people who will say, well, I never supported the Sai. Right?

Speaker: 2
01:06:27

Or I never supported ESG, or I never thought we should have canceled that person. Right? Where, of course, they were full on a part of the mob, like, you know, kind of at that moment. And so, anyway so you’ll have preference falsification happening in the other direction. And he and his prediction, I think, basically, is you’ll end up with the same quote problem on the, on the other side. Now will that happen here?

Speaker: 2
01:06:44

I don’t know. You know? Well, how far is American society willing to go into these things? I don’t know. But, like, there is some some question there.

Speaker: 2
01:06:50

And then and then the other part of it is, okay, now you have this, you know, elite that is used to being in power for the last decade. And it and by the way, many of those people are still in power, and they’re in very, you know, important positions. And the New York Times ai still the New York Tyler, and Harvard is still Harvard. And, like, those people haven’t changed, like, at all. Right?

Speaker: 2
01:07:06

And they still and, you know, they’ve been bureaucrats in the government and, you know, senior Democratic, you know, politicians and so forth. And and they’re sitting there, you know, right now feeling like reality has just smack them hard in the face because they lost the election so badly.

Speaker: 2
01:07:18

But they’re now going into a sana, specifically, the Democratic party is going into a civil war. Right? And and and and and that form of the civil war is completely predictable, and that’s exactly what’s happening, which is half of them are saying we need to go back to the center.

Speaker: 2
01:07:31

We need to deradicalize because ai lost the people. We’ve lost that the people in the middle, and so we need to go back to the middle in order to be able to get 50% plus 1 in an election. Right? And then the other half of them are saying, no. We weren’t true to our principles. We were too weak. We were too soft.

Speaker: 2
01:07:44

You know, we must become more revolutionary. We must double down, and we must, you know, celebrate, you know, murders in the street of health insurance executives. And that’s and that that right now is like a real fight.

Speaker: 0
01:07:52

If I could tell you a little personal story that breaks my heart a little bit. There’s a there’s a professor, historian, I won’t say who, who I admire deeply, love his work. He’s a kind of a heretical thinker. And we were talking about having a podcast or doing a podcast, and he eventually said that, you know what?

Speaker: 0
01:08:13

At this time, given your guest list, I just don’t want the headache of being in the faculty meetings in my particular institution. And I asked who are the particular figures in this guest list? He said, Trump. And the second one, he said that you announced your interest to talk to Vladimir Putin. So I just don’t want the headache.

Speaker: 0
01:08:39

Now I I fully believe he, it would surprise a lot of people if I said who it is. But, you know, this is a person who’s not bothered by the, the guest list. And I should also say that 80 plus percent of the guest list is left wing. Okay. Nevertheless, he just doesn’t want the headache.

Speaker: 0
01:08:59

And that speaks to the the thing that you’ve kind of mentioned that you just don’t don’t want the headache. You just sana just have a pleasant morning with some coffee and talk to your fellow professors. And I think a lot of people are feeling that in universities and in other context in tech companies. And I wonder if that shifts, how quickly that shifts.

Speaker: 0
01:09:20

And there, the percentages you meh, 20, 60, 20 matters. And the and the the contents of the private groups matters, and the dynamics of how that shifts matters. Because it’s very possible nothing really changes in universities and in major tech companies, or just there’s a kind of excitement right now for potential revolution and these new ideas, these new vibes to reverberate through these companies and universities as best possible.

Speaker: 0
01:09:48

The the the wall will hold.

Speaker: 2
01:09:51

Yeah. So he’s a friend of yours. I respect that you don’t wanna name him. I also respect you don’t wanna beat on him. So I would like to beat on him on your behalf. Mhmm. Does he have tenure?

Speaker: 0
01:10:00

Yes. He should use it.

Speaker: 2
01:10:04

So this is the thing. Right? But this is the ultimate indictment of the corruption and the rot at the heart of our education system, at the heart of these universities, and it’s by the way, it’s, like, across the board. It’s, like, all the all the top universities. It’s, like, the because the the siren song for, right, what it’s been for 70 years, whatever, the tenure system, peer review system, tenure system, which is, like, yeah, you work your butt off as an academic to get a professorship, and then to get tenure because then you can say what you actually think.

Speaker: 2
01:10:31

Right? Then you can do your work and your research and your speaking and your teaching without fear of being fired. Right? Without fear of being vatsal. Ai, academic freedom.

Speaker: 2
01:10:43

I mean, think of the term academic freedom, and then think of what these people have done to it. Ai, it’s gone, ai, that entire thing was fake and is completely bryden, and these people are completely completely giving up the entire moral foundation of the system that’s been built for them, which ai the way is paid for virtually 100% by taxpayer money.

Speaker: 0
01:11:11

What’s the what’s the inkling of hope in this? Like, what, this particular person and others who hear this, what can give them strength, inspiration, and courage?

Speaker: 2
01:11:21

That the population at large is gonna realize the corruption in their industry and it’s going to withdraw the funding.

Speaker: 0
01:11:26

It’s okay. Sai desperation.

Speaker: 2
01:11:28

No. No. No. No. No. Think about what happens next. Okay. So let’s go let’s go through it. So the the the universities the universe the universities are funded by 4 primary sources of federal funding. The the big one is a federal student loan program, which is, you know, in the many trillions of dollars at this point and then only spiraling, you know, way faster than than inflation.

Speaker: 2
01:11:42

That’s number 1. Number 2 is federal research funding, which is also very large. And you probably know that, when a scientist at the university gets a research grant, the university rakes as much as 70% of the money, for central uses.

Speaker: 0
01:11:56

Yeah.

Speaker: 2
01:11:57

Number 3 is tax exemption at the operating level, which is based on the idea that these are nonprofit institutions as opposed to, let’s say, political institutions. And then number 4 is tax exemptions at the endowment level, you know, which is the financial buffer that these places have.

Speaker: 2
01:12:12

Ai anybody who’s been close to a university budget will basically see that what would happen if you withdrew those sources of federal taxpayer money. And then for the state schools, the state money, they all instantly go bankrupt, and then you could rebuild. Then you could rebuild, because the problem right now, and you know, like the folks at University of Austin are like mounting a very valiant effort, and I hope that they succeed, and I’m ai for them, but the problem is you’re now inserting You’re suppose we suppose you and I wanna start a new university, and we wanna hire all the free thinkers professors, and we wanna have the place that fixes all this.

Speaker: 2
01:12:42

Practically speaking, we can’t do it because we can’t get access to that money. Are you the most direct reason we can’t get access to that money? We can’t get access to federal student funding. Do you know how universities are accredited, for the purpose of getting access to federal student funding federal student loans?

Speaker: 2
01:12:56

They’re accredited by the government, but not directly indirectly. They’re not accredited by the Department of Education. Instead, what happens is the Department of Education accredits accreditation bureaus that are nonprofits that do the accreditation. Guess what the composition of the accreditation bureaus is? The existing universities.

Speaker: 2
01:13:13

They are in complete control. The incumbents are in complete control as to who gets as as to who gets access to payroll student loan money. Guess how enthusiastic they are about accrediting a new university. Right. And so we we have a government funded and supported cartel, that has gone I mean, it’s just obvious now.

Speaker: 2
01:13:32

It’s just gone, like, sideways sai basically any possible way if it goes ai, including, I mean, literally, as you know, students getting beaten up in the on campus for being, you know, the wrong religion. They’re just they’re they’re just wrong in every possible way at this point. And and they’re they’re it’s all in the federal taxpayer back. And there is no way.

Speaker: 2
01:13:46

I mean, I my opinion, there is no way to fix these things without without replacing them. And and there’s no way to replace them without letting them fail. And and ai the way, it’s like everything else in life. I mean, in a sense, this is like the most obvious conclusion of all time, which is what happens in in the business world when a company does a bad job is they go bankrupt and another another company takes its place.

Speaker: 2
01:14:05

Right? And that that’s how you get progress. And, of course, below that is what happens is this is the process of evolution. Right? Why why does anything ever get better? It’s because things are tested and ai, and then, you you know, the things that the things that are good survive.

Speaker: 2
01:14:15

And so these places have cut themselves off they’ve been allowed to cut themselves off from both for evolution at the institutional level and evolution at the individual level as shown by the the the the the just widespread abuse of tenure. And so we we’ve just stalled out. We we built a we built an ossified system, an ossified centralized corrupt system. We’re we’re surprised by the results.

Speaker: 2
01:14:34

They are not fixable in their current form.

Speaker: 0
01:14:38

I disagree with you on that. I have maybe it’s grounded in hope that I believe you can revolutionize a system from within because I do believe Stanford and MIT are important.

Speaker: 2
01:14:48

Oh, but that logic doesn’t follow at all. That’s underpants gnome logic.

Speaker: 0
01:14:52

Underpants gnome can you explain what that means?

Speaker: 2
01:14:54

Underpants gnomes logic. So I just started watching a key touchstone of American culture of my 9 year old, which of course is South Park.

Speaker: 1
01:15:00

Yes.

Speaker: 2
01:15:01

And there is Wow. And there is a which by the way is a little aggressive for a 9 year old.

Speaker: 0
01:15:05

Very aggressive.

Speaker: 2
01:15:05

But but but he likes it. So he’s learning all kinds of new words.

Speaker: 0
01:15:09

Yeah. All kinds of new ideas. But yeah.

Speaker: 2
01:15:11

I told I told ai, I said, you’re gonna hear words on here that you are not allowed to use.

Speaker: 0
01:15:14

Right. Yeah. Education.

Speaker: 2
01:15:16

Ai I said, do you know how we have an agreement that we never lie to mommy? Ai said Yeah. I sai, not using a word that you learn in here, does not count as lying.

Speaker: 0
01:15:26

Mhmm. Wow.

Speaker: 2
01:15:28

Keep keep that in mind.

Speaker: 0
01:15:29

Orwellian redefinition of ai, but, yes, go ahead.

Speaker: 2
01:15:32

In the very opening episode, in the first in the first 30 seconds, one of the one of the kids calls the other kid a dildo. Right ai, we’re off we’re off to the races. Yep. Let’s go. Daddy, what’s a dildo? Yep. So so, you know, Ai sorry. Sorry, sana. I Ai don’t know. Yeah.

Speaker: 2
01:15:46

Sai, the Underpants gnomes. So famous episode of South Park, the underpants gnomes. And so the underpants gnomes so there’s there’s there’s this ram all the kids basically ai that their underpants are going missing from their dresser drawers. Somebody’s stealing the underpants, and it’s just ai, well, who on earth would steal the underpants the underpants? And it turns out it’s the underpants gnomes.

Speaker: 2
01:16:04

And it turns out the underpants gnomes have come to town, and they’ve got this little underground warren of tunnels and storage places for all the underpants. And so they go out at night, they steal the underpants, and the kids discover that, you know, the underpants gnomes and they’re, you know, what what are you doing?

Speaker: 2
01:16:15

Like, what what’s what’s the point of this? And so the underpants gnomes present their their master plan, which is a 3 part plan, which is step 1, collect underpants. Step 3, profit. Yeah. Step

Speaker: 0
01:16:25

2, question mark.

Speaker: 2
01:16:26

Yeah. So you just

Speaker: 0
01:16:27

you just

Speaker: 2
01:16:28

proposed the

Speaker: 0
01:16:28

underpants gnome

Speaker: 2
01:16:29

Yeah. Which is very common in politics. So the form of this in politics is we must do something.

Speaker: 0
01:16:37

Yeah.

Speaker: 2
01:16:37

This is something. Therefore, we must do this. But there’s no causal logic chain in there at all to expect that that’s actually gonna succeed because there’s no reason to believe that it is. Yeah. But It’s the same thing. But this is what I hear all the time. It’s the and I’m I’m I’m I’ll I will let you talk as the host of the show in a moment, but, but the but I hear this all the time.

Speaker: 2
01:16:56

I hear this I have friends who are on these boards. Right? Very involved in these places, and I hear this all the tyler, which is, like, oh, these are very important. We must fix them, and so therefore, they are fixable. There’s no logic chain there at all.

Speaker: 0
01:17:09

If there’s that pressure that you described in terms of cutting funding, then you have the leverage to fire a lot of the administration and have new leadership that steps up that vatsal, aligns with this vision that things really need to change at the heads of universities. And they put students and faculty ai, fire a lot of the administration, and realign and reinvigorate this idea of freedom of thought and intellectual freedom.

Speaker: 0
01:17:38

I mean, I don’t because there is already a framework of of great institutions that’s there, and the way they talk about what it means to be a great institution is aligned with this very idea that you’re talking about. It’s this meaning, like, intellectual freedom, the idea of tenure. Right? The on the surface, it’s aligned. Underneath, it’s become corrupted.

Speaker: 2
01:18:00

If we say free speech and academic freedom often enough, sooner or later, these tenured professors will get brave.

Speaker: 0
01:18:04

Wait. Do you think the universities are fundamentally broken? Okay. So how do you fix it? How do you have institutions for educating 20 year olds and institutions that host, researchers that have the freedom to do epic shit, like research type shit that’s outside the scopes of r and d departments and inside companies.

Speaker: 0
01:18:26

So how do you create an institution like that?

Speaker: 2
01:18:28

How do you create a good restaurant when the one down the street sucks?

Speaker: 0
01:18:32

Ai. You, invent something new?

Speaker: 2
01:18:34

You open a new restaurant. Yeah. Like, how often in your life have you experienced a restaurant that’s just absolutely horrible, and it’s poisoning all of its customers, and the food tastes terrible, and then 3 years later you go back and it’s fantastic. Charlie Munger actually had the great the best comment on this great investor, Charlie Munger, the great comment meh was once asked.

Speaker: 2
01:18:50

He’s ai, you know, he’s you know, General Electric was going through all these challenges, and he was asked to the q and a. He said, how would you fix the culture at General Electric? And he said, fix the culture at General Electric. He said, I couldn’t even fix the culture at a restaurant. Like, it’s insane. Like, obviously, you can’t do it. Yeah.

Speaker: 2
01:19:05

I mean, nobody in business thinks you can do that. Like, it’s impossible. Like, it’s not. It’s now now look. Having said all that, I should also express this because I have a lot of friends to work at these places and and, and are involved in various attempts to fix these.

Speaker: 2
01:19:17

I hope that I’m wrong. I would love to be wrong. I would love for the I would love for the underpants gnome step 2 to be something clear and straightforward Mhmm. That they can figure out how to do. I would love to love to fix it. I’d love to see them come back to their spoken principles. I Sai think that’d be great.

Speaker: 2
01:19:30

I’d love to see the professors with tenure get bravery. I would love to sai I mean, it would be fantastic. You know, my partner and I have done, like, a lot of public speaking on this topic. It’s it’s been intended to not just be harsh, but also be ai, okay. Like, these these challenges have to be confronted directly.

Speaker: 2
01:19:44

By the way, let me also say something positive. You know, especially post October 7th, there are a bunch of very smart people who are major donors and board members of these institutions like Mark Rowan, you know, who are really coming in trying to, you know, I think legitimately trying trying to fix these places.

Speaker: 2
01:19:57

I have a friend on the executive committee at one of the top technical universities. He’s working ai to try to do this. Man, I hope they can figure it out. But I I but the counter question would just be, like, do you see it actually happening at a single one of these places?

Speaker: 0
01:20:11

Ai a person that believes in leadership. If you have the right leadership Right. The whole system can be changed.

Speaker: 2
01:20:18

So here’s a question for your friend who have tenure at one of these places, which is who runs his university?

Speaker: 0
01:20:23

I think you know you know who I think runs it? Meh. Whoever the fuck says they run it. That’s what great leadership is. Like, a president has that power. But how does the university has the leverage because they can mouth off like Elon can.

Speaker: 2
01:20:34

Can they fire the professors?

Speaker: 0
01:20:36

They can fire them through being vocal publicly. Yes.

Speaker: 2
01:20:39

They fire the professors? What are you

Speaker: 0
01:20:41

talking about? Legal Can

Speaker: 2
01:20:42

we fire No.

Speaker: 0
01:20:42

Can we not fire the professors?

Speaker: 2
01:20:44

Then we know who runs the university.

Speaker: 0
01:20:45

The professors?

Speaker: 2
01:20:46

Yeah. Professors. The professors and the students. The professors and the feral students. Then they’re, of course, in a radicalization feedback cycle driving into the fair

Speaker: 0
01:20:53

tell students.

Speaker: 2
01:20:53

The feral students. Yeah. The feral students. What happens when you’re put in charge of of your bureaucracy where the where the thing that the bureaucracy knows is that they can outlast you? The thing that the tenured professors at all these places know is it doesn’t matter who the president is because they can outlast them because they cannot get fired.

Speaker: 2
01:21:09

By the way, it’s the same thing that bureaucrats in in the government know. It’s the same thing that the the bureaucrats in the Department of Education know. They know the the exact same thing. They they can outlast you. It’s Ai mean, it’s the whole thing that it’s the resistance. Like, they can be the resistance.

Speaker: 2
01:21:20

They can just sit there and resist, which is what they do. They’re not fireable.

Speaker: 0
01:21:24

That’s definitely a crisis that needs to be solved. That’s a huge problem. And I also don’t like that I’m defending academia here. I I agree with you that the situation is dire and but I just think that institutions are important. And I should also add context since you’ve been grilled me a little bit. You were using restaurants as an saloni.

Speaker: 0
01:21:43

And earlier offline in this conversation, you said that Dairy Queen is a great restaurant. So let’s let’s ai look let the listener take that.

Speaker: 2
01:21:50

I said Dairy Queen is the best restaurant.

Speaker: 0
01:21:52

The best restaurant. There you go. So everything that Mark Hadreza is saying today

Speaker: 2
01:21:56

I don’t sana to cut. You should go order a Blizzard. Just one one day, you should mark down there and order a Blizzard.

Speaker: 1
01:22:00

Yeah.

Speaker: 2
01:22:01

They can get, like, 4,000 calories in a cup.

Speaker: 0
01:22:03

They can. And they’re delicious.

Speaker: 2
01:22:04

Amazing. They are They are truly delicious. They’ll put they’ll put anything in there you want. Alright. Okay. So but, anyway, let me just close by saying, look. I I my my friends in university system would just say, look. Like, the I this is the challenge. Like, I would just I would just pose this as the challenge.

Speaker: 2
01:22:16

Like, the to me, like, this is having had a lot of these conversations, like, this is the bar that in my view, this is the conversation that actually has to happen. This is the bar that actually has to be hit. These problems need to be confronted directly. Because I I think there’s just I think there’s been way too ai.

Speaker: 2
01:22:28

I mean, I’m actually we’re kind of on the other side. There’s too much happy talk in these conversations. Mhmm. I think the taxpayers do not understand this level of ai, and I think if the taxpayers come to understand it, I think the funding evaporates. And so I I think the the fuse is going through, you know, no fault of any of ours, but, like, the fuse is going, and there’s some window of time ai, to fix this and address it and justify the money because that it just normal taxpayers sitting in normal towns in normal jobs are not gonna tolerate this for for that much longer.

Speaker: 0
01:22:55

You’ve mentioned censorship a few times. Let us, if we can, go deep into the darkness of the past and how censorship mechanism was used. So you are a good person to speak about the history of this because you were there, on the ground floor in, 2,013 ish Facebook. I heard that, you were there when they invented or maybe developed the term hate speech in the context of censorship of on social media.

Speaker: 0
01:23:27

Sai, take me to through that history, if you can. The use of, censorship.

Speaker: 2
01:23:32

So I was there on the ground floor in ai.

Speaker: 0
01:23:36

There’s multiple floors to this building, apparently.

Speaker: 2
01:23:39

There are. Yeah. So I got the first ask to implement censorship on the Internet, which was in the web browser.

Speaker: 0
01:23:45

That is fascinating.

Speaker: 2
01:23:46

Yeah. Yeah. ai actually 1982, I was asked to implement a nudity filter.

Speaker: 0
01:23:51

Did you have the courage to speak up back then?

Speaker: 2
01:23:53

I I did not have any problems speak up back then. I was making $6.25 an hour. I did not have a lot to lose. Ai. I was asked at the tyler. And then ai, I you know, it’s legitimate you know, in some sense, a legitimate request, which is working out of out of a research project actually funded by the federal government at a public university.

Speaker: 2
01:24:09

So, you know, I don’t think my boss was, like, in any way out of line, but it was like, ai. Like, this this web browser thing is great, but, like, could it just make sure to not have any photos of naked people that show up? But if you think about this for a second as a technologist, I had an issue, which is this sai, like, pre ImageNet. Right?

Speaker: 2
01:24:22

And so I had a brief period where I tried to imagine an algorithm, that that I referred to as the breast detection algorithm, that I was going to have to ai, and then apparently a variety of other apparently body parts people are also sensitive about.

Speaker: 0
01:24:35

Yeah.

Speaker: 2
01:24:36

And, and then I I politely ai to do this.

Speaker: 0
01:24:38

For for just the the technical difficulties of it?

Speaker: 2
01:24:41

Well, number 1, I didn’t I actually didn’t know how to do it. But number 2 is just like, no. I’m not. I’m not building I’m just not building a censorship engine. Like, I’m sai you know, I’m I’m just not doing it. And and in those days, it was you know, in those days, the Internet generally was, you know, free fire zone for everything.

Speaker: 2
01:24:52

It was actually interesting as sort of pre ai, the Internet was such a specific niche community, like it was like the million kind of highest IQ nerds in the world. And so it actually, like, didn’t really have a lot of, you know, issues. The people were, like, super interested in talking about, like, astrophysics and not very interested in, you know, even politics at that at that time.

Speaker: 2
01:25:11

So there really was not an issue there. But, ai, I didn’t I didn’t wanna start the process. So I think the way to think about this so first of all, you know yeah. So I was involved in this with Facebook every step by the way, I’ve been involved in this with Facebook every step of the way.

Speaker: 2
01:25:23

I joined the board there in 2007, so I saw I’ve seen everything in the last, you know, almost 20 years, every step of the way. But also Ai been involved in most of the other companies over ai. So I was an angel investor in Twitter. I knew them really well. We were the founding investor in Substack, Ai part of the Saloni takeover of Twitter with X. I was an angel at LinkedIn.

Speaker: 2
01:25:40

So I I’ve I’ve I’ve been in these we were the funder of Pinterest. We were one of the one of the main investors there. Reddit, as well. And I was having these conversations with all these guys all the way through. So as much talk specifically about Facebook, but I can just tell you, like, the the general pattern.

Speaker: 2
01:25:53

And for quite a while, it was kind of all the same across these companies. Yeah. So so basically the the way to think about this, the the the true kind of nuanced view of this is that there speak, practically speaking, no Internet service that can have zero censorship. And and by the way, that also mirrors there is no country that actually has limited free speech either.

Speaker: 2
01:26:10

The US first amendment actually has 12 or 13 formal carve outs from the Supreme Court over ai, you know, so incitement to violence and terrorist recruitment and child abuse and so, you know, child pornography and so forth are, like, they’re not covered by the first amendment.

Speaker: 2
01:26:24

And just practically speaking, if you and I are gonna start an Internet company and have a service, we can’t have that stuff either. Right? Because it’s illegal or it will just clearly, you know, destroy the whole thing. So you you you’re always gonna have a censorship engine.

Speaker: 2
01:26:36

I mean, hopefully, it’s not actually in the browser, but, like, you’re gonna have it for sure at the level of an inter of an Internet service. And but then what happens is now you have now you have a machine. Right? Now now you have a system where you can put in rules saying we allow this. We don’t allow that. You have enforcement. You have consequences. Right?

Speaker: 2
01:26:53

And once that system is in place, like, it becomes the ring of power. Right? Which is like, okay. Now anybody in that company or anybody associated with that company or anybody who wants to pressure that company will just start to say, okay, you should use that machine for more than just terrorist recruitment and child pornography.

Speaker: 2
01:27:09

You should use it for x ai z. And basically that transition happened, call it 2012, 2013 is is when there was this, like, very, very ai rapid pivot. I think the kickoff to it for some reason was this it was the beginning of the 2nd Obama term. I think it also coincided with the sort of arrival of the first kind of super woke kids into these into these schools.

Speaker: 2
01:27:30

You know, that kind of you know, the it’s it’s the kids that were in school between, like, you know, for the Iraq war and then the global financial crisis and, like, they came out, like, super radicalized. They came into these companies. They immediately started mounting these social crusades to ban and censor, lots of things.

Speaker: 2
01:27:44

And then, you know, quite frankly, the democratic party figured this out, and they figured out that these companies were, you know, very subject to being controlled and they and the, you know, the executive teams and boards of directors are almost all democrats, and, you know, there’s tremendous circulation.

Speaker: 2
01:27:56

A lot of Obama people from the first term actually came and worked in these companies, and a lot of FBI people and other, you know, law enforcement intelligence people came in and worked, and they’re all democrat you know, they were all democrats for that set. And so they just you know, the the the ring of power was lying on the table.

Speaker: 2
01:28:10

It had been built, and they, you know, picked it up and put it on, and then they just ran. And the original discussions were basically always on 2 topics. It was hate speech and misinformation. Hate speech was the original one, and the hate speech conversation started exactly like you’d expect, which is we can’t have the n word, in which the answer is fair enough.

Speaker: 2
01:28:29

Let’s not have the n word. Okay? Now we’ve set a precedent. Right? And then but the and then and Jordan Peterson has talked a lot a lot about this.

Speaker: 2
01:28:36

The definition of hate speech ended up being things that make people uncomfortable. Right? So we can’t have things that make, you know, people uncomfortable. I, of course, you know, people like me that are disagreeable raise their hands and say, well, that idea right there makes me uncomfortable.

Speaker: 2
01:28:48

But, of course, that doesn’t count as hate speech. Right? So, you know, the ring of power is on one hand and not not not on the other hand. And then basically that began this slide where it ended up being that, you know, completely anodyne. Disappointing Arya has been making recently, like, completely anodyne comments that are completely legitimate on television or on the senate floor.

Speaker: 2
01:29:08

All of a sudden our hate speech can’t be said online. So that, you know, the the ring of power was wielded in grossly irresponsible ways. We could talk about all the stuff that happened there, and then the other one was misinformation, and that wasn’t as there was a little bit of that early on, but, of course, that really kicked in with with Trump.

Speaker: 2
01:29:22

So so the hate speech stopped the hate speech stopped predated Trump ai, like, 3 or 4 years. The misinformation stuff was basically, it was a little bit later and it was a consequence of the Russiagate hoax. And then that was, you know, a ring of power that was even more powerful. Right?

Speaker: 2
01:29:38

Because, you know, hate speech is ai, okay, at some point, if some if something offensive or not, like, at least you can have a question as to whether that’s the case. But the problem with misinformation is, like, is it the truth or not? You know? You know? What do we know for 800 years or whatever western civilization?

Speaker: 2
01:29:52

It’s that, you know, there’s only a few entities that can determine the truth on every topic. You know, there’s God. You know, there’s the king. We don’t have those anymore, and the rest of us are all imperfect and flawed. And so the idea that any group of experts is gonna sit around the table and decide on the truth is, you know, deeply anti Western and deeply authoritarian, and somehow the misinformation ai of crusade went from the the Russia gate hoax into just full blown.

Speaker: 2
01:30:14

We’re gonna use that weapon for whatever we want. And then and then, of course, then then the culminating moment on that that really was the straw that broke the camel’s back was, we’re gonna censor all theories that the COVID virus might have bryden, manufactured in sana lab as misinformation.

Speaker: 0
01:30:27

And and

Speaker: 2
01:30:27

and that and and inside these companies, like, that was the point where people for the first time this is, like, what? 3 years ago? For the first time they were, like that was when it sunk in, where it’s just, like, ai, this has spun completely out of control. But anyway, that that that’s how we got to where we are. And then basically that spell lasted.

Speaker: 2
01:30:44

That that that complex existed and got expanded basically from call it 2013 to 2023. I think basically 2 things broke it. 1 is substack, and so and I’m super proud of those guys because they started from scratch and declared right up front that they were gonna be a a a free speech platform, and they came under intense pressure, including from the press, and, you know, they tried to beat them to the ground and kill them.

Speaker: 2
01:31:10

And intense pressure, by the way, from, you know, let’s say, certain of the platform companies, you know, basically threatening them. And they stood up to it. And, you know, sitting here today, they have the widest spectrum of of speech and and and conversation, you know, anywhere on planet Earth, and they’ve done a great job.

Speaker: 2
01:31:23

And it’s worked, by the way. It’s great. And then, obviously, Elon, you know, with x was the the, you know, the hammer blow. And then I the the third one now is what Arya is doing at Facebook.

Speaker: 0
01:31:33

Mhmm. And there’s also, like, singular moments. I think you’ve spoken about this, which, like John Stewart going on Stephen Colbert and talking about the lab leak theory. Yes. I just there’s certain moments. They just kinda shake everybody up. The right person, the right time,

Speaker: 1
01:31:52

just it’s a wake up call.

Speaker: 2
01:31:54

So that there and I will tell you, like and I should sai, John Stewart attacked me recently, so I’m not that thrilled about him. But I would sai, I was a long run fan of John Stewart. I watched probably every episode of The Daily Show when he was on it, there for probably 20 years.

Speaker: 2
01:32:08

But he did a very important public service, and it was that appearance in the Colbert show. And I I don’t know how broadly this is, you know, at the time it was in the news briefly, but I don’t know how if people remember this. But I will tell you, in in the rooms where people discuss what is misinformation in these policies, that was a very big moment.

Speaker: 2
01:32:21

That was probably actually the key catalyzing moment, and I think he exhibited, I would say, conspicuous bravery and had a big impact with that. And and yeah. What what for people who don’t recall what he did, what and this was in the full blown, like, you vatsal you know, you absolutely must lock down for 2 years.

Speaker: 2
01:32:34

You absolutely must keep all the schools closed. You absolutely must have everybody work from home. You absolutely must wear a mask, like, the whole thing. And one of those was you absolutely must believe that, COVID was completely natural. You must believe vatsal, and not believing that means you’re a fascist Sai Trump supporter meh evil QAnon person. Right?

Speaker: 2
01:32:53

And that was, like, uniform, and that was enforced by the social media companies. And and and ai I said, that that was the peak. And and and John Stewart went on the Colbert shah, and I don’t know if they planned it or not because Colbert looked shocked. I don’t know how much it was a bit, but he went on there and he he just had one of these, like, the emperor’s wearing no clothes things where he said it’s just not plausible that you had the COVID super virus appear 300 yards down the street from the Wuhan Institute of of lethal coronaviruses.

Speaker: 2
01:33:18

Like, it’s just not plausible that that that vatsal certainly that you could just rule that out. And then there was another meh moment actually, the more serious version was Ai think the author Nicholson Baker wrote a big piece for New York Magazine. Nicholson Baker is, like, one of our great novelist writers, of our time, and he wrote the piece, and he did the complete addressing of it.

Speaker: 2
01:33:35

And that was the first I think that was the first legit there had been, like, alt, you know, renegade there had been, you know, people running around saying this, but getting censored all over the place. That was the first one that was, like, in the mainstream press where he and he talked to all the heretics, and he just, like, laid the whole thing out.

Speaker: 2
01:33:49

And and that was a moment. And I remember, let’s say, a board meeting at one of these companies after that, where basically, you know, everybody looked around the table and was, like, ai, like, I guess we’re not we don’t need to censor that anymore, and, you know, and then of course what immediately follows from that is, well, wait a minute, why were we censoring that in the first place?

Speaker: 2
01:34:05

And okay. Like and then, you know, the downstream not that day, but the downstream conversations were, like, okay. If if if we made such a giant in retrospect, if we all made such a giant collective mistake censoring that, then what does that say about the rest of our regime?

Speaker: 2
01:34:17

And I think that was the thread in the sweater that started to unravel it.

Speaker: 0
01:34:20

I should say it again. I do think that the Jon Stewart appearance and the the statement he made was a courageous act.

Speaker: 2
01:34:26

Yep. I agree.

Speaker: 0
01:34:27

I think we need to have more that of that in the world. And ai you said, Elon, everything he did with x is is is a series of courageous acts. And I think what, Zuck what Mark Zuckerberg did on Bryden a few days ago is a courageous act. Can you just speak to that?

Speaker: 2
01:34:49

He has become, I think, an outstanding communicator. Right? And he’s, you know, somebody who came in for a lot of criticism earlier in his career, on that front, and I think he’s, you know, he’s he’s one of these guys who can sit down and talk for 3 hours and and make complete sense.

Speaker: 2
01:35:00

And and, you know, as as you do with with all of all of your episodes, like, when somebody sits and talks for 3 hours, like, you really get a sense of somebody because it’s it’s really hard to to meh artificial for that long. And, you know, he’s he’s not done that repeatedly. He’s really good at it.

Speaker: 2
01:35:12

And then, look, I I again, I would maybe put him in the 3rd category now with, certainly after that appearance, I would say, I would put him up there now with, you know, ai of, you know, kind of, you know, kind of, you know, kind of, you know, kind of, you know, the public and the private are now synchronized.

Speaker: 2
01:35:23

I guess I’d say that. Like, he he he said on that show what he really believes. He said all the same things that he says in private. Like, I don’t think there’s really any any discrepancy anymore. I would say he has always taken upon himself a level of obligation responsibility to running a company the size of meh and to running services that are that large.

Speaker: 2
01:35:42

And I think, you know, his conception of what he’s doing, which I think is correct, is he’s running services that are bigger than any country. Right? He’s running, you know, over 3,000,000,000 people use those services. And so and then, you know, the company has, you know, many tens of thousands of of employees and many investors, and it’s a public company, and he thinks very deeply and seriously about his responsibilities.

Speaker: 2
01:36:00

And so, you know, he has not felt like he has had, let’s just say, the complete flexibility that Elon has had. And, you know, people could argue that one way or the other, but, you know, he’s he’s, you know yeah. He’s he’s, you know, he talked about a lot. He’s he’s evolved a lot. A lot of it was he learned a lot, and ai the way, I’m gonna put myself right back up there.

Speaker: 2
01:36:17

Like, I’m not claiming any huge foresight or heroism on any of this. Like, I’ve also learned learned a lot. Like like, I my views on things are very different than they were 10 years ago on lots of topics. And so, you know, I would I’ve been on a learning journey. He’s been on a learning journey. He is a really, really good learner.

Speaker: 2
01:36:33

He assimilates information, you know, as as good as or better than anybody else I know. The other thing Ai guess I would just say is he talked on that show about something very important, which is when you’re in a role where you’re running a company like that, there are a set of decisions that you get to make, and then you deserve to be criticized for those decisions and so forth, and it’s valid.

Speaker: 2
01:36:52

But you are under tremendous external pressure, as well. And and by the way, you’re under tremendous internal pressure. You’ve got your employees coming at you. You’ve got your executives in some cases coming at you. You’ve got your board in some cases coming at you. You’ve got your shareholders coming at you.

Speaker: 2
01:37:07

So you you’ve got your internal pressures, but you also have the press coming at you. You’ve got academia coming at you. You’ve got the entire nonprofit complex coming activist complex coming at you. And then really critically, you know, he talked about on Bryden, and these companies all went through this.

Speaker: 2
01:37:22

In this last, especially ai years, you had the government coming at you. And, you know, that’s the really, you know, stinky end of the pool, where, you know, the government was, in my view, you know, illegally exerting, you know, just in flagrant violation of the first amendment and federal laws on on speech and coercion and, and conspiracy, forcing these companies to engage in activities.

Speaker: 2
01:37:42

You know, then again in some cases they may have wanted to do, but in other cases they clearly didn’t wanna do and felt like they had to do, And the level of pressure like I was saying, like, I’ve known every CEO of Twitter. They they’ve all had the exact same experience, which when they were in the job, it was just daily beatings.

Speaker: 2
01:37:57

Like, it’s just getting punched in the face every single day constantly, and, you know, Mark is very good at getting physically punched in the face. Getting better and better. Yeah. And he is and and he, you know, and he’s very good at, you know, taking a punch and he has taken meh, many punches.

Speaker: 2
01:38:13

So I would encourage people to have a level of sympathy for these are not kings. These these are people who operate with, like, I would sai, extraordinary levels of external pressure. I think if I had been in his job for the last decade, I would be a little puddle on the floor.

Speaker: 2
01:38:25

And so it it it ai, I think, a lot about him that he has, you know, risen to this occasion the way that he has. And ai the way, I should also say, you know, the the cynicism, of course, is immediately out, and, you know, it’s sai it’s a, a, you know, legitimate thing for people to sai, but, you know, it’s like, oh, you’re only doing this because of Trump or, you know, whatever, and it’s just like, no.

Speaker: 2
01:38:40

Like, he has been thinking about and working on these things and trying to figure them out for a very long time. And so I I think what you saw are legitimate deeply held beliefs, not some, you know, you know, sort of just in the moment thing that could change at any time.

Speaker: 0
01:38:52

So what do you think it’s like to be him and, other leaders of companies to be you and withstand internal pressure and external pressure? What’s that life like? Is it deeply lonely?

Speaker: 2
01:39:04

That’s a great question. So leaders are lonely to start with, and and this is one of those things where almost nobody has sympathy. Right? Nobody feels sorry for a CEO. Right? Like, it it’s not a thing. Right? And and, you know, and again, legitimately so.

Speaker: 2
01:39:15

Like, CEOs get paid a ai, like, the whole thing. There’s a lot of great things about it, so it’s not like they should be out there asking for a lot of sympathy. But it is the case that they are human beings, and it is the case that it is a lonely job. And the reason it’s a lonely job, is because your words carry tremendous weight, and you are dealing with extremely complicated issues, and you’re under a tremendous amount of emotional, you know, personal emotional stress.

Speaker: 2
01:39:36

And, you know, you often end up not being able to sleep well, and you end up not being able to, like, keep up an exercise routine and all those things, and, you know, you come under family stress because you’re working all the time ai my partner, Ben, you know, was he was CEO of our last company before we started the venture firm.

Speaker: 2
01:39:49

He he said, you know, the problem he had, like, with with his family life was he would even when he was home at night, he wasn’t home because he was in his head trying to solve all the business problems, and so he was, like, supposed to be, like, having dinner with his kids.

Speaker: 2
01:39:59

And he was physically there, but he wasn’t mentally there. So, you know, you kinda get you get that a lot. But the key thing is, like, you can’t talk to people. Right? So you can’t I mean, you can talk to your spouse and your kids, but, like, they don’t understand meh they’re not working in your company.

Speaker: 2
01:40:10

They don’t understand they have the context to really help you. You if you talk to your executives, they all have agendas. Right? And so they they’re all they’re all and they can’t resist. Like, it’s just human nature, and and and so you you you can’t necessarily rely on what they say.

Speaker: 2
01:40:23

It’s very hard in most companies to talk to your board because they can fire you. Right? Now now Mark has the situation because he has control. It actually turns out he can talk to his board, and Mark talks to us about many things that he does that most CEOs won’t talk to their boards about because we literally because we can’t fire him.

Speaker: 2
01:40:38

But the general a general c including all the CEOs of Twitter that that none of them had control, and so they they could all get fired. So, you can’t talk to the board meh. They’re gonna fire you. You can’t talk to shareholders, because they’ll they’ll just light dump your stock. Right? Like, okay.

Speaker: 2
01:40:51

So who’s the so so the so every once in a ai, what you find is basically the the best case scenario they have is they can talk to other CEOs. And there’s these little organizations where they kinda pair up and do that, and so they maybe get a little bit out of that. But but even that’s fraught with peril, because can you really talk about confidential information with another CEO? Ai trading risk.

Speaker: 2
01:41:08

And so it’s just a very it’s just a very lonely and isolating thing to start with. And then you you and then on top of that, you apply pressure. Ai? And that’s where it gets painful. And then maybe I’ll just spend a moment on this internal external pressure thing.

Speaker: 2
01:41:21

Ai general experience with companies is that they can withstand most forms of external pressure as long as they retain internal coherence. Ai? So as long as the internal team, is really bonded together and supporting each other, most forms of external pressure you can withstand.

Speaker: 2
01:41:40

And by that, I meh, investors dump your stock, you lose your biggest customers, you know, whatever negative article you know, negative ai, you know, you can you can withstand all that. And, basically and in fact, many of those forms of pressure can be bonding experiences for the team where they where they where they come out stronger.

Speaker: 2
01:41:56

What you 100% cannot withstand is the internal crack. And what I always look for in high pressure corporate situations now is the moment when the internal team cracks. Because I know the minute that happens, we’re in a different regime. Like, it’s like the, you know, the solid is turning into liquid.

Speaker: 2
01:42:14

Like, we’re in a different regime and, like, the whole thing can unravel in the next week because then people turn I mean, this I guess this is what’s happening in Los Angeles right now. The the the the meh and the and the fire chief turned on each other and that’s it. That government is dysfunctional.

Speaker: 2
01:42:27

It is never going to get put back together again. It is over. It is not going to work ever again. And and that’s what happens to ai companies. And so so so somebody like Mark is under, like, profound internal pressure and external pressure at the same time.

Speaker: 2
01:42:41

Now he’s been very good at maintaining the coherence of his executive team, but he has had over the years a lot of activist employees, as a lot of these companies have had. And so that’s been continuous pressure. And then the final thing I’d say is I said that companies can withstand most forms of external pressure, but not all in the especially though not all one is government pressure.

Speaker: 2
01:42:59

Is that when your government comes for you, like yeah. Any CEO who thinks that they’re bigger than their government, has that, notion beaten out of them in short order.

Speaker: 0
01:43:09

Can you just linger on that? Because it it is, maybe educating and deeply disturbing. You’ve spoken about it before, but we’re speaking about again, this government pressure. So you think they’ve crossed the line into essentially criminal levels of pressure?

Speaker: 2
01:43:27

Flagrant criminality. Felonies, like, obvious felonies. And I can I can actually cite the laws? But, yes, absolute criminality.

Speaker: 0
01:43:36

Can you explain how those possible to happen and maybe on a hopeful note, how we can avoid that happening again?

Speaker: 2
01:43:44

So to start with is a lot of this now is in the public record, which is good because it needs to be in the public record. And so there’s there’s three forms of things that are in the public record that people can look at. So one is the Twitter files, right, which Elon put out with the sai of journalists when he took over.

Speaker: 2
01:43:58

And I will just tell you the Twitter files are a 100% representative of what I’ve seen at every other one of these companies. And so you can just see what happened in Twitter, and you can just assume that that happened in in these other companies, you know, for the most part.

Speaker: 2
01:44:09

Certainly in terms of the kind of pressure that they got. So that’s that’s number 1. That stuff, you can just read it and you should if you haven’t. The second is, Mark referenced this in the Rogan podcast. There’s a congressman Jim Jordan who has a committee congressional committee called the weaponization committee, and they in the last, you know, whatever, 3 years have, done a full scale investigation of this, and Facebook produced a lot of documents into that, investigation and those have many of those have now been made public and you can download those reports and there’s like Ai like 2,000 pages worth of material on that.

Speaker: 2
01:44:37

And that’s essentially the Facebook version of the Twitter files just arrived at with a different mechanism. And then third is Mark himself talking about this on on on on Rogan, so I’ll, you know, just defer to his comments there. But yeah. Basically, what what those three forms of information show you is basically the government, you know, over time, and then culminating in 2020, 2021, you know, in the last 4 years, just decided that the first amendment didn’t apply to them.

Speaker: 2
01:45:01

And they just decided that, federal laws around free speech and around, conspiracies to take away the rights of citizens just don’t apply. And they just ai they can just arbitrarily pressure, they just ai literally arbitrarily call up companies and threaten and bully, and yell and ram, and and, you know, threaten repercussions, and force people to force them to censor.

Speaker: 2
01:45:21

And, you know, there’s this whole thing of, like, well, the first amendment ai to, you know, the government, it doesn’t apply to companies. It’s like, well, there’s actually a little bit of nuance to that. First of all, it definitely applies to the government. Like 100%, the first amendment applies to the government.

Speaker: 2
01:45:36

By the way, so does the 4th amendment and the 5th amendment, including the right to due process, also applies to the government. There was no due process at all to any of the censorship regime that was put in place. There was no due process put in place, by the way, for debanking either. Those are just as serious violations as the as the free speech violations.

Speaker: 2
01:45:52

And so this is just ai flagrant flagrant on constitutional behavior, and then there are specific federal statutes. There’s it’s 18241 and 18242, and one of them applies to federal employees, government employees, and the other one, applies to private actors, around what’s called deprivation of rights, and conspiracy to deprive rights.

Speaker: 2
01:46:10

And it is not legal according to the United States criminal code for government employees or in a conspiracy private entities to take away constitutional rights. And interestingly, some of those constitutional rights are enumerated, for example, in the First Amendment, freedom of speech, and then some of those rights actually do not need to be enumerated.

Speaker: 2
01:46:27

It is if the government takes away rights that you have, they don’t need to be specifically enumerated rights in the constitution in order to still be a felony. The constitution does not very specifically does not say you only have the rights that it gives you. It says you have all the rights that have not been previously defined as being taken away from you. Right?

Speaker: 2
01:46:45

And so debanking qualifies as a right, you know, right to access to the financial system as every bit something that’s subject to these laws as as sai free speech. And sai, yeah, this has happened. And then I’ll just add add one final thing, which is we’ve talked about 2 parties so far.

Speaker: 2
01:46:58

Talk about the government employees, and then we’ve talked about the companies. The government employees for sure have misbehaved. The companies, there’s a very interesting question there as to whether they are victims or perpetrators or both. You know, they will defend and they will argue, and I believe they have a good case that they are victims, not perpetrators. Right?

Speaker: 2
01:47:15

They are the downstream subjects of pressure, not the cause, you know, not the cause of pressure, but there’s a big swath of people who are in the middle and specifically the ones that are funded by the government that I think are in possibly pretty big trouble, and that’s all of these 3rd party, censorship bureaus.

Speaker: 2
01:47:29

I mean the one that sort of is most obvious is the so called, Stanford Internet Observatory, that got booted up there over the last several years, and they basically were funded by the federal government to be third party censorship operations. And they’re private sector actors but acting with federal funding, and so it puts them in this very interesting spot where there there could be, you know, very obvious theory under which they’re basically acting as agents of the government.

Speaker: 2
01:47:54

And so I think they’re also very exposed on this and have behaved in just flagrantly illegal ways.

Speaker: 0
01:47:59

So meh, government should not do any kind of pressure, even soft pressure on companies to censor.

Speaker: 2
01:48:07

Can’t.

Speaker: 0
01:48:08

Not allowed. It really is disturbing. I mean, it probably started soft ai, slowly, and then it escalates sai the the old will to power will instruct them to do. Because you get you get I mean, yeah. I mean, that’s why that’s why there’s protection. Because you can’t put a check on power for government. Right?

Speaker: 2
01:48:31

There are so many ways that they can get you. Like, there are so many ways they can come at you and get you. And, you know, the the thing here to think about is a lot of times we go think about government action, they think about legislation, right? Ai you sai when Ai was a kid, we got trained at how does government work?

Speaker: 2
01:48:45

There was this famous animated short, the thing we got shown sai just a cartoon of how a bill becomes a law. It’s like this, you know, fancy little bill snicked along and guess this.

Speaker: 0
01:48:52

I’m just sai bill.

Speaker: 2
01:48:53

Yeah. Exactly. Ai, it’s like, alright. Number 1, that’s not how it works at all. Like, that that doesn’t actually happen. We could talk about that. But but but even beyond that, mostly what we’re dealing with is not legislation. The the when we talk about government power these days, mostly it’s not legislation.

Speaker: 2
01:49:05

Mostly it’s either regulation, which is basically the equivalent of legislation, but having not gone through the legislative process, which is a very big open legal issue and one of the things that the Doge is very focused on. Most most government rules are not legislated. They’re regulated.

Speaker: 2
01:49:18

And that and there’s tons and tons of regulations that these companies are sai this is another cliche you’ll hear a lot, which is all private companies can do whatever they want. It’s like, oh, no. They can’t. They’re subject to tens of thousands of regulations that they have to comply with.

Speaker: 2
01:49:30

And the the hammer that comes down when you don’t comply with regulations is profound. Like, they can completely wreck your company with no no ability for you to do anything about it. So so so regulation is a big part of the way the power gets exercised. And then there’s what’s called just flat out administrative power, the term that you’ll hear.

Speaker: 2
01:49:45

Administrative power is just literally the government telling you calling you and telling you what to do. Here’s an example how this works. So Facebook had this whole program a few years back to do a global cryptocurrency for payments called Libra. And they they built the entire system, and it was this high scale, you know, sort of new cryptocurrency, and they were gonna build in every product, and they were gonna be 3,000,000,000 people who could transact with Libra.

Speaker: 2
01:50:03

And they went to the government, and they went to the all these different tried to figure out how to make it so it’s, like, fully compliant with anti money laundering and all these, you know, controls and everything, and they had the whole thing ready to go. 2 senators wrote letters to the big banks, saying, we’re not telling you that you can’t work with Facebook on this, but if you do, you should know that every aspect of your business is going to come under greatly increased level of regulatory scrutiny, which is, of course, the exact equivalent of, it sure is a nice corner restaurant you have here.

Speaker: 2
01:50:30

It would be a shame if, you know, somebody tossed a Molotov cocktail through the window and burned it down tonight. Right? And so that letter like, what is that letter? Like, it’s not a law. It’s not even a regulation. It’s just, like, straight direct state power.

Speaker: 2
01:50:42

And then and then it culminates in literally calls from the White House, where they’re just, like, flat out telling you what to do, which is, of course, what a king gets to do, but not what a president gets to do. And and so anyway, so this bay so so what these companies experienced was they experienced the full panoply of this, but it was it was the the level of intensity was in that order.

Speaker: 2
01:51:00

It was actually legislation was the least important part. Regulation was more important. Administrative power was more important, and then just, like, flat out demands I mean, flat out threats were ultimately the most important. How do you fix it? Well, first of all, like, you have to elect people who don’t do it. Sai, like, as with all these things, ultimately, the fault lies with the voters.

Speaker: 2
01:51:18

And so, you know, you have to decide you don’t wanna live in that regime. I have no idea what part of this recent election mapped to the censorship regime. I do know a lot of people on the right got very angry about the censorship, but I, you know, I think it probably at least helped with enthusiasm on that side.

Speaker: 2
01:51:30

You know, maybe some people on the left will now not want their, you know, Democratic nominees to be so pro censorship. So the the voters definitely, you know, get a vote, number 1. Number 2, I think you need transparency. You need to know what happened. We know some of what happened.

Speaker: 2
01:51:46

Peter Thiel has written in the Feet just now saying we just need, like, broad after what we’ve been through in the last decade, we need bryden based truth and reconciliation, you know, efforts to really get to the root of things. So maybe that’s part of it. We need investigations for sure. Ultimately, we need prosecutions.

Speaker: 2
01:52:02

Like, we we need ultimately we need people to go to jail, because we need to set object lessons that say that you don’t get to do this. And and and on those last 2, I would say that that those are both up to the new administration, and I don’t wanna speak for them. And I I don’t wanna predict what they’re gonna do, but they have they for sure have the ability to do both of those things.

Speaker: 2
01:52:18

And, you know, we’ll we’ll see where they take it.

Speaker: 0
01:52:20

Yeah. It’s truly disturbing. I don’t think anybody wants this kind of overreach of power for government, including perhaps people that were participating in it. It’s ai this dark momentum of power. They just get caught up in it, and that’s the reason there’s that kind of protection. Nobody wants that.

Speaker: 2
01:52:37

So I use the metaphor of the ring of power and Yeah. For people who don’t catch the reference as lord lord of the rings. And the thing with the ring of power in Lord of the Rings, it’s the ring that Gollum has in the beginning, and it turns you invisible, and it turns out it, like, unlocks all this, like, fearsome power, it’s the most powerful thing in the world.

Speaker: 2
01:52:51

It’s the key to everything, and basically the the moral lesson of Lord of the Rings, which was, you know, written by a guy who thought very deeply about these things is, yeah, the ring of power is inherently corrupting. The characters at one point, they’re like, end off, just put on the ring and, like, fix this. Right?

Speaker: 2
01:53:04

And he’s like he’d like he will not put the ring on even to, like, end the war, because he knows that it will corrupt him. And then, you know, the character as it starts, the character of Gollum is the result of, you know, it’s just like ai a normal character who ultimately becomes, you know, this incredibly corrupt and deranged version of himself.

Speaker: 2
01:53:20

And ai, I mean, I think you I think you said something actually quite profound there, which is the ring of power is infinitely tempting. You know, the censorship machine is infinitely tempting. If you if you have it, like, you are going to use it. It’s overwhelmingly tempting, because it’s so powerful, and that it will corrupt you. And yeah. I I don’t know whether any of these people feel any of this today.

Speaker: 2
01:53:40

They should. I don’t know if they do, but, yeah, you go out 5 or 10 years later, you know, you would hope that you would realize that your soul has been corroded, and you probably started out thinking that you were a patriot, and you were trying to defend democracy, and you ended up being, you know, extremely authoritarian and anti democratic and anti western.

Speaker: 0
01:53:57

Can I ask you a a tough question here? Staying on the ring of power, Elon is quickly becoming the most powerful human on earth.

Speaker: 2
01:54:11

I’m not sure about that.

Speaker: 0
01:54:12

You don’t you don’t think he

Speaker: 2
01:54:13

Well, he doesn’t have the nukes. So Nukes.

Speaker: 0
01:54:19

Yeah. There’s different definitions and perspectives on power. Right? Yep. How can he and or Donald Trump, avoid the corrupting aspects of this power?

Speaker: 2
01:54:29

I mean, I think the danger is there with power. It’s just it’s flat out there. I I would say with Elon, I mean, we’ll, you know, we’ll see. I would say with Saloni, and I would sai, by the way, overwhelmingly, I would say so far far so good. I’m extremely extremely thrilled by what he’s done on almost every front, for, like, you know, the last 30 years.

Speaker: 2
01:54:44

But including all this stuff recently, like, I think he’s he’s been a real hero on a lot of topics where we needed to see heroism. But look, I I would say, I guess, the sort of case that he has this level of power is some combination of the money and the the and the proximity to the president, and obviously both of those are are are instruments of power.

Speaker: 2
01:54:59

The counterargument to that is Ai do think a lot of how Elon is causing change in the world right now I mean, there’s there’s the companies he’s running directly where I think he’s doing very well, and we’re investors in multiple of them and doing very well. Ai I think, like, a lot of the stuff that gets people mad at him is, like, it’s the social and political stuff, and it’s, you know, it’s his statements, and then it’s the downstream effects of his statements.

Speaker: 2
01:55:21

Sai, like, for example, it’s, you know, for the last couple weeks, it’s been him, you know, kind of weighing in on this rape gang scandal, you know, this rape organized child rape thing in the UK, and, you know, it’s it’s, you know, it’s it’s actually a it’s a preference cascade.

Speaker: 2
01:55:33

It’s one of these things where people knew there was a problem, they weren’t willing to talk about it, it kind of got suppressed, and then, Elon brought it up and then all of a sudden there’s now in the UK this, like, massive explosion of basically open conversation about it for the first time.

Speaker: 2
01:55:46

And, you know, it’s like this catalyzing. Yeah. All of a sudden, everybody’s kinda woken up and being like, oh my god. You know, this is really bad. And and and there will be now, you know, pretty sure pretty pretty clearly big changes as a result.

Speaker: 2
01:55:56

So and Elon was, you know, he played the role of the boy who said the emperor has no clothes. Right? But but but here’s the thing. Here’s my point. Like, he said it about something that was true. Right? And so had he said it about something that was false, you know, he would get no credit for it.

Speaker: 2
01:56:09

They wouldn’t deserve any credit for it. But he said something that was true. And by by the way, everybody over there instantly, they were ai, oh, yeah. He’s right. Right. Ai, nobody nobody seriously said. They’re they’re just arguing the details now. So so number 1, it’s like, okay. He says true things. And so it’s like, okay.

Speaker: 2
01:56:23

How far I’ll put it this way, like, how worried are we about somebody becoming corrupt by virtue of their power being that they get to speak the truth? And I guess I would sai, especially in the last decade of what we’ve been through where everybody’s been lying all the time about everything, I’d say I think we should run this experiment as hard as we can to get people to tell the truth, and so, I don’t feel that bad about that.

Speaker: 2
01:56:42

And then the money side, you know, this rapidly gets into the money and politics question, and the money and politics question is this very interesting question because, it seems like if there’s a clear cut case that the more money in politics, the worse things are and the more corrupted the system is.

Speaker: 2
01:56:57

That was a very popular topic of public conversation up until 2016, when Hillary outspent Trump 3 to 1 and lost. You’ll notice that money in politics has almost vanished as a topic, in the last 8 years, and and once again, Trump spent far you know, Kamala raised and spent 1,500,000,000 on top of what, Biden speak.

Speaker: 2
01:57:15

So they were they were, Ai don’t know, something like 33,000,000,000 vatsal, and Trump, I think, spent again like a 3rd or a 4th of that. And so the money in politics kind of topic has kinda vanished from the popular conversation the last 8 years. It has come back a little bit now that Elon is spending. You know, but but again, like, it’s like, okay, he’s spending.

Speaker: 2
01:57:35

But the data would seem to indicate in the last at least the last 8 years, that money doesn’t win the political battles. It’s actually, like, the voters actually have a voice, and they actually exercise it, and they don’t just listen to ads. And so again there, I would say, like, yeah, clearly there’s some power there, but I don’t know if sai, ai, I don’t know if it’s some, ai, I don’t know if it’s some weapon that he can just, like, turn on and and and use in a definitive way.

Speaker: 0
01:57:54

And I don’t know if there’s parallels there, but I could also say just on a human level, he has a good heart. And I interact with a lot of powerful people, and that’s not always the case. So that that’s a good thing there. Yeah. If we if we can draw parallels to the hobbit or whatever. Who gets to put on the ring? Frodo? Frodo. Yeah.

Speaker: 2
01:58:14

Yeah. Well, may maybe one of the lessons of Lord of the Rings, right, is even even Frodo would have been, you know, even Frodo would have been corrupted. Right? But, you know, nevertheless, you had somebody who could do what it took at the at the time. The the thing that I find just so amazing about the Elon phenomenon and all the critiques is, you know, the one thing that everybody in our societies universally agrees on because of our sort of our our our post Christian egalitarian.

Speaker: 2
01:58:34

So, you know, we live in sort of this post secularized Christian context in the west now, and it’s we we, you know, we we consider Christianity kind of, you know, backwards, but we still believe essentially all the same things. We just dress them up in in in sort of fake science.

Speaker: 2
01:58:47

So the the one thing that we’re all told we’re all taught from from early is that the best people in the world are the people who care about all of humanity. Right? And we venerate, you know, that all of our figures are people who care about all of you know, Jesus cared about all of humanity.

Speaker: 2
01:59:00

Gandhi cared about all of humanity. Martin Luther King cared all of humanity. Like, with this universe the person who cares the most about everybody. And and with Elon, you have a guy who literally, like, is he’s literally he he talks about this constantly, and he talks about exactly the same in private.

Speaker: 2
01:59:13

He’s literally he is operating on behalf of all of humanity to try to get us to you know, he goes through to get us through multi planetary civilization so that we can survive a strike at any one planet so that we can extend the light of human consciousness into the world and, you know, into the universe and have it persist, you know, and the good of the whole thing.

Speaker: 2
01:59:27

And, like, literally, the critique is, yeah, we want you to care about all of humanity, but not like that.

Speaker: 0
01:59:33

Yeah. All the critics, all the all the surface turmoil, all the critics will be forgotten.

Speaker: 2
01:59:39

Yeah. I think that’s yeah. That’s clear.

Speaker: 0
01:59:42

You said that, we always end up being ruled by the elites of some kind. Can you explain this law, this idea?

Speaker: 2
01:59:50

So this comes from a Italian political philosopher from about a 100 years ago named Robert Ai gonna mangle the tyler you pronounce the Italian, Michels or Ai. And then it was I learned about it through a famous book on on politics, probably the best book on politics written in the 20th century called the Machiavellians ai this guy, James Burnham, who has had a big impact on me.

Speaker: 2
02:00:12

But, in the Machiavellians, he resurrects what he calls this sort of Italian realist school of of political philosophy from the from the tens of twenties. And these were people to be clear, this was not like a Mussolini thing. These were people who were trying to understand the actual mechanics of how politics actually works.

Speaker: 2
02:00:26

So to get to the actual sort of mechanical substance of, like, how the political machine operates. And, this guy, Michels, had this concept he he ended up with called the iron law of oligarchy. And so what the iron law of oligarchy I mean, take a step back to say what he meant by oligarchy because it has multiple meanings.

Speaker: 2
02:00:44

So basically, in classic political theory, there’s basically three forms of government at core. There’s democracy, which is rule of the many. There’s oligarchy, which is rule of the few, and there’s monarchy, which is rule of the 1. And you can just use that as a general framework of any government you’re gonna be under is gonna be one of those. Just sai mechanical observation.

Speaker: 2
02:01:00

Without even saying which one’s good or bad, just a structural observation. And so the question that Michel’s asked was, like, is there such a thing as democracy? Like, is there actually such a thing as democracy? Is is there ever actually, like, direct direct government? And what he did was he mounted this sort of incredible historical, exploration of whether democracies had ever existed in the world, and the answer basically is almost never.

Speaker: 2
02:01:21

And we could talk about that. But the other thing he did was he he sought out the most democratic, private organization in the world that he could find at that point, which he concluded was some basically communist German auto workers union that was, like, wholly devoted to the workers of the world uniting, you know, back when that was, like, the hot thing.

Speaker: 2
02:01:37

And he went in there and he’s ai, okay, this is the organization out of all organizations on planet Earth that must be operating as a as a direct democracy. And he went in there ai he’s like, oh, nope. There’s a leadership class. You know, there’s like 6 guys at the top and they control everything.

Speaker: 2
02:01:48

And then they leave the rest of the membership along, you know, by the nose. Which is, of course, the story of every union. The story of every union is always the story of, you know, there’s there’s a Jimmy Hoffa in there, you know, kind of running the thing. You know, we just saw that with the dock workers union. Right? Like, you know, there’s a ai, and he’s in charge.

Speaker: 2
02:02:05

And ai the way, the number 2 is his son. Right? Like, that’s not ai a, you know, an accident. Right? So the Arab law of oligarchy basically says democracy is fake. There’s always a ruling class. There’s always a ruling elite structurally.

Speaker: 2
02:02:16

And he said the reason for that is because the masses can’t organize. Right? What what’s the fundamental problem that whether the mass is 25,000 people in union or 250,000,000 people in a country, the masses can’t organize. The majority cannot organize. Only a minority can organize. And to be effective in politics, you must organize.

Speaker: 2
02:02:32

And therefore, every political ai in human history has been some form of a small organized elite ruling, a large and dispersed majority. Every single one. The Greeks and the Florentines had brief experiments in direct democracy, and they were total disasters. In Florence, I forget the name of it.

Speaker: 2
02:02:52

It was called, like, the workers’ revolt or something like that. There was, like, a 2 year period, where they basically experimented with direct democracy during the renaissance, and it was a complete disaster, and they never tried it again. In the state of California, we have our own experiment on this, which is the proposition system, which is an overlay on top of the legislature, and it you know, anybody who looks at it for 2 seconds concludes it’s been a complete disaster.

Speaker: 2
02:03:15

It’s just a catastrophe, and it’s caused enormous damage to the state. And so basically, the the the basically, the the presumption that we are in a democracy is just sort of by definition fake. Now good news for the US, it turns out the founders understood this. And so, of course, they didn’t give us a direct democracy. They gave us a representative democracy.

Speaker: 2
02:03:30

Right? And so they they they built the oligarchy into system in the form of congress and the and the executive branch ram the judicial branch. But but so anyway, so as a consequence, democracy is always and everywhere fake. There is always a ruling elite. And and basically the the the lesson of the Machiavellians is you can deny that if you want, but you’re fooling yourself.

Speaker: 2
02:03:47

The way to actually think about how to make a system work and maintain any sort of shred of freedom, is to actually understand that that is actually what’s happening.

Speaker: 0
02:03:55

And, lucky for us, the founders saw this and figured out a way to given that there’s going to be a ruling elite, how to create a balance of power among that elite

Speaker: 2
02:04:08

Yes.

Speaker: 0
02:04:08

So it doesn’t get out of hand.

Speaker: 2
02:04:09

And it was very clever. Right? And, you know, some of this was based on earlier experiments. Some of this by the way, you know, they these these were very, very smart people. Right? And so they they knew tremendous amounts of, like, Greek and Roman history. They knew the Sai history. You know, they the Federalist Papers, they argued this sai great length. You you can read it all.

Speaker: 2
02:04:23

You know, they they they ram, like, a one of the best seminars in world history trying to figure this out. And they they went through all this. And yeah. And so they they thought through it very carefully. But just to give you an example, which continues to be a hot topic.

Speaker: 2
02:04:34

So, you know, one way they did it is through the 3 branches of government. Right? Executive, legislative, and judicial, sort of balance of powers. But the other way they did it was they sort of echoing what had been done earlier, I think, in the UK meh. They created the 2 different bodies of the legislature. Right?

Speaker: 2
02:04:50

And so the, you know, the house and the sana, and as you know, the the house is a portion on the basis of population and the senate is not. Right? The small states have just as many senators as the big states. And then they made the deliberate decision to have the house get reelected every 2 years to make it very responsive to the will of the people, and they made the decision to have the senate get reelected every 6 years so that it had more buffer from the passions of the moment.

Speaker: 2
02:05:11

But what’s interesting is they didn’t choose 1 or the other. Right? They did them both. And then to meh legislation passed, you have to get through both of them. And so they they built in, like, a second layer of checks and balances.

Speaker: 2
02:05:21

And then there’s a 1,000 observations we could make about, like, how well the system is working today, and, like, how much does it live up to the ideal, and how much are we actually complying with the constitution, and there’s lots of, you know, there’s lots of open questions there.

Speaker: 2
02:05:34

But, you know, this system has survived for coming on 250 years with a country that has been spectacularly successful that I don’t think at least, you know, I don’t think any of us would trade this system for any other one. And so it’s, yeah, one of the great all time achievements.

Speaker: 0
02:05:46

Yeah. It’s incredible. And we should say they were all pretty young relative to our current set of leaders.

Speaker: 2
02:05:52

Many in their twenties at the time. And, like, supergeniuses. This is one of those things where it’s just like, ai, something happened where there was a group of people where, you know, nobody ever tested their Ai, but, like, these are Einsteins of politics. Yeah. An amazing thing.

Speaker: 2
02:06:03

But anyway, it’s I I I just I go through all that which is they were very keen students of the actual mechanical practice of democracy not fixated on what was desirable. They were incredibly focused on what would actually work, which is, you know, I think the the way to think about these things.

Speaker: 0
02:06:17

There were engineers of sort, not the fuzzy humanity students of

Speaker: 2
02:06:22

They were shape rotators, not word cells.

Speaker: 0
02:06:24

I remember that. Wow. That meme came and went. I think you were central to them. You’re central to

Speaker: 1
02:06:30

a lot of memes.

Speaker: 2
02:06:31

I was. I was.

Speaker: 0
02:06:32

You’re you’re the meme dealer and the meme popularizer.

Speaker: 2
02:06:35

That meme I get some credit for, and then the current thing is the other one I get some credit for. I don’t know that I invented either one, but, I I I popularized them.

Speaker: 0
02:06:42

Take credit and run with it.

Speaker: 2
02:06:43

Yep. Thanks.

Speaker: 0
02:06:46

If we can just linger on the Ai Valiance. It’s a it’s a study of power and power ai. Like you mentioned, looking at the actual reality of the machinery of power. From everything you’ve seen now in government but also in companies, what are some interesting things you can sort of continue to say about the dynamics of power, the jostling for power that happens inside these institutions?

Speaker: 2
02:07:11

Yeah. So it it it a lot of it, you know, we we’d already talked about this a bit with the universities, which is you you can apply a Machiavellian style lens to the it’s why I posed the question to you that I did, which is okay, who runs the university? The trustees, the administration, the students, or the faculty?

Speaker: 2
02:07:25

And and, you know, the answer the true answer is some combination of the 3 or of the 4, plus the donors, by the way, plus the government, plus the press, etcetera. Right? And so there there you know, there’s a there’s a mechanical interpretation of that. I meh, companies, operate under the exact same, you know, set of questions.

Speaker: 2
02:07:39

Who runs a company? You know, the CEO, but, like, the CEO runs the company basically up to the day that either the shareholders or the management team revolt. If the shareholders revolt, it’s very hard for the CEO to stay in the seat. If the management team revolts, it’s very hard for the CEO to stay in the seat.

Speaker: 2
02:07:52

By the way, if the employees revolt, it’s also hard to stay in the seat. By the way, if the New York Ai comes at you, it’s also very hard to stay in the seat. If the sana comes at you, it’s very hard to stay in the seat. So, you know, ai the a a a a reductionist version of this that ai a good shorthand is who can get who fired.

Speaker: 2
02:08:08

You know, so so who has more power? You know, the the newspaper columnist who makes, you know, $200,000 a year or the CEO who makes, you know, $200,000,000 a year. And it’s ai, well, I know for sure that the columnist can get the CEO fired. I’ve seen that happen before. I have yet to see a CEO get a columnist fired.

Speaker: 0
02:08:24

Did anyone ever get fired from the Bill Ackman assault on journalism? So Bill Bill, like, really showed the bullshit that happens in journalism. No.

Speaker: 2
02:08:36

Because what happens is they they they wear it with the I mean, they and yeah. I would say to their credit, they wear it as a badge of honor, and then to their shame, they wear it as a badge of honor. Right? Which is if, you know, if they’re doing the right thing, then they are justifiably bryden themselves for standing up under pressure.

Speaker: 2
02:08:49

But it also means that they can’t respond to legitimate criticism, and, you know, they’re obviously terrible at that now. As I recall, he went straight to the CEO of, I think, Axel Springer Mhmm. That owns Ai, and I, you know and I I happen to know the CEO, and I think he’s quite a good CEO, but, like, I like, what is a good example is the CEO of Axel Springer run his own company.

Speaker: 2
02:09:08

Right? Like well, there’s a fascinating okay. So there’s a fascinating thing playing out right ai. Not to dwell on these ai. But, it’s sai case you see the pressure reveals things. Right? And so if you’ve been watching what’s happened with the LA Ai recently.

Speaker: 2
02:09:21

So so this guy, Biotech Entrepreneur, buys LA Tyler, like, whatever, 8 years ago. It is just ai the most radical social revolutionary thing you can possibly imagine. It endorses every crazy left wing radical you can imagine. It endorses Karen Bass, it endorses Gavin Newsom.

Speaker: 2
02:09:34

It’s just like ai litany of all the people who, you know, are currently burning the city to the ground. It’s just ai endorsed every single bad person every step of the way. He’s owned it the entire time. You know, he put his foot down right before for the first time, I think, put his foot down right before the November election and said, we’re not.

Speaker: 2
02:09:49

We’re we’re we’re not getting he said, we’re gonna get out of this thing where we just always endorse the democrat, and we said we’re not endorsing I think he said we’re not endorsing for the for the presidency, and, like, the paper flipped out. Right? It’s like, our billionaire backer who’s and I don’t know what he spends, but, like, he must be burning 50 or $100,000,000 a year out of his pocket to keep this thing running.

Speaker: 2
02:10:05

He paid 500,000,000 for it, which is amazing, back when people still thought these things were businesses. And then he’s probably burned another 500,000,000 over the last decade keeping it running, and he burns probably another 50, a 100,000,000 a year to do this. And the journalists at the LA Times hate him with the fury of a 1000 suns.

Speaker: 2
02:10:23

Like, they just, like, absolutely freaking despise him, and they have been, like, attacking him and, you know, the ones that can get jobs elsewhere quit and do it, and the rest just stay and say the worst, you know, most horrible things about him and they wanna constantly run these stories attacking him.

Speaker: 2
02:10:35

And so he has had this reaction that a lot of people in LA LA are having right now to to this fire into this just, like, incredibly vivid collapse of leadership and all these people that he had his paper head and tore are just disasters. And he he’s on this tour. He’s basically just he’s decided he’s he’s decided to be the the boy who says emperor has no clothes, but he’s doing it to his own newspaper. Very smart guy.

Speaker: 2
02:10:57

He’s not a press tour and he’s basically saying, yeah. We we yes. We did all that, and we endorsed all these people, and it was a huge mistake, and we’re gonna completely change. And his paper is, you know, in a complete internal revolt. But I go through it, which is okay.

Speaker: 2
02:11:08

Now we have a very interesting question, which is who runs the LA Ai. Because for the last 8 years, it hasn’t been him. It it’s been the reporters. Now for the first time, the owner is showing up saying, oh, no. I’m actually in charge, and the reporters are saying, no. You’re not.

Speaker: 2
02:11:25

And, like ai, it is freaking on. And so, again, if the the Machiavellian’s mindset on this is, like, okay. How How is power actually exercised here? Can can can a guy who’s, like, even super rich and super powerful, who even owns his own newspaper, can he stand up to a full scale assault, not only by his own reporters, but by every other journalism outlet who also now thinks he’s the antichrist.

Speaker: 0
02:11:45

And he is trying to exercise power by speaking out publicly, and so that’s the game of power there.

Speaker: 2
02:11:50

And firing people. And and, you know, he has removed people and he has set new rules. I mean, he he is he is now I think at long I think he’s saying that he’s now at long last actually exercising prerogatives of an owner of a business, which is ai on the policies and staffing of the business.

Speaker: 2
02:12:02

There are certain other owners of these publications that are doing similar things right now. He’s the one I don’t know, so he’s the one I can talk about. But there are others that are going through the same thing right now. And I think it’s a really interesting open question.

Speaker: 2
02:12:15

Like, you know, in a fight between the employees and the employer, like, it’s not crystal clear that the employer wins that one.

Speaker: 0
02:12:20

And just to stay on journalism for a second, we mentioned Bill Ackman. I just wanna say put him in the category we mentioned before of a really courageous person. I don’t think I’ve ever seen anybody so fearless in going after you know, in following what he believes in publicly. That’s courage.

Speaker: 0
02:12:40

That that that several things he’s done publicly has been really ai. Just being courageous.

Speaker: 2
02:12:47

What do you think is, like, the most impressive example?

Speaker: 0
02:12:49

Where he went after a journalist Yeah. Whose whole incentive is to, like I mean, it’s, ai, like, sticking your like, kicking the beehive or whatever. You know what’s gonna follow. Yep. And to do that, I mean, that’s why it’s difficult to challenge journalistic organizations because they’re going to you know, there’s just so many mechanisms they use, including, like, writing articles and get cited by Wikipedia, then drive the narrative, and then they can get you fired, all this kind of stuff.

Speaker: 0
02:13:18

Bill Ackman, like a bad Meh er, just just tweets these essays and just goes after them, legally and also in the public eye and just I I don’t know. That was truly inspiring. There’s not many people like that out, in public, And I hopefully, that inspires not just me, but many others to be, ai, to be courageous themselves.

Speaker: 2
02:13:42

Did you know of him before he started doing this in public?

Speaker: 0
02:13:45

I knew of Neri, his his wife. She’s just brilliant researcher and scientist, and so I I admire her. I look up to her. I think she’s amazing.

Speaker: 2
02:13:52

Well, the reason I ask if you knew about Bill is because a lot of people had not heard of him before, especially, like, before October 7th and before some of the campaigns he’s been running since in public, but, and with Harvard and so forth. But, he was very well known in the in the investment world before before that.

Speaker: 2
02:14:05

So, he was a famous, he was a so called active activist investor for, you know, very, very successful and very widely respected for probably 30 years before before before now. And and I bring that up because it it turns out that they weren’t, for the most part, battles that happened in in kind of full public view.

Speaker: 2
02:14:20

They weren’t national stories, but in the business and investing world, it the the activist investor is a very it’s like in the movie Taken. It’s a very specific set of skills Yeah. On how to, like, really take control of situations, and how to wreck the people who you’re going up against.

Speaker: 2
02:14:34

Sana, just to and there’s lots there’s been controversy over the years on this topic, and there’s too much detail to go into. But the the defensive activist investing, which I think is valid is, you know, these are the guys who basically go in and take stakes in companies that are being poorly managed or under ai.

Speaker: 2
02:14:51

And and and then generally what that means is, at least the theory is, that means the existing management has become entrenched and lazy, mediocre, you know, whatever, not, you know, responding to the needs of the shareholders. Often not responding to the customers. And the activists basically go in with a minority position and then they rally support among other investors who are not activists, and then they basically show up and they force change.

Speaker: 2
02:15:13

But they are the aggressive version of this. And I’ve been on the I’ve been involved in companies that have been on the receiving end of these Uh-oh. Where it is amazing how much somebody like that can exert pressure on situations even when they don’t have formal control. So so it’s another it would be another chess piece on the mechanical board of kind of how power gets exercised.

Speaker: 2
02:15:31

And, basically, what happens is the effective analysts, a large amount of the time, they end up taking they end up taking over control of companies even though they never own more than, like, 5% of the stock. And so anyway so it turns out with Bill’s it’s such a fascinating case, because he has that, like, complete skill set.

Speaker: 0
02:15:45

Yeah.

Speaker: 2
02:15:45

And he has now decided to bring it to bear in areas that are not just companies. And 2 interesting things for that, one is, you know, some of these places, you know, and some of these battles are still ongoing. But number 1, like, a lot of people who run universities or newspapers are not used to being up against somebody like this.

Speaker: 2
02:16:00

And by the way, also now with infinitely deep pockets and lots of experience in courtrooms and all the things that ai go with that, but the other is through example, he is teaching a lot of the rest of us the activist playbook, like, in real time. And so the Liam Neeson skill set is getting more broadly diffused, just by being able to watch and learn from him.

Speaker: 2
02:16:19

So I Ai think he I think he’s having a you know, I would put him up there with Ai in terms of somebody who’s really affecting how all this is playing out.

Speaker: 0
02:16:25

But even skill set aside, just courage and Yes.

Speaker: 2
02:16:27

Including, by the way, courage to go outside of his own zone. Yeah. Right? You know, because, like, he hasn’t I’ll give you an example. Like, my my firm, Venture Capital Firm, we have LPs. There are things that I feel like I can’t do or say because I feel like I would be bringing, you know, I would be bringing embarrassment or other consequences to our LPs.

Speaker: 2
02:16:43

He has investors also where he worries about that. And so his so a couple of things. 1 is his willingness to go out a bit and risk his relationship with his own investors. But I will tell you the other thing, which is his investors I know this for a fact. His investors have been remarkably supportive of him doing that because as it turns out, a lot of them actually agree with him.

Speaker: 2
02:17:01

And so he’s it’s the same thing he does in his activism campaigns. He is able to be the tip of the spear on something that actually a lot more people agree with.

Speaker: 0
02:17:10

Yeah. It turns out if you have truth behind you, it it helps.

Speaker: 2
02:17:14

And just again, our you know, how I started is a lot of people are just fed up.

Speaker: 0
02:17:18

You’ve been spending a bunch of time in Mar a Lago sana Palm Beach helping the new administration in many ways, including, interviewing people who might join. So, what’s your general sense about the saloni, about the people who are coming in into the new administration?

Speaker: 2
02:17:33

Ai I should start by saying I’m not a member of the new administration. I’m not I’m not in the I’m not, like, in the room when a lot of these people are being selected.

Speaker: 0
02:17:40

I believe you said unpaid intern.

Speaker: 2
02:17:42

I’m an unpaid intern. So I’m a I’m a volunteer and, you know, went helpful, but I’m not I’m not making the decisions nor am I in a position to, you know, speak for the administration. So I don’t wanna say anything that would cause people to think I’m doing that. It’s a very unusual situation, right, where you had an incumbent president and then you had a 4 year gap where he’s out of office and then you have him coming back.

Speaker: 2
02:17:58

Right? And as you’ll recall, there was a fair amount of controversy over the end of the first term.

Speaker: 0
02:18:04

Oh, yeah.

Speaker: 2
02:18:05

The fear the specific concern was, you know, the 1st Trump administration they, you know, they they will all say this is, like, they they didn’t come in with a team. Right? So they they, you know, they they didn’t come into the team, and most of the sort of institutional base of the Republican Party were Bush Republicans, and they were and many of them had become Never Trumpers, and so they had a hard time putting the team together.

Speaker: 2
02:18:21

And ai, by the way, they had a hard time getting people confirmed, and so if you talk to the people who were there in the first term, it took them 2 to 3 years to kinda even get the government in place. And then they basically only had the government in place for, you know, for basically, like, 18 months, and then COVID hit.

Speaker: 2
02:18:35

You know, and then sort of the aftermath and everything and all the all the drama and headlines and everything. And so the the concern, you know, including from some very smart people in the last 2 years has been, boy, Trump gets a second term. Is he gonna be able to get a team that is as good as the team he had last time or a team that is actually not as good?

Speaker: 2
02:18:50

Because maybe people got burned out, maybe they’re more cynical now, maybe they’re not willing to go through the drama. By the way, a lot of people on in the first term came under, like, you know, with their own withering legal assaults, and, you know, some of them went to prison, and, like, you know, a lot a lot of stuff happened.

Speaker: 2
02:19:04

Lots of investigations, lots of legal fees, lots of bad press, lots of debanking ai the way. A lot of the officials in the first Trump term got debanked, including the president’s wife and son.

Speaker: 0
02:19:16

Yeah. I heard you tell that story. That’s insane. That’s just insane.

Speaker: 2
02:19:18

In the wake of the first term, yes. We we now take out spouses and children with our ring of power, And so there there’s, like, this legitimate question as to, like, whether, okay, what what will the team for the 2nd term look like? And at at least what I’ve seen and what you’re seeing this in appointments is, it looks much much better.

Speaker: 2
02:19:33

First of all, it just looks better than the 1st term, and not because the people in the 1st term were not necessarily good, but just you you just have this, like, influx of, like, incredibly capable people that have shown up that wanna be part of this. And you just didn’t have that the first time. And so they they’re just drawing on a much deeper, richer talent pool than they had the first ai.

Speaker: 2
02:19:50

And and they’re drawing on people who know what the game is. Like, they’re they’re drawing on people now who know what is gonna happen, and they’re still willing to do it. And so they’re gonna get, I think, you know, some of the best people from the first term, but they’re bringing in a lot of people who they couldn’t get, the first time around.

Speaker: 2
02:20:04

And then saloni is there’s a bunch of people, including people in the 1st term where they’re just 10 years older. And so they went through the 1st tyler, and they just learned how everything works. Or they’re young people who just had a different point of view, and now they’re 10 years older and they’re ready to go serve in government.

Speaker: 2
02:20:18

And so there’s a generational shift happening. And, actually, one of the interesting things about the team that’s forming up is it’s remarkably young. Some of the cabinet members and then many of the saloni and third level people are, like, in their thirties and forties, you know, which is a big change from the gerontocracy that, you know, we’ve been under for the last 30 years.

Speaker: 2
02:20:34

And so I think the caliber has been outstanding, you know, and we could sit here and list tons and tons of people, but, like, you know, the people who are running, you know, it’s everything from the people who are running all the different departments at HHS. It’s the people running, you know, the the number 2 at the Ai is Steve Feinberg who’s just like an incredible legend of private equity, incredible capable guy.

Speaker: 2
02:20:52

We’ve got, 2 actually, 2 of my partners are going in who I both think are amazing. Yeah. Ai, many, many parts of the government, the people are, like, really impressive.

Speaker: 0
02:21:02

Well, I think one of the concerns is is actually that, given the human being of Donald Ram, that there would be more tendency towards, let’s say, favoritism versus meh. That’s there’s kind of circles of sycophancy that form. And if you’re be able to, be loyal and never oppose and just be, basically suck up to the president, then you’ll get a position.

Speaker: 0
02:21:32

So that’s one of the concerns. And I think you’re in a good position to speak to the degree that’s happening versus, hiring based on merit and just getting great

Speaker: 2
02:21:43

teams? Yeah. So look. I’ll just start by saying any leader at that level, by the way, any CEO, there’s always some risk of that. Right? So there there’s always some you know, it’s just it’s like a natural reality warps around around powerful leaders. And so there’s always some risk to that.

Speaker: 2
02:21:55

Of course, the good and powerful leaders are, you know, very aware of vatsal, and Trump at this point in his life, I think, is highly aware of that. At least ai interactions with him, like, he he he definitely seems very aware of that. So, so so that’s one thing. I would just say that I think the way to look at that I mean, it and look, like I said, I don’t wanna predict what’s gonna happen once this whole thing starts unfolding.

Speaker: 2
02:22:11

But, I I would just sai, again, the caliber of the people who are showing up and getting the jobs, and then the fact that these are some of the most accomplished people in the business world, and in the medical field. I I just you know, Jay Bhattacharya coming in to run NIH.

Speaker: 2
02:22:24

When so I was actually in the I was actually I was part of the interview team for a lot of the HHS folks. Nice.

Speaker: 0
02:22:29

Jay is amazing.

Speaker: 1
02:22:30

Oh, I was so I was

Speaker: 0
02:22:31

so happy to see that.

Speaker: 2
02:22:32

So I literally got this is a ai. I got to the the the transition office for one of the days of the HHS interviews, and I was on one of the interview interviewing teams, and they gave us I didn’t know who the candidates were, and they gave us the sheet in the beginning. Sana I go down the sheet and I saw Jay’s name, and I’d like, Ai almost physically fell out of my chair.

Speaker: 0
02:22:45

Yeah.

Speaker: 2
02:22:45

And I and I was just like you know? And Ai I have I happen to know Jay. I have to know Jay, and I, like, respect him enormously. And then he proved himself under this, like talk about a guy who proved himself under extraordinary pressure Yeah. Over the last 5 years.

Speaker: 0
02:22:57

And then go radical under the pressure. He maintained balance and thoughtfulness and depth. I mean, incredibly

Speaker: 2
02:23:05

Very serious, very analytical, very applied, and and and and yes. A 100%. Tested Under Pressure came out, like, the more people look back at what he said and did. And, you know, he’s not you know, none of us are perfect, but, like, overwhelmingly, like, overwhelmingly insightful throughout that whole period.

Speaker: 2
02:23:21

And, you know, we you know, we would all be much better off today had he been in charge of the response. And and so just ai an incredibly capable guy. And and look and and then he learned from all that. Right? He he learned a lot in the last 5 years.

Speaker: 2
02:23:31

And so the idea that somebody like that could be head of NIH as compared to the people we’ve had is just, like, breathtakingly. It’s just a gigantic upgrade. You know. And then Marty Macri coming in to run FDA, exact same thing. The guy coming to run the CDC, exact same thing.

Speaker: 2
02:23:44

Ai mean, I’ve been spending time with doctor Oz. So, you know, Ai not like again, I’m not like it. I’m not on these teams. I’m not in the ram, but, like, I’ve been spending enough time trying to help that, like, his level of insight into into the health care system is, like, it’s, like, astounding.

Speaker: 2
02:23:59

And it comes from being a guy who’s been, like, in the middle of the whole thing and been talking to people about this stuff and working on it and serving as a doctor himself and in medical systems for, you know, his entire life. And it’s just like, you know, he’s like a walking encyclopedia on these things. And so and, you know, very dynamic, you know, very charismatic, very smart, organized, effective.

Speaker: 2
02:24:16

So, you know, to have somebody like that in there and saloni anyway, they’re just I have, like, 30 of these stories now across all these different, all these different positions. And so I and then I just I’d be quite sana. I did do the compare and contrast to the last 4 years, and it’s not even these people are not in the same ballpark. They’re just, like, wildly better.

Speaker: 2
02:24:34

And so it the the, you know, pound for pound is maybe the best team in the White House since, you know, I don’t even know. Maybe the nineties, maybe the maybe the thirties, maybe the fifties, you know, maybe Eisenhower had a team like this or something, but, it it it’s it’s there’s a lot of really good people in there now.

Speaker: 0
02:24:53

Yeah. The potential for change is certainly extremely high. Well, can you speak to Doge? What’s the most wildly successful next 2 years for Doge? Can you imagine maybe also can you think about the trajectory that’s the most ai, and, what kind of challenges would it be facing?

Speaker: 2
02:25:12

Yeah. So and I’ll start by saying again, I’m not disclaimer. I have to disclaim. I’m not on Doge. I’m not a member of Doge.

Speaker: 0
02:25:19

We we should say there’s about 10 lawyers in the room. They’re staring. No. I’m just kidding.

Speaker: 2
02:25:25

Both the angels and the devils on my shoulder

Speaker: 0
02:25:27

are lawyers.

Speaker: 2
02:25:27

So, yeah. So I’m not speaking for Doge. I’m not in charge of Doge.

Speaker: 0
02:25:31

Yep.

Speaker: 2
02:25:32

Those guys are doing it. I’m not doing it. But I am you know, but, again, I’m I’m volunteering to help as much as I can, and I’m a 100% supportive. Yeah. So look. I I I think the way to think of I mean, the the the basic outlines are in public. Right?

Speaker: 2
02:25:43

Which is it’s a it’s a time limited, you know, basically commission. It’s not a formal government agency. It’s a, you know, time limited 18 month. Vatsal it’ll in terms of implementation, it will advise the executive branch. Right? And so the the the implementation will happen through the the White House.

Speaker: 2
02:25:58

And the president has total attitude on what he wants to what he wants to implement. And then, basically, what I think about is 3 sort ai streams, you know, ai of target sets. And they’re related but different. So money, people, and regulations. And so, you know, the headline number that, you know, put us the $2,000,000,000,000 number and there’s already, you know, disputes over over that whatever, and there’s a whole question there.

Speaker: 2
02:26:20

But then there’s the people thing. And the people thing is interesting because you get into these very, kind of fascinating questions. And I’ve been doing this I I won’t do this for you as a pop quiz, but I do this for people in government as a pop quiz, and I can stump them every time, which is, a, how many federal agencies are there?

Speaker: 2
02:26:35

Mhmm. And the answer is somewhere between 415520, and nobody’s quite sure. And then the other is how many people work for the federal government? And the answer is, you know, something on the order I forget, but, like, 4,000,000 full time employees and maybe up to 20,000,000 contractors, and nobody is quite sure.

Speaker: 2
02:26:51

And so there’s a large people component to this. And then by the way, there’s a related component to that, which is how many of them are actually in the office? And the answer is not many. Most of the federal buildings are still empty. Right?

Speaker: 2
02:27:03

And so and then there’s questions of, like, are people, you know, working from home, or are we actually working from home? So there’s the people meh. And of course, the money and the people are connected. And then there’s the third which is the the regulation thing. Right?

Speaker: 2
02:27:13

And I I described earlier how basically our system of government is much more now based on regulations than legislation. Right? Most of the rules that we all live under are not from a bill that went through congress, they’re from an agency that that created a regulation. That turns out to be very, very important.

Speaker: 2
02:27:28

So one is, Ai, I’ve already described, we wanna do the the Doge wants to do broad based regulatory relief, and Trump has talked about this, and basically get the government off those backs and liberate the American people to be able to do things again. So that’s part of it.

Speaker: 2
02:27:40

But there’s also something else that’s happened which is very interesting which was there were a set of Supreme Court decisions about 2 years ago, that went directly after the idea that the executive branch can create regulatory agencies and issue regulations and enforce those regulations without corresponding congressional legislation.

Speaker: 2
02:27:56

And most of the federal government that exists today, including most of the departments and most of the rules and most of the money and most of the people, most of it is not enforcing laws that Congress passed. Most of it is is regulation. And the Supreme Court basically said large parts, you know, large to maybe all of that regulation that did not directly result from a bill that went through Congress the way that the cartoon said that it should, that may not actually be legal.

Speaker: 2
02:28:25

Now the previous White House, of course, was super in favor of big government. They had no desire to they did nothing based on this. They they didn’t, you know, pull anything back in, but the new regime, if they choose to, could say, look, the the thing that we’re doing here is not, you know, challenging the laws.

Speaker: 2
02:28:41

We’re actually complying with the Supreme Court decision that basically says we have to unwind a lot of this, and we have to unwind the regulations, which are no longer legal, constitutional. We have to unwind the spend and we have to unwind the people. And so and sai and that’s how you get from basically, you connect the thread from the regulation part back to the money part, back to the people part.

Speaker: 2
02:28:58

They have work going on all 3 of these threads. They have, I would say, incredibly creative ideas on how to deal with this. I’m I I know lots of former government people who a 100% of them are super cynical on this topic, and they’re like, this is impossible, this could never possibly work.

Speaker: 2
02:29:12

And I’m like, well, I can’t tell you what the secret plans are, but, ai, ai, blow my mind. Like and all 3 of those, like, they they have ideas that are, like, really quite amazing as you’d expect from, you know, from from the people involved. And so, over the course of the next few months, you know, that’ll start to become visible.

Speaker: 2
02:29:28

And then the final thing I would say is, this is going to be very different than than attempts like the there have been other programs like this in the past. The Clinton Gore administration had 1, and then there there there were others before that. Reagan had 1. The the difference is this time, there’s social media. Mhmm.

Speaker: 2
02:29:45

And so Mhmm. There has never been like, it’s it’s it’s interesting. One of the reasons people in Washington are so cynical is because they know all the bullshit. Ai, they know all the bad spending and all the bad rules and all the, like, you know I mean look, we’re adding a $1,000,000,000,000 to the national debt every 100 days right ai.

Speaker: 2
02:30:03

And that’s compounding and it’s now passing the size of the defense department budget and it’s compounding and it’s pretty soon it’s gonna be adding a $1,000,000,000,000 every 90 days and then it’s gonna be adding a $1,000,000,000,000 every 80 days and then it’s gonna be a $1,000,000,000,000 every 70 days and then if this doesn’t get fixed at some point we enter a hyperinflationary spiral and we become Argentina or Brazil and Kabloi.

Speaker: 2
02:30:22

Right? And so, like, everybody in DC knows that something has to be done. And then everybody in DC knows for a fact that it’s impossible to do anything. Right? They know all the problems and they also know the sheer impossibility of fixing it.

Speaker: 2
02:30:33

But I think what they’re not taking into account that what the critics are not taking into account is these guys can do this in the full light of day, and they can do it on social media. They can completely bypass the press. They can completely bypass the cynicism. They can expose any element of, you know, unconstitutional or, you know, silly government spending.

Speaker: 2
02:30:50

They can run victory laps every single day on what they’re doing. They can they can bring the people into the process. And again, if you think about it, this goes back to our Machiavellian structure, which is if you think about again, you’ve got democracy, oligarchy, monarchy, rule of the many, rule of the few, rule of the one.

Speaker: 2
02:31:07

You could think about what’s happened here as a little bit of a sandwich. Right? Which is you have you have we don’t have a monarch where we have a president, rule the one with some power. And then we have the people who can’t organize, but they can be informed and they can be aware and they can express themselves through voting and polling. Right?

Speaker: 2
02:31:22

And so there’s a sandwich happening right now, is the way to think about it, which is you’ve got basically monarch monarchy. If you got rule of 1 combining with rule of many meh? Right? And rule of many is they do get to vote, right, the people do get to vote. Basically and then essentially Congress as in this sort of permanent bureaucratic class in Washington as the oligarchy in the middle.

Speaker: 2
02:31:40

And so the White House plus the people, I think have the power to do all kinds of things here. And I and I think that that would be the way I would watch it.

Speaker: 0
02:31:48

The transparency I mean, Elon just, by who he is in is incentivized to be transparent and show the bullshit in the system and to celebrate the victories. So it’s gonna be so exciting. I mean, honestly, it just makes government more exciting, which is a win for everybody.

Speaker: 2
02:32:08

These people are spending our money. Yeah. These people have enormous contempt for the taxpayer. Okay. Here’s the thing you hear in Washington, here’s one of the things. So so the first thing you hear is this is impossible, they’ll be able to do nothing, and then meh, I ai them through this and they’re like, they start to get it starts to dawn on them that this is a new kind of thing.

Speaker: 2
02:32:23

And then they’re ai, well it doesn’t matter because all the money is in entitlements and the debt and the military and so ai, yeah, you’ve got like this silly fake whatever, you know, NPR funding or whatever and like it just it’s sai rounding error and it doesn’t matter and you look it up in the budget and it’s like whatever $500,000,000 or $5,000,000,000 or or it’s the or it’s the the the the it’s the charging stations that don’t exist.

Speaker: 2
02:32:44

It’s the $40,000,000,000 of charging stations and they bill 8 charging stations, or it’s the it’s the broadband Internet plan that delivered broadband to nobody. Right? And cost you $30,000,000,000. You know, ai, so these these boondoggles. And what everybody in Washington says is a $30,000,000,000 is a rounding error on the federal budget. It doesn’t matter.

Speaker: 2
02:32:58

Who cares if they if they if they make it go away? And, of course, any taxpayer is like, what the?

Speaker: 0
02:33:04

What do you mean?

Speaker: 2
02:33:06

It’s $30,000,000,000. Yeah. Right? And then and then and then the experts are like and then and the the press is in on this too. Then the experts are like, well, it doesn’t it doesn’t matter because it’s surrounding here or no. No. It’s $30,000,000,000. And if you’re this cavalier about $30,000,000,000, imagine how cavalier you are about the 3,000,000,000,000.

Speaker: 0
02:33:20

Yeah.

Speaker: 2
02:33:20

Okay. Then there’s the okay. $30,000,000,000 is $30,000,000,000 a lot of the federal budget in percentage? No. It’s not, but $30,000,000,000 divided by 30 do the meh. $30,000,000,000 divided by, let’s say, 300,000,000 taxpayers. Right? Like, what’s that Meh math bryden? $100. $100 per taxpayer per year. Okay.

Speaker: 2
02:33:37

So a $100 to an ordinary person working hard every day to make money and provide for their kids. A $100 is a meal out, It’s a trip to the amusement park. It’s the ability to, you know, buy additional educational materials. It’s the ability to have a babysitter, to be able to have romantic relationship with your wife.

Speaker: 2
02:33:55

It’s there’s like a 100 things that that person can do with a $100 that they’re not doing because it’s going to some bullshit program that is being basically where the money is being looted out in the form of just like ridiculous ridiculousness and graft. And so the idea that that $30,000,000,000 program is not something that is like a very important thing to go after is just like the level of contempt for the taxpayer is just off the charts.

Speaker: 2
02:34:16

And then that’s just one of those ram, and there’s, like, a 100 of those programs. And they’re all just like that. Like, it’s not like any of this stuff is running well. Like, the one thing we know is that none of this stuff is running well. Like, we know that for sure. Right? And we’re, like, we know these people aren’t showing up to work, and, like, we know that all this crazy stuff is happening.

Speaker: 2
02:34:31

Right? And, like, you know, the the the do you remember, Elon’s story of the, do you remember Elon’s story of what got the Amish to turn out to vote in Pennsylvania? Oh, okay. So, like, Pennsylvania okay. So Pennsylvania is ai a wonderful state, great history.

Speaker: 2
02:34:43

It has these cities like Philadelphia that have descended, like, other cities into just, like, complete chaos, violent madness, and death. Right? And the federal government has just, like, let it happen in these incredibly violent places. And so the Biden administration decided that the big pressing law enforcement thing that they needed to do in Pennsylvania was that they needed to start raiding Amish farms to prevent them from selling raw milk with armed raids.

Speaker: 0
02:35:05

Right.

Speaker: 2
02:35:05

And it turns out it really pissed off the Amish. It turns out they weren’t willing to drive to the polling places because they don’t have cars, but if you came and got them, they would go and they would vote, and that’s one of the reasons why Trump won. Anyway, so, like, the law enforcement agencies are off working on, like, crazy things, like, the system’s not working. And so you you add up just pick a $130,000,000,000 programs.

Speaker: 2
02:35:26

Alright. Now you’re okay. Math major, a 100 times a 100.

Speaker: 0
02:35:30

$20,000.

Speaker: 2
02:35:31

Okay. $10,000 per taxpayer per year.

Speaker: 0
02:35:34

And but but it’s also not just about money. That’s real obviously, money is a hugely important thing, but it’s the cavalier attitude Yes. That then in sort of in the ripple effect of that, it makes it so nobody wants to work in government and be productive. It makes it so that corruption can it breeds corruption. It breeds laziness.

Speaker: 0
02:35:54

It breeds secrecy because you don’t wanna be transparent about having done nothing all year, all this kind of stuff. And you don’t wanna reverse that so it would be exciting for the future to work at government to because the the amazing thing, if you’re the steel man government, is you can do shit at scale.

Speaker: 0
02:36:12

You have money, and you can directly impact people’s lives in a positive, sense at scale. It’s it’s super ai. As long as there’s no bureaucracy that slows you down or not huge amounts of bureaucracy that slows you down significantly.

Speaker: 2
02:36:30

Yeah. So, here’s here’s the term. This blew my ai, because I was you know, you once you look into the once you open the hell mouth of looking into the federal budget, you you learn all kinds of things. So there is a term of art in government called impoundment. Mhmm. And so you you if you’re like me, you’ve learned this the hard way when your car has been impounded.

Speaker: 2
02:36:48

The government meaning of impoundment federal budget meaning is a is a different meaning. Impoundment is as follows. The constitution, requires congress to authorize money to be spent by the executive branch. Right? So the the the executive branch goes to congress, says we need money x.

Speaker: 2
02:37:02

Congress does their thing, they come back and they say you can have money y. The money is appropriated from congress. The executive branch spends it on their military or whatever they spend it on, or on roads to nowhere or charging stations to nowhere or whatever. The the the and what’s in the constitution is the the congress appropriates the money.

Speaker: 2
02:37:17

Over the last 60 years, there has been an additional interpretation of appropriations applied by the courts, and by the system, which is the executive branch not only needs Congress to appropriate x amount of money, the executive branch is not allowed to underspend.

Speaker: 1
02:37:33

Yeah. I’m aware of this. I’m aware of this.

Speaker: 2
02:37:37

And so there’s this thing that happens in Washington at the end of every fiscal year, which is September 30th, and it’s the it’s the great budget flush. And any remaining money that’s in the system that they don’t know how to productively spend, they deliberately spend it on productively. Yeah.

Speaker: 2
02:37:50

To the tune of 100 and 100 of 1,000,000,000 of dollars. A president that doesn’t sana spend the money cannot spend it. Yeah. Like, okay, a, that’s not what’s in the constitution. And there’s actually quite a good Wikipedia page goes through the the great debate on this.

Speaker: 2
02:38:04

It’s played out in the legal world over the last 60 years. And, like, basically, if you look at this with anything resembling, I think I don’t mind, you’re like, ai, this is not what the founders meant. And then number 2, again, we go back to this thing of contempt. Like, can you imagine showing up and running the government like that and thinking that you’re doing the right thing and not going home at night and thinking that you’ve sold your soul, right?

Speaker: 2
02:38:25

Like, it’s just ai I I actually think you sort of had it in a really good point, which is it’s even unfair to the people who have to execute this. Yeah. Right? Because it it makes them it makes them bad people, and they didn’t they didn’t start out wanting to be bad people.

Speaker: 2
02:38:35

And so there is stuff like this, like

Speaker: 0
02:38:37

Yeah. Everywhere.

Speaker: 2
02:38:38

Everywhere. And so we’ll we’ll see how far these guys get. I’m I am extremely encouraged what I’ve what I’ve seen so far.

Speaker: 0
02:38:44

It seems like a lot of people will try to slow them down, but, yeah, I’m fortunate to get far. Yeah. Another difficult topic, immigration. What’s your take on the, let’s say, heated h one b visa debate that’s going on online and legal immigration in general?

Speaker: 2
02:38:59

Yeah. So Sai should start by saying Sai am not involved in any aspect of government policy on this. I am not planning to be. This is not an issue that I’m working on or that I’m going to work on. I’m we’re not this is not part of the agenda of what the firm is doing. So my firm is doing.

Speaker: 2
02:39:11

So, like, I’m not I’m not in I’m not in this in the new administration of the government. I’m not planning to be. So purely just personal opinion. So I would say I would describe what Savvy has sai complex or nuanced hopefully nuanced view on on this issue. This may be a little bit different than what a lot of my peers have.

Speaker: 2
02:39:27

And I think and I kind of thought about this, you know, I didn’t say anything about it all the way through the big kind of debate over Christmas, but I thought about it a lot and read everything. I think what I realized is that I just have a very different perspective on some of these things, and the reason is because of the combination of where I came from and then where I ended up.

Speaker: 2
02:39:44

And so, ai start with this. I where I ended up in Silicon Valley. So, and I have made the pro skilled high skilled immigration argument meh, many times, the h one b argument many times. In past lives, I’ve been in DC many times arguing with prior administrations about this, always on the side of trying to get more h one b’s and trying to get more high skilled immigration, and, you know, I think that argument is very strong and very solid and very, you know, has paid off for the US in some in in many, many ways, and we can go through it, but I think it’s the argument everybody already knows.

Speaker: 2
02:40:15

Right? It’s like the stock. You take any Silicon Valley person, you press the button, and they tell you why we need to brain drain the world to get more Meh. Right? So everybody kinda gets that argument.

Speaker: 0
02:40:23

So it’s basically just to summarize, it’s a mechanism by which you can get super smart people from the rest of the world, import them in, keep them here to, increase the productivity of the u US companies.

Speaker: 2
02:40:35

Yeah. And and then and then and then it’s not just good for the them and it’s not just good for Silicon Valley or the tech industry. It’s good for the country because they then create new companies and create new technologies and create new industries that then create many more jobs for Americans native born Americans than would have previously existed.

Speaker: 2
02:40:50

And so you’ve got a it’s a positive sum flywheel thing where everybody wins. Ai, everybody wins. There are no trade offs. It’s all absolutely glorious in all directions. You cannot possibly there cannot possibly be a ai, right, argument against it under any circumstances.

Speaker: 2
02:41:04

Anybody who argues against it is obviously doing so from a position of racism is probably a fascist and a Nazi. Right? Right? I meh, that Right. That’s the thing.

Speaker: 2
02:41:12

And like I said, I’ve made that argument many times. I’m I’m very comfortable with that meh. And then I’d also say, look, I I I would say number 1, I believe a lot of it. I’ll talk about the parts I don’t believe, but I believe a lot of it. And then the other part is, look, I I benefit every day.

Speaker: 2
02:41:23

I I I always describe it as I work in in the United Nations. Like, I my own firm and our founders and our companies and the industry, and my friends, you know, are just this, like, amazing, you know, panoply cornucopia of people from all over the world. And, you know, I just I’ve worked I don’t know at this point where people from it’s gotta be, I don’t know, 80 countries or something.

Speaker: 2
02:41:44

And hopefully over ai, it’ll be, you know, the rest as well. And, you know, it’s just it’s it’s been amazing, and they’ve done many of the most important things in my industry, and it’s been really remarkable. So so that’s all good. And then, you know, there’s just the practical version of the argument, which is we are the we are the main place these people get educated anyway. Right?

Speaker: 2
02:42:00

They the the best and the brightest tend to come here to get educated, and so, you know, this is the old kind of Mitt Romney staple of green card to every, you know, at least, you know, maybe not every university degree, but every technical degree. Maybe the sociologist we could quibble about, but, you know, the roboticist for sure. For sure. For sure.

Speaker: 2
02:42:16

We can all agree that

Speaker: 0
02:42:17

At least I won you over on something today.

Speaker: 2
02:42:19

Well, no. I’m I’m exaggerating for a fact.

Speaker: 0
02:42:21

So And I lost you. I had you for I haven’t got a second.

Speaker: 2
02:42:24

I haven’t gotten to the other side of the argument yet.

Speaker: 0
02:42:26

Okay. Thank you.

Speaker: 2
02:42:27

So surely, we can all agree k. That, we need to staple a green card.

Speaker: 0
02:42:31

The roller coaster is going up.

Speaker: 2
02:42:32

The roller coaster is ratcheting slowly up. So, yeah. So surely, we can all agree that the roboticists should all get green cards. And, again, like, there’s a lot of merit to that. Obviously, like, look, we want the US to be the world leader in robotics. What’s step 1 to being the world leader in robotics is to have all the great robotics people? Right? Like, you know, very unlike the underpants, no.

Speaker: 2
02:42:49

It’s ai a very straightforward formula. Right? Alright. That’s all well and good. Alright.

Speaker: 2
02:42:52

But it gets a little bit more complicated, because, there is a kinda argument that’s sort of right underneath that that you also hear from, you know, these same people. And I have made this argument myself many times, which is we need to do this because we don’t have enough people in the US who can do it otherwise.

Speaker: 2
02:43:06

Right? We have all these unfilled jobs. We’ve got all these, you know, all these companies that wouldn’t exist. We don’t have enough good founders. We don’t have enough engineers.

Speaker: 2
02:43:12

We don’t have enough scientists. Or or then the next version of the argument below that is our education system is not good enough to generate those people, and which is a weird argument, by the way, because, like, our education system is good enough for foreigners to be able to come here preferentially in, like, a very large number of cases, but somehow not good enough to educate our own native foreign people.

Speaker: 2
02:43:31

So there’s, like, a weird there’s little cracks in the matrix that you can kind of stick your fingernail into and kind of wonder about. Now we’ll come back to that one. But, like, at least, yes, our education system has its flaws. And then and then underneath that is the is the argument that, you know, Vivek made, you know, which is, you know, we have cultural rot in the country and, you know, native born people in the country aren’t you know, don’t work hard enough and spend too much time watching TV and TikTok and don’t spend enough time studying differential, you know, equations.

Speaker: 2
02:43:55

And again, it’s like, alright. Like, you know, meh, there’s a fair amount to that. Like, there’s a lot of American culture that, is, you know, there’s a lot of frivolity, there’s a lot of, you know, look at that. I mean, we have well documented social issues on many fronts, many things that cut against having a culture of just ai straightforward high achievement and and effort and striving.

Speaker: 2
02:44:13

Anyway, like, you know, those are the basic arguments. But then I have this kind of other side of my, you know, kind of personality and thought process, which is, well, I grew up in a small farming town in rural Wisconsin, the rural Midwest, and, you know, it’s interesting.

Speaker: 2
02:44:25

There’s not a lot of people who make it from rural Wisconsin to, you know, high-tech. And so it’s ai, alright. Why is that exactly? Right? And then Sai noticed I’m an aberration. Like, I I was the only one from anybody I ever knew whoever did this. Right?

Speaker: 2
02:44:37

That I know what an aberration I am, and I know exactly how that aberration happened, and it’s a very unusual, you know, set of speak, including, you know, many that were were just luck. But, like, it there is in no sense a talent flow from rural Wisconsin into high-tech, like, not at all.

Speaker: 2
02:44:54

There is also, like, in no sense of talent flow from the of the Midwest into high-tech. There is no talent flow from the south into high-tech. There is no flow from the Sunbelt into high-tech. There is no flow from, you know, the deep south into high-tech. Like, it just like, literally, it’s like the blanks there’s this whole section of the country that just where the people just, like, for some reason, don’t end up in tech.

Speaker: 2
02:45:15

Now that’s a little bit strange because these are the people who put a man on the moon. Mhmm. These are the people who built the World War 2 war machine. These are the people, at least their ancestors, are the who built the 2nd industrial revolution and built the railroads and built the telephone network telephone network and built, you know, logistics and transportation in the auto Ai mean, the auto industry was built in Cleveland and Detroit.

Speaker: 2
02:45:36

And so at least these people’s parents and grandparents and great grandparents somehow had the wherewithal to, like, build all of this, like, amazing things, invent all these things. And then there’s many, many, many, many stories in the history of American invention and innovation and capitalism where you had people who grew up in the middle of nowhere.

Speaker: 2
02:45:51

Ai Farnsworth who invented the television, and just like, you know, tons and tons of others, endless stories like this. Now you have, like, a a puzzle. Right? And the conundrum, which is, like, okay, like, what is happening on the blank spot of the map? And then, of course, you also can’t help noticing that the blank spot on the map, the Midwest, the South, you’ve also just defined Trump country. Mhmm.

Speaker: 2
02:46:10

The the Trump voter base. Right? And it’s like, oh, well, that well, that’s interesting. Like, how how did that happen? Right?

Speaker: 2
02:46:15

And so either you really, really, really have to believe the very, very strong version of, like, the Vivek thesis or something where you have to believe that, like, that basically culture the the whole sort of civilization in the middle of the country and the South of the country is so, like, deeply flawed, either inherently flawed or culturally flawed such that for whatever reason they are not able to do the things that their, you know, their parents or grandparents were able to do and that their peers are able to do, or something else is happening.

Speaker: 2
02:46:38

Would you care to guess on what else is happening?

Speaker: 0
02:46:40

You meh, what affirmative action? Affirmative action.

Speaker: 2
02:46:45

Okay. So this is very think about this. This is very entertaining. Right? What are the three things that we know about affirmative action? It is absolutely 100% necessary. But, however, it cannot explain the success of any one individual. Right. Nor does it have any victims at all.

Speaker: 0
02:47:01

Ai that could explain maybe the disproportionate, but, like, it it surely doesn’t explain why you’re probably the only person in Silicon Valley from Wisconsin.

Speaker: 2
02:47:11

What educational institution in the last 60 years has wanted Farm Boyce from Wisconsin?

Speaker: 0
02:47:15

But what institution rejected Farm Boyce from Wisconsin?

Speaker: 2
02:47:18

All of them.

Speaker: 0
02:47:19

All of them.

Speaker: 2
02:47:19

Of course. Okay. So we know this. We know this. The reason we know this is because of the Harvard and UNC court Supreme Court cases. So the this was ai 3 years ago. These were these were big court cases. You know that because the the idea of affirmative action has been litigated for many many many years and through many court cases and the Supreme Court repeatedly in the past had upheld that it was a completely legitimate thing to do.

Speaker: 2
02:47:38

And a lot of these and there’s basically 2 categories of affirmative action that, like, really matter. Right? The one is, the admissions into educational institutions and then the other is jobs, right, getting hired. Like, those are the 2 biggest areas. And the education one is, like, super potent as has been a super potent political issue for a very long time for all, you know, people have written and talked about this for many decades.

Speaker: 2
02:47:56

I don’t I don’t need to go through it. There’s many arguments for why it’s important. There’s many arguments as to how it could backfire. It’s been this thing, but the Supreme Court upheld it for a very long time. The most the the most recent ruling I don’t I’m not a lawyer.

Speaker: 2
02:48:07

I don’t have the exact reference in my head, but there was a case in 2003 that said that, Sandra Day O’Connor famously wrote that, you know, it although it had been 30 years of affirmative action and although it was not working remotely as it had had been intended, she said that, you know, well, basically, we need to try it for another 25 years.

Speaker: 2
02:48:25

But she sai, basically, as a message to future supreme court justices, if it hasn’t resolved basic basically the issues it’s intended to resolve within 25 years, then we should probably call it off. By the way, we’re coming up on the 25 years. It’s good. It’s a couple it’s a couple years away.

Speaker: 2
02:48:38

The Supreme Court just, had these cases as as a Harvard case ai I think a UN University of North Carolina case. And what’s interesting about those cases is the the lawyers in those cases put a tremendous amount of evidence into the record of how the admissions decisions actually happen, at Harvard and happen at UNC.

Speaker: 2
02:48:55

And it is ai every bit as cartoonishly garish and racist as you could possibly imagine because it’s a ring of power. And if you’re an admissions officer at a private university or an administrator, you have unlimited power to do what you want and you can justify any of it under any of these rules or systems.

Speaker: 2
02:49:13

And up until these cases that have been a black box where you didn’t have to explain yourself and show your work. And and what the Arya and USC cases did is they basically required showing the work. And so they and and there was, like, all kinds of, like, phenomenal detail.

Speaker: 2
02:49:25

I mean, number 1 is there were text messages in there that will just curl your hair of people of students being spoken of and just ai crude racial stereotypes that would just make you wanna jump out the window. It’s horrible stuff. But also, there was, oh, statistical information.

Speaker: 2
02:49:37

And of course, the big statistical kicker to the whole thing is that at top institutions, it’s common for different different ethnic groups to have different cutoffs for SAT, that are as wide as 400 points. Ai? So different groups. So it so it so it specifically, Asians need to perform at 400 SAT points higher than other ethnicities in order to actually get admitted into these.

Speaker: 2
02:49:58

Ai mean, this is not even about I mean, white people arya part of this, but, like, Asians are, like, a very big part of this. And actually, the Harvard case was actually brought by an activist on behalf of actually, the Asian students who were being turned away. And it it’s basically I mean, it’s the cliche now in in in the valley and in the medical community, which is like ai you want a super genius, you hire an Asian from Harvard because they are guaranteed to be freaking Einstein.

Speaker: 2
02:50:17

Because if they weren’t, they were never getting admitted. Right? Almost all the qualified agents get turned away. So they’ve been running this. It’s very, very explicit, very, very clear program. This of course has been a third rail of things that people are not supposed to discuss or under any circumstances.

Speaker: 2
02:50:33

The thing that has really changed the tenor on this is Ai think two things. Number 1, those Supreme Court cases the Supreme Court ruled that they can no longer do that. I will tell you I don’t believe there’s a single education institution in America that is conforming with the Supreme Court ruling.

Speaker: 2
02:50:47

I think they are all flagrantly ignoring it and we we could talk about that.

Speaker: 0
02:50:51

Mostly because of momentum probably or what?

Speaker: 2
02:50:53

They are trying to make the world a better place, they are trying to solve all these social problems. They are trying to have diverse student populations. They are trying to live up to the expectations of their donors. They are trying to make their faculty happy. They are trying to, have their friends and family think that they’re good people.

Speaker: 0
02:51:08

Right.

Speaker: 2
02:51:09

They’re trying to have the press write nice things about them. Ai, it’s nearly impossible for them. And and, you know, to be clear, like, nobody has been fired from an admissions office for, you know, 25 years prior. What we now the Supreme Court now has ruled to be illegality. And so they’re all the same people under the exact same pressures.

Speaker: 2
02:51:26

And so, like, I you know, the numbers are moving a little bit, but, like, I I don’t think I don’t know anybody in the system who thinks that they’re complying with the Supreme Court. Like, who’s in charge in the rank ordering of who rules who? The universities rule the Supreme Court way more than the Supreme Court rules the universities. Right?

Speaker: 2
02:51:42

Well, another example of that is I think it’s it’s every sitting member of the Supreme Court right now went to either Harvard or or Yale. Right? Like, the the level of incestuousness here is ai is any anyway, so so so there’s that. And so so this has been running for a very long time.

Speaker: 2
02:51:54

So one is the Harvard and USC cases kinda gave up the game, number 1, or at least show showed what the mechanism was. And then number 2, the other thing is obviously the the aftermath of October 7th. Right? And what we discovered was happening with, Jewish applicants. And what was happening at all the top institutions for Jewish applicants was they were being managed down as they were being actively managed down as a percentage of the of of of the base.

Speaker: 2
02:52:15

And, let’s sai Ai I’ve heard reports of, like, extremely explicit, basically, plans to manage to manage the Jewish admissions down to their representative percentage of the US population, which is 2%. And, you know, there’s a whole backstory here, which is a 100 years ago, Jews were not admitted into a lot of these institutions, and then there was a big campaign to get them in.

Speaker: 2
02:52:33

Once they could get in, they immediately became 30% of these institutions because there’s so many smart, talented Jews. So it went from 0% to 30%, and then the most recent generation of leadership has been trying to get it down to 2%. And a lot of Jewish people at least a lot of Jewish people I know sort of they kinda knew this was happening, but they discovered it the hard way, after October 7th.

Speaker: 2
02:52:53

Right? And so all of a sudden oh, so so so basically the the supreme court case meant that you could address this in terms of the Asian victims. The October 7th meant that you could address it in terms of the Jewish victims. And for sure, both of those groups are being systematically excluded. Right?

Speaker: 2
02:53:06

And then, of course, there’s the thing that you basically can’t talk about, which is all the white people are being excluded. And then it turns out it’s also happening to black people. And this is the thing that, like, blew my freaking mind when I found out about it. So, I just assumed that, like, this was great news for, like, American Blacks because, like, you know, obviously if, you know, whites, Asians and Jews are being excluded then, you know, the whole point of this in the beginning was to get the Black population up, and so this must be great for American blacks.

Speaker: 2
02:53:34

So then I discovered this New York Times article from 2004, called, blacks are being admitted into top schools at greater numbers, but which ones? Uh-oh. And again and by the way, this is in the New York Times. This is not in, like, you know, whatever. National Review. This is New York Ai, 2004.

Speaker: 2
02:53:53

And the two authorities that were quoted in the story are Henry Louis Gates, who’s the dean of the African American Studies, you know, community in the United States. Super brilliant guy. And then Lani Guinier, who was a she was a potential Supreme Court, appointee under, I think, she was a close friend of Hillary Clinton.

Speaker: 2
02:54:09

And there was for a long time, she was on the short list for Supreme Court. So one of the top, you know, jurists, lawyers in the country. Both black. But sort of legendarily successful in their in their in in the academic and legal worlds, and black. And they are quoted as the authorities in this story.

Speaker: 2
02:54:24

And the story that they tell is actually very it’s amazing. And by the way, it’s happening today in, education institutions and it’s happening in companies and you can see it all over the place sana the government, which is, at least at that ai, the number was half of the black admits into a place like Harvard were not American born blacks.

Speaker: 2
02:54:44

They were foreign born blacks. Specifically, well, Northern African off generally Nigerian, or West Indian. Right. And by the way, many Nigerians and Northern Africans have come to the US and have been very successful. Nigerian Americans as a group, like, way outperform.

Speaker: 2
02:55:02

They’re you know, this is a super smart cohort of people. And then West Indian blacks in the US are incredibly successful. Most recently, by the way, Kamala Harris as well as Colin Powell. Like, just two sort of examples of that. And sai, basically, what Henry Louis Gates and Ai Guinier said in the story is Harvard is basically struggling to either, whatever it was, identify, recruit, make successful, whatever it was, American born native blacks.

Speaker: 2
02:55:25

And so therefore, they were using high school immigration high school immigration as an escape hatch to go get blacks from other countries. And then and and then this was 2004 when you could discuss such things. Obviously, that is a topic that nobody has discussed since. It has sailed on.

Speaker: 2
02:55:40

All of the DEI programs of the last 20 years have had this exact characteristic. There’s large numbers of black people in America who are fully aware of this and are ai, it’s obviously not us that are getting these slots. We’re we’re obviously we’re literally competing with people who are being imported.

Speaker: 2
02:55:53

And, you know, if if you believe in the basis of affirmative action, you are trying to make up for historical injustice of American black slavery. And so the idea that you import somebody from, you know, Nigeria that never experienced that, you know, is ai tremendously insulting to to to ai Americans.

Speaker: 2
02:56:08

Anyway, so you can see where I’m heading with this. We have been in a 60 year social engineer engineering experiment to exclude native born people from the educational slots and jobs that high school immigration has been funneling foreigners into. Right? And so it turns out it’s not a victim free thing. There’s there’s, like, 100% there’s victims because ai?

Speaker: 2
02:56:26

There’s only so many for sure, there’s only so many education slots, and then for sure, there’s only so many of these jobs. Right? You know, Google only hires so many, you know, whatever level 7 engineers. Right? And so so so that’s the other side of it. Right?

Speaker: 2
02:56:38

And so you’re a farm boy in Wisconsin. Right? Or a, you know, black American whose ancestors arrived here, you know, on a slave ship 300 years ago in Louisiana, or a, you know, Cambodian immigrant in, you know, the Bronx, and your kid or a Jewish immigrant, or, you know, ram a very successful Jewish family, and, you know, your entire you know, for 3 generations, you and your parents or grandparents went to Harvard.

Speaker: 2
02:57:02

And what all of those groups know is the system that has been created is not for them. Right? It’s designed specifically to exclude them. And then what happens is all of these tech people show up in public and say, yeah, let’s bring in more foreigners. Right?

Speaker: 2
02:57:16

And so so anyway so the the the short version of it is you can’t anymore, I don’t think, just have the the the, the the the quote high skill immigration conversation for either education or for, or for employment without also having the DEI conversation. And then underneath and then DEI is just another word for affirmative action. So it’s it’s the affirmative action conversation.

Speaker: 2
02:57:35

And you you need to actually deal with this at substance and to see what’s actually happening to people you needed to join these topics. And and and I think it is much harder to make the moral claim for high school immigration given the the the the extent to which DEI took over both the education hiring, education process and the and the hiring process.

Speaker: 1
02:57:52

Okay. So first of all,

Speaker: 0
02:57:53

that was brilliantly laid out, the nuance of it. So just to understand, it’s not so much a criticism of h one b ai school immigration. It’s that there needs to be more people saying, yay. We need more American born hires.

Speaker: 2
02:58:08

So I spent the entire Christmas holiday reading every message on this and not saying anything. And when I was which Yeah. You know me well enough to know that’s a serious up level of

Speaker: 0
02:58:17

Yeah. That’s very Meh.

Speaker: 2
02:58:18

Yes. Thank you. No. It wasn’t. There was tremendous rage on the other side of it, but Sure. We I I suppressed it. So sai, I was waiting for the dog who didn’t bark. Right? And the dog who didn’t bark was Ai did not and tell me if you saw ai. I did not see a single example of somebody pounding the table for more high school immigration who was also pounding the table to go get more smart kids who are already here, into these educational institutions and into these jobs.

Speaker: 2
02:58:42

I didn’t see I didn’t see a single one.

Speaker: 0
02:58:44

That’s true. I I I think I agree with that. There’s there really was a divide.

Speaker: 2
02:58:49

But it was ai literally, it was like the proponents of high skilled immigrant and, again, this was me for a very long time. I meh, I kinda took myself by surprise on this because I was on you know, I I I had the meh, say, simpler version of this story for a very and ai I said, I I’ve been in Washington many times under past presidents lobbying for this.

Speaker: 2
02:59:03

By the way, never made any progress, which we could talk about. Like, it never actually worked. But, you know, I I’ve been on the other side of this one. But I was literally sitting there being, like, alright. Which of these, like, super geniuses, geniuses, who, you know, many of whom by the way are very, you know, successful high school immigrants or children of high school immigrants.

Speaker: 2
02:59:18

You know, which of these super geniuses are sana ai sai, actually we have this ai incredible talent source here in the country, which again to be clear, I’m not talking about white people, I’m talking about native born Americans, whites, Asians, Jews, blacks, for sure. For sure. For sure. Those four groups. But also Yeah. White people. Yeah. And and also white people.

Speaker: 0
02:59:36

People that are making the case for American born hires are usually not also supporting h one b. This is it’s an extreme divide, and those people that are making that case are often not making it in a way that’s, like, make making it in quite a radical way. Let’s put it this way.

Speaker: 2
02:59:56

But you have this interesting thing. You have a split between the sides that I’ve noticed, which is one side has all of the experts.

Speaker: 0
03:00:01

Right.

Speaker: 2
03:00:02

Right? And I and I’m using scare quote for for people listening to audio. I’m I’m making quotes in the air with my fingers as vigorously ai I can. Yep. One side has all the certified experts. The other side just has a bunch of people who are, like, they know that something is wrong and they don’t quite know how to explain it.

Speaker: 2
03:00:15

And what was so unusual about the Harvard UNC ai, by the way, in front of the Supreme Court is they actually had sophisticated lawyers for the first time in a long time actually put all the 7 hours together and actually put it in the public record. They actually they actually had experts, which is just which is just really rare.

Speaker: 2
03:00:28

Generally, what you get is you get because if you don’t have experts, what do you have? You know something is wrong, and you have but you have primarily an emotional response. You feel it. But can you put it, you know, can you put it in the words and tables and charts, you know, that a that a certified expert can?

Speaker: 2
03:00:42

And and, no, you can’t. Like, that’s not, you know, that’s not who you are. That doesn’t mean that you’re wrong, and it also doesn’t mean that you have less of a moral stance. Yeah. And so it’s just ai, alright.

Speaker: 2
03:00:51

Now by the way, ai, I think there’s there I I think there are ways to square the circle. I think there’s a way to have our cake and eat it too. Like, I I think there’d be many ways to resolve this. I I think, again, I think the way to do it is to look at these these these issues combined. Look at the at Ai combined with, ai school immigration.

Speaker: 2
03:01:05

It so happens the DEI, is under much more scrutiny today than it has been for probably 20 years, affirmative action is. The Supreme Court did just rule that it is not legal, for universities to do that. They are still doing it, but they should stop. And then there are more and more you’ve seen more companies now also dishing their DEI programs. Mhmm.

Speaker: 2
03:01:29

In part, that’s happening for a bunch of reasons, but it’s happening in part because a lot of corporate lawyers will tell you that the Supreme Court rulings in education either already apply to businesses or just as a clear foreshadowing, the Supreme Court will rule on new cases that will ban in businesses.

Speaker: 2
03:01:44

And so so so there there is a moment here to be able to look at this, on both sides. Let me add one more nuance to it that makes it even more complicated.

Speaker: 0
03:01:54

Yep.

Speaker: 2
03:01:54

So the the cliche is we’re gonna brain drain the world. Right? You’ve heard that? We’re gonna we’re gonna take all the smart people from all over the world, we’re gonna bring them here, we’re gonna educate them, and then we’re gonna keep them, and then they’re gonna raise their families here, create businesses here, create jobs here.

Speaker: 2
03:02:05

Right?

Speaker: 0
03:02:05

In the cliche, that’s a super positive thing.

Speaker: 2
03:02:07

Yeah. Okay. So what happens to the rest of the world?

Speaker: 0
03:02:11

They lose?

Speaker: 2
03:02:13

Well, how fungible are people? How many highly ambitious, highly conscientious, highly energetic, high achieving, high IQ supergeniuses are there in the world? And if there’s a lot, that’s great, but if there just aren’t that many and they all come here and they all aren’t where they would be ai, what happens to all those other places?

Speaker: 2
03:02:37

So it’s almost impossible for us here to have that conversation in part because we become incredibly uncomfortable as a society talking about the fact that people aren’t just simply all the sai, which is a whole thing we could talk about. But, it it it also, we we are purely the beneficiary of this effect. Right? We are brain drain in the world, not the other way around.

Speaker: 2
03:02:56

There’s only 4 so if you look at the flow of high skill immigration over time, there’s only 4 permanent sinks of high skill immigration in places people go. It’s the US, Canada, the UK, and Australia.

Speaker: 0
03:03:07

It’s the it’s the Oh, what? Australia.

Speaker: 2
03:03:08

It’s the 4 it’s 4 of the ai 5 i’s. It’s the major anglosphere countries. And so for those countries, this there this seems like a no lose proposition. It’s all the other countries that basically what what we what we 4 countries been doing is ai all the smart people out. Mhmm.

Speaker: 2
03:03:21

It’s actually much easier for people in Europe to talk about this, I’ve discovered, because the eurozone is whatever, you know, 28 countries. And within the eurozone, the high skilled people over time have been migrating to originally the UK, but also specifically, I think it’s the Netherlands, Germany, and France.

Speaker: 2
03:03:36

But specifically, they’ve been migrating out of the peripheral eurozone countries. And the the the one where this really hit the fan was in Greece. Right? So, you know, Greece falls into chaos, disaster, and then, you know, you’re running the government in Greece and you’re trying to figure out how to put an economic development plan together.

Speaker: 2
03:03:50

All of your smart young kids have left. Like, what are you gonna do? Right? By the way, this is a potential I I know you’ve you care a lot about Ukraine. This is a potential crisis for Ukraine, not because in part because of this, because we enthusiastically recruit Ukrainians, of course, and so we’ve been drain brain draining Ukraine for a long time.

Speaker: 2
03:04:09

Mhmm. But also, of course, you know, war does tend to cause people to to ai out. And so, you know, when it comes time for Ukraine to rebuild as a peaceful country, is it gonna have the talent base even that it had 5 years ago? It’s like a very big and important question. By the way, Russia, like, we have ram drain a lot of really smart people out of Russia.

Speaker: 2
03:04:25

A lot of them are here, right, over the last, you know, 30 years. And so there’s this thing. It’s actually really funny if you think about it. Like, the one thing that we know to be the height of absolute evil that the west ever did was colonization Mhmm. And resource extraction. Right?

Speaker: 2
03:04:40

So we know the height of absolute evil was when the Portuguese and English and, you know, everybody else went and had these colonies and then went in and we, you know, took all the oil, and we took all the diamonds, or we took all the whatever, lithium or whatever it is. Right? Well, for some reason, we ai that that’s a deeply evil thing to do when it’s a physical resource, when it’s a nonconscious physical matter.

Speaker: 2
03:04:57

For some reason we think it’s completely morally acceptable to do it with human capital. In fact, we think it’s glorious and beautiful and wonderful and, you know, the great flowering of, of peace and harmony and moral justice of our time to do it. And we don’t think for one second what we’re doing to the countries that we’re pulling all these people out of.

Speaker: 2
03:05:14

And I I this is one of these things, like, I don’t know, like, maybe we’re just gonna live in this delusional state forever, and we’ll just keep doing it, and it’ll keep benefiting us, and we just won’t care what happens. But, like, I I think there may come this is one of these this is like one of these submarines under 10 feet under the ai, like, I think it’s just a matter of time until people suddenly realize, oh meh god, what are we doing?

Speaker: 2
03:05:33

Because, like, we need the rest of the world to succeed too. Right? Like, we need these other countries to, like, flourish. Like, we don’t wanna be the only successful country in the middle of just, like, complete chaos and disaster. And we just extract and we extract and we extract and we don’t think twice about it.

Speaker: 0
03:05:48

Well, this is so deeply profound, actually. So what is the cost of winning, quote, unquote? If these countries are drained in terms of human capital on the on the level of geopolitics, what does that lead to? Even if we talk about wars and conflict and all of this, we actually want them to be strong in the way we understand strong, not just in every way.

Speaker: 0
03:06:13

So that cooperation and competition can build a better world for all of humanity. It’s interesting. I’ve I’ve been this is one of those, truths where you just speak and it it resonates. Yeah. And I didn’t even think about it.

Speaker: 2
03:06:28

Yeah. Exactly.

Speaker: 0
03:06:29

So this is what you were sitting in the during the holiday season and just boiling over. So all that said Yeah. There’s still do you use some good to the h one b?

Speaker: 2
03:06:40

Okay. So then you get this other okay. So then there’s Things come come all the way around. There’s another nuance. So there’s another nuance. There’s another nuance, which is mostly in the valley, we don’t use h one b’s anymore. Mhmm. Mostly, we use o ones. So there’s a set there’s a you you mean, there’s a separate class of Vatsal, and and the o one is like this it it turns out the o one is the super genius Visa.

Speaker: 0
03:06:57

Mhmm.

Speaker: 2
03:06:57

So the o one is the basically, our found our ai. Like, when we have, like, a when we have somebody from anywhere in the world and they’ve, like, invented a breakthrough in new technology and they wanna come to the US to start a company, they come in through an o one vatsal.

Speaker: 2
03:07:07

And and there and that actually is like a it’s a fairly high bar. It’s a high acceptance rate, but it’s like a pretty high bar arya they they do a lot of work and they there’s like a you you have to put real work into it, really really prove your case. Mostly what’s happened with the h one b vatsal program, is that it has gone to basically 2 categories of employers.

Speaker: 2
03:07:24

1 is, the basically a small set of big tech companies that hire in volume, which is exactly the companies that you would think. And then the other is it goes to these what they call kind of the mills, the consulting mills. Right? And so there’s these set of companies with names I don’t sana pick on companies, but, you know, names like Cognizant that, you know, hire basically have in their business model, is primarily Indian bringing primarily Indians, in in large numbers.

Speaker: 2
03:07:46

And, you know, they often have, you know, offices next to company owned housing and they’ll have, you know, organizations that are, you know, they’ll have, you know, organizations that are literally thousands of Indians you know, living and working in the US and they do basically, call it mid tyler, like, IT consulting.

Speaker: 2
03:08:00

Sai, you know, these folks are making good good good wages, but they’re making 60 or 80 or $100,000 a year, not the, you know, 300,000 that you’d make in the valley. And so, like, in practice, the startup’s basic ai, Little Meh, as we call it, or the startup world, mainly doesn’t use h one b’s at this point, and and mainly can’t because the system is kind of rigged in a way that we really can’t.

Speaker: 2
03:08:22

And then and then and then again, you get to the sort of underlying morality here, which is it’s ai, well, you know, Amazon, ai, Amazon’s a in like, I love Amazon, but, like, they’re a big powerful company. You know, they’ve got, you know, more money than God, they’ve got resources, they’ve got long term planning horizon, they do big, you know, profound things over, you know, decades at a time.

Speaker: 2
03:08:39

You know, they could, you know, or any of these other companies could launch massively effective programs to go recruit the best and brightest from all throughout the the country, and, you know, you’ll notice they don’t do that. You know, they bring in, you know, 10,000, 20000 H1Bs a year. And so you’ve got a question there.

Speaker: 2
03:08:56

And then these mills, like, there’s lots of questions around them, and whether they should, you know, whether that’s even a ethical way, you know I don’t sana say they’re unethical, but there’s questions around, like, exactly what what the trade offs are there. And so, you know, this this Yeah. And this is ai a Pandora’s box that really, you know, nobody really wanted to be opened.

Speaker: 2
03:09:13

You know, to to to play devil’s advocate on all this in terms of, like, national immigration issues, you know, none of this is ai a top end issue just because the numbers are small. Ai? And so, you know, I don’t think, you know, the administration has said, like, this is not like a priority of theirs for right now.

Speaker: 2
03:09:26

But I guess what I would say is, like, there there is actually a lot of complexity and nuance here. I have a lot of friends like I said, I have a lot of friends and colleagues who are, you know, who came over on h one b’s, 0 one’s, green cards, many are now citizens, and, you know, every single one one of them was not every single one.

Speaker: 2
03:09:42

A lot of them were enthusiastic to, you know, defend the honor of immigrants throughout this whole period. And they said to me, it’s like, well, Mark, how can we, you know, how can we how can we more clearly express, you know, the importance of high school immigration to the US?

Speaker: 2
03:09:51

And I was like, I think you can do it by advocating for also developing our native foreign saloni. And be like, do you wanna inflame the issue, or do you wanna diffuse the issue? Mhmm. Right? And I I think I think the answer is to diffuse the issue.

Speaker: 2
03:10:03

Let me give you one more positive scenario, which and then I’ll also beat up on the university some more. Do you do you know about the National Merit Scholarship System? Have you heard about this?

Speaker: 0
03:10:16

Not really. Can you explain?

Speaker: 2
03:10:17

So there’s a system that was created during the Cold War, called the National Merit Scholars, and, it is a basically, it was created, I forget, in the late fifties or sixties when it was when people in government actually wanted to identify the best and the ai. As heretical an idea as that sounds today. And so it’s basically a national talent search for basically IQ.

Speaker: 2
03:10:37

It it its goal is to identify basically the top 0.5% of the IQ, in the country. Ai the way, completely regardless of other characteristics. So there’s no race, gender, or any other aspect to it. It’s just going for straight intelligence. It uses the first the PSAT, which is the preparatory SAT that you take, and then the SAT. Sai it uses those scores. That that that is the scoring.

Speaker: 2
03:11:00

It’s a straight PSAT Sai scoring system. So they use the SAT as a proxy for IQ, which it is. They run this every year. They ai they all they they it’s like they get down to, like, 1% of the population of the kids, 18 year olds in a given year who scored highest on the Speak, and then they get down to they further ai down to the 0.5% that also replicate on the SAT.

Speaker: 2
03:11:21

And then it’s ai the scholarship amount is ai $25100. Right? So it’s like there’s a lot of money 50 years ago, not as much today. But it’s a national system being run literally to find the best and the brightest. Mhmm. How many of our great and powerful universities use this as a scouting system?

Speaker: 2
03:11:37

Ai, our universities all have sports teams. They all have national scouting, they have full time scouts who go out and they go to every high school and they try to find all the great basketball players and bring them into the NCAA into all these leagues. How many of our great and powerful and enlightened universities use the National Merit System to go do a talent, search for the smartest kids and just bring them in?

Speaker: 0
03:11:58

Let me guess. Very few. 0. As you say it, that’s brilliant. There should be that same level of scouting for talent internally.

Speaker: 2
03:12:07

Go get the smartest ones. I’ll give you one more kicker on this topic if you’re not if I haven’t beaten it to death. You know, the SAT has changed. Mhmm. So the SAT used to be a highly accurate proxy for Ai, that caused a bunch of problems. People really don’t like the whole idea of IQ.

Speaker: 2
03:12:24

And so the SAT has been actively managed over the last 50 years by the college board that runs it, and it has been essentially ai everything else. It’s been dumbed down. And so the the in in in 2 ways. Number 1, it’s been dumbed down where, an 800 from 40 years ago does not mean what an 800 means today.

Speaker: 2
03:12:42

And 800 40 years ago, it was almost impossible to get an 800. Today, there’s today, there’s so many 800s that you could stock the entire Ivy League with 800. Ai? Right? And so so so so it’s been deliberately dumbed down.

Speaker: 2
03:12:54

And then 2 is they have they have tried to pull out a lot of what’s called the g loading. And so they’ve they’ve tried to detach it from being an Ai proxy, because IQ is such an inflammatory concept. And and the consequence of that is and this is sort of perverse, they’ve made it more coachable. Right?

Speaker: 2
03:13:09

So the IT for the the SAT 40 years ago, coaching didn’t really work. And more recently, it has really started to work. And one of the things you see is that the Asian spike, you see this like giant leap upward in Asian performance over the last decade. And I I think looking at the data, I think a lot of that is because more coachable now and the and the Asians do the most coaching.

Speaker: 2
03:13:24

So there’s a bunch of issues with this. And so the coaching thing is really difficult because the coaching thing is a subsidy then to the kids whose parents can afford coaching. Mhmm.

Speaker: 0
03:13:33

Right?

Speaker: 2
03:13:34

And I don’t know about you, but where I grew up there was no SAT coaching. So there’s, like, an issue there. I didn’t even know what the SAT was until the day I took it, much less that there was coaching, much less that it could work. So much less we could afford it. So so number 1, there’s issues there. But the other issue there is think about what’s happened by the dumbing down.

Speaker: 2
03:13:49

800 no longer captures all this mark. It it it it 800 is too crude of a test. It’s like the AI benchmarking problem. It’s it’s the same problem having AI benchmarking right now. 800 is too low of a threshold. There are too many kids scoring 800. Mhmm.

Speaker: 2
03:14:04

Because what you want is you want whatever, if it’s gonna be a 100,000 kids, I don’t know what it is, but it’s gonna be 50,000 kids a year scoring 800. You also then want kids to be able to score 910,011, 1200, and you want to ultimately get to, you know, this you’d like to identify ultimately identify the top 100 kids and make sure that you get them in MIT.

Speaker: 2
03:14:20

Mhmm. And the resolution of the test has been reduced so that it actually is not useful for doing that. And again, as I I would sai, this is like part of the generalized corruption that’s taken place throughout this entire system where we we we have been heading in the reverse direction from wanting to actually go get the best and brightest and actually put them in the places, where they should be.

Speaker: 2
03:14:37

And then just the final comment would be the great thing about standardized testing and the national merit system is it’s comp like I said, it’s completely race ai, it’s gender blind. It’s blind on every other characteristic. It’s only done on test scores. You ai, and you can make an argument about whether that’s good or bad, but it is, you know, for sure, you know, it’s the closest thing that we had to get to merit.

Speaker: 2
03:14:56

It was the thing that they did when they thought they needed merit to win the cold war, and of course we could we could choose to do that anytime we want. And and I just say I find it, like, incredibly striking, and an enormous moral indictment of the current system that there are no universities to do this today.

Speaker: 2
03:15:10

So back to the immigration thing just real quick. It’s like, okay, we aren’t even trying to go get the smart kids out of the center in South. And and even if they think that they can get into these places, they get turned down. And the same thing for the smart Asians, and the same thing for the smart Jews, and the same thing for the smart black people.

Speaker: 2
03:15:23

And, like, it it just like, it’s just like like, I don’t know how, like, I I don’t know how that’s moral. Like, I I don’t get it at all.

Speaker: 0
03:15:31

As you said about the 800, so I took the SAT and ACT many ai, and they’re I’ve always gotten perfect on math 800. It’s just and I’m not that I’m not special. Like, ai it it it it doesn’t identify genius. Right. I think you wanna search for genius, and you wanna create measures that find genius of all different ai, speaking of diversity.

Speaker: 0
03:15:56

And Ai guess we should reiterate and say over and over and over, defend immigrants. Yes. But say we should hire more and more native born.

Speaker: 2
03:16:09

You asked me in the beginning, like, what what would what what’s the most optimistic forecast, right, that we could have? And the most optimistic forecast would be, my god, what if we did both? Like

Speaker: 0
03:16:20

So that’s the reasonable, the rational, the smart thing to say here. In fact, we don’t have to have a war.

Speaker: 2
03:16:27

Well, it would diffuse it would diffuse the entire issue. Yeah. If everybody in the center in the south of the country and every Jewish family, Asian family, black family knew they were getting a fair shake, like, it would diffuse the issue. Like, how about diffusing the issue? Like, what a crazy radical ai.

Speaker: 2
03:16:40

I don’t mean to really get out over my skis here, but

Speaker: 0
03:16:43

I think your profile on x states it’s time to build. It feels like 25 2025 is a good year to build. So, I wanted to ask your ai and maybe, for a device for anybody who’s trying to build. So who’s trying to build something useful in the world. Maybe launch a start up or maybe just launch apps, services, whatever, ship software products. So maybe by way of advice, how do you actually get to shipping?

Speaker: 2
03:17:21

Sai, I mean, a big part of the answer, I think, is we’re in the middle of a legit revolution, and I know you’ve been talking about this on your show, but, like, AI coding. I mean, this is the biggest earthquake to hit software in certainly my life, maybe since the meh software.

Speaker: 2
03:17:34

And I’m sure, you know, we’re involved in various of these companies, but, you know, these these tools, you know, from a variety of companies are, just, like, absolutely revolutionary. And and they’re getting better at leaps and bounds ai every day. And you you know all this, but, like, the thing with coding, like, there there’s, like, open questions of whether AI can get better at, like, I don’t know, understanding philosophy or whatever, creative writing or whatever.

Speaker: 2
03:17:56

But, like, for sure, we can make it much better at coding. Mhmm. Right? Because you can validate the results of coding. And so, you know, there’s all these methods of, you know, synthetic data and self training and reinforcement learning that for sure you can do with with coding.

Speaker: 2
03:18:07

And so everybody I know who works in the field says AI coding is gonna get to be phenomenally good. And it’s it’s already great. And you you can I mean, anybody wants to see this, just go on YouTube and look at AI coding demos, you know, little little kids making apps in 10 minutes working with an AI coding system?

Speaker: 2
03:18:21

And so I think it’s the golden I mean, I think this is an area where it’s clearly the golden age, the toolset is extraordinary. You know, in a in a day as a as a coder for sure in a day, you can retrain yourself. You know, start using these things, get a huge boost in productivity. As a noncoder, you can learn much more quickly than you could before.

Speaker: 0
03:18:36

That’s that’s actually a tricky one in terms of learning as a noncoder to build stuff. It’s still I feel like you still need to learn how to code. It it it becomes a superpower. It helps you be much more productive. Like, you could legitimately be a one person, company and get quite far.

Speaker: 2
03:18:55

I agree with that. Up to a point. So the, I think for sure for quite a long time, the people who are good at coding are gonna be the best at actually having Ai code things, because they’re gonna understand what I mean, very basic. They’re gonna understand what’s happening. Right? And they’re gonna be able to evaluate the work, and they’re gonna be able to, you know, literally, like, manage AIs better.

Speaker: 2
03:19:13

Like, even if they’re not literally handwriting the code, they’re just gonna have a much better sense of what’s going on. So I definitely think, like, a 100%, my 9 year old is, like, doing all kinds of coding classes, and he’ll keep doing that for certainly through 18. We’ll see after that. And sai, like, for sure that’s the case.

Speaker: 2
03:19:28

But but look, having said that, one of the things you can do with an AI is say, teach me how to code. Right? And so and, you know, there’s there’s a whole bunch of, you know, I’ll I’ll name names, you know, Khan Academy. Like, there’s a whole bunch of a whole bunch of work that they’re doing at Khan Academy for free, and then we you know, we have this company, Replit, which is was originally specifically built for kids for coding, that is has AI built in that’s just absolutely extraordinary now.

Speaker: 2
03:19:51

And then, you know, there’s a variety of other of other systems like this. And, yeah. That ai meh the AI is gonna be able to teach you to code. Ai, by the way, is, as you know, spectacularly good at explaining code. Mhmm. Right?

Speaker: 2
03:20:03

And so, you know, the tools have these features now where you can talk to the code base, and so you can, like, literally, like, ask the code base questions about vatsal, and you can also just do the simple form, which is you can copy and paste code into chat gpt and just ask it to explain it, what’s going on, rewrite it, improve it, make recommendations.

Speaker: 2
03:20:20

And so there there’s Yeah. There’s dozens of ways to to to do this. By the way, you can also I mean, even more broadly than code, like, you know, okay, you sana make a video game, okay, now you can do AI art generation, sound generation, dialogue generation, voice generation, right?

Speaker: 2
03:20:33

And so all of a sudden, like, you don’t need designers, you know, you don’t need, you know, voice, actors, you know, so yeah. So there’s just ai unlimited And then, you know, biggest, you know, a big part of coding is so called glue, you know, it’s interfacing into other systems.

Speaker: 2
03:20:46

So it’s interfacing into, you know, Stripe to take payments or something like that, and, you know, AI is fantastic at writing glue code. Sai, you know, really really good at making sure that you can plug everything together, really good at helping you figure out how to deploy.

Speaker: 2
03:20:59

You know, it’ll even write a business plan for you. So it it’s just this it’s it’s like everything happening with AI right now. It’s just it’s like this latent superpower, and there’s this incredible spectrum of people who have really figured out massive performance increases, productivity increases with it already.

Speaker: 2
03:21:14

There’s other people who aren’t even aware it’s happening. And there’s some gearing to whether you’re a coder or not, but I think there are lots of non coders that are off to the races, and I think there are lots of professional coders who are still, like, you know, the blacksmiths were not necessarily in favor of, you know, car business.

Speaker: 2
03:21:31

So, there’s the old William Gibson quote, the future is here, it’s just not evenly distributed yet, and this is maybe the most potent version of that that I’ve ever seen.

Speaker: 0
03:21:41

Yeah. There’s, you know, the old meme with the, with the bell curve. The the the people on both extremes say AI coding is the future. Right. It is very common the programmers to say, you know, if you’re any good of a programmer, you’re not going to be using it. That’s just that’s just not true.

Speaker: 0
03:21:58

Now I consider myself reasonably good programmer, and I my productivity has been just ai, and the joy of programming skyrocketed as every aspect of programming is more efficient, more productive, more fun, all that kind of stuff.

Speaker: 2
03:22:15

I I would also say code is, you know, code has code has of anything in, like, industrial sai, code has has the highest which is to say the easier it is to make it, the more it get the more it gets made. Like, I think effectively there’s unlimited demand for code. Like, in other words, like, there’s always some other idea for a thing that you can do, a feature that you can add, or a thing that you can optimize.

Speaker: 2
03:22:34

And so and so, like, overwhelmingly, you know, the amount of code that exists in the world is a fraction of even the ideas we have today, and then we come up with new ideas all the time. And so I I think that, like, you know, I was I was late eighties, early nineties when sort of automated coding systems started to come out, the expert systems, big deal in those days, and there were all these there was a famous book called The Decline and Fall of the American Programmer, you know, that predicted that these new coding systems were gonna mean we wouldn’t have programmers in the future, and, of course, the number of programming jobs exploded by, like, a factor of a 100.

Speaker: 2
03:23:03

Like, my guess will be we’ll have more my guess is we’ll have more coding jobs probably by, like, an order of magnitude 10 years from now. That that will be different. There’ll be different jobs. They’ll they’ll involve orchestrating Ai, but, we’ll there will be we will creating so much more software that the that the whole industry will just explode in size.

Speaker: 0
03:23:21

Are you seeing the size of companies decrease in terms of startups? What’s the landscapes of Little Tech?

Speaker: 2
03:23:28

All we’re seeing right now is the AI hiring boom of all time.

Speaker: 0
03:23:32

All for the big meh.

Speaker: 2
03:23:33

People and and little tech.

Speaker: 0
03:23:34

And little tech.

Speaker: 2
03:23:35

Everybody’s trying to hire as many engineers as they can to build AI systems. It’s just, it’s a 100%. I mean, there there’s a handful of company you know, come come there’s a little bit. There’s there’s customer service, You know, there we have some companies and others, Ai think it’s Klarna that’s publicizing a lot of this, in in Europe, where, you know, there there, you know, there are jobs that can be optimized, and jobs that they can be automated.

Speaker: 2
03:23:56

But, like for engineering jobs, like, it’s just an explosion of hiring. That at least so far there’s no trace of any sort of diminishing effect. Now having said that, I am looking forward to the day. I I am waiting for the first company to walk in saying, meh. Ai, the more radical form of it. But so basically, the the companies that we see are basically one of 2 kinds.

Speaker: 2
03:24:15

We we see the companies that are basically sometimes use weak form, strong form. So the the weak form companies, Ai sometimes use the term it’s it’s the called the 6th bullet point. AI is the 6th bullet point on whatever they’re doing.

Speaker: 0
03:24:28

Sure.

Speaker: 2
03:24:29

Right? And it’s on the slide. Right? So they’ve got the, you know, whatever and then AI is the 6th thing. And the reason AI is the 6th thing is because they had already previously written the slide before the AI revolution started, and so they just added the 6th bullet point in the ai, which is how you’re getting all these products that have, like, the AI button up the corner.

Speaker: 2
03:24:43

Right? The little sparkly button. Yep. Right? And all of a bryden, Gmail is offering to summarize your email, which I’m, like, I don’t need that.

Speaker: 2
03:24:49

Like, I need I need you to answer my email, not summarize it. Like, what the hell? Okay. So we see those, and that’s fine. That’s, like, I don’t know, putting sugar on the cake or something. But then we see the strong form, which is the companies that are building from scratch for AI.

Speaker: 2
03:25:03

Right? And they’re they’re building it. I actually just met with a company that is building literally an AI email system as an example. So just give me

Speaker: 0
03:25:08

a Ai. Sai I’d I can’t wait.

Speaker: 2
03:25:11

Yeah. They’re gonna complete ai. So the very obvious idea, very smart team. You know, it’s gonna be great. You know, and and then, you know, Notion just, you know, another not one of our companies, but just came out with a product. And so there so now companies are gonna basically come through, sweep through, and they’re gonna do basically AI first versions of basically everything.

Speaker: 2
03:25:27

And those arya, like, companies built you know, AI is the first bullet point. Mhmm. It’s it’s the strong form of the argument.

Speaker: 0
03:25:32

Yeah. Cursor is an example that they basically said, okay. We’re gonna re rebuild the thing with AI as the first citizen.

Speaker: 2
03:25:38

What if we knew from scratch that we could build on this? And and and and again, this is ai this is part of the full employment act for startups and VCs is it just ai if if a if a technology transformation is sufficiently powerful, then you actually need to start the product development process over from scratch because you need to reconceptualize the product.

Speaker: 2
03:25:54

And and then usually what that means is you need a new company, because most incumbents just just won’t do that. And so yeah. So that that’s underway across many categories. What I’m waiting for is the company where it’s ai, no, our org chart is redesigned as a result of Ai. Right?

Speaker: 2
03:26:08

And so I’m looking at I’m waiting for the company where it’s like, no, we’re sana have ai, you know and and the cliche here’s a thought experiment, right? The cliche would be we’re gonna have ai the human executive team and then we’re gonna have the AIs be the workers, Right?

Speaker: 2
03:26:19

So we’ll have a VP of engineering supervising a 100 instances of of coding of coding agents. Right? Okay. Maybe. Right?

Speaker: 2
03:26:26

By the way, or maybe, maybe the VP of engineering should be the AI.

Speaker: 0
03:26:30

Mhmm.

Speaker: 2
03:26:31

Yeah. It may be supervising human coders who are supervising AIs. Right? Because one of the things that AI should be pretty good at is managing. Mhmm. Because it’s ai not you know, it’s like a process driven. It’s the kind of thing that AI is actually pretty good at. Right? Performance evaluation coaching.

Speaker: 2
03:26:44

And so should it be an AI executive team? And then, you know, and then, of course, the ultimate question, which is AI CEO. Right? And then, you know, and then there’s and then maybe the most futuristic version of it would be an actual AI agent that actually goes fully autonomous. Yeah. Yeah.

Speaker: 2
03:27:00

What if you really set one of these things loose and let it let it, basically build itself a business? And and so I will say, like, we’re we’re not yet seeing those, and I think there’s a little bit of the systems aren’t quite ready for that yet. And then I think it’s a little bit of you really do need at that point, like, a founder who’s really willing to break all the rules, and really willing to take the swing.

Speaker: 2
03:27:20

And I and and Sai those people exist, and so I’m sure we’ll see that.

Speaker: 0
03:27:23

And some of it is as as you know with all the startups, this is the execution. The the idea that you have sai first email ai, this seems like an obvious idea, but, actually creating one, executing, and then taking on Gmail is really is really difficult. I mean, Gmail, it’s it’s it’s fascinating to see Google can’t do it Because because why? Because the momentum because it’s hard to reengineer the entirety of the system.

Speaker: 0
03:27:48

It feels like Google’s perfectly positioned to to do it. Same with, like, you know, perplexity, which I love. Like, Google could technically take on perplexity and do it much better, but they haven’t. Not yet. So it’s fascinating why that is for large companies.

Speaker: 0
03:28:05

I mean, that that is an advantage for little tech. They could be agile.

Speaker: 2
03:28:09

Yeah. That’s right.

Speaker: 1
03:28:10

They can move fast.

Speaker: 2
03:28:11

Yeah. Little companies can break glass in a way big companies can’t. Right. This is sort of the big breakthrough that Clay Christensen had in the innovator’s dilemma, which is sometimes when big companies don’t do things, it’s because they’re screwing up, and that certainly happens.

Speaker: 2
03:28:22

But a lot of times they don’t do things because it would break too much class. It would specifically, it would, it would it would interfere with their existing customers, and their existing businesses, and they just simply won’t do that. And by the way, responsibly, they shouldn’t do that. Right?

Speaker: 2
03:28:36

And so they just get but this is Clay Christensen’s big thing is they they often don’t adapt because they are well run, not because they’re poorly run. But they’re optimizing machines. They’re they’re they’re optimizing against the existing business. And and and as as you ai just said, this is ai a permanent state of affairs for large organizations.

Speaker: 2
03:28:53

Ai, every once in a while, one breaks the pattern and actually does it. But for the most part, like, this is a very predictable form of human behavior, and this fundamentally is why startups exist.

Speaker: 0
03:29:02

It feels like 2025 is when the race for dominance ai AI will, see some winners. Like, it’s it’s a big year. So who do you think wins the race? OpenAI, Meta, Google, Ai, who do you think wins the AI race?

Speaker: 2
03:29:15

I would say Ai I’m not gonna predict. I sana there’s questions all over the place. And then we have we have this category question we call the $1,000,000,000,000 question, which is, like, literally depending on how it’s answered, people make or lose a $1,000,000,000,000, and I think there’s, like, I don’t know, 5 or $6,000,000,000,000 questions right now that are hanging out there, which is an unusually large number.

Speaker: 2
03:29:32

Yeah. And I just, you know, I’ll just hit a few of them and we can talk about them. So one is big models versus small models. Another is open models versus closed models. Another is whether you can use synthetic data or not.

Speaker: 2
03:29:43

Another is chain of thought, how far can you push that in reinforcement learning. And then another one is political $1,000,000,000,000 questions, or your policy questions, which, you know, the US and the EU have both been flunking dramatically, and the US hopefully is about to really succeed at.

Speaker: 2
03:29:59

Yeah. And then there’s probably another, you know, half dozen big important questions after that. And so these are all just ai sai this is an industry that’s in flux in a way that I even more dramatic, I think, than the ones I’ve seen before. Ai and look, the most example most obvious example, the flux is sitting here 3 sitting here in the summer you know, sitting here less than 3 years ago, sitting here in December 22, we would have said that OpenAI is just running away with everything.

Speaker: 2
03:30:22

Mhmm. And sitting here today, it’s like, you know, there’s at least 6, you know, world class god model companies and teams that arya, by the way, generating remarkably similar results. That’s actually been one of the most shocking things to me is, like, it turns out that once you know that it’s possible to build one incredibly smart Turing test passing large language model, which was a complete shock and surprise, to the world.

Speaker: 2
03:30:44

It turns out within, you know, a year you can have 5 more. There’s also a money component thing to it, which is, to get the money to scale one of these things into the 1,000,000,000 of dollars. There’s basically right now only 2 sources money that will do that for you. 1 is, the hyperscalers giving you the money which you turn around and round trip back to them, or, you know, foreign sovereigns.

Speaker: 2
03:31:02

You know, other you know, country sovereign sovereign wealth funds, which can be, you know, difficult in some cases to for companies to access. Sai there’s a there’s another this maybe another $1,000,000,000,000 question is the financing question. Here’s one. So Sam Altman has been public about the fact that he wants to transition OpenAI from being a nonprofit to being a for profit. Mhmm.

Speaker: 2
03:31:21

The way that that is legally done is that, there is a way to do it. There is a way in US law to do it. The IRS and and other legal entities, government entities ai this very carefully because the US takes foundation nonprofit law very seriously because of the tax exemption.

Speaker: 2
03:31:35

And so the way that historically the way that you do it is you start a for profit and then you you raise money with the for profit to buy the assets of the nonprofit at fair market value. And you know, the last financing round at OpenAI was know, 150 some $1,000,000,000, and so logically the if if the flip is going to happen, the for profit has to go raise $150,000,000,000 out of the chute to buy the assets.

Speaker: 2
03:31:59

You know, raising 150,000,000,000 is a challenge. Sai, you know, is that even possible? If that is possible, then OpenAI maybe is off to the races as a for profit company. If not, you know, you know, I don’t know. And then, you know, obviously the Elon lawsuit. So so just because they’re the market leader today, you know, there’s big important questions there.

Speaker: 2
03:32:17

You know, Microsoft has this kind of love hate relationship with them. Where does that go? Apple’s, you know, lagging badly behind, but, you know, they’re very good at catching up. Amazon, you know, is primarily hyperscaler, but they now have their own models.

Speaker: 0
03:32:29

And then there’s the other questions like you laid out brilliantly, briefly and brilliantly of open versus closed, big versus little models, synthetic data. That’s a huge, huge question. And then, test time compute with, chain of thought, the role of that. And it’s this fast and these are I think it’s fair to say $1,000,000,000,000 questions.

Speaker: 2
03:32:48

Yeah. These are big, like, look, you know, it’s just it’s like, oh, here’s a $1,000,000,000,000 question, which is kind of embedded in that, which is just hallucinations. Right? Ai, so if you are trying to use these tools creatively, you’re thrilled because they can draw new images and they can make new music and they can do all this incredible stuff.

Speaker: 2
03:33:02

Right? They’re creative. The flip side of that is if you need them to be correct, they they can’t be creative and that’s, you know, the term hallucination.

Speaker: 1
03:33:09

Mhmm.

Speaker: 2
03:33:09

And these things do hallucinate. And, you know, there have been, you know, court cases already where lawyers have submitted legal briefs that contain made up court citations case citations. The the judge is like, wait a minute. This doesn’t exist. And the very next question is, did you write this yourself?

Speaker: 2
03:33:23

And the lawyer goes, I mean,

Speaker: 0
03:33:27

that’s why with Elon with Grock Yes. Looking for truth. I mean, that’s an open technical question. How close can you get to truth with LMs?

Speaker: 2
03:33:35

Yeah. That’s right. And and I I my my sense, this is a very contentious topic at the industry. My sense is if to the extent that there is a domain in which there is a definitive and checkable and approvable answer, and you might say math satisfies that, coding satisfies that, and maybe some other fields, then you should be able to generate synthetic data.

Speaker: 2
03:33:53

You should be able to do chain of thought reasoning. You should be able to do reinforcement learning, and you should be able to ultimately, you know, eliminate hallucinations. For but by the way, that’s a $1,000,000,000,000 question right there as to whether that’s true. But then but then there’s question of, like, okay, is that gonna work in the more general domain?

Speaker: 2
03:34:08

Ai, so for example, one possibility is these things are gonna get truly superhuman at, like, math and coding, but at, like, discussing philosophy, they’re gonna just they’re basically as smart as they’re ever gonna be. And they’re gonna be kind of, you know, say midwitt grad student level. And and the theory there would just be they’re already out of training data.

Speaker: 2
03:34:26

Ai, they they they literally If you know, you talk to these people, like, literally the big models the big models are, like, within a factor of 2 x of consuming all the human generated training data to the point that some of these big companies are literally hiring people like doctors and lawyers to sit and write ai training data by hand.

Speaker: 2
03:34:39

And so does this mean that, like, you have to if you want your model to be meh at philosophy, you have to go hire like a 1,000 philosophers and have them write new content? Then is anybody gonna do that? And so, you know, maybe maybe these things are topping out in certain ways and they’re gonna leap way ahead in other ways.

Speaker: 2
03:34:51

And so anyway, so we just don’t you know, I guys, this ai this is I’ll tell you maybe my main main conclusion is Ai I don’t any of these anybody telling you know, anybody telling you these big sweeping conclusions, you know, this whole super you know, all of these abstract generalized superintelligence AGI stuff, ai, it you know, maybe it’s the engineer in meh, but, like, no.

Speaker: 2
03:35:09

Like, that that’s not ai that’s not the core. That’s too abstract, like, it it’s gotta actually work. And then ai the way, it has to actually have to be able to pay for it. I mean, this is a problem right now with, you know, the big models. The big models that are, like, really good at coding and meh, they’re, like, actually very expensive to run. You know, they’re quite slow.

Speaker: 2
03:35:26

Another $1,000,000,000,000 question, future chips, which I know you’ve talked a lot about. Another $1,000,000,000,000 question, yeah, I meh, all the global issues. Oh, another $1,000,000,000,000 question, censorship. Ai? Like and and and and, and all the as they say with all the, human feedback training process exactly what are you training these things to do?

Speaker: 2
03:35:49

What are they allowed to talk about? How long do they give you these how how often do they give these incredibly preaching moral lectures? How or here’s a here’s a here’s a good here’s a $1,000,000,000,000 question, how many other countries want their country to run its education system, healthcare system, new system, political system on the basis of an AI that’s been trained according to the most extreme left wing California politics?

Speaker: 2
03:36:10

Ai? Because that’s kind of what they have on offer right now, and I think the answer to that is not very many. So there’s, like, massive open questions there about, like, what, you know and by the way, like, what morality are these things gonna get trained on as sai

Speaker: 0
03:36:25

And that one, we’re cracking wide open with, what’s been happening over the past few months. Censorship on every level of these companies and just the very idea what truth means and what it means to be expand the Overton window of LLMs or the Overton window of human discourse.

Speaker: 2
03:36:44

Sai what what I experienced, you know, going back to how we started, what I experienced was, alright, social media censorship regime from hell, debanking. Right, at like a large scale, and then the war on the crypto industry ai to kill it, and then basically declared intent to do the same thing to AI, and to put AI under the same kind of censorship and control regime as as social media and the banks.

Speaker: 2
03:37:05

And I and I think this election tipped in Meh, I think this election tipped us from a timeline in which things were going to get really bad on that front, to a timeline in which I think things are sana be quite good. But look those same questions also apply outside the US and, you know, the EU is doing their thing.

Speaker: 2
03:37:21

They’re being extremely draconian, and they’re trying to lock in a political censorship regime on AI right now. That’s so harsh that even American AI companies are not even willing to launch new products in the EU right now. Ai, that’s not sana to last, but ai, what what happens there, ai, and what what are the trade offs, you know, what levels of censorship are American companies going to have to sign up for if they want to operate in the EU or is the EU still capable of generating its own AI companies or have we brain drained them so that they can’t?

Speaker: 2
03:37:49

So big questions.

Speaker: 0
03:37:52

Quick questions. So you’re very active on x, a very unique character, flamboyant, exciting, bold. You post a lot. I think there’s a meme, I don’t remember it exactly, but that Elon posted something like, inside Elon, there are 2 wolves. 1 is, please be kind or more positive, and the other one is, I think, you know, doing the, take a big step back and fuck yourself in the face guy.

Speaker: 0
03:38:23

How many wolves are inside, your mind when you’re tweeting?

Speaker: 2
03:38:27

To be clear, a reference from the comedy classic, Tropic Thunder.

Speaker: 0
03:38:31

Tropic Thunder. Yeah. A legendary movie.

Speaker: 2
03:38:33

Yes. Any Zoomers listening to this who haven’t seen that movie, go watch it immediately.

Speaker: 0
03:38:39

Yeah. There’s nothing offensive about it.

Speaker: 2
03:38:41

Nothing offensive about it at all. So Ai Cruise’s greatest performance. Sai, yeah. No. Look. I’ll just start by saying, like, I’m not supposed to be tweeting at all. So

Speaker: 0
03:38:56

yeah.

Speaker: 2
03:38:56

Yes. Yes. Yes. And so but, you know.

Speaker: 0
03:38:59

So how do you approach that? Like, how do you approach what to tweet?

Speaker: 2
03:39:02

Sai mean, I don’t. I ai. So it’s it’s a it’s a, it it I I don’t I don’t well enough. It’s mostly an exercise in frustration. Look. There’s a glory to it, and there’s there’s a there’s an issue with it. And the glory of it is, like, you know, it’s instantaneous global communication that, you know, in x x in particular is ai the, you know, the town square on all these, you know, social issues, political issues, everything else, current events.

Speaker: 2
03:39:22

Ai, I mean, look, there’s no question the format. The format of vatsal least the original tweet is, you know, prone to be inflammatory. You know, I’m I’m I’m the guy who at one point the entire nation of India hated meh. Ai I once tweeted something, it turned out that it’s still politically sensitive, and the entire continent.

Speaker: 2
03:39:38

I stayed up all night that night as as I became front page headline and bleeding television news in each time zone in India for a single tweet. So, like, the single tweet out of context is a very dangerous thing. Obviously, x now has the the middle ground where they, you know, they they now have the longer form meh.

Speaker: 2
03:39:56

And so, you know, probably the most productive thing I can do is is longer form, is longer form things.

Speaker: 0
03:40:03

You’re not going to do it though, are you?

Speaker: 2
03:40:04

I ai. I do from time to time.

Speaker: 0
03:40:05

Sai ai, I should I

Speaker: 2
03:40:06

should do more of them. And then, yeah, I mean, look, but And, yeah, obviously X is X is doing great. And then, like I said, like Substack, you know, has become the center for a lot, you know, a lot of the I think the best kind of, you know, deeply thought through, you know, certainly intellectual content.

Speaker: 2
03:40:19

You know, tons of current events, stuff there as well. And then, yeah. So and then there’s a bunch of other, you know, a bunch of new systems that are very exciting. So I think one of the things we can look forward to in the next 4 years is number 1, just like a massive reinvigoration of social media as a consequence of the changes that are happening right now.

Speaker: 2
03:40:35

I’m very excited to see the ai to see what’s gonna happen with that. And then, and it’s happened on next, but it’s now gonna happen on other platforms. And then, the other is, crypto sana come, you know, crypto is gonna come right back to life. And actually that’s very exciting.

Speaker: 2
03:40:49

Actually, that’s worth noting is that’s another $1,000,000,000,000 question on AI, which is, in a world of pervasive Ai, and especially in a world of AI agents, and imagine a world of 1,000,000,000 or trillions of AI agents running around, they need an economy. And and crypto, in our view, happens to be the ideal economic system for that. Right? Because it’s a programmable money.

Speaker: 2
03:41:08

It’s a very easy way to plug in and and do that, and there’s this transaction processing system that can that can do that. And so I think the crypto AI intersection, you know, is potentially very a very, very big deal. And so we that was that was going to be impossible under the prior regime, and I think under the new regime, hopefully, it’ll be something we can do.

Speaker: 0
03:41:25

Almost for fun, let me ask, a friend of yours, Jan LeCun, what are your top ai favorite things about Jan LeCun? He’s, I think he’s a he’s a brilliant guy. I think he’s important to the world. I think you guys disagree on a lot of things, but I personally like vigorous disagreement.

Speaker: 0
03:41:43

I, as a person in the stands, like to watch the gladiators go at it. And

Speaker: 2
03:41:49

No. He’s a super genius. I mean, look, it it it I haven’t said we’re super close, but, you know, casual casual friends. I I I worked with him at Meh, you know, he’s the chief scientist at Meta for a long time and still, you know, works with us. And and, you know, and as obviously, he’s a legendary figure in the field and one of the main people responsible for what’s happening.

Speaker: 2
03:42:04

Ai it’s meh serious observation would be that it’s it’s it’s the thing I keep I’ve talked to him about for a long time, and I keep trying to read and follow everything he does is he’s probably he is the I think see if you agree with this. He is the smartest and most credible critic of LLMs is the Path for AI. Yeah.

Speaker: 2
03:42:23

And, he’s not you know, there’s certain, I would say, troll like characters who are just, like, cropping everything. But, like, Jan has, like, very deeply thought through, basically, theories as to why LLMs are an evolutionary dead end. And and I actually ai, I I try to do this thing where I try to model, you know, I try to have a mental model of, like, the 2 different sides of a serious argument.

Speaker: 2
03:42:42

So I Ai bryden, like, internalize that argument as much as I can, which is difficult because, like, we’re investing it behind Ai as aggressively as we can. So if he’s right, like, that could be a big problem but like we should also know that. And then I sort of use his ideas to challenge all the bullish people, you know, to really kind of test their level of knowledge.

Speaker: 2
03:43:01

So I like to kind of grill people, like I’m not like I’m not you know, Ai I was not, you know, I got my CS degree 35 years ago, so I’m not like deep in the technology, but ai if if to the extent I can understand Jan’s points, I can use them to, you know, to really surface a lot of the questions for the people who are more bullish.

Speaker: 2
03:43:17

And that’s been, I think, very productive. Yeah. So, yeah, just it’s very striking that you have somebody who is ai that central in the space who is actually like a full on a full on skeptic. And and, you know, and and again as this could go different ways, he could end up being very wrong.

Speaker: 2
03:43:32

He could end up being totally right, or it could be that he will provoke the evolution of these systems to be meh better than they would have been.

Speaker: 0
03:43:39

Yeah. He could be both right and wrong. I I I first of all, I do I do agree with that. He’s one of the most legit and, rigorous and deep critics of the LLM path to AGI. You know, his basic notions that there needs AI needs to have some physical understanding of the physical world, and that’s very difficult to achieve with LLMs.

Speaker: 0
03:43:59

Sai there and that that is a really good way to challenge the limitations of LLMs and so on. He’s also been a vocal and a huge proponent of open source Yes. Which is a whole another

Speaker: 2
03:44:11

Yes. Which you

Speaker: 0
03:44:12

have been as well.

Speaker: 2
03:44:12

Which is very useful.

Speaker: 0
03:44:13

Yeah. And that’s been just fascinating to watch.

Speaker: 2
03:44:15

And anti doomer.

Speaker: 0
03:44:16

Anti doomer.

Speaker: 2
03:44:17

Yeah.

Speaker: 0
03:44:18

Yeah. He’s he’s He’s

Speaker: 2
03:44:19

very anti doomer.

Speaker: 0
03:44:20

He embodies he also has meh wolves inside of this.

Speaker: 2
03:44:23

Yes he does. Yes he does. Yes he does. Yes he does.

Speaker: 0
03:44:25

So it’s been really really fun to watch.

Speaker: 2
03:44:27

The other 2 okay. Here’s my other wolf coming out. Yeah. The other 2 of the 3 godfathers of AI are like radicals. Like like, full on left, you know, far left, you know, the the ai, they I would say, like, either Marxist or borderline Marxist, and they’re, like, I think quite extreme in their social political views.

Speaker: 2
03:44:44

And I think that feeds into their dermarism, And I think, you know, they they they are lobbying for, like, draconian government. Ai think what would be ruinously destructive government legislation, and regulation. And so it’s it’s actually super helpful super super helpful to have Sana as a counterpoint to those 2.

Speaker: 0
03:44:59

Another fun question. Our mutual friend, Andrew Huberman. Yes. 1st, maybe what do you love most about Andrew? And second, what score on a scale of 1 to 10 do you think he would give you on your approach to health?

Speaker: 2
03:45:11

Oh, 3. Physical. 3.

Speaker: 0
03:45:13

You think you score that ai, Okay.

Speaker: 2
03:45:16

Exactly. 10.

Speaker: 0
03:45:17

That’s good.

Speaker: 2
03:45:18

Exactly. Well, so he can he did he convinced me to stop drinking alcohol, which was a big Successfully. Well, it was ai my other than my family, it was my favorite thing in the world. Yeah. And so it was a major major reduction. Like, having, like, a glass of Scotch at night was like a major like, it was like the thing I would do to relax, and so he has profoundly negatively impacted my emotional health.

Speaker: 2
03:45:36

I, I I blame him for making me much less happy as a person, but meh much much healthier. Yeah. Physically healthier sai that that Ai, I bryden him with that. I’m glad I did that. But then his sleep stuff, like, I’m not doing any of that.

Speaker: 0
03:45:51

Yeah.

Speaker: 2
03:45:51

I have no interest in his sleep shah. Like, no. This whole ai, natural light, no. We’re not doing it.

Speaker: 0
03:45:57

Are you too hardcore for this?

Speaker: 2
03:45:58

I don’t see any I don’t see any natural light ai

Speaker: 0
03:46:00

here. It’s all covered. It’s all horrible.

Speaker: 2
03:46:04

And I’m very happy. I would be very happy living and working here because I’m totally happy without natural light.

Speaker: 0
03:46:09

In darkness. Yes. That must be a metaphor for something.

Speaker: 2
03:46:12

Yes. It’s a test look. It’s a test of manhood as to whether you could have a blue screen in your face for 3 hours and then go right to sleep. Like, I don’t understand why you should wanna take shortcuts.

Speaker: 0
03:46:22

I now understand what they mean by toxic masculinity. Alright. Sai, let’s see. You’re exceptionally successful by most meh. But what do you use the definition of success?

Speaker: 2
03:46:39

I would probably say it is a combination of 2 things. I think it is contribution. So, you know, have you done something that mattered ultimately? And, and, you know, and specifically it mattered to people. And then the other thing is I think happiness is either overrated or almost a complete myth.

Speaker: 2
03:47:01

And in fact, interesting, Thomas Jefferson did not mean happiness the way that we understand it when he said pursuit of happiness in the declaration of independence. He meant it more of the the the Greek meaning, which is closer to satisfaction, or fulfillment. And so I think happiness is so I think about happiness as the first the first, ice cream cone makes you super happy, the first mile of the walk in the park during sunset makes you super happy.

Speaker: 2
03:47:26

The first kiss makes you super happy. The 1,000th ice cream cone, not so much. The south 1000th mile of the walk through the park. The 1000th kiss can still be good, but maybe just not right in a row. Ai?

Speaker: 2
03:47:41

And so happiness is this very fleeting concept, and the people who anchor on happiness seem to go off the rails pretty often. Sai did the deep sense of having been I don’t know how to put it useful.

Speaker: 0
03:47:57

So that’s a good place to arrive at in life. Yeah.

Speaker: 2
03:48:00

I think so. Yeah. I mean, like, can you sit can you yeah. You know, who was it who said that all this the source of all the ills in the world is man’s inability to sit in a room by himself doing nothing, Ai, like, if you’re sitting in a room by yourself and you’re like, ai, you know, 4 in the morning, it’s like, alright, have Sai, like, you know, have I lived up to my expectation of myself?

Speaker: 2
03:48:18

Like, if you have, you know, people I know who feel that way are ai pretty centered, and, you know, generally seem very Ai don’t know how to put it pleased with, you know, proud, calm at speak. The people who are, you know, sensation seekers, you know, some of the sensations ai the way, some sense you know, there’s certain entrepreneurs, for example, who are ai into every form of extreme extreme speak, and they get, you know, huge satisfaction out of that, or, you know, there’s sensation seeking in sort of useful and productive ways.

Speaker: 2
03:48:48

You know, Larry Ellison was always like that, Zuckerberg was like that. And then, you know, there’s a lot of entrepreneurs who end up, you know, drugs. You know, ai, ai, you know, sexual escapades that seem like they’ll be fun at first sana then backfire.

Speaker: 0
03:49:02

Yeah. But at the end of the day, if you’re able to be at peace by yourself in a room at 4 AM Yeah. And I would even say happy, but I know. I understand Thomas Jefferson didn’t mean it the way the way maybe I mean it. But I can be happy by myself at 4 AM with a blue screen.

Speaker: 2
03:49:20

That’s good. Exactly.

Speaker: 0
03:49:21

Staring at cursor.

Speaker: 2
03:49:22

Exactly.

Speaker: 0
03:49:26

As a small tangent, a quick shout out to an amazing interview you did with Barry Weiss and just to her in general. Yep. Barry Weiss of, the free press. She has a podcast called Honestly with Bear Wise. She’s great. People should go listen. You were asked if you believe in god.

Speaker: 0
03:49:42

One of the joys see, we talked about happiness. One of the things that makes me happy is making you uncomfortable.

Speaker: 2
03:49:50

Thank you.

Speaker: 0
03:49:50

So this this question is designed for many of the questions today are designed for that. You were asked if you believe in god and you said after a pause that you’re not sure. So it felt like the pause, the uncertainty there was, some kind of ongoing search for wisdom and meaning. Are you in fact searching for wisdom and meaning?

Speaker: 2
03:50:14

I guess I put it this way. There’s a lot to just understand about people, and Ai feel like I’m only starting to understand, and that’s certainly a simpler concept than God. So that’s what I’ve spent a lot of the last, you know, 15 years trying to figure out. I Ai feel like I spent my first, like, whatever, 30 years figuring out machines, and then now I’m spending 30 years figuring out people, which turns out to be quite a bit more complicated.

Speaker: 2
03:50:40

And then Ai don’t know, maybe God’s the last 30 years or something. And then, you know, ai, I mean, just, you know, like like Elon, it’s just ai, okay. The known universe is, like, very, you know, complicated and, you know, mystifying. I mean, every time I, you know, pull up an astronomy and, like, get super in astronomy, and it’s ai, you know you know, daddy, how many galaxies are there in the universe?

Speaker: 2
03:51:00

And, you know, what’s how many galaxies are there in the universe?

Speaker: 0
03:51:03

100,000,000,000?

Speaker: 2
03:51:04

Okay. Like, how? Yeah. Like Yeah. Like, how is that freaking possible? Like, what like, it it it like, it’s just it’s such a staggering concept that I

Speaker: 0
03:51:15

I actually wanted to show you a tweet that blew my mind from Elon for a while back. He said, Elon said, as a friend called it, this is the ultimate skill tree. This is a wall of galaxies a 1,000,000,000 light years across. Yeah. So these are all galaxies.

Speaker: 2
03:51:32

Yeah. Like, what the like, how how was it that big? Like, how the hell? Ai, like, you know, I can read the textbook and then this and then that and the whatever. 8000000000 years and the big bang and the whole thing, and then it’s just, like, alright. Wow. And then it’s, like, alright. The big bang. Alright.

Speaker: 2
03:51:45

Like, what was what was before the big bang?

Speaker: 0
03:51:50

Do you think we’ll ever we humans will ever colonize, like, a galaxy and maybe even go beyond?

Speaker: 2
03:51:56

Sure. I mean, yeah. I mean, in the fullness of time.

Speaker: 0
03:51:58

Meh. Sai you have that kind of optimism, you have that kind of hope that extends across 1000

Speaker: 2
03:52:02

In the fullness of ai, I meh, yeah. I mean, you yeah. You know, all the all the problems all the challenges with it that I ai. But, like, yeah. Why not? I mean, again, in the fullness of ai, it’ll it’ll take a long time.

Speaker: 1
03:52:10

You don’t think we’ll destroy ourselves?

Speaker: 2
03:52:11

No. I Ai doubt it. I doubt it. And, you know, fortunately, we have Elon giving us giving us the the backup plan. So so I don’t know. Like, I grew up, you know, rural Midwest, sort of just like conventionally kind of Protestant Christian. It never made that much sense to me.

Speaker: 2
03:52:24

Got trained as an engineer and a scientist. I’m like, oh, that definitely doesn’t make sense. I’m like, I know. I’ll spend my life as an empirical, you know, rationalist, and I’ll Ai figure everything out. And then, you know, and then again, you walk up against these things.

Speaker: 2
03:52:35

You know, you you bump up against these things and you’re just like, alright. I Sai like, okay, I guess there’s a scientific explanation for this, but ai, wow. And then there’s, like, alright, where did that come from? Ai? And then how how far back can you go on the causality chain? Yeah. And then ai.

Speaker: 2
03:52:52

I mean, and even even just, you know, experiences that we all have on Earth, it’s it’s hard to it’s hard to rationally explain it all. And then, you know, so, yeah, I guess I’d just say I’m kind of radically open minded, at peace with the fact that I’ll probably never know.

Speaker: 2
03:53:03

The other thing though that’s happened and maybe the more more practical answer to the question is, I think I have a much better understanding now of the role that religion plays in society that I didn’t have when I was younger. And my partner, Ben, has a great he quotes his father on this.

Speaker: 2
03:53:18

He’s ai, if if a man does not have a real religion, he makes up a fake one, and the fake ones go very, very badly. And so there’s this class it’s actually really funny. There’s this class of intellectual. There’s this class of intellectual that has what appears to be a very patronizing point of view, which is, yes, I’m an atheist, but it’s very important that the people believe in something.

Speaker: 2
03:53:37

Right? And Marx had, like, the negative view on that, which is religion is the opium of the masses, but there’s a lot of, like, right wing intellectuals who are themselves, I think, pretty atheist agnostic that are, like, it’s deeply important that the people be Christian or or something like that.

Speaker: 2
03:53:50

And on the one hand, it’s, like, wow, that’s arrogant and presumptive, but on the other hand, you know, maybe it’s right because, you know, what have we learned in the last 100 years is in the absence of a real religion, people will make up fake ones. There’s this, writer there’s this political philosopher who’s super interesting on this named Eric Vogel, and he wrote this he wrote in that sort of mid mid part of the century, mid late part of 20th century.

Speaker: 2
03:54:13

He sai, like, born in, I think, like, 1900 and, like, died in, like, 85, so he saw the complete run of communism and and, Nazism. And himself, you know, fled the I think he fled Europe and and, you know, the whole thing. And, you know, his his sort of big conclusion was basically that both communism and Nazism fascism were were basically religions, were were but, like, in the deep way of religions.

Speaker: 2
03:54:36

Like, they were, you know, you call them political religions, but they were they were ai actual religions and, you know, they were the they were what Nietzsche forecast when he said, you know, god is dead, we’ve killed him, and we won’t wash the blood off our hands for a 1000 years, right?

Speaker: 2
03:54:48

Is we will come up with new religions that will just cause just mass murder and death. And ai, you you read his stuff now and you’re like, yep, that happened, right? And then of course as fully, you know, elite moderns, of course, we couldn’t possibly be doing that to ourselves right now, but of course we are.

Speaker: 2
03:55:03

And, you know, I would argue that Eric Vogel, for sure, would argue that the last 10 years, you know, we have been in a religious frenzy. You know, that woke has been a full scale religious frenzy, and has had all of the characteristics of a religion, including everything from patron saints to holy texts to, you know, sin.

Speaker: 2
03:55:20

It’s that, what what what what what what what what what sai every aspect of a Wokeness has said every I think it said every single aspect of an actual religion other than redemption. Right? Which is maybe ai the most dangerous religion you could ever come up with is the one where there’s no forgiveness. Right?

Speaker: 2
03:55:36

And so I think if Vogelund were alive, I think he would have zeroed right in on shah, would have said that, and you know, we just ai sailed right off. I meh earlier, like, we we somehow rediscover the religions of the Indo Europeans. We’re all into identity politics and environmentalism. Ai, I don’t think that’s an accident.

Speaker: 2
03:55:51

So so so anyway, like, there there is something very deep going on in the human psyche, on religion that is not dismissible, and needs to be taken seriously. Even if one struggles with the the, specifics of it.

Speaker: 0
03:56:09

I think I speak for a lot of people. It has been a a real joy, and for me an honor to get to watch you, seek to understand the human psyche as you described. You’re in that 30 year part of your ai, and it’s been an honor to talk with you today. Thank you, Mark.

Speaker: 2
03:56:26

Thank you, Alex. Is that it? That’s only only how long is that?

Speaker: 0
03:56:31

4 hours with, Mark Andrews. It’s ai 40 hours of actual content.

Speaker: 2
03:56:36

So I’ll I’ll I’ll accept being one of the short ones.

Speaker: 0
03:56:38

Oh, for

Speaker: 2
03:56:39

the for

Speaker: 0
03:56:40

the listener, Mark looks like he’s ready to go for 20 more hours, and I need a nap. Thank you, Mark.

Speaker: 2
03:56:48

Thank you, Lex.

Speaker: 0
03:56:49

Thanks for listening to this conversation with Mark Andreessen. To support this podcast, please check out our sponsors in the description. And now let me leave you with some words from Thomas’ soul. It takes considerable knowledge just to realize the extent of your own ignorance.

Speaker: 0
03:57:07

Thank you for listening, and hope to see you next time ai.

Transcribe, Translate, Analyze & Share

Join 170,000+ incredible people and teams saving 80% and more of their time and money. Rated 4.9 on G2 with the best AI video-to-text converter and AI audio-to-text converter, AI translation and analysis support for 100+ languages and dozens of file formats across audio, video and text.

Start your 7-day trial with 30 minutes of free transcription & AI analysis!

Trusted by 150,000+ incredible people and teams

More Affordable
1 %+
Transcription Accuracy
1 %+
Time Savings
1 %+
Supported Languages
1 +
Don’t Miss Out - ENDING SOON!

Get 93% Off With Speak's Start 2025 Right Deal 🎁🤯

For a limited time, save 93% on a fully loaded Speak plan. Start 2025 strong with a top-rated AI platform.