Elon Musk: 3 Years of X, OpenAI Lawsuit, Inside Twitter HQ, Grokipedia & The Future of Everything

(0:00) Disgraziad Corner: The most disgraceful things of the week! (3:10) Elon on X's new algorithm, why there has been so much Sydney Sweeney content lately (11:35) Creating Grokipedia: Wikipedia's failures, the future of information on the internet, confirmation bias (24:52) Three years of X: Looking back on the Twitter acquisition and how it changed free speech on the internet (42:49) Tesla vote on Elon's compensation, would he leave Tesla if it doesn't pass? (47:40) OpenAI lawsuit, for-profit conversion, how much Elon should own, OpenAI's great irony (56:24) AI power efficiency, Robotaxis, future of self-driving (1:09:34) Bill Gates flips on climate change, solar, energy production Follow Elon: https://x.com/elonmusk Follow the besties: https://x.com/chamath https://x.com/Jason https://x.com/DavidSacks https://x.com/friedberg Follow on X: https://x.com/theallinpod Follow on Instagram: https://www.instagram.com/theallinpod Follow on TikTok: https://www.tiktok.com/@theallinpod Follow on LinkedIn: https://www.linkedin.com/company/allinpod Intro Music Credit: https://rb.gy/tppkzl https://x.com/yung_spielburg Intro Video Credit: https://x.com/TheZachEffect
Your partner in AI voice technology
Transform voice into your most valuable asset.
Capture, transcribe, and analyze audio and video with the Speak platform - or work closely with the team on custom solutions and conversational AI agents.
Try Speak Free Book Consult
Free trial includes 30 minutes , 30 minutes with a work email.
What you can do
Capture, transcribe, and analyze audio, video, or text
Summaries, action items, themes, quotes, and key moments
White-label embeds, repositories, and exports for real workflows
Trusted, fast, global
Users
250,000+
Languages
100+
Exports
DOCX, SRT, VTT, CSV

You can listen to the Elon Musk: 3 Years of X, OpenAI Lawsuit, Inside Twitter HQ, Grokipedia & The Future of Everything using Speak’s shareable media player:

Elon Musk: 3 Years of X, OpenAI Lawsuit, Inside Twitter HQ, Grokipedia & The Future of Everything Podcast Episode Description

(0:00) Disgraziad Corner: The most disgraceful things of the week!

(3:10) Elon on X’s new algorithm, why there has been so much Sydney Sweeney content lately

(11:35) Creating Grokipedia: Wikipedia’s failures, the future of information on the internet, confirmation bias

(24:52) Three years of X: Looking back on the Twitter acquisition and how it changed free speech on the internet

(42:49) Tesla vote on Elon’s compensation, would he leave Tesla if it doesn’t pass?

(47:40) OpenAI lawsuit, for-profit conversion, how much Elon should own, OpenAI’s great irony

(56:24) AI power efficiency, Robotaxis, future of self-driving

(1:09:34) Bill Gates flips on climate change, solar, energy production

Follow Elon:

https://x.com/elonmusk

Follow the besties:

https://x.com/chamath

https://x.com/Jason

https://x.com/DavidSacks

https://x.com/friedberg

Follow on X:

https://x.com/theallinpod

Follow on Instagram:

https://www.instagram.com/theallinpod

Follow on TikTok:

@theallinpod

Follow on LinkedIn:

https://www.linkedin.com/company/allinpod

Intro Music Credit:

https://rb.gy/tppkzl

https://x.com/yung_spielburg

Intro Video Credit:

https://x.com/TheZachEffect
This interactive media player was created automatically by Speak. Want to generate intelligent media players yourself? Sign up for Speak!

Elon Musk: 3 Years of X, OpenAI Lawsuit, Inside Twitter HQ, Grokipedia & The Future of Everything Podcast Episode Top Keywords

Elon Musk: 3 Years of X, OpenAI Lawsuit, Inside Twitter HQ, Grokipedia & The Future of Everything Word Cloud

Elon Musk: 3 Years of X, OpenAI Lawsuit, Inside Twitter HQ, Grokipedia & The Future of Everything Podcast Episode Summary

Based on the provided context, the phrase “has joined the group” refers to someone becoming a member of a group, band, club, or team. Throughout the conversation, there are multiple references to joining various groups, inviting members, and welcoming new people. Specific examples include:

Continue reading the full guide (click to expand)

– “we joined the band”
– “He should’ve joined the…”
– “Join the team.”
– “Welcome to the club.”
– “add one more bestie.”
– “they’re in, they’re in.”
– “invite you to…”

These statements all indicate the act of someone joining or being added to a group or collective. However, the context does not specify exactly who “has joined the group” in a particular instance. The general meaning is clear: it signifies the addition of a new member to a group. If you are looking for a specific individual who joined a specific group, that information is not explicitly provided in the context.

This summary was created automatically by Speak. Want to transcribe, analyze and summarize yourself? Sign up for Speak!

Elon Musk: 3 Years of X, OpenAI Lawsuit, Inside Twitter HQ, Grokipedia & The Future of Everything Podcast Episode Transcript (Unedited)

Speaker: 0
00:00

Let’s get started. You know, we wanted to try something new this week. Every week, you know, I get a little upset. Things perturb me, Sai. And, when it does, I just yell and scream, Disgracia! And so I bought the domain name, disgracias.com, for no reason other than my own amusement. But you know what?

Speaker: 0
00:18

Want to run this on your own file?
Upload audio, video, or text and get a transcript, summary, and insights in minutes.
Try Speak Free Book Consult For voice partners, white-label, routing, and advanced workflows
Free trial includes 30 minutes (60 with a work email)
I I’m not alone in my absolute disgust at what’s going on in the world. So this week, we’re gonna bring out a new feature here on the online podcast, Disgracias Corner.

Speaker: 1
00:32

Desgratiad. He

Speaker: 2
00:33

was the best guy around.

Speaker: 0
00:36

What about the people he murdered?

Speaker: 2
00:37

What murder? You can act like a man.

Speaker: 0
00:41

What the hell

Speaker: 3
00:42

are you doing?

Speaker: 1
00:42

He’s just delusional. He managed. We insulted him a little bit.

Speaker: 0
00:46

I splashed, and I want this spaz. Smells good.

Speaker: 1
00:49

Your hair was in the toilet water. Disgusting. I gotta suffocate you, little

Speaker: 0
00:53

bit. It’s a fucking disgrace.

Speaker: 1
00:58

Disgraziad. Discratiad.

Speaker: 2
01:00

This is fantastic.

Speaker: 0
01:02

This is our new feature. Shamath, you look like you’re ready to go. Why don’t you tell you tell everybody who gets your discratiad this week?

Speaker: 2
01:12

Wait. We all had to come with a discretion?

Speaker: 1
01:14

You’ve ai first

Speaker: 3
01:15

of all, you

Speaker: 0
01:16

said you’ve missed a memo. Alright. Fine. Enough.

Speaker: 2
01:19

I got one. I got one.

Speaker: 0
01:19

Okay. Alright. Just calm down.

Speaker: 2
01:21

My discretion corner goes to Jason Calacanis.

Speaker: 3
01:24

Oh, here we go. Come on,

Speaker: 0
01:25

man. You can’t.

Speaker: 2
01:26

And Pete Buttigieg where they, in the first 30 of the interview, compared virtue signaling points about how each one worked at various moments at Amnesty International. Literally affecting zero change, making no progress in the world, but collecting a badge that they use to hold over other people. Disgrazia.

Speaker: 0
01:47

A lot of letters.

Speaker: 3
01:47

We wrote a lot of

Speaker: 1
01:48

letters. Disgrazia.

Speaker: 0
01:49

Which is good. That means it’s ai a good one because

Speaker: 2
01:51

it’s ai the scenes. Jason, Thomas sana Pete Buttigieg. Great.

Speaker: 0
01:56

I’m glad that I got the first one, and you you can imagine what’s coming next week for you. I saw the Sydney Sweeney dress today trending on social. Disgracii odd. It’s too much. What? It’s too much.

Speaker: 2
02:09

What is it? I didn’t I didn’t even know what this is.

Speaker: 3
02:11

You didn’t

Speaker: 0
02:11

see it?

Speaker: 1
02:12

Bring it up, Nick.

Speaker: 0
02:12

Bring it

Speaker: 3
02:13

up, Nick.

Speaker: 4
02:13

Bring it up. It’s a little floppy.

Speaker: 1
02:14

How is this disgracian? What’s too

Speaker: 0
02:19

much? It’s disgraceful. A little bit of like, look at this. Oh meh god. Too much.

Speaker: 1
02:24

It’s elegant.

Speaker: 0
02:25

Too much. In my day, Sachs, a little, cleavage maybe. Perhaps in the nineties or 2 thousands, some side view. This is too much.

Speaker: 5
02:34

Hey, ai.

Speaker: 2
02:37

Hey. Hey.

Speaker: 5
02:38

Very high bryden subject, madame.

Speaker: 3
02:40

Yeah. Sorry. We were discussing the royal

Speaker: 1
02:42

politics and

Speaker: 3
02:43

the Sydney Sweden press. I don’t know what’s trending on edge.

Speaker: 1
02:49

Hi, Hi, dad. Hi, dad. Put away the phone, Jason.

Speaker: 6
02:54

We’ll let your winner ride. Rain man David. And it said we

Speaker: 3
03:02

open sourced it to the fans, and they’ve just gone crazy with it.

Speaker: 6
03:02

Love you, Wes. I

Speaker: 0
03:10

What’s going on with the algorithm? I’m getting Sydney Sweeney’s dress all day. And lastly, Saks

Speaker: 5
03:15

Well, maybe you should stop everything in.

Speaker: 3
03:17

I’m gonna have to favorite

Speaker: 0
03:19

it 15 times. And then Saks poor Saks got he got invited to Slucon for two weeks straight on the algorithm. No.

Speaker: 4
03:27

I say the algorithm has to come. If you if you demonstrate

Speaker: 1
03:31

It’s actual right.

Speaker: 5
03:32

You can’t even tell if that’s a joke or a real thing.

Speaker: 3
03:34

It’s a real thing. It’s a

Speaker: 1
03:35

real thing.

Speaker: 4
03:36

It’s all too real.

Speaker: 0
03:37

Oh, it it’s actually real? Like, oh, yeah.

Speaker: 5
03:39

That kind of is for real?

Speaker: 4
03:41

I’ve noticed yeah. If you if you demonstrate interest in anything on X now, if you click on it, god forbid you ai something, man,

Speaker: 0
03:50

the algorithm ai on it.

Speaker: 4
03:52

It will give you more of that. It will give you a lot more.

Speaker: 5
03:55

Yes. Yes. So we we did have an issue. We still have somewhat of an issue where, there there was there was an important bug that was figured out that that was solved over the weekend, which caused, in network posts to be, not shown. So so you you basically if you followed someone, you wouldn’t see them wouldn’t

Speaker: 3
04:20

see their

Speaker: 5
04:20

posts. Got it. It’s obviously a big bug. The big but big bug. Then the the algorithm was not probably taking into account if you just dwells on something. Mhmm. But but if but if you if you interacted with it, it would it would go hog wild. So if you pay as David said, if you if you wrote a favorite, reply, or engage with it in some way, it it it is gonna give you a torrent of that same thing.

Speaker: 0
04:53

Oh, Saks. So maybe

Speaker: 1
04:55

you Saks, what was your interaction?

Speaker: 0
04:57

Did you bookmark Slocane? I think you bookmarked it.

Speaker: 4
05:00

Here’s what I thought was good about it, though,

Speaker: 3
05:02

is all of a sudden

Speaker: 5
05:06

I would see If you happen to sport Sydney Sweeney’s boobs Yeah.

Speaker: 1
05:09

And you you would get a lot more of it.

Speaker: 4
05:12

Yeah. That that

Speaker: 5
05:13

yeah. Feel it. Okay.

Speaker: 4
05:15

But what I thought was good about it was that you would see who else had a take on the same subject matter, and that actually has been a useful part of it. Yeah. And so you do you do get more of a you get more of, like, a three sixty view on whatever it is that you’re showing interest in.

Speaker: 5
05:32

Yeah. Yeah. It it it just it’s like it was giving you if you you take a you’d have, like it was just going too far, obviously. It was over correcting. It had too much gain on, just turning up the gain way too high ai any interaction would would you would then get a tar into that.

Speaker: 5
05:49

It’s like it’s like, oh, you had a taste of it? Here, we’re gonna give you three helpings. Yeah. We’re gonna force you we’re sana gonna give you the food funnel.

Speaker: 0
05:59

And and that’s all

Speaker: 1
06:00

being done

Speaker: 0
06:01

I assume it’s all being done with Grok now, so it’s not like the old hard coded algorithm, or is it using Grok?

Speaker: 5
06:08

Well, what what’s happening is that we’re, you know, we’re we’re gradually deleting the, legacy Twitter heuristics. Now the problem is that it’s ai, as you delete these heuristics, it turns out the one heuristic the one bug was covering for the other bug. And so when you delete one side of the bug, you know, it’s like that that meh with the Internet that where there’s, like, this very complicated machine, and there’s, like, a tiny little wooden stick that’s and that’s keeping it going, which was, I guess, Amazon AWS, east or whatever had something like that.

Speaker: 5
06:37

You know, when when when somebody pulled out the little stick, you hit, they would watch this Oops.

Speaker: 1
06:44

I think it’d be good if it

Speaker: 5
06:45

half of Earth. You know?

Speaker: 0
06:47

It would be great if it showed, like, one person you follow, and then, like, it blended the old style, which was just reverse chronological of your friends, the original version, with this new version. So you get, like, a little bit of both.

Speaker: 5
07:01

Well, you can still you still have the plot everyone still has the following tab. Yeah. Now something we’re gonna be adding is the ability to have a curated following tab. Because the problem is, like, if you follow some people and they’re maybe a little, more prolific than you’re, you know Scobble.

Speaker: 0
07:18

Or a scobble.

Speaker: 5
07:20

You know, you you follow someone and some people are are ram much more pro you know, say a lot more than others. That that makes the following tab hard to use. So we’re gonna add a, an option where you can have the following tab be curated. So, Grokle will say, what are the most interesting things posted by your friends? And and we’ll show you that in the following tab. It will also everything.

Speaker: 5
07:44

But, but I think having that option, will make the following tab much more useful. So it’ll be a curated list of people you follow, like, ideally, most interesting stuff that they’ve said, which is kinda what you you you would wanna look at. And then, we’ve we’ve we’ve mostly fixed the bug, which would, give you way too much of something if you interacted with a particular subject matter.

Speaker: 5
08:10

And then the, the really big change, which is where Groth literally reads, everything that’s posted to the platform, which which actually there’s there’s about a 100,000,000 posts per day. So it’s a 100,000,000 piece of content per day. Ai think that’s actually just maybe just in English. I think it goes beyond that because it’s ai of English.

Speaker: 5
08:36

So, Grok is gonna we we’re gonna start off reading the, really what what Grok thinks are the top 10,000,000 of the 100,000,000, and and we’ll actually read them, and understand them and, categorize them and match them to users. It’s like this is a this is not a job humans could ever do.

Speaker: 5
08:57

And and and then once that is scaling reasonably well, we’ll we’ll add the entire 100,000,000 a day. So it’s literally gonna read through a 100,000,000 things and and and and and show you the things that it thinks out of a 100,000,000, posts per day, shah are the most interesting posts to you?

Speaker: 0
09:16

How much of colossus will that take?

Speaker: 3
09:18

Like Lot

Speaker: 5
09:18

of work.

Speaker: 0
09:20

Yeah. That’s ai is it tens of thousands of servers, like, to do that every day?

Speaker: 5
09:25

Yeah. My my guess is it’s probably on the order of 50 k h 100, something like that.

Speaker: 0
09:29

Wow. And that will replace search, so you’ll be able to actually search on Twitter and find things in, like, with a with a plain language.

Speaker: 5
09:39

We’ll have semantic search where you can just ask a question, and it will show you all content, whether that is text, pictures, or video that matches your search query semantically.

Speaker: 2
09:51

How, how’s it been three years in? This is a it was a three year anniversary, like, a couple days.

Speaker: 5
09:57

Three years?

Speaker: 0
09:58

Yeah. Yeah. Remember it was Halloween?

Speaker: 5
10:01

Yeah. Halloween’s back.

Speaker: 0
10:03

Halloween’s back, but it was the when the weekend you took over was Halloween.

Speaker: 3
10:11

Yeah. We had

Speaker: 0
10:12

a good time. Yeah. Wow. Yeah. Three years.

Speaker: 5
10:16

We will think three years from now.

Speaker: 0
10:20

Yeah. What’s the takeaway? Three years later, you were you you obviously don’t regret buying it. It’s sai free speech. That was good. Seemed to have turned that holding around. That was, I think, a big part of your mission. But then you added it to XAI, which makes it incredibly valuable as a data source.

Speaker: 0
10:38

So when you look back on it, the reason you bought it is to stop crazy woke mind virus and make truth exist in the world again. Great. Mission accomplished. And now it has this great future.

Speaker: 5
10:51

Yeah. We’ve got community notes. You can also ask Ai about any any anything you see on the platform. You know, just you just press the Grok icon on any x post, and it’ll analyze analyze it for you, and and research it as much as you want. So you can you can basically have, just by by tapping the Grok icon, you can, assess the, whether that that post is the truth, the whole truth, or nothing but the truth, or whether there’s something supplemental that you need to be explained.

Speaker: 5
11:18

So I think it I think it’s actually we made a lot of progress towards, yeah, freedom of of of speak, and and and and people being able to tell whether something is false or not false. You know, you know, propaganda. The recent update to Grok is actually, I think, very good, at piercing through propaganda.

Speaker: 5
11:39

So, and then we we we used that latest version of to create Crockipedia, which I think is, much, more it it it’s it’s not just, I think, more, neutral that than and and more accurate than than Wikipedia, but actually has a lot more information than a Wikipedia page.

Speaker: 2
11:59

Did you seed it with Wikipedia? Actually, take a step back. How did you guys how did you do this?

Speaker: 5
12:06

Well, we used AI.

Speaker: 2
12:09

But meaning, like, totally unsupervised, just a complete training run on its own, totally synthetic data, no no seeded set, nothing.

Speaker: 5
12:19

Well, it it was only just recently possible for us to do this. So we’ve, we’ve we we finished training on a maximally true seeking ai, if you’re seeing a a version of Grok that is good at cogent analysis. So breaking down, any any given argument into its axiomatic elements, assessing where those axioms are, not you know, the the basic test for cogency, the axioms are likely to be true.

Speaker: 5
12:51

They’re not contradictory, that, the conclusion naturally the the conclusion most likely follows from those axioms. So so we’re we’re just playing grok on a lot of critical thinking. So it it just got really good at critical thinking, which was quite quite hard. And then we took that motion of Grok and said, okay.

Speaker: 5
13:14

Cycle through the the the million most, popular, articles in Wikipedia and and add, modify, and delete. Sai, that means, research the rest of the Internet, whatever’s publicly available, and correct, what’s correct Wikipedia articles and fix mistakes, but also add a lot more context.

Speaker: 5
13:36

So so sometimes, really, the the nature of the propaganda is that, you know, facts are stated that are that are technically true, but but are not represent do not properly represent a picture of the individual or event.

Speaker: 0
13:53

This is critical because when you have a bio as you do, actually, we all do, on Wikipedia, over time, it’s just the people you fired or you you beat in business or have an axe to ai. So it just slowly becomes, like, the place where everybody, you know, kinda who hates you then puts their information. Ai looked at mine.

Speaker: 0
14:14

It was so much more representative, and it was five times longer, six times longer. And the Yeah. What it gave way to, was much more accurate, much more accurate. And this opportunity was sitting here, I think, for a long time.

Speaker: 5
14:30

And

Speaker: 0
14:30

it’s just great that you got to it because they they don’t update my page, but, you know, I don’t know, twice a month with you know, and then who is this secret cobble? There’s 50 people who are anonymous, who decide what’s meh gets put on it. It was a much better, much more updated page in version one.

Speaker: 5
14:49

Yes. This is version 7.1 as we put it as we shah at the top. So, I I do think actually by the time we get to version one one point zero, it’ll be 10 times better. But even at this early stage, as you as you just mentioned, it’s it’s not just that it’s correcting errors, but, it it is creating a a more accurate, realistic, and fleshed out description of of people and events.

Speaker: 5
15:12

Elon, do you think that And and and subject matters. It’s like you can look at articles on on physics and Grokipedia. They’re they’re much better than Wikipedia by far. This is what

Speaker: 2
15:21

I was gonna ask you is, like, do you think that you can take this corpus of pages now and get Google to deboost Wikipedia or boost Grokipedia in traditional search? Because a lot of people still find this, and they believe that it’s authoritative because it comes up number one. Right? So how do we how do we do that?

Speaker: 2
15:39

How do you flip Google?

Speaker: 5
15:42

Yeah. So it it really can if if people share a lot of if if if if if Grokipedia, is used elsewhere, like, if people sai it on their websites, or post about it on social media, or when they do a search when GraphPD shows up, if they click on GraphPD, it will naturally, you know, rise in in Google’s, you know, rankings.

Speaker: 5
16:06

Ai did I did send I did text Sendar because, you know, even even sort of a day after launch, if you tyler in Grokipedia, Google would just say, did you mean Wikipedia?

Speaker: 2
16:16

Wikipedia. Yeah.

Speaker: 3
16:17

Yeah. And it

Speaker: 5
16:17

wouldn’t bring Grokipedia up at all. Yeah. Yeah.

Speaker: 2
16:20

That’s true.

Speaker: 5
16:21

So so now it brings How’s

Speaker: 1
16:22

the use how’s the usage been? Have you seen good growth since it launched?

Speaker: 5
16:26

Yeah. But it’s it’s very early. It went super viral. So we’re we’re, yeah, we’re seeing seeing ai side all over the place. But, yeah, it’s and I think we’ll see it used more and more, to as as people refer to it. Like and people will judge for themselves. When when you read, a Crockpedia article about a subject or a person that you know a lot about and you see, wow, this is way better than than Wikipedia, it’s it’s it’s more comprehensive, it’s it’s way more accurate, it’s not it’s it’s neutral instead of biased, then you’re gonna sai you’re gonna forward those links around and sai that this is actually the better source.

Speaker: 5
17:06

Like, it’s it GraphQL will will will succeed, I think, very well because it’ll it’ll it is fundamentally a superior product to Wikipedia. It is it is a better source of information. And we haven’t even added, images and video yet. So we’re gonna add yeah. We’re gonna add a lot of video.

Speaker: 5
17:27

So, using GraphImagine to create videos. And, so if you’re if you’re ai to explain something, Grok imagine can take the text from Grokipedia and then generate a video, an explanatory video. So if you’re trying to understand anything from how to tie bow tie to, you know, how do some chemical reactions work or, you know, really anything, dietary things, medical things, we could, ram well, you can just go on and and see the video of of how it works.

Speaker: 5
18:04

That’s created by Aida.

Speaker: 2
18:05

When when you have this version that’s maximally truth seeking as a model, do you think that there needs to be a better eval or a benchmark that people can point to that shows how off of the truth things are? So that if you’re gonna start a training run with common crawl or if you’re gonna use Reddit or if you’re gonna use meh

Speaker: 3
18:23

Yeah.

Speaker: 2
18:23

Is it important to be able to, like, say, hey. Hold on a second. This eval just suck like, you guys suck on this eval. Like, it’s just this is crappy data.

Speaker: 5
18:34

Yeah. Ai I guess I’m not I I think I mean, there are a lot of evals out there. I’ve complete confidence that Crockipedia is gonna succeed, because Wikipedia is actually not a very good product. Yeah. It it’s it’s it’s the the information is sparse, wrong, and out of date. And if you can go if you find if and and and it doesn’t have you know, there are very few images. There’s basically no video.

Speaker: 5
19:01

Sai if you have something which is, you know, accurate, comprehensive, has videos, where moreover, you can ask if there’s any part of it that you’re curious about, you can just highlight it and and and and ask Grok right there. Like, if you’re trying to learn something, it’s just great. It’s it’s it’s it’s it’s not gonna be a little bit better than than Wikipedia.

Speaker: 5
19:27

It’s gonna be a 100 times better than

Speaker: 1
19:29

what we do. Elon, do you think you’ll see, like, good uniform usage? Like, if you look back on the last three years since you bought Twitter, there was a lot of people after you bought Twitter that said, I’m leaving Twitter. Elon’s bought it. I’m gonna go to this other wherever the hell they went. And there’s all these news and there’s all these and there’s all these articles that follow-up.

Speaker: 1
19:52

Happened to that

Speaker: 5
19:53

black creature. You know?

Speaker: 1
19:54

Yeah. But but Blue ai

Speaker: 0
19:55

is falling is my favorite.

Speaker: 1
19:56

I guess meh my question is, as you destroy the woke mind viral ai of, control of the system, and as you bring truth to the system, whether the system is through Grockipedia or through x, do people, like, just look for confirmation bias and they actually don’t accept the truth? Like, what do you like, or do you think people are actually going to see the truth and change and then they’re gonna change? Yeah.

Speaker: 1
20:23

But, I mean, is that, like These are Sydney Swinney’s booth. Look great.

Speaker: 5
20:27

Let me see ai

Speaker: 0
20:30

Looking good. Yeah. Solid

Speaker: 5
20:31

solid week up there. Yeah. Putting a

Speaker: 3
20:33

little a

Speaker: 5
20:33

little sheer. Yeah.

Speaker: 3
20:34

Yeah. I

Speaker: 1
20:36

think we just got flagged on YouTube. Yeah.

Speaker: 3
20:39

We did.

Speaker: 1
20:39

We that that was definitely gonna give us a sense of urgency. Yeah.

Speaker: 5
20:43

Grade a moves.

Speaker: 1
20:44

Yeah. No. But but, like, like, but but do people change their mind?

Speaker: 5
20:47

I mean If there’s sai Ai could say there’s no such thing as a grade a move.

Speaker: 0
20:55

God. So if the rails already. David, you were trying to ask a serious question. Go ahead.

Speaker: 1
20:59

Well, I just wanna know if people change their mind. Like, can you actually change people’s minds by putting the truth in front of them? Or do people just take you know, they’re they kind of ignore the truth because they’re they feel like they’re in some sort of ram, and they’re ai, I’m on the side.

Speaker: 2
21:12

They want the confirmation bias.

Speaker: 1
21:13

They want the confirmation bias, and they wanna stay in a camp, and they wanna be tribal about everything.

Speaker: 5
21:18

It it is remarkable how much people believe things, simply because it is their the the belief of the of their in group, you know, whatever their their sort of political, or ideological tribe is. Sai, I mean, there’s some some pretty hilarious videos of, you know, you know, and there was, like, some guy going around, is ai a racist Nazi or whatever.

Speaker: 5
21:44

And and then and and then there and he was, like, trying to show them the videos where of the thing that they are talking about, where he is in fact, condemning the Nazis in strongest possible terms and condemning racism in the strongest possible ai. And they literally don’t even wanna watch the videos.

Speaker: 5
22:00

So so, yeah, the people or at least some people would would they would prefer, They they they will stick to whatever their ideological views are, whatever their sort of political tribal views are, no matter what. The the the evidence could be staring them in the face, and and they’re just gonna be a flat earther. You know?

Speaker: 5
22:23

There’s there is no evidence that you could show to a flat earther to commit convince them the world’s around because everything is just a ai. The world is flat type of thing. I think

Speaker: 0
22:32

the the ability to hit at grok in a reply and ask it a question in the thread has really become ai a truth seeking miss missile on the platform. So when I put up metrics or something like that, I reply to myself and I say, Eckrock, is the information I just shared correct, And can you find any better information?

Speaker: 0
22:52

And please tell me if my argument is correct or if I’m wrong. And then it goes through, and then it DM’s Sachs, and then Sachs gets in my replies and tries to correct me. No. But it does actually a really good job of, like and that combined with community notes, now you’ve got, like, two swings at bat.

Speaker: 0
23:07

The community’s consensus view and then Grok coming in. I think it would be, like, really interesting if Grok on, like, really powerful threads kinda did, like, its own version of community notes and had it sitting there ahead of time. You know? Like, you could look at the thread and it just had next to it, you know?

Speaker: 0
23:24

Or maybe on, like, the specific statistic, you could click on it and it would show you, like, ah, here’s where that statistic’s from.

Speaker: 5
23:30

I mean, you can I mean, pretty much every I mean, essentially, every post on x, unless it’s, like, advertising something, has the GROC symbol on it?

Speaker: 0
23:39

Yeah.

Speaker: 5
23:39

And you just tap that symbol, and you’re one tap away from a GROC analysis. Literally, you just want tap. And we don’t wanna clutter the interface with where it’s Sure.

Speaker: 3
23:47

Yeah.

Speaker: 5
23:47

Finding an explanation. But I’m just saying, if you go on x right now, it’s one tap to get to get analysis. And Groc will research the the the ex post and give you an an accurate answer. And and you can even ask us to do further research and further due diligence, and you you can go as far down the rabbit off as you want as you wanna go.

Speaker: 5
24:08

But I I do think, like, this is, the consistent with we want x to be the the best source of truth on the on the planet by far, and I think it is. And and where you hear, you know, any and all points of view, but but where those points of view are corrected by, human editors with community notes.

Speaker: 5
24:27

And I think the the essence of community notes is that, people who historically disagree agree that this community note is correct. So this and and and all of the community notes, code is open source, and the data is open source. So you can recreate any community note, from scratch as independently.

Speaker: 4
24:49

By and large, it’s worked very well. Yeah.

Speaker: 0
24:51

Yeah.

Speaker: 4
24:52

I think we originally had the idea to have you back on the pod because it was the three year anniversary of the Twitter acquisition. So

Speaker: 5
24:59

Okay.

Speaker: 4
24:59

I just wanted to kinda reminisce a little bit. And I remember yeah. I mean, I remember

Speaker: 5
25:04

Where’s that sync?

Speaker: 4
25:05

Where’s that sync? Well, yeah. So Elon was staying at my house. We had talked the week before, and he told me the deal was gonna close. And so I was like, hey. Do you need a place to stay? And he took me up on it. And the day before he went to the Twitter office, there was a request made to to my staff, do you happen to have an extra sync?

Speaker: 4
25:23

And they did not, but they were able to Yeah.

Speaker: 5
25:26

Who who has an extra sync, really?

Speaker: 4
25:29

But they were able to to locate one at a nearby hardware store, and I think they paid extra to get it out of the window or something.

Speaker: 3
25:36

Gotcha.

Speaker: 5
25:37

Well, I I think the the store was confused because, my security team was asking for, any kind of sink. And and and, ai, like, normally, people wouldn’t ask for any kind of sink. You need a sink that fits in your bathroom or connects to a certain kind of plumbing. So the ai asked is like, well, what kind of process do you sana? That’s no. No.

Speaker: 5
25:56

I just wanted

Speaker: 1
25:57

to sink. Yeah. They think

Speaker: 0
25:58

it’s a mental person going on. I think

Speaker: 5
26:01

the the store was confused that we just wanted a speak. Yes. And didn’t and didn’t care what what the sink connected to.

Speaker: 4
26:09

That was that was a relief.

Speaker: 3
26:10

They were

Speaker: 5
26:10

just like they they were, like, almost not letting us buy the sink because because they they thought maybe we’d buy the wrong sink. You know? It’s just rare that somebody wants a sink for a specific sai.

Speaker: 4
26:23

For meh purposes.

Speaker: 0
26:25

One of my favorite memories was Elon said, hey. You know, swing by. Check it out. And I said, okay. I’ll come by. And I drive up there, and I’m looking where to park the car, and I realized there’s just parking spaces around the entire biz building. And I’m like, okay. This can’t be, like, legal parking, but I park, and it’s legal parking.

Speaker: 5
26:41

Yeah. I mean, you’re in Downtown Sai, so you might get your window broken. Yeah.

Speaker: 0
26:45

I might not be there when I get back. But we get in there, and the place is empty. And then Yeah. Yeah. It it was seriously empty except the cafeteria.

Speaker: 5
26:55

There was an entire, there were two the Twin headquarters sai two buildings. One of the buildings was completely and utterly empty, and the other building, had, like, 5% occupancy.

Speaker: 0
27:06

And in the 5% occupancy, we go to the cafeteria, we all go get something to eat, and we ai there’s more people working in the cafeteria than at Twitter.

Speaker: 5
27:15

There there there were more people making the food than eating the food. Correct. In this giant you know, this giant really nice really nice cafeteria. The you know, this this is where we discovered that the the actual price of of the lunch was $400.

Speaker: 0
27:32

Absolutely.

Speaker: 5
27:34

The the original price was $20, but it had five it went for it was at 5% occupancy, so it was 20 times higher. And they still kept making the same amount pretty meh. So and charging the same amount. So effectively, lunch was $4,400. And and That’s

Speaker: 0
27:49

a great meeting.

Speaker: 5
27:51

Yes. And and then and then there was that that that, where we had the initial meetings sort of the sort of trying to figure out what the heck’s going on meetings in the in in these in in the because, you know, there’s the two buildings, the two Twitter Twitter buildings, and one the one with literally no one in it.

Speaker: 5
28:07

That’s that’s where we had the initial meetings. And, and then we and we’d we’d ai drawing on the whiteboard, and and the the markers had had gone dry. So the nobody used the the whiteboard markers in, like, two years.

Speaker: 2
28:26

So sad.

Speaker: 5
28:27

None of the markers worked. Sai we’re like, this is totally bizarre, but it it was it was totally clean because the the cleaning crew had come in and done their job and cleaned and cleaned an already clean place for, I don’t know, two, three years straight.

Speaker: 0
28:42

It was fun.

Speaker: 3
28:43

And and then and then

Speaker: 5
28:45

I mean, honestly, this is this is this is more crazy than any sort of Mike Judge movie or or, you know, Silicon Valley or anything like that. And and then we could Sai remember going into the men’s bathroom, and and and there’s there’s there’s a table, with, you know, Ai? Menstrual hygiene products. Yep. Yeah.

Speaker: 5
29:10

Refreshed every week. Tampons. Ai, a fresh box of tampons. And and we’re like, but but there’s literally no one in this building. So, but nope. No.

Speaker: 5
29:22

It hadn’t turned off the sana send fresh tampons to the man’s bathroom in the empty building had not been turned off. No. So so every week, they would put a fresh box of tampons in an empty building for years. This happened for years. And and it must have been very confusing to the the people that were being asked to do this because they’re like Okay.

Speaker: 1
29:46

Okay. I’ll throw them away.

Speaker: 3
29:48

Well, I remember when you

Speaker: 5
29:50

Ai guess they’re paying us. So we’ll just put tampon so seriously, I have to consider the the the string of possibilities necessary in order for us anyone to possibly use that tampon in the men’s bathroom, at the unoccupied Second Building of Twitter headquarters, because you’d have to be a burglar, who is a trans man burglar, who’s unwilling to use the woman’s bathroom that also has tampons.

Speaker: 1
30:18

Statistically, you’re probably in the building.

Speaker: 5
30:21

So you broke into the building. And at at that moment, you have a period. Yes. And

Speaker: 3
30:28

you’re on your period.

Speaker: 5
30:30

Ai mean, we are more likely to be struck by a a meteor, than need that tampon. Okay?

Speaker: 1
30:37

Well, I remember it

Speaker: 5
30:38

was I

Speaker: 4
30:40

think it was shortly after that, you discovered an entire room

Speaker: 5
30:45

Yeah.

Speaker: 4
30:45

At the office that was filled with stay woke T shirts.

Speaker: 5
30:49

Yeah. Do you remember this? An an entire pile of merch.

Speaker: 4
30:52

Meh.

Speaker: 5
30:52

Hashtag stay woke.

Speaker: 4
30:54

Stay woke.

Speaker: 5
30:54

And also a a big sort of buttons ai those magnetic buttons that you put on your shirt that said, Ai I am an engineer. I’m like, look, if if you’re an engineer, you don’t need a button ai a big

Speaker: 1
31:09

Who’s the button for? Who are you telling that to?

Speaker: 0
31:12

You could just ship code. We would know. We could check your gift ai.

Speaker: 5
31:17

But, yeah, they’re they’re like scarves, hoodies, all kinds of merch that said hashtag stay woke.

Speaker: 0
31:24

Yeah. A couple of music groups.

Speaker: 4
31:25

That, I was like, my god, man. The barbarians are fully within the gates now.

Speaker: 3
31:29

I mean

Speaker: 5
31:30

The the the barbarian have smashed through the gates and are looting the merch.

Speaker: 4
31:34

Yes. You are rummaging through their holy relics and defiling them.

Speaker: 0
31:39

I meh, but when you think about it, David, the amount of waste that we saw there during those first thirty days, just to be serious about it for a second, this was a publicly traded company.

Speaker: 1
31:50

Right.

Speaker: 3
31:50

So

Speaker: 0
31:50

if you think about the financial duty of those individuals, there was a list of SaaS software we went through, and none of it was being used. Some of it had never been installed and had been paying for it for two years. They’d been paying for a SaaS product for two years. And the the the one that blew my mind the most that we canceled was they were paying a certain amount of money per desk to have desk suiteing software in an office where nobody came to work.

Speaker: 0
32:17

So

Speaker: 5
32:17

they were

Speaker: 0
32:18

paying to rob nobody.

Speaker: 5
32:20

There was there was millions of dollars here being paid for for yes. But for for, analysis of pedestrian like, software that use cameras to analyze the pedestrian traffic to figure out where you can alleviate pedestrian traffic jams, ram an empty building.

Speaker: 0
32:36

Right.

Speaker: 5
32:38

That’s ai 11 out of 10 on a Dilbert scale.

Speaker: 0
32:41

Yeah. It was pretty shout out Scott Adams.

Speaker: 5
32:43

You’ve gone off scale, on on your Dilbert level at that point.

Speaker: 4
32:48

Let’s talk about the free speech aspect for a saloni, because I I think that is the most important legacy of the Twitter acquisition. And I think people have short memories and they forget how bad things were three years ago. First of all, you had figures as diverse as President Trump, Jordan Peterson, Jay Bhattacharya, Andrew Tate. They were all banned from Twitter.

Speaker: 4
33:10

And I remember when you opened up the Twitter jails and reinstated their accounts, kinda you know, freed all the bad boys of free speech.

Speaker: 5
33:18

Stolen the Bastille.

Speaker: 4
33:19

Yes. So you basically gave all the the bad boys of free speech their their accounts back. But second, beyond just the the bannings, there was the shadow bannings. And Twitter had claimed for years that they were not shadow banning. This sai a paranoid, conservative conspiracy theorist. Yeah.

Speaker: 5
33:36

There there there was an a very aggressive shadow banning by, what was called the trust and safety group, which, of course, naturally would be the one that is doing the nefarious shadow banning. And I I just I just think you we shouldn’t have a group called trust and safety.

Speaker: 5
33:53

I mean, this is all well in name if you ever if there ever was one. Ai. I’m from the trust department. Oh, really?

Speaker: 3
34:03

We wanna talk to you about your

Speaker: 1
34:05

tweets. Okay. Can we see your DMs?

Speaker: 5
34:07

Sai that you’re from the trust department? It’s literally that’s the Ministry of Truth right there. Yeah. And the it was just like the ministry of truth.

Speaker: 4
34:16

They they had maintained for years that they were not engaged in this practice, including under oath. And on the heels Yeah. Of you opening that up and exposing that because, by the way, it wasn’t just the fact they were doing it. They created an elaborate set of tools to do this.

Speaker: 4
34:29

They had checkboxes in

Speaker: 3
34:31

the app.

Speaker: 5
34:31

Set of tools to to, yes, to deboost accounts. Yes.

Speaker: 4
34:36

Yes. And, you know, subsequently, we found out that other social networking properties have done this as well, but you were really

Speaker: 5
34:43

forced to

Speaker: 4
34:43

expose it.

Speaker: 5
34:44

This is still being done at the other social media companies. It’s Google, by the way. Sai, for you know? I don’t wanna pick on Google because they’re all doing it. But, for search results, if you simply push a result, pretty far down the page or, you know, the second page of results, like like, you know, the the joke used to be or sana is, I think, like, where do you hide a dead what’s the best place to hide a dead body?

Speaker: 5
35:10

The second page of Google search results because nobody ever goes to the second page of Google search results. So you could you ai hide a dead body there, nobody would find it. And and and you still you it’s it’s then then it’s not like you’ve you haven’t made them go away. You’ve you’ve just, put them on this one page too.

Speaker: 4
35:26

Yes. So shadow banning, I think, was number two. So first was banning. Second was shadow banning. I think third to me was government collusion, government interference. So you released the Twitter files. Nothing like that had ever been done before where you just you actually let investigative reporters go through Twitters, emails

Speaker: 5
35:43

Unfitter sai groups.

Speaker: 3
35:44

I

Speaker: 5
35:44

didn’t I I I was not looking over their shoulder at all. Ai I they just had direct access to everything.

Speaker: 4
35:50

And they found that there was extensive collusion between the FBI and the Twitter trust and safety group where it turns out the FBI had 80 agents submitting takedown requests, and they were very involved in the banning, the shadow banning, the censorship, which I don’t think we ever had definitive evidence of that before. That was pretty extraordinary.

Speaker: 5
36:10

Yeah. And and the the US house of representatives had hearings on the matter, and and and a lot of us, you know, was unearthed. It’s it’s public record. So a lot of people some people on the left still think this is, like, made up. I’m like, this is just literally these the Twitter files are literally the files at Twitter.

Speaker: 5
36:29

I mean, we’re we’re literally just talking about these are the emails that were sent internally that confirm this. This is what’s on the Slack channels, and and this is what is shown in the in the on the Twitter database as where people have made, either, suspensions or shadow vans.

Speaker: 0
36:45

Has the government come and asked you to take stuff down since, or they just have to your the policy is, hey. Listen. You gotta file a warrant. You gotta you you gotta come correct as opposed to just putting pressure on executives.

Speaker: 5
36:58

Yeah. Our our our policy at this point is to follow the law. So, so if if if now, the the laws are obviously different in different countries. So ai, you know, I I get criticized for, like, why don’t I push free speech in x ai z country that doesn’t have free speech laws? I’m like, because that’s not the law there. Yeah. And and if we don’t obey the law, we’ll simply be blocked in that country.

Speaker: 5
37:21

Sai, the the the policy is really just, adhere to the laws in any given country. It is off up to us to agree or disagree with those laws. And if if if if the people of those of that country wants laws to be different, then they should, you know, ask their leaders to change the laws. Yeah.

Speaker: 5
37:43

But but anything that but as as soon as you start going beyond the law, now you’re putting your thumb on the scale. So so the, yeah, the the the I I think I think that’s the right policy is just adhere to the laws within any given country. Now sometimes we get, you know, in a bit of a bind like we had got into with Brazil where, you know, this this this judge in Brazil was asking us to or or telling us to break the law in Brazil and ban accounts contrary to the law of Brazil.

Speaker: 5
38:16

And now we’re now we’re sort of somewhat stuck. We’re like, wait a second. We’re reading the law, and it says this is not allowed to happen, and also that and giving us a gag order. So, like, we’re not allowed to we’re we’re ai to say it’s happening, and we have to break the law, and the judge is telling us to break the law.

Speaker: 5
38:34

The law is breaking the law. That’s where things get, very difficult. And we were actually banned in Brazil for a while because of that.

Speaker: 4
38:41

I just wanna make one final point on the free speech issue, and then we can move on. It’s just I think people forget that the censorship wasn’t just about COVID. There was a growing number of categories of thought and opinion that were being outlawed. The, quote, content moderation, which is another Orwellian euphemism for censorship

Speaker: 5
39:00

Yes.

Speaker: 4
39:00

Was being applied to categories like gender and even climate change. The definition of hate speech was constantly growing.

Speaker: 5
39:08

Yes.

Speaker: 4
39:09

And more and more people were being banned or shadow banned, and there was more and more things that you couldn’t say. This trend of censorship was growing, it was galloping, and it would have continued if it wasn’t, I think, for the fact that you decided to buy Twitter and opened it up.

Speaker: 4
39:23

And it was only on the heels of that that the other social networks were willing to, I think, be a little bit chastened in their policies and start to push back

Speaker: 5
39:32

more. Yeah. That’s right. When once Twitter broke ranks, the others, had to, it became very obvious what the others were doing. And so they had to mitigate, their their censorship substantially as because of what Twitter did. And I mean, firstly, to give them some credit, they also felt that they had the air cover to, to be more inclined towards free speech.

Speaker: 5
39:55

They still do a a lot of sort of, you know, shadow banning and and whatnot at at the other social media companies, but it’s it’s much less than it used to

Speaker: 4
40:05

be. Meh.

Speaker: 1
40:06

Elon, what do you what have you seen in terms of, like, governments creating new laws? So we’ve seen a lot of this crackdown in The UK on what’s being called hateful speech on social media and folks getting arrested and actually going to prison over it. And it seems like when there’s more freedom, the side that is threatened by that comes out and creates their own counter. Right?

Speaker: 1
40:30

There’s a reaction to that, and there seems to be reaction. Are you seeing more of these laws around the world in response to your opening up free speech through Twitter and, and those changes and what they’re enabling that that the governments and the parties that control those governments aren’t ai, and they’re stepping in and saying, let’s create new ways of maintaining our control through law.

Speaker: 5
40:52

Yeah. There there is there’s been an an overall global movement to suppress free speech, under the name of in in in the under the guise of suppressing hate speech. But then, you know, the the it’s the the the problem with with that is that, your freedom of speech only matters, if people are allowed to say things that you’d that that you don’t like or even that things that you hate.

Speaker: 5
41:18

Because, if if you’re allowed to suppress speech that you don’t like, then, and, you know, you you don’t have freedom of speak. And and it’s and it’s only a matter of time before things switch around, and then the shoes on the other foot, and they will suppress you. So, suppress not lest you be suppressed.

Speaker: 5
41:37

But but there there there is a, a movement, and I I there there there was a very strong movement to ai speech suppression into the law throughout throughout the world and including the Western world, you know, the Europe and Australia.

Speaker: 0
41:54

UK and Germany were very, yeah, aggressive in this regard.

Speaker: 5
41:59

Yes. And and my understanding is that for in The UK, there’s something like two or 3,000 people, in prison for social media posts. And in fact that there’s there there’s so many people in, that were in prison for social media posts. And and many of these things arya, like, you you can’t believe that vatsal someone would actually put in put in prison for this.

Speaker: 5
42:18

They they they have in in a lot of cases released people who have committed violent crimes in order to to imprison people who have simply made posts on social media, which is deeply wrong.

Speaker: 3
42:29

Mhmm.

Speaker: 5
42:30

And and and, underscores why the founders of this country made the first amendment the first amendment was freedom of speech. Why ai they do that? It’s because in the places that they came from, there wasn’t freedom of speech, and you could be imprisoned or killed for for saying thanks.

Speaker: 5
42:48

Can I

Speaker: 2
42:49

ask you a question just to maybe move to a different topic? If you came and did this next week, we will be past the Tesla board vote. We talked about it last week, and we talked about how crazy ISS and Glass Lewis is. And

Speaker: 5
43:01

Right.

Speaker: 2
43:01

We use this one insane example where, like, Ira Ehrenpise didn’t get the recommendation from ISS and Glass Lewis because he didn’t meet the gender requirements, but then Kathleen

Speaker: 5
43:13

also did It it it doesn’t make any sense.

Speaker: 2
43:15

Can you so the the board vote is on

Speaker: 5
43:18

the sixth. Sai She was an African American woman. Yeah. Yeah. She she they recommended against her, but then also recommended against, our enterprise, on on the grounds he was insufficiently diverse. So I’m like, this like, these things don’t make any sense. Yeah. Sai I I do think we’ve got a fundamental issue with corporate governance, in publicly traded companies where you’ve got about half of the stock market, is controlled by past fintech funds.

Speaker: 5
43:44

And, most of them out most of them outsource their decision, to, advisory firms and particularly Glassdoor’s and, ISS. I call them corporate ISIS. You know, so it’s it’s ai all they do is basically just they’re just terrorists. Sai, so and and and they had they own no stock in any of these companies.

Speaker: 0
44:09

Right.

Speaker: 5
44:10

So I I think that there’s there’s a fundamental breakdown of fiduciary responsibility here, where really, you know, any company that’s managing, even though they’re passively managing, you know, index funds or whatever, that they do at the end of the day have a fiduciary duty to, vote, you know, along the lines of what would maximize the the shareholder returns because people are counting on them. Like, people, you know, have sai you know, sai has have all their savings since they have four zero one k or something like that.

Speaker: 5
44:43

And they’re they’re counting on, the index funds to vote, do company votes in the direction that would, ensure that their retirement savings, do as well as possible. But the problem is if that is then outsourced to ISS and Glass Lewis, which have been infiltrated by far left activists, because, you know, we’re you know, we’re far like that.

Speaker: 5
45:06

You know you know you know where where basically political activists go. They go where the where the power is. And sai, effectively, Glassdoor’s and ISS, controlled the vote of half the stock market. Now now if if you’re a political activist, you know what a great place would be to go work? Ai sana glass doors, and they do.

Speaker: 5
45:29

So, so my concern for the future, because this the the, you know, the Tesla, thing is is it’s called sort of compensation, but, really, it’s not about compensation. It’s not like I’m ai, I’m gonna go out and buy, you know, a yacht with it or something. It’s just that I I I do Ai in order if I’m going to build up, Optimus and and, you know, have all these robots out there, I need to make sure we do not have a terminated scenario and I and that I can make you know, maximize the safety of the robots.

Speaker: 5
45:58

And and and, but Ai but I I I feel like I I need to have something like a 25% vote, which is enough of a vote to have a strong influence, but not so much of a vote that I can’t be fired if I go insane. So it’s it’s kinda but but my concern would be, you know, creating this army of robots and then and then being fired for political reasons, because of because of ISS and Glass Lewis, you know, ai to ISIS and Glass Lewis ai me effectively, or or the the activists at those firms fire meh, even though I’ve done everything right.

Speaker: 5
46:42

Yeah. That’s my concern. Yeah. And then I and then then then you’ve got and then I and then I cannot assure the the safety of the robots.

Speaker: 0
46:49

If you don’t get that vote, if it doesn’t go your way, it looks like it’s going to, would you leave? I mean, is that even in the cards? I heard they were the board was very concerned about that.

Speaker: 5
47:01

Ai let’s just say I’m not gonna build a robot arya, if I if I can be easily kicked out by activist investors. Yeah. No way.

Speaker: 0
47:09

No way. Yeah. Makes sense. I mean Yeah. And who is capable of running the four or five major product lines at Tesla? I mean, this is the the madness of it. It’s a very complex business. People don’t understand what’s under the hood there. It’s not just a car company. You got batteries, you got trucks, you got the self driving group, and this is a very complex built business that’s you’ve built over decades now.

Speaker: 0
47:34

It’s it’s not a very simple thing to run. I don’t think there’s a Elon equivalent out there who can just jump into the cockpit.

Speaker: 2
47:40

By the way, if we take a full turn around corporate governance corner also this week, what was interesting about the OpenAI restructuring was I read the letter and Yeah. Your lawsuit was excluded from the allowances of the California attorney general basically saying this thing can go through, which means that your lawsuit is still out there. Right?

Speaker: 2
48:03

And I think it’s gonna go to a jury trial. Yes. So there, that corporate governance thing is still very much in question. Do you have any thoughts on that?

Speaker: 5
48:11

Yes. I believe that will go to a jury trial in February or March, and and then we’ll we’ll see what the what the results are there. But, Ai I did there’s there’s an oh, then, like, a a mountain of evidence, that that shows that OpenAI was created as a, an open source nonprofit.

Speaker: 5
48:30

It’s it’s literally that’s the the exact description in the incorporation documents. And in fact, the incorporation documents explicitly say that no officer, or founding member, ai member will be will will benefit financially from OpenAI. And that they have completely violated that.

Speaker: 5
48:48

And more of a you can that then we’re you you can just use the way back machine and look at the the website of OpenAI. Again, open source nonprofit, open source nonprofit the whole way until, you know, it it looked like, wow, this is a there’s a lot of money to be gained here.

Speaker: 5
49:02

And then suddenly it starts changing. And they ai to change the definition of Ai to meh open to everyone instead of open source even though it always meant open source. Right. Sai came up with the name. Yeah. That’s how I know.

Speaker: 0
49:16

So if they open sourced it, or they gave you I mean, you don’t need the money, but if they gave you the percentage ownership in it that you would be rightfully which 50,000,000 for a start up would be half at least. But they must have made an overture towards you and said, hey. Can we just give you 10% of this thing and give us your blessing?

Speaker: 0
49:38

Like, you obviously have a different goal here. Yeah?

Speaker: 5
49:42

Yeah. I mean, essentially, since I came up with the idea for the company, named it, provided the a, b, and c rounds of funding, recruited the, critical personnel, ai told them everything I know. You know, if that had been a a commercial corporation, I’d probably own half the company. So, but and and and I I I could have chosen to do that.

Speaker: 5
50:09

That that I if I it was totally at my discretion. I could have done that. But I created it as a nonprofit for the world, an open source nonprofit for the world.

Speaker: 2
50:19

Do you think the right thing to do is to take those models and just open source them today? If you could affect that change, is that the right thing to do?

Speaker: 5
50:28

Yeah. I think I think, that that that is what the what it was created to do, so it should. I mean, the the best open source models right now, actually, ironically, because ai fade seems to be an irony maximizer. The best open source models are generally from China.

Speaker: 0
50:43

Yeah.

Speaker: 5
50:45

Like, that’s bizarre. And and and and then Ai think the second best one is or ai maybe it’s better than second best, but, like, the, the GROC 2.5, open source model is actually very good. And I think we’d we’d probably be and and and we’ll continue to open source our models.

Speaker: 5
51:06

But, you know, but ai, like, try using any of the the the recent, so called the OpenAI open source models, they’re out. They don’t work. They basically they open sourced a broken nonworking version of of their models as a fig leaf. I mean, do you know anyone who’s ai open opening eyes, open source models? Exactly.

Speaker: 0
51:27

Yeah. Nobody. We’ve had a big debate about jobs here. Obviously, there’s gonna be job displacement. You and I have talked about it for decades. What’s your take on the pace of it? Because, obviously, you’re building soft driving software. You’re building Optimus.

Speaker: 5
51:46

Yeah.

Speaker: 0
51:46

And we’re seeing Amazon take some steps here where they’re like, yeah. We’re probably not gonna hire these positions in the future. And, you know, maybe they’re getting rid of people now because they were bloated, but maybe some of it’s AI. You know, it’s it’s all debatable.

Speaker: 0
51:59

What do you think the ai is? And what do you think, as a society, we’re gonna need to do to mitigate it if it goes too fast?

Speaker: 5
52:09

Well, ai know, I call AI the supersonic tsunami. So, so not the most company description in the world. But but it’s

Speaker: 0
52:21

Fast and big?

Speaker: 5
52:22

If there was a tsunami, a giant wall of water moving faster than the speed of sound, that’s AI.

Speaker: 0
52:28

When does it land?

Speaker: 5
52:30

Yeah. Exactly. So and now now this is happening whether I wanted to or not. I I actually try to slow down AI ai from

Speaker: 1
52:38

that point.

Speaker: 5
52:40

And and then the the reason, you know, I I, the reason I wanted to create OpenAI was to serve as a counterweight to Google because at the time, Google was sort of essentially had unilateral power in AI vatsal all the all the Ai, essentially. And and, you know, Larry Page was not, you know, he he he was not taking AI’s safety seriously. I don’t know. Jason, I’m not sure.

Speaker: 5
53:12

Were you were you there when he he called me a speciesist?

Speaker: 0
53:14

Yes. I was there. Yeah.

Speaker: 5
53:17

K. So

Speaker: 0
53:18

You were more concerned about the human race than you were about the machines. And, yeah, you had a clear bias for humanity.

Speaker: 5
53:25

Yes. Yes. I was exactly. I was ai, Larry Larry, well, what ai, we need to make sure that the AI doesn’t destroy all the humans. And then he called me a speciesist, ai, a racist or something for being pro, human intelligence instead of machine intelligence. I’m like, well, Larry, what side are you on?

Speaker: 5
53:41

I mean, you know, that’s kind of a concern. And and and then at the time, the Google had, essentially, a monopoly on AI.

Speaker: 0
53:52

Yeah. They bought DeepMind, which you were on the board of, had an investment in. Larry and Sergei had invested in as well, and it’s really interesting.

Speaker: 5
53:59

Out about it because I told them about it. And I Ai I showed him some stuff from deep from DeepMind, and and I think that’s how I found out found out about it and and and ai them actually. I gotta be careful what I say. But but the the the the point is that it’s ai, look, Larry is not taking AI safety seriously.

Speaker: 5
54:17

And and and and Google had essentially all the AI and all the computers and all the money, and I’m like, this is a unipolar world where the the guy in charge is not taking things seriously. Sai, and called me a speciesist, for being prohuman. What do you do in those circumstances? You know?

Speaker: 0
54:33

Build a competitor.

Speaker: 5
54:34

Yes. Sai OpenAI was created essentially as the opposite, which is an open source nonprofit, the opposite of Google. Now, unfortunately, it’s it’s it it needs to change its name to closed for maximum profit AI. Yeah. For maximum profit, to be clear. The most about it. They’re ai going for the most amount of profit. Possibly meh. I mean It it is so it is ai like I sai It’s comical.

Speaker: 3
55:00

I

Speaker: 0
55:00

mean, when you hear when you hear some

Speaker: 5
55:03

ai maximizer.

Speaker: 3
55:04

You have

Speaker: 5
55:04

to sai, like, what is the most the most ironic outcome for a company that that was created for to do open source, not at nonprofit AI is it’s super closed source. It’s ai than The the the AI open air source code is is locked up ai in Fort Knox, and, and and they are going

Speaker: 0
55:25

for ai profit. Like, I’m

Speaker: 5
55:28

ai, get the bourbon, the steak knife that, you know. Yeah, I meh. Ai going for the buffet and they’re just diving head first into the profit buffet. Ai mean, it’s just, or at least aspiration, the revenue buffet at least, profit we’ll see. We, I mean, it’s like it’s like ravenous wolves for revenue. Revenous swamp.

Speaker: 5
55:52

The refugee buffet.

Speaker: 0
55:55

No. No. It’s literally, like, super bad. It’s like Bond villain level flip. Like, it went from being Ai Nations to being Spectre in, like, James Bondland.

Speaker: 5
56:05

Yeah.

Speaker: 0
56:06

When you hear him say Ai gonna when when Sam says he’s gonna, like, raise 1,400,000,000,000.0 to build our data centers

Speaker: 5
56:12

Yeah. No. But I think he I think he means it.

Speaker: 0
56:14

Yeah. I mean, it’s I would say, audacious, but I I wouldn’t want to, yeah, insult the word if

Speaker: 2
56:21

Ai, actually, I have a I have a question about this.

Speaker: 0
56:23

How is that possible? In the

Speaker: 2
56:24

earnings call, you said something that was insane, and then I think the math actually nets up, but you said we could connect all the Teslas and allow them in downtime to actually offer up inference, and you can string them on together. I think the math is, like, it could actually be, like, a 100 gigawatts.

Speaker: 0
56:43

Is that right? Did you

Speaker: 5
56:44

do If if ultimately the there’s a Tesla fleet, that is, a 100,000,000 vehicles, which I think we probably will get to at some point, a 100,000,000 vehicle fleet, and, they have, you know, mostly state of the art, inference computers in them, that that each, say, are, a a kilowatt of inference compute, and have built in, power and cooling, and

Speaker: 2
57:08

Yeah. Yeah.

Speaker: 5
57:08

Connect to the Wi Fi.

Speaker: 1
57:09

That’s the key.

Speaker: 2
57:10

Yeah. Exactly.

Speaker: 5
57:12

Yeah. Exactly. And and and and, at the end, you’d have a a 100 gigawatts of inference compute.

Speaker: 1
57:17

Elon, do you think that the architecture like, there is an attention free model that came out the last speak. There’s been all of these papers, all of these new models that have been shown to reduce power per token of output by many, many, many orders of magnitude. Like, not just an order of magnitude, but, like, maybe three or four.

Speaker: 1
57:34

Like, what’s your view in all the work you’ve been doing on where we’re headed in terms of power, per unit of computer, per token of output?

Speaker: 5
57:46

Well, we we have a a clear example of efficient power efficient compute, which is the human brain. So, our brains use about 20 watts, of power, but and all that, only about 10 watts is ai brain function. Most of it’s you know, ai of it is just housekeeping functions, you know, keeping your heart going and breathing and that ai of thing.

Speaker: 5
58:07

So so you’ve got maybe 10 watts of, higher brain function in a human. And we’ve managed to build civilization with 10 watts of, of a biological computer. And that ai computer has, like, a twenty year, you know, boot sequence. Yeah. So it’s pretty pretty fun. But but but but it’s very power efficient.

Speaker: 5
58:31

So, given that, humans are capable of inventing, you know, general relativity and quantum mechanics and, or discovering general relativity like ai, inventing aircraft, lasers, the Internet, and discovering physics with a with a 10 watt, meh computer, essentially, then, there’s clearly a massive opportunity for improving the, efficiency of AI compute.

Speaker: 5
59:01

It’s because it’s it’s it’s currently many orders of magnitude away from that. And and it’s still the case that a a 100 megawatt, or even, you know, a gigawatt, AI supercomputer at this point can’t do everything that, a human can do. It it will be able to, but it can’t yet.

Speaker: 5
59:23

So but but but we’ve we’ve we’ve ai I said, we’ve got this obvious case of, human brains being very power efficient and achieving and and building civilization with with a with a, you know, with with 10 watts of compute. And and and and a very slow and and our our bandwidth is very low.

Speaker: 5
59:43

So the the the speed at which we communicate information to each other is extremely low. You know, we’re we’re not communicating at a terabit. We’re communicating more ai 10 bits per second. Sai,

Speaker: 1
59:58

ai to add that that should

Speaker: 5
59:59

naturally lead you to the conclusion that there is massive, opportunity for being more power efficient with with AI. And and at Tesla and at XAI, we’re both we we we we continue to see massive improvements in in inference computer efficiency. So, yeah.

Speaker: 2
01:00:16

You think that there’s a moment where you would justify stopping all the traditional cars and just going completely all in on ai cab if you felt like the learning was good enough and that the the system was safe enough. Is is there ever a moment like that, or do you think you’ll always kinda dual track and always do both?

Speaker: 5
01:00:38

I mean, all of the cars we make right now, are capable of being a robotaxi. So there’s a little confusion of the terminology because the the the the our cars look normal, you know, like model three ram model y looks you know, it’s a good looking car, but it looks looks normal.

Speaker: 5
01:00:53

But it has an advanced AI computer and advanced AI software and cameras. And we didn’t want the cameras to stick out sai we you know, so that that we don’t want them to be ugly or stick out. So so, you know, we put them they’re sort of an unobtrusive locations. You know, the forward looking camera cameras are in front of the rearview mirror.

Speaker: 5
01:01:11

The side view mirrors are in the side repeaters. Oh, this the side view camera’s on the side repeaters. The the the rear camera is, you know, just in the you know, above the license plate, actually, typically where the rearview camera is in a car. And, you know, and and and the the diagonal forward ones are in the b pillars.

Speaker: 5
01:01:31

Like, if you look closely, you can see all the cameras, but but you have to look closely. We just didn’t want them to be to stick out, like, you know, warts or something. But but, actually, all the cars we make, are hyper intelligent, and have the cameras in the right places. They just look normal.

Speaker: 5
01:01:47

And, so so all of the cars we make are capable of unsupervised full autonomy. Now we we have a dedicated product, which is the ai cab, which has no no steering wheel or pedals, which are obviously prestigious in a autonomous world. And we sai production, of the ai cab in q two next year. And we’ll we’ll scale that up to to quite high volume.

Speaker: 5
01:02:13

I think, ultimately, we’ll make millions of, cyber cabs per year. But but it but it is important to emphasize that all of our cars are capable of being robotic taxis.

Speaker: 0
01:02:23

The cyber cab is gorgeous. I told you I’d buy two of those if you put a steering wheel in them, and there is a big movement around them.

Speaker: 5
01:02:30

Putting a steering wheel in.

Speaker: 0
01:02:31

People are begging for it. Why not? Why not let us buy a couple, you know, you know, just the first ones off the line and drive them? I mean, it’s they look great. It’s like the perfect model. You always had a vision for a model two. Right? Like, isn’t it, like, the perfect model two in addition to being a a cyber cab?

Speaker: 5
01:02:48

Look. The reality is people may think they wanna drive their car, but the reality is that they don’t. How many times have you been, say, in an Uber or Lyft and and you said, you know what? I wish I could take over from the ai. And and and I wish I could get off my phone and and take over from the Uber driver and, and and drive to my destination.

Speaker: 5
01:03:07

How many times have you thought to your thought that to yourself?

Speaker: 0
01:03:10

No. It’s quite the opposite of ram

Speaker: 5
01:03:12

Sai ai. Times. Okay.

Speaker: 0
01:03:14

I have the model y and I just got 14 I have Juniper and I have I got the fourteen one and I put it on Mad Max mode the last couple of days. That is

Speaker: 5
01:03:23

Mad Max mode.

Speaker: 0
01:03:24

A unique experience.

Speaker: 1
01:03:27

I was like,

Speaker: 0
01:03:28

wait a second. This thing is driving in a very unique fashion.

Speaker: 5
01:03:32

Yeah.

Speaker: 0
01:03:33

Yeah. It

Speaker: 5
01:03:33

it assumes you wanna get to your destination in a hurry.

Speaker: 0
01:03:36

Yeah. I I used to get a cab ai. It was an extra $20 to

Speaker: 5
01:03:40

do that. Meh appointment or something. I don’t know.

Speaker: 0
01:03:43

Yeah. It but it’s it feels like it’s getting very close, but you have to be very careful. You know, Uber had a horrible accident with the safety driver. Cruise had a terrible accident. It wasn’t their fault exactly, except, you know, somebody got hit and then it they they hit the person a second time and they got dragged.

Speaker: 5
01:04:02

Yeah. Yeah.

Speaker: 0
01:04:03

You know, there’s this pretty high stakes, so you’re being extremely cautious.

Speaker: 5
01:04:07

The the cars the car is actually extremely capable right now. Yeah. But we are being extremely cautious, and we’re being paranoid about it because to your point, even one accident would would be headline news. Well, probably worldwide headline news.

Speaker: 0
01:04:21

Especially if it’s a Tesla. Like, Waymo, I think, gets a bit of a pass. I think there’s half the country or a number of people probably would, you know, go extra hard on you.

Speaker: 5
01:04:31

Yes. Yeah. But, yeah, exactly. Yeah. Not everyone in the press is my friend. It’s like,

Speaker: 0
01:04:39

Ai hadn’t noticed. Yeah.

Speaker: 5
01:04:41

You know, some of them are a little antagonistic.

Speaker: 1
01:04:43

Yep. So you can just

Speaker: 0
01:04:46

but people are pressuring you to go fast, and I I think is everybody’s gotta just take their time with this thing. It’s obviously gonna happen, but I I just get very nervous that the the pressure to put these things on the road faster than they’re ready is just, a little crazy.

Speaker: 0
01:05:03

So I I applaud you for putting the safety monitor in, doing the safety ai, and no shame in the safety driver game. It’s so much the right decision, obviously, but people are criticizing you for it. I think it’s dumb. It’s the right thing to do.

Speaker: 5
01:05:16

Yes. And and we do expect it to take to not have any, sort of safety, occupant or or there’s hardly a driver that just sai Monitor? Safety safety monitor. Just sits you just sai, they they just sit in the car and don’t do anything.

Speaker: 0
01:05:32

Safety dude.

Speaker: 5
01:05:34

Yeah. So, but we do expect that that the cars will be driving around without any any safety monitor, if before the end of the year. So sometime in December.

Speaker: 0
01:05:44

In Austin. Yeah. I mean, you got a number of reps under your belt in Austin, and it feels like pretty well you you guys have done a great job figuring out where the trouble spots are. Maybe you could talk a little bit about what you learned in the first I don’t know. It’s been, like, three or four months of this so far.

Speaker: 0
01:06:01

What did what did you learn in the first three or four months of the Austin experiment?

Speaker: 5
01:06:06

Actually, it’s it’s gone pretty smoothly. A a lot a lot of things that we’re learning are, just how to manage a fleet. Like, because because you’ve you’ve gotta write all the fleet management software. Right? So

Speaker: 0
01:06:18

Yep.

Speaker: 5
01:06:18

And you’ve you’ve gotta ai the ride hailing software. You’ve gotta write basically, the the software that Uber has, you’ve gotta write that software. It’s just summoning a robot car instead of a car with a driver. So so a lot of the things we’re doing, we’re we’re we’re scaling up the number of cars, to say says, like, what happens if you have a thousand cars?

Speaker: 5
01:06:36

Like, so we’ll we’ll you know, we think probably we’ll have, you know, a thousand cars or more, in the Bay Arya, by the end of this year, probably, I don’t know, 500 or more in the Greater Austin area. And, you know, if if if, you you have to you have to make sure the cars don’t all, for example, go to the same supercharger.

Speaker: 0
01:07:00

That’s fine.

Speaker: 5
01:07:01

Right? So, or or don’t all go to the same intersection. The the there’s there’s it’s ai, what do these cars do? And then, like, sometimes there’s, high demand and sometimes there’s there’s low demand. What do you do during during those times? Do you have a car circle to block?

Speaker: 5
01:07:18

Do you have a ai to find a parking space? The and then, you know, sometimes the like, say it’s a it’s a, you know, disabled parking space or something, but the the ai faded or the the things faded. The ai like, oh, look. A parking space will jump right in there. It’s like Yeah.

Speaker: 0
01:07:35

Get a ticket.

Speaker: 5
01:07:36

Gotta look carefully and make sure it’s it’s, like, you know, it’s not a, an illegal parking speak, or or or or it sees it sees a space to park and it’s, like, ridiculously ai. But it’s like, Sai can get in there. Yeah. But with, like, you know, three inches on either ai.

Speaker: 0
01:07:54

Bad computer.

Speaker: 5
01:07:56

Yeah. That could be true. But but but nobody else would be able to get in the car if you do that. Yeah. So, you know, there’s just, like, all these odd wall corner cases. And,

Speaker: 0
01:08:09

And regulators. Like, regulators are all very, yeah, they’re they’re they have different levels of persicatiness and regulations depending on the city, depending on the airport. I mean, it’s just

Speaker: 5
01:08:23

Yeah.

Speaker: 0
01:08:24

You know, very different everywhere. That’s gonna just be a lot of blocking and tackling, and it just takes time.

Speaker: 1
01:08:29

Elon, let me ask

Speaker: 5
01:08:30

you another Like, in order to take people to San Jose Airport, like, San Jose you actually have to connect to San Jose Airport servers, and because you have to pay a fee every time you Yep. Turn off. So so the car actually has to has to do a remote call. The the robot car has to do, you know, remote procedure call to to San Jose Airport servers to to, say I’m dropping someone off at the airport and charge me whatever $5, which is ai they’re all these, like, quirky things like that.

Speaker: 5
01:09:00

The the the the like, airports are somewhat of a racket. Yeah. So so that’s ai, you know, we have to solve that thing. But it’s kinda funny if the robot car is, like, calling the server the the airport server to to, you know, charge its credit card or whatever. Someone off.

Speaker: 0
01:09:20

To extend a fax. Yeah. We’re we’re gonna be dropping off at this time.

Speaker: 5
01:09:23

But but it it it will soon become extremely normal for to see cars going around with no one in them. Yeah. Yeah. Extremely normal.

Speaker: 1
01:09:30

On on just before, we lose you, I Ai wanna, like, ask if you saw the Bill Gates memo that he put out. A lot of people are talking about this memo. It like, you know, did Ai I

Speaker: 5
01:09:42

guess, like, like Billy g is not my love.

Speaker: 0
01:09:48

Oh, man.

Speaker: 1
01:09:49

Ai, did did did climate change become woke? Did it become, like, woke? And is it over being woke? Like, you know, like, what happened and what’s what what happened with Billy g? I mean, you know,

Speaker: 0
01:10:03

short task. Great question. Great question.

Speaker: 1
01:10:06

Yeah.

Speaker: 5
01:10:08

Ai know, you’d you’d think that ai like Bill Gates who clearly started a tech you know, start started a technology company that’s one of the the biggest companies in the world, Microsoft, being, you you think he’d be really quite, you know, strong in the sciences. But actually, my at least direct conversations with him have he he is he is not strong in the sciences. Like like, he the he didn’t yeah.

Speaker: 5
01:10:37

This is the really shah surprising. No. Like, he he came to visit me at the Tesla Gigafactory in in Austin and was telling me that it’s impossible to have a long range, sai truck. And I was like, well but we literally have them, and you can drive them. And Pepsi is literally using them right now, and you can drive them yourself or send someone.

Speaker: 5
01:11:02

Obviously, Bill Gates is not gonna drive ai himself, but you can send, a trusted person to drive the the truck and verify that it can do the things that we say it’s doing. And he’s like, no. No. It doesn’t work. It doesn’t work. And I’m like, okay. Ai, like, kind of stuck there.

Speaker: 5
01:11:17

Then it’s like I was like, well, so it must be that you disagree with the watt hours per kilogram of the battery pack sai that you’re you must think that perhaps we can’t achieve the energy density of the battery pack or that the watt hours per mile of the truck is too high because and and that when you combine those two numbers, the range is low.

Speaker: 5
01:11:40

And so which one of those numbers do you think we have wrong, and what numbers do you think are correct? And he didn’t know any of the numbers. And I’m like, well, then doesn’t it seem that it’s perhaps, you know, premature to conclude that, long range semi cannot work if you do not know the energy density of the battery pack or the energy efficiency of the of the truck chassis?

Speaker: 0
01:12:06

But, yeah, he he’s now taken a one eighty on climate. He’s saying maybe this should

Speaker: 5
01:12:12

be a top priority. Ai is gay. It

Speaker: 1
01:12:15

just yeah.

Speaker: 5
01:12:16

Why would he say climate is gay?

Speaker: 0
01:12:17

That’s wrong. It’s totally retarded.

Speaker: 5
01:12:22

You see. Well, Paul Gates said ai climate is getting retarded. Come on.

Speaker: 0
01:12:26

I maybe he’s got some data centers he’s gotta put up. Does he have to stand up a data center for for Sam Altman or something? I don’t know. What is Azure?

Speaker: 5
01:12:36

I don’t know.

Speaker: 0
01:12:39

He changed his position? I can’t figure out why.

Speaker: 5
01:12:43

I mean, you know, I mean, the the reality of the whole climate change thing is is that the, you know, you’ve just had sort of people who say it it doesn’t exist at all and then people who say it it’s our super sana saying, you know, Roger’s gonna be underwater in five years. And, obviously, neither of those two positions are true. But, you know, the the the reality is you you can measure the the carbon concentration, the atmosphere.

Speaker: 5
01:13:08

Again, you could just literally buy a c o two, monitor from Amazon. It’s ai $50, and, you can measure it yourself. And, you know, and and you you can say, okay, well, look, the the the the past per million of c o two in the atmosphere has been increasing steadily at two to three per year.

Speaker: 5
01:13:27

At some point, if you continue to take to take, billions, especially trillions of tons of carbon from deep underground and transfer to the atmosphere and oceans, so you transfer it from deep underground into the surface cycle, you will change the chemical constituency of the atmosphere and oceans just that you just literally will.

Speaker: 5
01:13:49

Then you can only then now you sai shah, oh, to what degree and over what ai scale. And the reality is that, in ai opinion, is that we’ve got at least fifty years, before it’s a serious issue. I don’t think we’ve got five hundred years, but but we’ve probably got, you know, fifty. It’s it’s not it’s not five years.

Speaker: 5
01:14:10

So if you’re trying to get to the right order of magnitude of accuracy, I’d say the cons the concern level for climate change is on the order of fifty years. It’s definitely not five, and I think it probably isn’t 500. So, so really, the the right course of action is actually just the reasonable course speak action, which is to lean in the direction of sustainable energy, and, and lean into the direction of of solar and so so of a sort of a solar battery future.

Speaker: 5
01:14:38

And and and generally have the rules of the system lean in that direction. I I don’t think we need massive subsidies, but then we also shouldn’t have massive subsidies for the oil and gas industry. Okay? So the oil and gas gas industry has massive tax write offs. They they don’t even think of as subsidies, because these things have been in place for, in some cases, you know, eighty years. But they’re not there for other industries.

Speaker: 5
01:15:10

So when you’ve got speak tax conditions that are in one industry and not not another industry, I call that a subsidy. Obviously, it it is. But they’re taking it for granted for so long in oil and gas that they don’t think of it as a subsidy. Sai the the right course of action, of course, is to remove, in my opinion, to remove subsidies from all industries.

Speaker: 5
01:15:26

But but the the the political reality is that the oil and gas industry, is very strong in the Republican party but not in the Democratic party. So you you will not see obviously even the tiniest subsidy being removed from the, oil, gas, and coal industry. In fact, there were some that were added to the the oil, gas, and coal industry, in in the the the sort of big bull.

Speaker: 5
01:15:49

And, and and they were a a lot a massive number of of sustainable energy incentives that were removed, some of which I agreed with, by the way. Ai it some of the incentives have gone too far. But, anyway, the the the the the vatsal, I think, object the correct scientific conclusion, in my opinion, and I and I think we can back this up with with solid reasoning.

Speaker: 5
01:16:15

Ask ask Grok, for example, is is that we should, we should lean in the direction of moving towards a sustainable energy future. We we will eventually run out of, oil, sai, and coal to burn anyway, because of ai it’s a it’s a finite there’s a finite amount of that stuff, and we will eventually have to go to something that lasts sai long time that is sustainable.

Speaker: 1
01:16:41

But to your point about the irony of things, it seems to be the case that making energy with solar is cheaper than making energy with some of these carbon based sources today. And so the irony is it’s already working. I mean, the market is moving in that direction. And this notion that we need to kind of force everyone into a model of behavior, it’s just naturally gonna change Right. Because we’ve got better systems.

Speaker: 1
01:17:03

You know, you and others have engineered better systems that make these alternatives cheaper, and therefore, they’re winning. Like, they’re actually winning in the market, which is great. Yes. They can’t win if there are subsidies to support the old systems, obviously.

Speaker: 5
01:17:17

Yeah. I mean, the by the way, there are actually massive disincentives of Foscolo because the because China, China, is is a massive producer of solar panels that during China does an incredible job of solar manufacturing of solar panel manufacturing. Really incredible. They have one and a like, roughly one and a half terawatts of of solar production right now. And and they’re only using a terawatt per year.

Speaker: 5
01:17:42

By the way, that’s a gigantic number. The the, the average, US power consumption is only half a terawatt. So just think about that for a second. China’s, you know, for China’s solar panel out production max capacity is one and a half terawatts per year. US steady state power usage is half a terawatt.

Speaker: 5
01:18:05

Now now you do have to to reduce you say if you do produce one and a half terawatts a year of solar, you need you need to add that with batteries, take into account the the, the differences between night and day, the fact that the solar panel is not always, pointed, directly at the sun, that kind of thing.

Speaker: 5
01:18:21

So you can divide by five ish to say that but but that still means that China has the ability to produce solar panels that have a steady state output that is roughly two thirds that of the entire use economy from all sources, which means that just with solar alone, China can, in one in in eighteen months, produce enough solar panels to power the entire The United States, all the electricity of The United States.

Speaker: 2
01:18:48

What do you think about near field tyler, AKA nuclear?

Speaker: 3
01:18:53

I’m in

Speaker: 5
01:18:53

favor of look. Make make energy from any any way you you you sana. That that doesn’t that doesn’t, like, obviously, harmful to to the environment. Generally, people don’t welcome a nuclear reactor in their backyard. They’re they’re they’re not like championing

Speaker: 2
01:19:09

Put it here.

Speaker: 1
01:19:10

Put it under my bed. Put it

Speaker: 5
01:19:13

on my roof. If if if if your next door neighbor said, hey, I’m selling my house and they’re putting a reactor there. What would you you you know, the typical homeowner response will be negative. It very few people will embrace a nuclear reactor, adjacent to their house. So, but nonetheless, Sai I I do think nuclear is actually very safe.

Speaker: 5
01:19:39

The it’s it’s there’s a lot of scare meh sort of scaremongering and and propaganda around, fission if it let’s see if you tyler about fission. And and but fission is actually very safe. They obviously have this on, you know, the navy. US Navy has this on submarines and aircraft carriers and with with people really walking ai I mean, a submarine’s a pretty crowded place, and they have a nuclear powered submarine.

Speaker: 5
01:20:03

So so so I think I think fission’s fine as as a as a as an option. The the regulatory environment is makes it very difficult to actually get that done. And and then it is important to appreciate just the sheer magnitude of the power of the sun. So this is here are some just important basic facts.

Speaker: 5
01:20:26

Even Wikipedia has these speak. Right? You you know? So you don’t even have you you gotta copy it. Rockpedia. The best answers, but even Wikipedia has Yeah.

Speaker: 0
01:20:35

Even Wikipedia got it. Right? Yes.

Speaker: 5
01:20:36

Yes. I’m saying what I’m saying, even Wikipedia’s got got these vatsal. Right? Yeah. The the the sai is about 99.8 of the mass of the solar system. Then then everything else is in the remaining point 1%, and we arya meh less than point 1%. So if you burnt all of the mass of the solar system, okay, the then the total energy produced by the sun would still round up to a 100%. Mhmm.

Speaker: 5
01:21:10

They could just burnt Earth, the whole planet, and burnt Jupiter, which is very big and and quite challenging to burn. You, you know, and turn your Juno nuclear Jupiter into a thermonuclear active. It wouldn’t matter. The sun compared to the sana. The sun is 99.8% of the mass of the solar system, and everything else is in the miscellaneous category.

Speaker: 5
01:21:36

So, ai, basically, no matter what you do, total energy produced in our solar system rounds up to 100% from the sun. You could even throw another Jupiter in there. So we’re gonna snag a Jupiter from somewhere else, and, somehow teleport you could teleport two more Jupiters, into our solar cells to burn them, and the sun would still ai up to a 100%.

Speaker: 5
01:22:03

You know, sai soon as as long as you’re at 99.6%, you’re still rounding up to a 100%. Maybe that gives some perspective of why solar is really the thing that matters. And and and as soon as you start thinking about things in at sort of a grander scale, ai, Kardashev scale to civilizations, it it becomes very, very obvious.

Speaker: 5
01:22:25

It’s it’s ai, I’m not saying anything that’s new, by the way. Like, anyone who studies physics has known this for, you know, very long time. In fact, Kautashev, I think, was a Russian physicist who came up with this idea, I think, in the sixties, just sai as a way to classify civilizations, where the cost of scale one would be, you’ve used you’re you’re you’ve harnessed most of the energy of of the planet.

Speaker: 5
01:22:52

Cost of scale two, you’re you’ve harnessed most of the energy of your sun. Card shift three, you’ve harnessed most of the energy of galaxy. Now we’re only about, I don’t know, 1% or a few a few percent of card shift scale one right now optimistically. So, but as soon as you go to Kardashev scale two where you’re talking about the power of the sun, then you’re you’re really just saying, everything is solar power, and and and and the the rest is in the noise.

Speaker: 5
01:23:25

And, yeah. So, like, the, you know, like, the the sana produces about a billion times or call it meh, well over a billion times more energy than everything on Earth combined.

Speaker: 0
01:23:45

It’s crazy. It’s mind blowing.

Speaker: 5
01:23:48

Right.

Speaker: 0
01:23:49

Yeah. Yeah. Solar is the obvious solution to all this. And, yeah, I mean, short term

Speaker: 2
01:23:54

Yeah. Ai have

Speaker: 0
01:23:55

to use some of these other sources. But, hey, there it is. An hour and a half to give it ai month.

Speaker: 5
01:23:59

Star powered. Like, maybe we got a branding issue here.

Speaker: 0
01:24:01

Yeah. Star powered.

Speaker: 5
01:24:03

Instead of solar powered, it it’s it’s starlight.

Speaker: 0
01:24:06

Yeah. Starlight.

Speaker: 5
01:24:08

Perfect. It’s the power of a a blazing sun. The ai. The the blaze how much energy does an entire star have? Yeah. Well, more than enough.

Speaker: 0
01:24:21

More than enough.

Speaker: 5
01:24:22

Alright. Ai what and and and ai, also, you really need to keep the power local. So sometimes people honestly, this I’ve had these discussions so many times. It’s it’s, where they say, well, would you beam the power back to earth? I’m like, do you want to melt the earth? Because you would melt the earth if you did that. We’d be ai in an instant.

Speaker: 5
01:24:46

So you you really need to get the power local, you know, basically distributed power. And and and I and Ai guess most would be used for intelligence. So it’s ai the end of the future is ai a whole a whole bunch of, solar powered AI satellites.

Speaker: 1
01:25:01

But Elon, the the only the only thing that makes the star work is it just happens to have a lot of mass, so it has that gravity to ignite the fusion to ignite the fusion reaction. Right? But, like, we could ignite the fusion reaction on Earth now. I don’t know, like, if your view has changed.

Speaker: 1
01:25:15

I think we talked about this couple years ago where you were pretty, ai, we don’t know if or when fusion becomes real here. But, theoretically, we could take,

Speaker: 2
01:25:22

like, 10

Speaker: 5
01:25:23

No. I wanna be clear. My opinion on, so, yeah, I started physics in college. At one point in high school, I was thinking about a career in physics. One of my sons actually is does a career is doing a career in physics. But my the point is I came to conclusions that I’d be waiting for, a collider or or or a telescope.

Speaker: 5
01:25:42

I don’t have any and I have to get that collider in physics, but I have a strong interest in the subject. Sai, so so, my opinion on, say, creating a fusion reactor on Earth is Ai think this is actually not a hard problem. Actually I mean, it’s a little hard. I mean, it’s it’s not, like, totally trivial. But if you just scale up a tokamak, the the bigger you make it, the easier the problem gets.

Speaker: 5
01:26:06

So the the the you got a surface volume ratio, thing where, you know, you you’re trying to maintain a really hot core while having a wall that doesn’t melt. So, that’s a similar problem with with rocket engines. You you’ve got a super hot core in the rocket engine, but you don’t want the the walls chamber walls of the rocket engine to melt.

Speaker: 5
01:26:29

So you have a temperature gradient, where it’s very hot in the middle, and and and gradually gets cold enough as you get to the, perimeter. It gets as you get to the, you know, the the chamber walls in the rocket engine where the it doesn’t melt, because if you’ve lowered the temperature, and and you got a temperature gradient.

Speaker: 5
01:26:48

So just if you just scale up, of, you know, the the donut reactor, tokamak, and and and and improve your source volume ratio, that becomes much easier. And you you can absolutely, in my opinion, I I I think just anyone who looks at the math, you can you can make a a a react a reactor that is part that generates more energy than it consumes.

Speaker: 5
01:27:15

And the bigger you make it, the easier it is. And in the limit, you’re just to have a giant gravitationally contained thermonuclear reactor like the sana. So which requires no maintenance and is free. So this is also why why would we bother doing that on making a little itty bitty sun that’s so microscopic you’d barely notice on Earth when we’ve got the giant free one in the sky.

Speaker: 1
01:27:41

Yeah. But we but we’re we only get a fraction of 1% of that energy on the planet Earth. We have to go Meh

Speaker: 5
01:27:47

less than 1%. Yeah.

Speaker: 3
01:27:48

Yeah.

Speaker: 1
01:27:49

Right. So we’ve gotta figure out how to wrap the sun if we’re gonna harness that energy. That that’s our our long term.

Speaker: 5
01:27:56

If people wanna have fun with reactors, you know, that’s that’s fine. Have fun with reactors. But it’s not a serious endeavor compared to the sun. You It’s it’s it’s it’s sort of a a fun it’s just fun science project to make a nuclear reactor, but it’s not it’s not it’s it’s just peanuts compared to the sun.

Speaker: 5
01:28:14

And and and even the the the solar energy that does reach Earth is a gigawatt per square kilometer or roughly, you know, called two and a half gigawatts per square mile. So that’s a lot. You know? And the commercially available panels are around 25, almost 2026% efficiency.

Speaker: 5
01:28:35

And maybe, you know but you can and then you sai, like, if you speak it densely, you get an 80%, packing density. You’re gonna, which I think, you know, you’ve in a lot of places, you could get an 80% ai density. You effectively have about, you know, a 200 megawatts per square kilometer, and and and and the you need to pair that with batteries sai you so you have continuous power.

Speaker: 5
01:29:01

Although our our power usage drops considerably at night, so you need less batteries than you think. And, and and, and And doesn’t

Speaker: 1
01:29:10

the doesn’t the question

Speaker: 5
01:29:11

then become a rough way to look. Ai very maybe is it a a gigabyte hour per square kilometer per day is a is a a roughly correct number.

Speaker: 1
01:29:21

But then doesn’t your technical challenge become the scalability of manufacturing of those systems? So, you know, accessing the raw materials and getting them out of the ground of planet Earth to make them to make enough of them to get to that sort of scale sana that volume that you’re talking about.

Speaker: 1
01:29:35

And as you kinda think about what it would take to get to that scale, like, do we have an ability to do that with what we have today? Like, can we pull that much material out of the ground?

Speaker: 5
01:29:46

Yes. Solar panels are made of silicon, which is sand, essentially. And, I

Speaker: 1
01:29:52

guess more on the battery side. But

Speaker: 5
01:29:54

Oh, the battery side. Yeah. Yeah. So battery on the battery side, you know, the like, iron phosphate lithium ion battery cells, this, you know, earth and I’d like to throw out some, like, interesting factoids here. If most people don’t know, ai you said, as measured by mass, what is the biggest element? What what is what is earth made of as meh ai mass?

Speaker: 5
01:30:17

Actually, we’re it’s it’s iron.

Speaker: 1
01:30:19

Iron? Yeah.

Speaker: 5
01:30:20

Iron. Yeah. We’re we’re I think 32% iron, 30% oxygen, and and then everything else is in the remain remaining percentage. So, we’re basically a a a a rusty ball bearing, is that’s earth. And and with with with, you know, a lot of silicon at the surface in the in the form of sana.

Speaker: 5
01:30:40

And the the iron phosphate so so iron phosphate lithium ion cells, ai extremely common most common element on Earth, even in the crust. And then, phosphorus is also very common. And, and then the the anode is is carbon, but also very common. And then lithium is also very common.

Speaker: 5
01:30:59

So the there there’s actually you can you can do the math. In fact, we did the math and published the published the math, but ai we looked at it. It’s on the Tesla website that that shows that you can completely power Earth with solar panels and batteries, and, there’s no shortage of anything.

Speaker: 0
01:31:19

Alright. So on that note, yeah, go get to work, Elon, and just power the Earth while you’re getting implants, into people’s brains and, satellites and other good fun stuff. Good to see you, buddy.

Speaker: 5
01:31:33

Yeah. Good to see you ai. Yeah. Yeah. Thanks for

Speaker: 3
01:31:35

stopping by

Speaker: 1
01:31:35

anytime. Thanks for doing this. You got the Zoom link. Stop by anytime.

Speaker: 4
01:31:38

Thank you for coming today, and thank you for liberating free speech three years ago. So Yeah. That was that was a very important milestone.

Speaker: 5
01:31:45

It it it’s it’s and and Ai sai all you guys arya in just different different places. I guess this is a very virtual situation. I’ve always been ai. I’m at the ranch. Sai is on the line. Are you ever in the same room?

Speaker: 0
01:31:56

We try not to be.

Speaker: 1
01:31:57

Only when Ai only only when we do that that summit. But ai, we avoid each other. Sun. Yeah.

Speaker: 5
01:32:03

Your summit is your your summit is is is pretty fun. Yeah. Yeah.

Speaker: 1
01:32:08

We had

Speaker: 5
01:32:08

a great time recounting, SNL sketches that didn’t didn’t make it.

Speaker: 0
01:32:11

Oh, god. There’s just so many good ones.

Speaker: 5
01:32:15

I mean,

Speaker: 0
01:32:16

we didn’t even get to the jeopardy ones.

Speaker: 5
01:32:19

Yeah. Yeah.

Speaker: 0
01:32:19

Like, no cost. So offensive.

Speaker: 5
01:32:21

Oh, wait. We did oh, no. Well, I think we skipped a few that would’ve, dramatically increased our probability of being killed.

Speaker: 0
01:32:27

We can take this one out.

Speaker: 2
01:32:29

Voice, I love you. I love you.

Speaker: 5
01:32:30

I love you all. I’m going

Speaker: 1
01:32:32

to poker. Later.

Speaker: 5
01:32:33

Take care. Alright.

Speaker: 3
01:32:33

Tell me to you next

Speaker: 5
01:32:34

week. Bye bye.

Speaker: 2
01:32:35

Love you.

Speaker: 5
01:32:35

Take care. Take care.

Speaker: 6
01:32:37

We’ll let your winners ride. Rain meh David Sacks. And it said, we open sourced it to the fans, and they’ve just gone crazy like it. Love you, West.

Speaker: 3
01:32:49

I do. Queen of quinoa. Besties are called.

Speaker: 5
01:33:00

That is my dog taking it out of your driveway. Oh,

Speaker: 3
01:33:05

man. My happy

Speaker: 6
01:33:08

We should all just get a room and just have one big huge orgy because they’re all just useless. It’s like this, like, sexual tension and they just need to release somehow.

Speaker: 1
01:33:16

Let your feet be. Let

Speaker: 4
01:33:18

your feet.

Speaker: 1
01:33:20

Where did you

Speaker: 0
01:33:21

get murdered? Where did you get murdered? I’m doing all this.

Speaker: 3
01:33:30

Are back?

Ready to try this in Speak?

Upload your audio, video, or text and get transcription, summaries, and insights in minutes. Start self-serve, or book a consult if you need white-label, routing, or advanced workflows.

Don’t Miss Out - ENDING SOON!

Save Big With Speak's March Limited Offers 🎁

For a limited time, save on a fully loaded Speak plan. Join 250K+ who save time and money with our top-rated AI platform for capture, transcription, translation, analysis and more.