Elon Musk | All-In Summit 2024

01:06:52
https://www.youtube.com/watch?v=pSFvOUswFwA

Summary

TLDRElon Musk, after buying Twitter for $44 billion, is navigating significant challenges, particularly with Tesla's shares plummeting over 30% since his acquisition of the social media company. His entrepreneurial journey is marked by erratic yet successful ventures, such as launching rockets with SpaceX and shattering critics' doubts. Amidst investigations by the Biden Administration into Tesla and SpaceX, Musk remains undeterred. In a candid conversation, Musk discussed the significance of free speech and the concerning global trend towards its suppression. He reflected on government overreach, comparing economic productivity between historical examples like East and West Germany. Elon emphasized the need for reducing unnecessary regulations to spur economic prosperity. As the world's richest man, Musk is on track to become the first trillionaire, driven by his relentless pursuit of innovation. His future projects include advancing SpaceX's plans for spacewalks and Mars missions, and developing humanoid robots that could potentially exceed the human population in numbers. Through humor and candid insights, Musk highlighted the potential of AI and robotics to revolutionize industries, envisioning an era of abundance made possible by AI advancements. His optimism is matched by his call for realistic action in addressing overgovernment and enabling swift progress in tech development.

Takeaways

  • 🚀 Elon Musk bought Twitter for $44 billion.
  • 📉 Tesla shares dropped over 30% since Twitter acquisition.
  • 🕵️‍♂️ Biden administration is investigating Elon Musk.
  • 🗣️ Advocates for free speech amidst global suppression trends.
  • 🌌 SpaceX aims for first commercial spacewalk.
  • 📋 Regulative delays pose challenges to SpaceX's pace.
  • 🤖 Developing humanoid robots to outnumber humans.
  • 🔧 Dojo aims to be a leading training computer.
  • ⚖️ Criticizes overregulation hindering economic growth.
  • 🌐 Envisions a tech-driven future of abundance and innovation.

Timeline

  • 00:00:00 - 00:05:00

    Elon Musk's acquisition of Twitter has been tumultuous, affecting Tesla's stock significantly. His next steps remain uncertain, and his behavior has been described as erratic. There are growing concerns about his dealings with other countries, leading to investigations from the U.S. government. Nonetheless, Musk's track record of overcoming obstacles is praised.

  • 00:05:00 - 00:10:00

    Musk's vision for a future among the stars fuels his ambitions despite skepticism. His leadership style, challenging his teams to achieve the impossible, is seen as a model by many executives. Recent successes in SpaceX reinforce his reputation as a visionary possibly on track to become the world's first trillionaire. His resilience under pressure is admired.

  • 00:10:00 - 00:15:00

    In an exchange about free speech, Musk highlights the global challenges of maintaining it, criticizing instances where people are jailed for online expressions. His stance is that platforms should adhere to local laws but not impose censorship beyond what's illegal. He stresses the importance of public dialogue and warns against authoritarian tendencies.

  • 00:15:00 - 00:20:00

    Musk discusses ongoing issues with Brazil, emphasizing the need to respect local laws while facing misunderstandings about their compliance. As a friend expresses concern for his safety when visiting volatile countries, Musk downplays personal risk but acknowledges the seriousness of international pressures and regulatory challenges that he faces.

  • 00:20:00 - 00:25:00

    Musk addresses recent calls for his imprisonment by some media outlets, which accuse him of allowing harmful dialogue on Twitter. He rebuffs these as attacks on free speech, implying that those advocating for his jailing aim to suppress inconvenient truths and maintain control over public discourse.

  • 00:25:00 - 00:30:00

    Continuing his critique on regulation, Musk argues for thoughtful reform to prevent bureaucratic overreach, citing examples like the inefficiencies in California's infrastructure projects. He suggests that excessive regulations stifle economic progress and stresses the need for a balance that promotes innovation without compromising safety.

  • 00:30:00 - 00:35:00

    Musk is quizzed on his possible role in government if Trump were elected, indicating that a focused effort could significantly reduce governmental inefficiency. However, he admits the transition would require careful management to avoid economic shock. His ambitious vision includes vast economic prosperity resulting from deregulation and a shift of roles from public to private sectors.

  • 00:35:00 - 00:40:00

    Musk discusses SpaceX's upcoming projects, expressing excitement about a major private spacewalk mission. The conversation touches on regulatory hurdles for SpaceX's Starship program, criticizing the slower bureaucratic processes that hinder progress. Musk dreams of a future where humanity deeply integrates with space exploration.

  • 00:40:00 - 00:45:00

    Speculating on the future of AI and robotics, Musk envisions a world with vast economic growth driven by AI's capabilities. He argues that AI will make goods and services essentially abundant, but notes potential challenges in finding purpose in a world where AI outperforms human labor. Despite possible risks, he remains optimistic about AI’s benefits.

  • 00:45:00 - 00:50:00

    The discussion shifts to Musk's Optimus project—humanoid robots designed to assist in various tasks. He sees robots as potentially outnumbering humans, providing companionship, and handling routine chores. Musk outlines a strategy to reduce production costs, suggesting robots could become widely affordable, significantly altering daily life.

  • 00:50:00 - 00:55:00

    Musk shares insights on Tesla's AI advancements, highlighting its emphasis on real-world problem-solving with vast amounts of data. He details Tesla's custom-designed chips for in-vehicle AI, which outperform available competitors. His overview suggests a strategic positioning in AI and robotics, aiming to innovate beyond mere software advancements.

  • 00:55:00 - 01:00:00

    Jokingly reflecting on his appearance on Saturday Night Live, Musk narrates behind-the-scenes anecdotes and skits that were cut from the airing. He describes the experience as liberating and fun, highlighting his often misunderstood humorous side. This section offers a glimpse into the human side of Musk amid his high-profile business activities.

  • 01:00:00 - 01:06:52

    The video concludes with a light-hearted tone as Musk and the hosts reminisce about humorous events and moments of personal enjoyment. They reveal Musk's capacity for humor and the need for levity amidst his intense professional life. Overall, the discourse emphasizes Musk's complex persona as both a business magnate and an individual with relatable, human attributes.

Show more

Mind Map

Video Q&A

  • How much did Elon Musk buy Twitter for?

    He bought Twitter for $44 billion.

  • What challenges has Elon Musk faced with Tesla since acquiring Twitter?

    Tesla's shares have plummeted more than 30% since he took over Twitter.

  • What investigations is the Biden Administration conducting on Elon Musk?

    The Biden Administration is conducting its second investigation involving Tesla and SpaceX in less than a week.

  • What is Elon Musk's stance on free speech?

    Elon Musk emphasizes the importance of free speech and is concerned about global efforts to suppress it.

  • What is SpaceX's recent achievement?

    SpaceX is launching the first commercial spacewalk and aims to achieve the highest altitude since Apollo missions.

  • What challenges are associated with building rockets according to Elon Musk?

    Musk mentions regulatory approval delays as a significant challenge, stating rockets are being built faster than paperwork is processed.

  • What is Musk's vision for AI and robotics?

    He envisions a future where humanoid robots are more numerous than humans and can perform various tasks, enhancing productivity.

  • What role does Elon Musk anticipate for Dojo?

    Elon Musk suggests that Dojo could become a powerful training computer, comparable to the best servers currently available.

  • How does Musk view overregulation?

    He believes overregulation stifles progress and emphasizes reducing unnecessary regulations for economic prosperity.

  • What is Elon Musk's opinion on the balance between government regulation and economic growth?

    Musk advocates for streamlining regulations and reducing government size to enhance prosperity and productivity.

View more video summaries

Get instant access to free YouTube video summaries powered by AI!
Subtitles
en
Auto Scroll:
  • 00:00:03
    after buying Twitter for $44 billion
  • 00:00:06
    must time as CEO has been a whirlwind
  • 00:00:09
    shares of must other major company Tesla
  • 00:00:12
    have plummeted more than 30% since he
  • 00:00:14
    took over
  • 00:00:15
    Twitter as is often the case his next
  • 00:00:18
    move is unclear I go as far to say that
  • 00:00:21
    he's demonstrating some erratic Behavior
  • 00:00:24
    go [ __ ]
  • 00:00:26
    yourself is that
  • 00:00:28
    clear I hope it is
  • 00:00:31
    hey Bob here in the
  • 00:00:33
    audience Elon musk's cooperation Andor
  • 00:00:38
    relationships with other countries is
  • 00:00:42
    worthy of being looked at the Biden
  • 00:00:44
    Administration has just announced its
  • 00:00:46
    second investigation in the Elon Musk in
  • 00:00:48
    less than a week both the Tesla and
  • 00:00:50
    spaceex there's a product road map that
  • 00:00:52
    they're on and that whether Elon is in
  • 00:00:55
    the building or not is not going to
  • 00:00:57
    impact the plan that they have
  • 00:01:02
    people said he'd never get the rocket in
  • 00:01:03
    space he did that people said the road
  • 00:01:05
    here would never get delivered he did
  • 00:01:07
    that people said he'd never get a
  • 00:01:08
    hundred of them done he's got 200 done
  • 00:01:11
    as an entrepreneur you can't listen to
  • 00:01:12
    the noise and you certainly can't listen
  • 00:01:14
    to losers who have never accomplished
  • 00:01:16
    anything with their life who are
  • 00:01:17
    obsessing about you
  • 00:01:18
    [Music]
  • 00:01:22
    please we're out there among the stars
  • 00:01:25
    and we're a multiplet species across
  • 00:01:27
    many planets and many Star systems
  • 00:01:33
    [Music]
  • 00:01:36
    this is a great future and that's what
  • 00:01:39
    we should strive
  • 00:01:40
    for T customer support At Your
  • 00:01:44
    Service nearly every VC I speak with
  • 00:01:47
    every CEO is looking to elon's behavior
  • 00:01:49
    and saying that's a model for how you
  • 00:01:51
    can challenge your team to achieve the
  • 00:01:53
    impossible in an impossibly difficult
  • 00:01:56
    environment and you can see those git
  • 00:01:58
    fins on your left hand screen rotating
  • 00:02:00
    and turning to guide the booster and
  • 00:02:02
    there's that Landing
  • 00:02:05
    B Landing GN just begun and you can see
  • 00:02:09
    the water below and we have FL
  • 00:02:16
    down what an
  • 00:02:20
    incredible he's just a Visionary like
  • 00:02:22
    I've never seen how on Earth would you
  • 00:02:25
    bet against him Elon seems to be on
  • 00:02:27
    track to be not only the world's richest
  • 00:02:29
    man but the world's first trillionaire
  • 00:02:31
    Elon basically has had over the last 10
  • 00:02:34
    or 15 years an incredible amount of
  • 00:02:36
    challenges that he's overcome probably
  • 00:02:38
    had to deal with stuff that most of us
  • 00:02:40
    would have broken under and he just
  • 00:02:42
    fought through it and the guy just
  • 00:02:44
    basically bended all the haters until he
  • 00:02:47
    crushed their souls and I just think
  • 00:02:49
    that that's incredible
  • 00:02:51
    [Music]
  • 00:03:00
    the greatest
  • 00:03:01
    entrepreneur this generation Elon
  • 00:03:05
    Musk hey
  • 00:03:10
    guys grab my
  • 00:03:14
    stuff there I'm gonna yep I'm going to
  • 00:03:17
    take this all
  • 00:03:20
    right want to do this thanks for taking
  • 00:03:22
    the
  • 00:03:23
    [Laughter]
  • 00:03:25
    time how um how you doing brother
  • 00:03:30
    you keep him
  • 00:03:31
    busy yeah I
  • 00:03:35
    mean it's rarely a slow
  • 00:03:38
    week I mean in the world as well yeah it
  • 00:03:42
    I mean any given week I it just seems
  • 00:03:44
    like the things getting nuttier it it
  • 00:03:46
    it's definitely a simulation we've
  • 00:03:47
    agreed on this at this
  • 00:03:49
    point I mean well look if if we are in
  • 00:03:53
    some alien Netflix series I think the
  • 00:03:54
    ratings are high yes ratings are High um
  • 00:03:59
    how are the um uh the the freedom of
  • 00:04:02
    speech Wars going um this is a uh you've
  • 00:04:06
    been you've been out War for two years
  • 00:04:09
    now yes uh the price of freedom of
  • 00:04:12
    speech is not cheap is it I think it's
  • 00:04:14
    like 44 billion something like that just
  • 00:04:16
    round numbers give her give or take a
  • 00:04:18
    billion yeah round numbers yeah yeah um
  • 00:04:23
    it's uh it's it's pretty nutty there
  • 00:04:25
    there is like this weird movement to uh
  • 00:04:28
    quell free speech
  • 00:04:30
    uh kind of around the world and um and
  • 00:04:33
    this is something we should be very
  • 00:04:34
    concerned about uh you know you have to
  • 00:04:37
    ask like why was the first amendment
  • 00:04:40
    like a high priority was like number one
  • 00:04:42
    um One is because uh people came from
  • 00:04:46
    countries where if you spoke freely you
  • 00:04:49
    would be imprisoned or killed and they
  • 00:04:51
    were like well we would like to not have
  • 00:04:53
    that here um cuz that was terrible yeah
  • 00:04:58
    and and actually you know there's a lot
  • 00:05:00
    of places in the world right now if you
  • 00:05:02
    if you're critical of the government uh
  • 00:05:05
    you get imprisoned or killed right yeah
  • 00:05:08
    we'd like to not have that are you
  • 00:05:11
    concerned
  • 00:05:13
    that I mean I suspect this is a
  • 00:05:15
    receptive audience to that message yeah
  • 00:05:22
    um you I I think we always thought that
  • 00:05:26
    the West Was the exception to that that
  • 00:05:28
    we knew there were authorit places
  • 00:05:30
    around the world but we thought that in
  • 00:05:31
    the west we'd have freedom of speech and
  • 00:05:33
    we've seen like you said it seems like a
  • 00:05:34
    global movement in Britain you've got
  • 00:05:38
    teenagers being put in prison for memes
  • 00:05:41
    opposing it's like you like to you like
  • 00:05:44
    the Facebook post throw them in the
  • 00:05:45
    prison
  • 00:05:47
    yeah people have got an actual you know
  • 00:05:50
    prison for for like like obscure
  • 00:05:53
    comments on social media not even [ __ ]
  • 00:05:55
    posting yet like not even yeah crazy
  • 00:05:59
    thrown in prison recently I'm like that
  • 00:06:01
    was about I was like what is the massive
  • 00:06:05
    crime that pav in France and then of
  • 00:06:08
    course we got Brazil with judge
  • 00:06:11
    Voldemort that that one seems like the
  • 00:06:13
    one that impacts you the most can you
  • 00:06:15
    what's the latest on
  • 00:06:20
    that well we I guess we are trying to
  • 00:06:23
    figure out uh is there
  • 00:06:26
    some reasonable solution in Brazil uh
  • 00:06:30
    the you know the concern I mean I want
  • 00:06:33
    to just make sure that this is framed
  • 00:06:35
    correctly um and uh you know funny memes
  • 00:06:39
    aside I the the the nature of the
  • 00:06:42
    concern was
  • 00:06:43
    that at least at at excorp we had the
  • 00:06:46
    perception that um we were being asked
  • 00:06:50
    to do things that violated Brazilian law
  • 00:06:53
    so obviously we cannot as an American
  • 00:06:56
    company impose American laws and values
  • 00:06:59
    on other countries um that uh you know
  • 00:07:03
    we wouldn't get very far if we did that
  • 00:07:05
    um but but we do you know think that uh
  • 00:07:10
    if if a country's law laws are a
  • 00:07:11
    particular way and we're being asked to
  • 00:07:14
    what we we think we think we're being
  • 00:07:15
    asked to break them then and and be
  • 00:07:18
    silent about it then obviously that is
  • 00:07:20
    no good so so I just want to be clear
  • 00:07:23
    because thetime sometimes comes across
  • 00:07:26
    as uh elon's trying to just be a crazy
  • 00:07:30
    whatever billionaire and demand
  • 00:07:33
    outrageous things from other countries
  • 00:07:36
    and
  • 00:07:37
    um you know well that is
  • 00:07:41
    true
  • 00:07:43
    um in
  • 00:07:45
    addition there are um other things uh
  • 00:07:50
    that that I think are you know valid
  • 00:07:52
    which is like we we we obviously can't
  • 00:07:56
    uh you know I think any given thing that
  • 00:07:58
    we do at cor we've got to be able to
  • 00:08:01
    explain in the light of day and and not
  • 00:08:04
    feel that it was dishonorable or you
  • 00:08:07
    know we we we did the wrong thing you
  • 00:08:09
    know uh so we don't we that that that
  • 00:08:13
    that was the that that's the nature of
  • 00:08:14
    the concern so we actually are in uh
  • 00:08:18
    sort of discussions with uh the you know
  • 00:08:22
    judicial authorities in in Brazil to try
  • 00:08:25
    to you know run this to ground like uh
  • 00:08:30
    what what's actually going on like if if
  • 00:08:32
    we're being asked to break the law
  • 00:08:35
    Brazilian law then that's that that
  • 00:08:37
    obviously should not be should not sit
  • 00:08:38
    well with the resilient Judiciary and if
  • 00:08:42
    we're not and we're mistaken we'd like
  • 00:08:43
    to understand how we are mistaken I
  • 00:08:46
    think that's a that's a pretty
  • 00:08:47
    reasonable uh position I'm a bit
  • 00:08:50
    concerned as your friend that you're
  • 00:08:53
    going to go to one of these countries
  • 00:08:55
    and I'm going to wake up one day and
  • 00:08:56
    you're going to get arrested and like
  • 00:08:58
    I'm going to have to go bail you out or
  • 00:08:59
    or something like this is feels very
  • 00:09:01
    acute like yes I mean it's not a joke
  • 00:09:04
    now like they're literally saying like
  • 00:09:07
    you know it's not just Biden saying like
  • 00:09:08
    we have to look into that guy now it's
  • 00:09:09
    become quite literal like this I who was
  • 00:09:12
    the guy who just wrote the um was it the
  • 00:09:15
    guardian piece about like oh yeah yeah
  • 00:09:18
    there have been three articles and I
  • 00:09:20
    think in the past three weeks Robert
  • 00:09:22
    Reich but it wasn't just him it was like
  • 00:09:24
    three different articles three different
  • 00:09:26
    articles it doesn't that's a trend
  • 00:09:29
    calling for me to be imprisoned right in
  • 00:09:32
    the in the guardian you know guardian of
  • 00:09:35
    what what are they protecting
  • 00:09:37
    exactly guardian of I don't know of
  • 00:09:41
    authoritarianism yeah guardian of uh
  • 00:09:44
    yeah yeah censorship censorship but but
  • 00:09:47
    the premise here is that you bought this
  • 00:09:50
    thing this online Forum this
  • 00:09:52
    communication platform and you're
  • 00:09:53
    allowing people to use it to express
  • 00:09:56
    themselves therefore you have to be
  • 00:09:58
    jailed
  • 00:10:00
    I don't understand the logic here right
  • 00:10:02
    um there's what do you think they're
  • 00:10:03
    actually afraid of at this
  • 00:10:05
    point what's the motivation here I mean
  • 00:10:08
    I think the if somebody's Afra if
  • 00:10:11
    somebody's sort of trying to push a
  • 00:10:13
    false premise on the world then and then
  • 00:10:15
    that that and that premise can be
  • 00:10:18
    undermined with public dialogue then
  • 00:10:20
    they will be opposed to public Dialogue
  • 00:10:22
    on that premise because they wish that
  • 00:10:24
    false premise to Prevail right um so
  • 00:10:27
    that's I think you know the the issue
  • 00:10:30
    there is uh if they don't like the truth
  • 00:10:33
    uh you know then we want to suppress it
  • 00:10:35
    so now the you know the the the sort of
  • 00:10:39
    the what what we're trying to do with
  • 00:10:41
    excorp uh is uh I distinguish that from
  • 00:10:46
    my son who's also called X yes
  • 00:10:49
    uh you have you have parental goals and
  • 00:10:52
    you have goals everything is just called
  • 00:10:54
    X basically it's very difficult
  • 00:10:56
    disambiguation the sun yeah it's X
  • 00:10:58
    everything um so uh what we're trying to
  • 00:11:03
    do is simply adhere to the uh you know
  • 00:11:07
    the the laws in a in a country um so so
  • 00:11:11
    if something is illegal in the United
  • 00:11:13
    States or if it's illegal in you know
  • 00:11:15
    Europe or Brazil or wherever it might be
  • 00:11:19
    uh then then we will take it down or
  • 00:11:20
    we'll suspend the account because we
  • 00:11:23
    we're not you know there to make the
  • 00:11:25
    laws we but but if speech is not illegal
  • 00:11:29
    then then what are we doing okay now
  • 00:11:31
    we're injecting ourselves in as as a
  • 00:11:35
    sensor and and where does it stop and
  • 00:11:37
    who
  • 00:11:38
    decides so and where where does that
  • 00:11:42
    path lead I think it leads to a bad
  • 00:11:44
    place uh so if the people if in a
  • 00:11:49
    country want the lows to be different
  • 00:11:50
    they should make the lows different but
  • 00:11:52
    otherwise we're going to obey the law in
  • 00:11:56
    each jurisdiction right and some of
  • 00:11:57
    these Europe that's it it's it's not
  • 00:11:59
    more complicated that we're not we're
  • 00:12:00
    not trying to Flat the law to be clear
  • 00:12:01
    about that but we're trying to adhere to
  • 00:12:03
    the law and if the laws change we will
  • 00:12:05
    change and if if the laws don't change
  • 00:12:07
    we we won't we're just literally trying
  • 00:12:09
    to adhere to the law it's pretty pretty
  • 00:12:11
    straightforward there very
  • 00:12:13
    straightforward if somebody doesn't
  • 00:12:15
    thinks we are not adhering to the law
  • 00:12:16
    well they can file a lawsuit Bingo also
  • 00:12:19
    very straightforward I mean there are
  • 00:12:21
    European countries that don't want
  • 00:12:22
    people to promote Nazi propaganda yes
  • 00:12:25
    they have some sensitivity to it well
  • 00:12:26
    it's it is illegal it is illegal in
  • 00:12:28
    those countries in those countries if
  • 00:12:30
    somebody puts that up you take it down
  • 00:12:32
    yes but they typically file something
  • 00:12:34
    and say take down and no in some cases
  • 00:12:36
    it is just um obviously illegal like you
  • 00:12:39
    don't need to file a lawsuit for you
  • 00:12:42
    know if something is just you know
  • 00:12:44
    unequivocally illegal we can literally
  • 00:12:46
    read the law this violates the law you
  • 00:12:48
    know anyone anyone can see that like you
  • 00:12:51
    know you don't need like if if somebody
  • 00:12:54
    is stealing you don't need let me check
  • 00:12:55
    the law on that okay oh no they're
  • 00:12:57
    they're stealing Francisco let's talk
  • 00:12:59
    about so we had we had JD Vance here
  • 00:13:02
    this morning he did a great job um and
  • 00:13:05
    you know one of the things is there's
  • 00:13:06
    this image on acts of like basically
  • 00:13:09
    like you
  • 00:13:10
    Bobby uh Trump and JD are like the
  • 00:13:14
    Avengers I guess and then there's
  • 00:13:16
    another meme where you're in front of a
  • 00:13:18
    desk where it says d o g the department
  • 00:13:21
    of governmental efficiency yes yes I
  • 00:13:23
    posted that one tell us I I made it
  • 00:13:26
    using grock the grock image generator
  • 00:13:30
    and I posted it tell us about put it to
  • 00:13:32
    my profile seat for
  • 00:13:35
    efficiency um how how do you do
  • 00:13:38
    it well I I mean
  • 00:13:45
    I I think with great difficulty uh but
  • 00:13:49
    you know look it's been a long time
  • 00:13:51
    since there was a serious effort to
  • 00:13:54
    reduce the size of government and to um
  • 00:13:57
    remove absurd regulations yeah and you
  • 00:14:00
    know the last time there was a really
  • 00:14:01
    concered effort on that front was Reagan
  • 00:14:04
    in the early 80s so we're 40 years away
  • 00:14:06
    from um a a a a serious effort to remove
  • 00:14:10
    um you know not regulations that don't
  • 00:14:14
    serve the greater good and and reduce
  • 00:14:16
    the size of government and I think it's
  • 00:14:18
    just if we don't do that then what's
  • 00:14:21
    what what's happening is that we get
  • 00:14:22
    regulations and laws accumulating every
  • 00:14:25
    year until eventually everything's
  • 00:14:27
    illegal uh and that's why we can't get
  • 00:14:30
    uh major infrastructure projects done in
  • 00:14:31
    the United States like if you look at
  • 00:14:33
    the absurdity of the California
  • 00:14:35
    highspeed rail I think they theyve spent
  • 00:14:36
    $7 billion and have a 1,600 put segment
  • 00:14:39
    that doesn't actually have rail in
  • 00:14:41
    it I mean your tax dollars at work I
  • 00:14:44
    mean yeah what are we doing that's an
  • 00:14:45
    expensive 1600 fet of concrete you know
  • 00:14:49
    um and and I mean I think it's like if
  • 00:14:53
    you know uh I realize sometimes I'm
  • 00:14:56
    perhaps a little optimistic with
  • 00:14:57
    schedules but uh uh you
  • 00:15:01
    know I mean I wouldn't be doing the
  • 00:15:03
    things I'm doing if I was uh you know
  • 00:15:07
    not an an optimist uh so but but but but
  • 00:15:12
    at the current Trend you know California
  • 00:15:14
    High Street rail might finish sometime
  • 00:15:16
    next
  • 00:15:17
    Century maybe probably not we're just
  • 00:15:20
    we'll have teleportation by that time so
  • 00:15:22
    yeah exactly have ai do everything at
  • 00:15:24
    that point so so so so you I I think you
  • 00:15:29
    really think of um you know the the
  • 00:15:31
    United States and many countries it's
  • 00:15:33
    it's arguably worse than the EU as being
  • 00:15:36
    like galiva tied down by a million
  • 00:15:38
    little strings and like any one given
  • 00:15:41
    regulation is not is not that bad but
  • 00:15:44
    you've got a million of them and um or
  • 00:15:46
    Millions actually and and and then
  • 00:15:49
    eventually you just can't get anything
  • 00:15:50
    done and and this is a this is a massive
  • 00:15:53
    tax on the on the consumer on the people
  • 00:15:56
    uh it's just they don't they don't
  • 00:15:57
    realize that there's this this massive
  • 00:15:59
    tax in the form of irrational
  • 00:16:02
    regulations um I'm going to give you a
  • 00:16:04
    recent uh example that you know is is
  • 00:16:08
    just insane um is that uh like SpaceX
  • 00:16:10
    was fined by the EPA
  • 00:16:12
    $140,000 for um they claimed dumping uh
  • 00:16:16
    portable water on the ground drinking
  • 00:16:19
    water so and we're like uh this is that
  • 00:16:22
    star base and and we're like it's we're
  • 00:16:23
    in a TR tropical uh thunderstorm region
  • 00:16:26
    um that stuff comes from the sky all the
  • 00:16:28
    time
  • 00:16:29
    time and
  • 00:16:31
    um and there was no actual harm done you
  • 00:16:33
    know it was just water to cool the the
  • 00:16:35
    the launch pad during uh lift off and um
  • 00:16:39
    there's zero harm done like and they're
  • 00:16:40
    like they agree yes there's zero harm
  • 00:16:42
    done we're like okay so there's no harm
  • 00:16:43
    done and um you want us to pay $140,000
  • 00:16:47
    fine it's like yes because you didn't
  • 00:16:48
    have a
  • 00:16:49
    permit okay we didn't know there was a
  • 00:16:52
    permit needed for zero har fresh water
  • 00:16:55
    being on the ground in a place that
  • 00:16:57
    where fresh water falls from the sky all
  • 00:16:58
    the
  • 00:16:59
    time got it next to the ocean next to
  • 00:17:03
    the ocean cuz there's a little bit of
  • 00:17:04
    water there too yeah I mean sometimes it
  • 00:17:06
    rain so much the the roads are flooded
  • 00:17:08
    so we're like you
  • 00:17:09
    know how does this make any sense yeah
  • 00:17:13
    and and then like then then they were
  • 00:17:15
    like well we're not going to process any
  • 00:17:16
    more of your any more of your
  • 00:17:18
    applications for launch for Starship
  • 00:17:19
    launch unless you pay this $140,000 they
  • 00:17:21
    just ransomed us and we're like okay so
  • 00:17:24
    we' paid $140,000 but it was a it's like
  • 00:17:27
    this is no good I mean at this wait
  • 00:17:29
    we're never going to get to Mars I mean
  • 00:17:31
    that's
  • 00:17:32
    the that's the confounding part here
  • 00:17:34
    yeah is we're acting against our own
  • 00:17:37
    self-interest you know when you look at
  • 00:17:40
    we do have to make putting aside fresh
  • 00:17:43
    water but hey you know they the rocket
  • 00:17:45
    makes a lot of noise so I'm I'm certain
  • 00:17:48
    there's some complaints about noise once
  • 00:17:50
    in a while but sometimes you want to
  • 00:17:51
    have a party or you want to make
  • 00:17:52
    progress and there's a little bit of
  • 00:17:53
    noise therefore you know we we trade off
  • 00:17:56
    a little bit of noise for massive
  • 00:17:58
    progress or even even fun so like when
  • 00:18:01
    did we stop being able to make those
  • 00:18:02
    tradeoffs but talk about the difference
  • 00:18:05
    between California and Texas uh where
  • 00:18:07
    you and I now reside um Texas you were
  • 00:18:11
    able to build the gigafactory I remember
  • 00:18:14
    when you got the plot of land and then
  • 00:18:16
    it seemed like it was less than two
  • 00:18:18
    years when you had the party to open it
  • 00:18:21
    yeah from St construction um to
  • 00:18:25
    completion uh was 14 months 14 14 months
  • 00:18:30
    is there anywhere on the planet that
  • 00:18:31
    would go faster is like China faster
  • 00:18:32
    than that uh China was 11 months got it
  • 00:18:36
    so Texas China 11 and 14 months
  • 00:18:40
    California how many months and just give
  • 00:18:42
    you a sense of size the T A gigafactory
  • 00:18:46
    in China is three times the size of the
  • 00:18:47
    Pentagon which was the biggest building
  • 00:18:49
    in America uh no there are bigger
  • 00:18:50
    buildings but the pentagon's a pretty
  • 00:18:51
    big one yeah or it was the biggest in
  • 00:18:53
    units in units of Pentagon it's like
  • 00:18:55
    three okay three pentagons and Counting
  • 00:18:58
    yeah got it in 14 months um the just the
  • 00:19:04
    just the regulatory approvals in
  • 00:19:06
    California would have taken two
  • 00:19:08
    years yeah so that's that's the issue
  • 00:19:11
    where where do you think the regulation
  • 00:19:13
    helps like for the people that will say
  • 00:19:15
    we need some checks and balances we
  • 00:19:17
    can't have some because for every good
  • 00:19:18
    actor like you there'll be a bad actor
  • 00:19:20
    so where is that line then yeah I mean I
  • 00:19:22
    have a sort of you
  • 00:19:25
    know in in sort of doing sensible
  • 00:19:29
    regulation and um reduction in the size
  • 00:19:32
    of government the is is just like be
  • 00:19:34
    very public about it and say like which
  • 00:19:36
    of these rules do you if the public is
  • 00:19:38
    really excited about a rule and wants to
  • 00:19:40
    keep it we'll just keep it and and here
  • 00:19:42
    the thing about the rules if if like if
  • 00:19:44
    the rule is um you know turns out to be
  • 00:19:47
    a bad we'll just put it right back okay
  • 00:19:50
    and then you know problem solved it's
  • 00:19:51
    like it's easy to add rules but we don't
  • 00:19:53
    actually have a process for getting rid
  • 00:19:55
    of them that's the issue there's no
  • 00:19:57
    garbage collection for rules
  • 00:20:00
    um when we were um watching you work
  • 00:20:04
    David and I and Antonio um in that first
  • 00:20:07
    month at Twitter which was all hands on
  • 00:20:09
    deck and you were doing zerob based
  • 00:20:11
    budgeting you really quickly got the
  • 00:20:14
    cost under control and then miraculously
  • 00:20:16
    everybody said this site will go down
  • 00:20:19
    and you added 50 more features so maybe
  • 00:20:22
    explain because this is the first time
  • 00:20:24
    yeah there were like so many articles
  • 00:20:26
    like the that this this is Twitter is
  • 00:20:29
    Dead Forever there's no way it could
  • 00:20:30
    possibly even continue at all it was
  • 00:20:33
    almost like the Press was roting for you
  • 00:20:34
    to let's write theit here's the orbitary
  • 00:20:37
    uh they were all saying their goodbyes
  • 00:20:38
    on Twitter remember that yeah yeah yeah
  • 00:20:40
    they were all leaving and saying their
  • 00:20:41
    goodbyes cuz the site was going to melt
  • 00:20:43
    down and yeah totally fail and uh all
  • 00:20:45
    the journalists left yeah and which is
  • 00:20:49
    if you ever want to like hang out with a
  • 00:20:50
    bunch of Hall monitors oh my God threads
  • 00:20:52
    is amazing every time I go over there
  • 00:20:54
    and post they're like they they're
  • 00:20:57
    really triggered but I mean if you like
  • 00:20:59
    being condemned repeatedly then you know
  • 00:21:02
    for reasons that make no sense then
  • 00:21:03
    threads is the way to go yeah it's
  • 00:21:05
    really it's it's the most miserable
  • 00:21:07
    place on Earth if Disney's the happiest
  • 00:21:10
    this is the anti- Disney but if we were
  • 00:21:12
    to go into government you went into the
  • 00:21:14
    Department of Education or Pi pick the
  • 00:21:16
    department you've worked with a lot of
  • 00:21:18
    them actually sure you can't go in there
  • 00:21:20
    in zerob based budget okay we get it but
  • 00:21:23
    if you could just pair two three four 5%
  • 00:21:27
    of those organizations
  • 00:21:29
    what kind of impact would that
  • 00:21:31
    have yeah I mean I think we' need to do
  • 00:21:34
    more than that I think ideally
  • 00:21:36
    compounding every year 2 3% a year I
  • 00:21:39
    mean it would be better than what's
  • 00:21:40
    happening
  • 00:21:42
    now yeah I look I think we' we' um you
  • 00:21:48
    know
  • 00:21:49
    uh if if Trump wins and and obviously I
  • 00:21:53
    suspect there are people with mixed
  • 00:21:54
    feelings about whether that should
  • 00:21:55
    happen but uh but if but we do have an
  • 00:21:59
    opportunity uh to do kind of a once in
  • 00:22:02
    a-lifetime deregulation and reduction in
  • 00:22:04
    the size of government um because
  • 00:22:06
    because the other thing besides the
  • 00:22:07
    regulations um America is also going
  • 00:22:09
    bankrupt extremely quickly um and and
  • 00:22:13
    nobody seems to everyone seems to be
  • 00:22:14
    sort of whistling past the graveyard on
  • 00:22:16
    this one um but they're all they're all
  • 00:22:19
    grabbing the silverware everyone's stuff
  • 00:22:21
    in their pockets in the silverware
  • 00:22:22
    before this the Titanic s likeor well
  • 00:22:25
    you know the the defense department
  • 00:22:27
    budget is a very big budget okay it's a
  • 00:22:30
    trillion dollars a year DOD Intel it's
  • 00:22:33
    Trill a trillion dollars um and interest
  • 00:22:37
    payments on the national debt just
  • 00:22:39
    exceeded the defense department budget
  • 00:22:42
    they're over a trillion dollars a year
  • 00:22:45
    just in interest and Rising we're we're
  • 00:22:48
    adding a trillion dollars to theet to
  • 00:22:51
    our debt which our you know kids and
  • 00:22:54
    grandkids are going to have to pay
  • 00:22:55
    somehow
  • 00:22:57
    um you know every every 3 months and
  • 00:23:01
    then soon it's going to be every two
  • 00:23:02
    months and then every month and then the
  • 00:23:04
    only thing we'll be able to pay is
  • 00:23:06
    interest yeah and and if if this is it's
  • 00:23:09
    just you know it's just like a person at
  • 00:23:11
    scale that has racked up too much credit
  • 00:23:15
    card debt um and
  • 00:23:18
    uh this this is not this does not have a
  • 00:23:22
    good
  • 00:23:23
    ending so we have to reduce the spending
  • 00:23:26
    let me ask one question because I've
  • 00:23:27
    brought this up lot and the
  • 00:23:29
    counterargument I hear which I disagree
  • 00:23:31
    with um but the counterargument I hear
  • 00:23:32
    from a lot of politicians is if we
  • 00:23:35
    reduce spending because right now if you
  • 00:23:36
    add up federal state and local
  • 00:23:39
    government spending it's between 40 and
  • 00:23:41
    50% of GDP so nearly half of our economy
  • 00:23:46
    is supported by government spending and
  • 00:23:48
    nearly half of people in the United
  • 00:23:50
    States are dependent directly or
  • 00:23:51
    indirectly on government checks and
  • 00:23:54
    either through contractors that that the
  • 00:23:56
    government pays or they're employed by
  • 00:23:57
    government um entity so if you go in and
  • 00:24:00
    you take too Harden acts too fast you
  • 00:24:03
    will have significant contraction job
  • 00:24:06
    loss and recession what's The Balancing
  • 00:24:09
    Act Elon just thinking realistically
  • 00:24:11
    because I'm 100% on board with you the
  • 00:24:14
    steps the next set of steps however
  • 00:24:16
    assume Trump wins and you become the the
  • 00:24:20
    chief uh doe um uh dog uh do
  • 00:24:27
    G how how like yeah and I think the
  • 00:24:30
    challenge is how quickly can we yeah how
  • 00:24:31
    quickly can we go in how quickly can
  • 00:24:33
    things change and without
  • 00:24:38
    without I want that on my business card
  • 00:24:41
    yeah without all the Lo without all the
  • 00:24:43
    contraction and
  • 00:24:44
    job yeah so so I guess how do you really
  • 00:24:47
    address it when so much of the economy
  • 00:24:48
    and so many people's jobs and
  • 00:24:49
    livelihoods are dependent on government
  • 00:24:52
    spending well I mean I I do think it's
  • 00:24:55
    it's it's sort of
  • 00:24:57
    um you know it's it's false dichotomy
  • 00:24:59
    it's not like no government spending is
  • 00:25:01
    going to happen um you really have to
  • 00:25:03
    say like is it the right level um and
  • 00:25:06
    just remember that that you know any any
  • 00:25:09
    given person if they are doing things in
  • 00:25:12
    a less efficient organization versus
  • 00:25:14
    more efficient organization their
  • 00:25:16
    contribution to the economy their net
  • 00:25:18
    output of goods and services will reduce
  • 00:25:21
    um I mean you've got a couple of clear
  • 00:25:23
    examples between uh East Germany and
  • 00:25:24
    West Germany North Korea and South Korea
  • 00:25:27
    um I mean North Korea they're starving
  • 00:25:29
    uh South Korea it's like amazing it's
  • 00:25:32
    the future the compounding effect of
  • 00:25:33
    productivity gains yeah yeah it's night
  • 00:25:35
    and day yeah um and so in the north
  • 00:25:37
    North Korea you've got 100% government
  • 00:25:39
    um and in South Korea you've got
  • 00:25:41
    probably I don't know 40% government
  • 00:25:43
    it's not zero U and yet you've got a
  • 00:25:45
    standard of living that is probably 10
  • 00:25:46
    times higher in South Korea at least at
  • 00:25:49
    least exactly um uh and then East and
  • 00:25:52
    West Germany um in West Germany uh you
  • 00:25:55
    you had just thinking in terms of cars I
  • 00:25:57
    mean you had BMW Porsche Audi Mercedes
  • 00:26:00
    um and and East Germany which is a
  • 00:26:03
    random line on a map um you you the the
  • 00:26:07
    car only car you could get was a a
  • 00:26:08
    trabant which is basically a lawn mower
  • 00:26:10
    with a shell on it um and it was
  • 00:26:13
    extremely unsafe you there was a 20-year
  • 00:26:18
    wait so you like you know put your kid
  • 00:26:20
    on the list as soon as they're conceived
  • 00:26:23
    um conceived and and even then only I
  • 00:26:26
    think um you know a quarter of people
  • 00:26:29
    maybe got got this lousy car and the
  • 00:26:32
    same so so that's just an interesting
  • 00:26:34
    example of like basically the same
  • 00:26:35
    people different operating system and
  • 00:26:38
    and it's not like uh West Germany with
  • 00:26:40
    some you know you know a capitalist uh
  • 00:26:45
    Heaven it was it's quite socialist
  • 00:26:47
    actually so uh so when you look you know
  • 00:26:51
    probably it was half half government in
  • 00:26:53
    West Germany and 100% governor in East
  • 00:26:56
    Germany and again sort of a a five to
  • 00:26:59
    like P it call at least a 5 to 10x
  • 00:27:01
    standard of living difference and even
  • 00:27:03
    qualitatively vastly better and and it's
  • 00:27:06
    obviously you know person people have
  • 00:27:07
    these amazingly in this modern era this
  • 00:27:09
    debate as to which system is better well
  • 00:27:12
    I'll tell you which system is better um
  • 00:27:14
    the one that doesn't need to build the
  • 00:27:15
    wall to keep people in okay that's
  • 00:27:18
    that's how you can tell
  • 00:27:21
    okay it's a dead giveaway spoiler alert
  • 00:27:25
    dead giveaway are they climbing the wall
  • 00:27:27
    to get out or come in you have to build
  • 00:27:29
    a barrier to keep people in that is the
  • 00:27:32
    bad system um it wasn't West West berin
  • 00:27:36
    that built the wall okay they were like
  • 00:27:38
    to you know anyone who wants to flee
  • 00:27:40
    West berin go ahead um speaking of walls
  • 00:27:42
    so it you know and and and and if you
  • 00:27:44
    look at sort of the flux of boats from
  • 00:27:47
    Cuba there's a large number of boats
  • 00:27:49
    from Cuba and there's a bunch of free
  • 00:27:52
    boats that you anyone can take to to
  • 00:27:55
    back to Cuba like plenty of seeds is
  • 00:27:58
    like hey wow an abandoned boat I could
  • 00:28:00
    use this boat to go to Cuba where they
  • 00:28:02
    have communism awesome Yes um and and
  • 00:28:05
    and yet nobody nobody picks up those
  • 00:28:07
    boates and and does it amazing um so
  • 00:28:11
    given this a lot of thought yeah wait so
  • 00:28:12
    your point is jobs will be created if we
  • 00:28:14
    cut government spending in half jobs
  • 00:28:16
    will be created fast enough to make up
  • 00:28:18
    for right just the count yes obviously
  • 00:28:21
    you know I'm not suggesting that that
  • 00:28:23
    people you know um have like immediately
  • 00:28:26
    tfed you know tossed out with with no SA
  • 00:28:28
    and and you know can't now can't pay
  • 00:28:30
    their mortgage they you to see some
  • 00:28:31
    reasonable offramp where yeah yeah um so
  • 00:28:35
    reasonable offramp where you know
  • 00:28:36
    they're still um you know earning
  • 00:28:39
    they're still receiving money but have
  • 00:28:40
    like I don't know a year or two to to
  • 00:28:42
    find to find drops in the private sector
  • 00:28:44
    which they will find and then they will
  • 00:28:46
    be in a different operating system um
  • 00:28:48
    again you can see the difference East
  • 00:28:50
    Germany was incorporated into West
  • 00:28:51
    Germany living standards in East Germany
  • 00:28:53
    uh Rose
  • 00:28:55
    dramatically um well so in four years if
  • 00:28:58
    you could shrink the side the size of
  • 00:29:00
    the government with Trump what would be
  • 00:29:01
    a good Target just in terms of like
  • 00:29:04
    ballpark I mean are you trying to get me
  • 00:29:05
    assassinated before this even happens no
  • 00:29:09
    no Piccolo number I mean you know
  • 00:29:11
    there's that old phrase go postal I mean
  • 00:29:13
    it's like they might yeah me so we'll
  • 00:29:15
    keep the post office I mean I'm going to
  • 00:29:17
    need a all a security detail guys yes I
  • 00:29:20
    mean the number of just granted workers
  • 00:29:24
    for former government employees is you
  • 00:29:26
    know
  • 00:29:28
    quite a scary number I mean I might not
  • 00:29:29
    make it you know I was saying low low
  • 00:29:31
    digits every year for four years would
  • 00:29:33
    be palatable yeah and I like your idea
  • 00:29:35
    the thing is that if if it's not done uh
  • 00:29:38
    like if you have a once once in a
  • 00:29:40
    lifetime or once in a generation
  • 00:29:41
    opportunity and you don't take Serious
  • 00:29:43
    action and and then you have four years
  • 00:29:46
    to get it done and then and if it
  • 00:29:49
    doesn't get done then how serious is
  • 00:29:51
    Trump about this like you've talked to
  • 00:29:53
    him about it yeah yeah I think he is
  • 00:29:55
    very serious about it got it um and no I
  • 00:29:58
    I think actually the reality is that if
  • 00:30:00
    we get rid of nonsense regulations and
  • 00:30:02
    shift people from the government sector
  • 00:30:05
    to the private sector we will have
  • 00:30:07
    immense
  • 00:30:08
    Prosperity um and and I think we will
  • 00:30:10
    have a golden age in this country and
  • 00:30:13
    it'll be
  • 00:30:14
    fantastic can we uh can we talk about
  • 00:30:19
    space um you have a bunch of critical
  • 00:30:22
    Milestones coming up um yeah in fact
  • 00:30:24
    there's an important a very exciting
  • 00:30:26
    launch
  • 00:30:28
    that is may be happening tonight so if
  • 00:30:31
    that if the weather is is holding up
  • 00:30:32
    then I'm going to leave here head to
  • 00:30:34
    Cape canaval for the um the the pl stor
  • 00:30:38
    Mission which is a private Mission so
  • 00:30:39
    finded by Dereck um dared isman and he's
  • 00:30:43
    um awesome guy and and there this will
  • 00:30:46
    be the first time uh the first private
  • 00:30:49
    first first commercial space walk um and
  • 00:30:51
    and it be at at the highest altitude uh
  • 00:30:55
    since Apollo so it's the furthest from
  • 00:30:57
    Earth that anyone's
  • 00:30:59
    gone um
  • 00:31:03
    yeah and you what comes after that let's
  • 00:31:06
    assume that's successful and I sure hope
  • 00:31:08
    so
  • 00:31:11
    man um no
  • 00:31:15
    pressure
  • 00:31:16
    um yeah we you know Absolut you know
  • 00:31:19
    astronut prior astronut safety is man if
  • 00:31:23
    I had like all all the all the wishes I
  • 00:31:25
    could say about that would be the one to
  • 00:31:27
    to put on
  • 00:31:28
    so you know space is dangerous um so the
  • 00:31:35
    the yeah the next
  • 00:31:37
    Milestone after that would be the next
  • 00:31:40
    flight of Starship um which um you know
  • 00:31:44
    Starship is the next FL Starship is
  • 00:31:46
    ready to fly we are waiting for
  • 00:31:48
    regulatory
  • 00:31:51
    approval you know yeah it it it it
  • 00:31:55
    really should not be possible to build a
  • 00:31:57
    giant rocket faster than the paper can
  • 00:32:01
    move from one desk to
  • 00:32:03
    [Applause]
  • 00:32:08
    another that stamp is really hard
  • 00:32:13
    approved yeah you ever see that movie
  • 00:32:15
    zootopia you ever see that movie
  • 00:32:17
    zootopia there's like a
  • 00:32:20
    sloth in for the appr yeah accidentally
  • 00:32:24
    tell a joke and and I was like oh no
  • 00:32:26
    this is going to take a long time sorry
  • 00:32:28
    sorry um but yeah zootopia you know you
  • 00:32:33
    know the funny thing is like so I went
  • 00:32:34
    to the
  • 00:32:36
    DMV about I don't know a year later
  • 00:32:39
    after zootopia and to get my whatever
  • 00:32:42
    license renewal and the guy in in an
  • 00:32:44
    exercise of incredible self-awareness
  • 00:32:46
    had the sloth from Utopia in his um in
  • 00:32:50
    his cube in in in his Cube and he was
  • 00:32:52
    actually Swift
  • 00:32:54
    yeah with the with the Mandate beat the
  • 00:32:56
    sloth yeah yeah no personal agency
  • 00:32:59
    personal agency no I mean people like
  • 00:33:03
    think the you know the government is um
  • 00:33:07
    more competent than it than it is I'm
  • 00:33:08
    not saying that there aren't competent
  • 00:33:10
    people in the government they're just in
  • 00:33:11
    an operating system that is inefficient
  • 00:33:13
    um once you move them to a more
  • 00:33:14
    efficient operating system they their
  • 00:33:17
    output is dramatically greater as we've
  • 00:33:18
    seen examp you know when East Germany
  • 00:33:21
    was reintegrated to into with West
  • 00:33:23
    Germany and and and the same people um
  • 00:33:27
    were vastly more prosperous uh with a
  • 00:33:29
    basically half capitalist uh operating
  • 00:33:32
    system
  • 00:33:33
    so um but I
  • 00:33:36
    mean for a lot of people their like the
  • 00:33:40
    maybe most direct experience with the
  • 00:33:41
    government is the DMV um and and then
  • 00:33:45
    the important thing to remember is the
  • 00:33:46
    the government is the DMV at
  • 00:33:50
    scale right that's the government got
  • 00:33:53
    the mental picture how much do you want
  • 00:33:54
    to scale it
  • 00:33:58
    yeah yeah sorry can you go back to
  • 00:34:01
    chim's um uh question on Star so you you
  • 00:34:03
    announced just the other day Starship
  • 00:34:05
    going to Mars in two years and by the
  • 00:34:08
    way huh yeah yeah yeah yeah yeah and
  • 00:34:11
    then four years for
  • 00:34:13
    crude uh aspirational launch in the next
  • 00:34:15
    window and how much is the government
  • 00:34:17
    involved I'm not saying like say you
  • 00:34:19
    watch by these not you know uh but these
  • 00:34:22
    uh but it based on our current progress
  • 00:34:26
    where with Starship we're able to
  • 00:34:28
    successfully reach oval ofy twice uh we
  • 00:34:31
    were able to achieve soft Landings of
  • 00:34:33
    the the booster and the ship and water
  • 00:34:36
    uh and that's despite the ship having
  • 00:34:39
    you know half its fls cooked off um you
  • 00:34:41
    can see the video on the xplatform is
  • 00:34:43
    quite exciting um so you know we we
  • 00:34:47
    think we'll be able to have to launch
  • 00:34:51
    reliably and repeatedly and quite
  • 00:34:53
    quickly um and the the the fundamental
  • 00:34:57
    holy breakthrough for rocketry for to
  • 00:35:00
    what the fundamental breakthrough that
  • 00:35:01
    is needed for life to become
  • 00:35:03
    multiplanetary is a rapidly reusable
  • 00:35:07
    reliable rocket
  • 00:35:10
    R for a pirate somehow um throw fire in
  • 00:35:14
    there um the so with Starship is the
  • 00:35:20
    first rocket design
  • 00:35:24
    where success is one of the possible
  • 00:35:26
    outcomes with full
  • 00:35:28
    reusability um so you for any given
  • 00:35:30
    project you have to say this is the
  • 00:35:32
    circle so right band diagrams um here
  • 00:35:35
    has a circle and is Success the success
  • 00:35:38
    dot in the circle um is is success in
  • 00:35:42
    the set of possible outcomes that uh you
  • 00:35:45
    know sounds pretty obvious but there are
  • 00:35:47
    often projects where that that is
  • 00:35:49
    success is not in the set of possible
  • 00:35:51
    outcomes um and so so
  • 00:35:54
    Starship not only is fully full
  • 00:35:57
    reusability in the set of possible
  • 00:35:58
    outcomes it it is being proven with each
  • 00:36:00
    launch um and and and I'm confident it
  • 00:36:03
    will succeed it's simply a matter of
  • 00:36:05
    time and you know if if we can get some
  • 00:36:11
    improvement in the speed of Regulation
  • 00:36:13
    we we could actually move a lot faster
  • 00:36:15
    um uh so that would that would be very
  • 00:36:19
    helpful and and in fact if if this if
  • 00:36:22
    something isn't done about um reducing
  • 00:36:26
    regulation and and sort of speeding up
  • 00:36:28
    approvals and to be clear I'm not
  • 00:36:30
    talking about anything unsafe it's
  • 00:36:32
    simply the processing of the safe thing
  • 00:36:35
    can be done at a as as fast as the
  • 00:36:38
    rocket is built not slower then uh then
  • 00:36:42
    then we could become a spacefaring
  • 00:36:43
    civilization and a multiplet species
  • 00:36:46
    ultimate and be out there among the
  • 00:36:47
    stars in the future and
  • 00:36:52
    there's you know it's it's just very
  • 00:36:55
    like it's incredibly important that we
  • 00:36:57
    have things that that we find inspiring
  • 00:37:00
    that you look to the Future and say the
  • 00:37:03
    future is going to be better than the
  • 00:37:04
    past things to look forward to and like
  • 00:37:08
    like kids are a
  • 00:37:10
    good a good way to assess this like what
  • 00:37:12
    are kids fired up about and if you can
  • 00:37:15
    say you know you you could you know you
  • 00:37:18
    could be an Astron on Mars you you could
  • 00:37:20
    maybe one day uh go beyond the solar
  • 00:37:23
    system um we could make Star Trek starf
  • 00:37:27
    Academy real um that is an exciting
  • 00:37:31
    future that is
  • 00:37:33
    inspiring um you know just I mean you
  • 00:37:36
    need things that move your heart right
  • 00:37:40
    um
  • 00:37:41
    yeah [ __ ] yeah [ __ ] yeah let's do it I
  • 00:37:47
    mean like like life can't just be about
  • 00:37:49
    solving one miserable problem after
  • 00:37:51
    another there's got to be things that
  • 00:37:53
    you look forward to as well yeah uh and
  • 00:37:55
    and do do you think you might have to
  • 00:37:57
    move it to a different jurisdiction to
  • 00:37:59
    move faster I've always wondered if like
  • 00:38:01
    what it's rocket technology is
  • 00:38:03
    considered Advanced weapons technology
  • 00:38:05
    so we can't just go do it you know in
  • 00:38:07
    another country yes in it yeah
  • 00:38:09
    interesting and if we don't do it other
  • 00:38:10
    countries could do it I mean they're so
  • 00:38:13
    far behind us but theoretically there's
  • 00:38:15
    a national
  • 00:38:17
    security you know justification here if
  • 00:38:20
    if somebody can put their thinking caps
  • 00:38:22
    on like do we want to have this
  • 00:38:23
    technology that you're building the
  • 00:38:25
    team's working so hard on stolen by
  • 00:38:26
    other countries
  • 00:38:27
    and then you know maybe they don't have
  • 00:38:30
    as much red tape I I wish people were
  • 00:38:32
    trying to steal it um so that no no
  • 00:38:36
    one's trying to steal it it's just
  • 00:38:39
    too this just it's too crazy
  • 00:38:42
    basically um and that's for you yeah
  • 00:38:45
    it's way too crazy El what do you think
  • 00:38:47
    um is going on that led to
  • 00:38:51
    Boeing building the Star Line the way
  • 00:38:53
    that they did they were able to get it
  • 00:38:56
    up
  • 00:38:59
    but not complete but can't complete they
  • 00:39:01
    can't finish can't finish and now you're
  • 00:39:04
    going to have to go up and
  • 00:39:06
    finish um
  • 00:39:09
    um well I mean I think Boeing is a
  • 00:39:13
    company that is you they actually do so
  • 00:39:16
    much business with the government they
  • 00:39:17
    have sort of impedance match to the
  • 00:39:19
    government so they're they're like
  • 00:39:21
    basically one notch away from the
  • 00:39:23
    government maybe two they're not far
  • 00:39:25
    from the government from an efficiency
  • 00:39:27
    stand point because they derive so much
  • 00:39:28
    of the revenue from the government um
  • 00:39:31
    and a lot of people think well SpaceX is
  • 00:39:33
    super dependent on the government and
  • 00:39:34
    actually no most of our revenue is
  • 00:39:35
    commercial um
  • 00:39:38
    so
  • 00:39:41
    um and and and and
  • 00:39:44
    there's I think at least up until
  • 00:39:47
    perhaps recently because they have a new
  • 00:39:49
    CEO who actually shows up in the factory
  • 00:39:52
    yeah um and the the the COO before that
  • 00:39:54
    I think had a degree in accounting and
  • 00:39:56
    and never went to the factory
  • 00:39:58
    and didn't know how airplanes
  • 00:40:00
    flew um so I think if you are in charge
  • 00:40:03
    of a company that makes airplanes fly
  • 00:40:08
    and spacecraft go to vit you need to
  • 00:40:11
    know it can't be a total mystery as to
  • 00:40:14
    how they work
  • 00:40:17
    yeah
  • 00:40:18
    so you know I'm like sure if somebody's
  • 00:40:22
    like running cocoa Pepsi and and they're
  • 00:40:24
    like great at marketing or whatever um
  • 00:40:26
    that's that's fine because it's you know
  • 00:40:29
    it's it's not a sort of Technology
  • 00:40:31
    dependent business um you know or if
  • 00:40:34
    they're running a you know financial
  • 00:40:36
    consulting in their degrees in
  • 00:40:38
    accounting that makes sense um but I
  • 00:40:40
    think uh you know if you're if you're
  • 00:40:43
    the Cavalry Captain you should know how
  • 00:40:44
    to ride a horse pretty basic yeah
  • 00:40:49
    yeah it's like it's disconcerting if the
  • 00:40:51
    Cavalry Captain just falls off the horse
  • 00:40:53
    you
  • 00:40:55
    know the team
  • 00:40:58
    I'm sorry I'm scared of horses gets on
  • 00:40:59
    backwards I'm like
  • 00:41:02
    oops um sh shifting gears to AI uh Peter
  • 00:41:05
    was here earlier and he was talking
  • 00:41:07
    about how so far the only company to
  • 00:41:08
    really make money off AI is NVIDIA with
  • 00:41:11
    the chips um do you have a sense yet of
  • 00:41:14
    where you think the big applications
  • 00:41:16
    will be from AI is it going to be an
  • 00:41:19
    enabling self-driving is it going to be
  • 00:41:20
    enabling robots is it transforming
  • 00:41:23
    Industries I mean it's still I think
  • 00:41:25
    early in terms of where the big business
  • 00:41:28
    impact is going to be do you have a
  • 00:41:29
    sense
  • 00:41:40
    yet I I mean I think I think
  • 00:41:43
    the the the spending on AI probably runs
  • 00:41:47
    ahead of I mean does run ahead of the
  • 00:41:49
    revenue right now that's there's no
  • 00:41:51
    question about that um but the rate of
  • 00:41:55
    improvement of AI is faster than any
  • 00:41:57
    technology I've ever seen by far
  • 00:42:00
    and and and it
  • 00:42:04
    it's I mean like the for example the
  • 00:42:07
    touring test used to be a thing now you
  • 00:42:11
    know your basic uh open- Source random
  • 00:42:14
    llm you're writing on a freaking
  • 00:42:15
    Raspberry Pi probably could uh you know
  • 00:42:18
    beat the touring test
  • 00:42:21
    um so
  • 00:42:23
    there's I I I I think actually
  • 00:42:27
    like like the the the good future of AI
  • 00:42:31
    is one of immense
  • 00:42:33
    Prosperity
  • 00:42:35
    where there is an age of abundance no
  • 00:42:39
    shortage of goods and
  • 00:42:41
    services everyone can have whatever they
  • 00:42:44
    want unless except for things we
  • 00:42:46
    artificially Define to be scarce like
  • 00:42:48
    some special
  • 00:42:49
    artwork um but but anything that is a
  • 00:42:52
    manufactured good or provided Service uh
  • 00:42:55
    will I think with the of AI plus
  • 00:42:58
    robotics that the cost of goods and
  • 00:43:01
    services will be
  • 00:43:04
    will Trend to zero like I'm not saying
  • 00:43:08
    it be actually zero but it'll
  • 00:43:11
    be every everyone will be able to have
  • 00:43:13
    anything they want that that's the good
  • 00:43:15
    future of course and you know in my view
  • 00:43:19
    that's probably 80% likely so look on
  • 00:43:20
    the right
  • 00:43:22
    side only 20% 20% probability of
  • 00:43:25
    annihilation nothing
  • 00:43:27
    um is is the 20% like what does that
  • 00:43:30
    look like I don't know man I mean
  • 00:43:34
    frankly I do have to go engage in some
  • 00:43:36
    degree of of deliberate suspension of
  • 00:43:37
    disbelief with respect to AI in order to
  • 00:43:40
    sleep well um and even then um because I
  • 00:43:44
    I I I think the actual issue the the
  • 00:43:47
    most likely issue is like well how do we
  • 00:43:48
    find meaning in a world where AI can do
  • 00:43:50
    everything we can do a bit better that
  • 00:43:52
    that is that is perhaps the bigger
  • 00:43:54
    challenge um
  • 00:43:57
    although you know at this point I know
  • 00:43:59
    more and more people who are retired and
  • 00:44:01
    they seem to enjoy that life
  • 00:44:04
    so uh but I think that that may be maybe
  • 00:44:07
    there'll be some crisis of meaning like
  • 00:44:09
    because the computer can do everything
  • 00:44:11
    you can do B better so maybe that'll be
  • 00:44:15
    a challenge um but but really uh you
  • 00:44:20
    know you need you need the sort of end
  • 00:44:21
    factors you need the the autonomous cars
  • 00:44:25
    and you need the sort of humanoid robots
  • 00:44:28
    or general purpose robots um but once
  • 00:44:31
    you have general purpose humanoid robots
  • 00:44:35
    um and autonomous
  • 00:44:37
    vehicles uh you really you you you can
  • 00:44:41
    build anything um and and and this I
  • 00:44:45
    think that there's no actual limit to
  • 00:44:47
    the size of the economy I mean there
  • 00:44:49
    obviously you know the mass of Earth you
  • 00:44:51
    know like that be one limit um but the
  • 00:44:56
    you know the the economy is is really
  • 00:44:57
    just the average productivity per person
  • 00:45:00
    times number of people that's the
  • 00:45:02
    economy and if you if you've got
  • 00:45:05
    humanoid robots that can do you know
  • 00:45:09
    where there's no real limit on the
  • 00:45:10
    number of humanoid robots um and and
  • 00:45:13
    they they can operate very intelligently
  • 00:45:16
    then then there's no actual limit to the
  • 00:45:18
    economy in there's no meaningful limit
  • 00:45:20
    to the economy you guys just turned on
  • 00:45:22
    Colossus which is like the largest
  • 00:45:25
    private compute cluster I guess of gpus
  • 00:45:28
    anywhere is that it's it's the it's the
  • 00:45:30
    most powerful supercomputer of any kind
  • 00:45:33
    um which sort of speaks to what David
  • 00:45:36
    said and kind of what Peter said which
  • 00:45:37
    is a lot of the kind of economic value
  • 00:45:40
    so far of ai ai is entirely gone to
  • 00:45:43
    Nvidia but there are people with
  • 00:45:45
    Alternatives and you're actually one
  • 00:45:47
    with an alternative now you have a very
  • 00:45:48
    specific case because Dojo is really
  • 00:45:49
    about images and large images huge
  • 00:45:53
    video um yeah I mean the the Tesla
  • 00:45:56
    problem is is different from the
  • 00:45:59
    um you know the sort of llm problem uh
  • 00:46:03
    the nature of the intelligence actually
  • 00:46:04
    is actually and
  • 00:46:06
    and the what what matters in the AI is
  • 00:46:09
    is different um to to the point you just
  • 00:46:12
    made which is that in the in ts's case
  • 00:46:14
    the context uh length is very long so
  • 00:46:17
    we've got gigabytes of context G context
  • 00:46:19
    Windows yeah yeah you got you know sort
  • 00:46:22
    of uh I was just bringing it up kind of
  • 00:46:24
    billions of tokens of context NY amount
  • 00:46:26
    of cont context because you've got um
  • 00:46:29
    seven seven cameras and if you've got
  • 00:46:31
    several you know let's say you've got a
  • 00:46:33
    minute of several high high def cameras
  • 00:46:36
    then that's gigabytes so you need to
  • 00:46:38
    compress so the Tesla problem is you got
  • 00:46:40
    to compress a gigantic context um into
  • 00:46:44
    the the pixels that are that actually
  • 00:46:47
    matter um
  • 00:46:49
    and you know and and and condense that
  • 00:46:52
    over a time and so you've got in both uh
  • 00:46:55
    the time Dimension the space Dimension
  • 00:46:57
    you've got to compress the pixels u in
  • 00:47:00
    space and the pixels over in time
  • 00:47:03
    um and and and then and then have that
  • 00:47:06
    inference done on a tiny computer
  • 00:47:08
    relatively speaking a small like you
  • 00:47:11
    know a few hundred watt uh it's a Tesla
  • 00:47:14
    designed AI inference computer uh which
  • 00:47:16
    is bya still the best there isn't a
  • 00:47:19
    better thing we could buy from suppliers
  • 00:47:21
    so the Tesla designed AI inference
  • 00:47:23
    computer that's in the cars is better
  • 00:47:25
    than anything we could buy from any
  • 00:47:27
    supplier just by the way that's kind of
  • 00:47:28
    a the way the Tesla a a is extremely
  • 00:47:32
    good you guys in the design there was a
  • 00:47:34
    technical paper and there was a deck
  • 00:47:35
    that somebody on your team from Tesla
  • 00:47:38
    published and it was stunning to me you
  • 00:47:40
    designed your own transport control like
  • 00:47:42
    layer over ethernet you're like ah
  • 00:47:44
    ethernet's not good enough for us you
  • 00:47:45
    have this TT Coe or something and you're
  • 00:47:48
    like oh we're just going to reinvent
  • 00:47:49
    ethernet and like string these chips
  • 00:47:51
    it's pretty incredible stuff that's
  • 00:47:52
    happening over there yeah um no the team
  • 00:47:56
    the the Tesla a chip design team is
  • 00:47:58
    extremely extremely good um so um but is
  • 00:48:03
    there a world where for example other
  • 00:48:04
    people over time that need you know some
  • 00:48:06
    sort of like video use case or image use
  • 00:48:09
    case theoretically you know you'd say oh
  • 00:48:11
    why not you know I have some extra
  • 00:48:12
    Cycles over here so it which should kind
  • 00:48:14
    of make you a competitor of Nvidia it's
  • 00:48:16
    not intentionally per se
  • 00:48:18
    but um yeah I mean
  • 00:48:22
    the you know this training and inference
  • 00:48:25
    and we we do have the you know two the
  • 00:48:26
    two projects at Tesla we've got Dojo
  • 00:48:28
    which is the the training computer uh
  • 00:48:31
    and then um you know our inference chip
  • 00:48:35
    which is in every every car inference
  • 00:48:38
    computer um so and Dojo we've only had
  • 00:48:42
    Dojo one Dojo 2
  • 00:48:44
    is um you know should be we should have
  • 00:48:47
    Dojo 2 in volume towards the end of next
  • 00:48:48
    year um and and that that that will be
  • 00:48:52
    we think sort of comparable to uh the
  • 00:48:57
    sort of a b200 type type system a
  • 00:48:59
    training system um and um you know so
  • 00:49:04
    there's I guess there's some potential
  • 00:49:06
    for for that to be used as a service um
  • 00:49:10
    and but like
  • 00:49:13
    you Dojo is is just kind of like I mean
  • 00:49:17
    we're I guess I guess I have like some
  • 00:49:20
    improved confidence in Dojo
  • 00:49:24
    um but I think we won't really know
  • 00:49:27
    how good Dojo is until probably version
  • 00:49:30
    three like usually takes three major
  • 00:49:32
    iterations on a technology for it to be
  • 00:49:35
    to be excellent um and we'll only have
  • 00:49:37
    the second major iteration next year um
  • 00:49:41
    the third iteration I don't know maybe
  • 00:49:44
    late you know 26 or something like that
  • 00:49:47
    how's the uh how's the Optimus project
  • 00:49:48
    going I remember when we talked last um
  • 00:49:50
    you said this publicly that it's in
  • 00:49:53
    doing some light testing inside the
  • 00:49:55
    factory um so it's actually being useful
  • 00:49:58
    what's the build of materials and when
  • 00:50:01
    you know for something like that at
  • 00:50:02
    scale so when you start making it like
  • 00:50:04
    you're making the model 3 now and
  • 00:50:05
    there's a million of them coming off the
  • 00:50:06
    factory line what would the would they
  • 00:50:08
    cost 20 30 $40,000 you think yeah I mean
  • 00:50:12
    what I mean I've discovered really that
  • 00:50:15
    you know anything made in sufficient
  • 00:50:17
    volume will ASM totically approach the
  • 00:50:20
    cost of its of its uh
  • 00:50:23
    materials so now there's there's I
  • 00:50:25
    should say the there's
  • 00:50:27
    some some things are constrained by the
  • 00:50:28
    cost of intellectual property and like
  • 00:50:30
    paying for patents and stuff so a lot of
  • 00:50:33
    you know what what's in a a chip is like
  • 00:50:35
    paying paying royalties um and
  • 00:50:38
    depreciation of the chip fa so but the
  • 00:50:41
    actual marginal cost of the chips is
  • 00:50:42
    very low um so so so Optimus it
  • 00:50:47
    obviously is humanoid robot it it is it
  • 00:50:49
    weighs much less than it's much smaller
  • 00:50:51
    than a car um so the you could expect
  • 00:50:55
    that in high volume
  • 00:50:58
    uh and and i' say you also probably need
  • 00:51:00
    three three production versions of
  • 00:51:01
    Optimus so you need to refine the design
  • 00:51:05
    three at least three major times and and
  • 00:51:08
    then you need scale production
  • 00:51:10
    to sort of the million unit plus per
  • 00:51:13
    year
  • 00:51:14
    level and I think at that point the cost
  • 00:51:19
    the the you know the labor and materials
  • 00:51:22
    on Optimus is probably not much more
  • 00:51:25
    than $10,000 yeah and that's a decade
  • 00:51:27
    long journey maybe basically think of it
  • 00:51:29
    like Optimus will cost less than um a a
  • 00:51:34
    small car right
  • 00:51:37
    so at at scale volume with three major
  • 00:51:40
    iterations of technology and and so if a
  • 00:51:42
    small car you know costs $25,000 you
  • 00:51:46
    know it's it's probably like I don't
  • 00:51:48
    know $20,000 for for an Optimus for a
  • 00:51:50
    humanoid robot that can be your your
  • 00:51:53
    body like a combination of R2D2 and c3p
  • 00:51:57
    better I mean you know that's honestly I
  • 00:52:00
    think people are going to get really
  • 00:52:01
    attached to their humanoid robot because
  • 00:52:03
    I mean like you look at sort of you
  • 00:52:05
    watch Star Wars it's like R2D2 and C3 I
  • 00:52:07
    love those guys um you know they're
  • 00:52:10
    awesome um and their personality and and
  • 00:52:14
    I mean and all all R2 could do is just
  • 00:52:16
    beef at you couldn't couldn't speak
  • 00:52:20
    English um c3p to translate the beeps
  • 00:52:24
    you know so you're in year two of that
  • 00:52:25
    if you did two or three years per
  • 00:52:27
    iteration or something it's a decade
  • 00:52:28
    long journey for this to hit some sort
  • 00:52:30
    of scale um I I would say m major
  • 00:52:34
    iterations are less than two years so
  • 00:52:36
    okay um it's probably on the order of
  • 00:52:39
    five five years yeah
  • 00:52:42
    uh maybe six to get to a million units a
  • 00:52:45
    year and at that price point everybody
  • 00:52:47
    can afford one on planet Earth I mean
  • 00:52:50
    it's going to be that 1: one 2: one what
  • 00:52:52
    do you think ultimately if we're sitting
  • 00:52:53
    here in 30 years the number of robot on
  • 00:52:56
    the planet versus Humans yeah I think
  • 00:52:59
    the number of robots will vastly exceed
  • 00:53:00
    the number of humans vastly yeah vastly
  • 00:53:03
    exceed I mean you have to say like who
  • 00:53:04
    who would not want their robot buddy
  • 00:53:08
    everyone wants a robot buddy
  • 00:53:11
    um you know this is like it especially
  • 00:53:14
    if it can you know you know it can take
  • 00:53:17
    care of your your take your dog for a
  • 00:53:20
    walk it could you know mo mow the lawn
  • 00:53:23
    it could watch your kids uh it could you
  • 00:53:25
    know like it could it could teach your
  • 00:53:28
    kids it could it could we could also
  • 00:53:30
    send it to Mars we send a lot of robots
  • 00:53:33
    to Mars to do the work needed to yeah
  • 00:53:35
    make it a colonized planet for him Mars
  • 00:53:37
    is already the robot Planet there's like
  • 00:53:39
    a whole bunch of you know robots like
  • 00:53:41
    Rovers and only helicopter yes only
  • 00:53:43
    robots yeah um so yeah the no I I think
  • 00:53:48
    the the sort of useful humanoid
  • 00:53:51
    robot opportunity is the single biggest
  • 00:53:54
    opportunity
  • 00:53:57
    ever
  • 00:54:00
    um because if you assume like that I
  • 00:54:03
    mean the I think the ratio of humanoid
  • 00:54:05
    robots to humans is going to be at least
  • 00:54:06
    2 to one maybe 3 to one because
  • 00:54:09
    everybody every everybody will want one
  • 00:54:10
    and then there'll be a bunch of robots
  • 00:54:12
    that you don't see that are making goods
  • 00:54:13
    and services and you think it's a
  • 00:54:15
    general one generalized robot that then
  • 00:54:17
    learns how to do different tasks or yeah
  • 00:54:20
    hey um I mean we are a
  • 00:54:22
    generalized yeah we're a generalized
  • 00:54:24
    we're just made of meat exactly
  • 00:54:27
    uh we're a meat PB generaliz meat yeah I
  • 00:54:29
    mean operating my meat Pub you
  • 00:54:32
    know so um yeah we are
  • 00:54:35
    actually and by the way it turns out
  • 00:54:37
    like as we're designing optimists we
  • 00:54:40
    sort of learn more and more about why
  • 00:54:42
    humans are shaped the way they're shaped
  • 00:54:45
    and you know and why we have five
  • 00:54:47
    fingers and why your little finger is
  • 00:54:49
    smaller than you know your index finger
  • 00:54:51
    um you know you know obviously why you
  • 00:54:54
    have opposable thumbs but also why for
  • 00:54:56
    example your the muscles the major
  • 00:54:59
    muscles that operate your hand are
  • 00:55:01
    actually in your forearm and and your
  • 00:55:04
    fingers are primarily operated like um
  • 00:55:08
    your the muscles that actuate your
  • 00:55:11
    fingers um are located the vast majority
  • 00:55:14
    of the of the of your finger strength is
  • 00:55:16
    actually coming from your
  • 00:55:17
    forearm um and your fingers are being
  • 00:55:19
    operated by tendons little strings that
  • 00:55:23
    that's and so the current version of of
  • 00:55:26
    the Optimus hand uh has the actuators in
  • 00:55:29
    the hand and has only 11 degrees of
  • 00:55:32
    freedom so it can't it's not as doesn't
  • 00:55:33
    have all the degrees of freedom of human
  • 00:55:36
    hand which has depending on how you
  • 00:55:38
    count it roughly 25 degrees of freedom
  • 00:55:40
    um and
  • 00:55:43
    uh and and and and it's also like not
  • 00:55:46
    strong enough in certain ways because
  • 00:55:48
    the actuators have to fit in the hand um
  • 00:55:51
    so the Next Generation Optimus hand uh
  • 00:55:54
    which we have in Prototype form the the
  • 00:55:56
    actuators have moved to the forearm just
  • 00:55:58
    like a human and they operate the the
  • 00:56:01
    fingers through cables just like a human
  • 00:56:04
    hand and uh and then the next Generation
  • 00:56:07
    had has 22 degrees of
  • 00:56:08
    freedom um which we think
  • 00:56:11
    is enough to do almost anything that a
  • 00:56:14
    human can do
  • 00:56:17
    um and presumably I think it was written
  • 00:56:20
    that X and Tesla may work together and
  • 00:56:24
    you know provide services but my Med
  • 00:56:26
    thought went to oh if you just provide a
  • 00:56:28
    grock to the robot then the robot has a
  • 00:56:30
    personality and can process oh yeah
  • 00:56:32
    voice and video and images and all of
  • 00:56:33
    that stuff as the uh as we wrap
  • 00:56:36
    here I think uh you know everybody talks
  • 00:56:40
    about all the projects you're working on
  • 00:56:42
    but um people don't know you have a
  • 00:56:44
    great sense of humor that's not true oh
  • 00:56:46
    you do you do um people don't see it but
  • 00:56:49
    I would say one of I know for me the
  • 00:56:51
    funniest week of my life or one of the
  • 00:56:52
    funniest was when you did SNL and we got
  • 00:56:55
    and you I got to tag along maybe you saw
  • 00:56:58
    it um maybe behind the scenes like some
  • 00:57:03
    of your funniest Recollections of that
  • 00:57:05
    chaotic insane week when we laughed for
  • 00:57:09
    12 hours a day it was a little
  • 00:57:11
    terrorizing on the first couple of days
  • 00:57:12
    but yeah I was I was bit worried in the
  • 00:57:15
    beginning there because frankly nothing
  • 00:57:18
    was funny um day one was rough rough um
  • 00:57:24
    yeah so I mean it's like a rule but
  • 00:57:27
    can't you guys just say it just say the
  • 00:57:29
    stuff that got on the
  • 00:57:31
    cutting the funniest skits were the ones
  • 00:57:33
    that didn't let you do that's what I'm
  • 00:57:34
    saying can you just say there were a
  • 00:57:35
    couple of funny ones yeah that they
  • 00:57:36
    didn't like well you can say it so that
  • 00:57:37
    he doesn't get I mean how much time do
  • 00:57:40
    we have here well we should just give
  • 00:57:41
    him one or two it
  • 00:57:43
    was in your mind which one do we regret
  • 00:57:46
    most not getting on
  • 00:57:49
    air you really want to hear that I I
  • 00:57:53
    mean I mean it was a little spicy it was
  • 00:57:56
    a little
  • 00:57:58
    funny
  • 00:58:00
    okay here we go all right here we go
  • 00:58:05
    guys all right so one of the things that
  • 00:58:10
    um I think everyone's been sort of
  • 00:58:11
    wondering this whole time is is Saturday
  • 00:58:13
    night Saturday Night Live actually live
  • 00:58:17
    like live live live live or do they have
  • 00:58:20
    like a delay or like just in case you
  • 00:58:23
    know there's a wardrobe malfunction or
  • 00:58:25
    something like
  • 00:58:27
    uh is it like a you know 5c delay what's
  • 00:58:30
    really going on but there's a there's a
  • 00:58:32
    way to test this right we came up the
  • 00:58:35
    way there's a way to test
  • 00:58:37
    it um which is we don't tell them what's
  • 00:58:40
    going on as I W on and say this is the
  • 00:58:43
    script I throw it on the ground we're
  • 00:58:46
    going to find out tonight right now if
  • 00:58:48
    Saturday live if Saturday night life is
  • 00:58:51
    actually
  • 00:58:53
    live and the way that we're going to do
  • 00:58:55
    this is I'm going to take my [ __ ]
  • 00:59:00
    [Applause]
  • 00:59:03
    out this is the greatest pitch ever and
  • 00:59:06
    and if if if if you see my
  • 00:59:11
    [ __ ] you know it's
  • 00:59:14
    true and if you don't it's been a lie
  • 00:59:17
    it's been a lie all these years all
  • 00:59:19
    these years now this is we're going to
  • 00:59:21
    bust them right now and this we're
  • 00:59:23
    pitching this yeah yeah so we're
  • 00:59:25
    pitching this Zoom yeah pring this on
  • 00:59:27
    zoom on like a Monday after like yeah
  • 00:59:30
    we're like kind of hung over from the
  • 00:59:31
    weekend and like pring this like and and
  • 00:59:34
    and it's uh it's you know Jason's on um
  • 00:59:37
    and uh Mike and you yeah and Mike uh you
  • 00:59:40
    know essentially got like you know who
  • 00:59:42
    my friends who I think are sort of you
  • 00:59:44
    know quite funny um you know uh Jason's
  • 00:59:48
    quite funny I think like like Jason's
  • 00:59:50
    the closest thing to Cartman that exists
  • 00:59:52
    in the real in real
  • 00:59:54
    life we haven't show that he's Butters
  • 00:59:57
    and I'm Cartman yeah so um and then my
  • 01:00:02
    friend Mike's PR funny too so so we we
  • 01:00:06
    come in like like just like guns blazing
  • 01:00:08
    guns blazing with with like ideas and
  • 01:00:11
    and we didn't realize like actually you
  • 01:00:12
    know that's not how it works and and and
  • 01:00:14
    uh that's that's that's normally like
  • 01:00:17
    actors and and they just get told what
  • 01:00:18
    to do and like oh right well you mean we
  • 01:00:20
    can't just like do funny things that we
  • 01:00:23
    thought of what they're they're watching
  • 01:00:25
    this on the zoom they're a gas at pitch
  • 01:00:29
    yeah silence like so I'm like and I'm
  • 01:00:32
    like and I was like is this thing
  • 01:00:33
    working is are we muted is is our mic on
  • 01:00:36
    I they like we hear you yeah and then
  • 01:00:38
    and then after a long silence like
  • 01:00:40
    Mike's Mike just says the word crickets
  • 01:00:43
    crickets and they're not laughing
  • 01:00:45
    they're not even a chule I'm like what's
  • 01:00:47
    going on Elon explains the punch line
  • 01:00:50
    yes is exactly so there's more to it
  • 01:00:54
    okay yes
  • 01:00:57
    that's just the
  • 01:00:58
    beginning so Elon says so so then I'm so
  • 01:01:01
    I'm like so so so so I said like I'm I'm
  • 01:01:05
    I'm G to I'm going to reach
  • 01:01:09
    down into my pants and into my pants and
  • 01:01:12
    and I stick my hand on my pants and I'm
  • 01:01:14
    going and I'm and I'm going to pull my
  • 01:01:15
    call out I tell this to the audience and
  • 01:01:16
    the audience is going to be like
  • 01:01:18
    go right and and and and and then and
  • 01:01:23
    and then and then and then I pull out a
  • 01:01:26
    a a baby
  • 01:01:28
    rooster you know yes and it's like okay
  • 01:01:31
    this is kind of PG you know like not
  • 01:01:34
    that bad it's like this is my tiny [ __ ]
  • 01:01:38
    and
  • 01:01:41
    and and it's like what do you think uh
  • 01:01:45
    and so then and do you think it's an ice
  • 01:01:47
    [ __ ] I mean I like it I pitch I'm like
  • 01:01:50
    and then Kate McKennon walks out yeah
  • 01:01:52
    exactly and I'm like oh no but you
  • 01:01:53
    haven't heard half of it so Kate
  • 01:01:54
    McKennon comes out yeah and she says
  • 01:01:57
    Elon I expected you would have a bigger
  • 01:02:00
    [ __ ] yeah like I I don't mean to just
  • 01:02:03
    point you Kate but yeah um but I I I
  • 01:02:06
    hope you like it anyway
  • 01:02:08
    um but Kate's got to come out with with
  • 01:02:11
    with her cat okay right uh so and Kate
  • 01:02:15
    says you see where you can see where
  • 01:02:16
    this is going and I say nice wow that's
  • 01:02:19
    that's a that's a that's a nice [ __ ]
  • 01:02:21
    you've got there
  • 01:02:23
    Kate wow that's amazing
  • 01:02:26
    um it looks a little wet was it raining
  • 01:02:31
    outside and then um do do you mind if I
  • 01:02:35
    stroke your [ __ ] is that cool it's like
  • 01:02:37
    oh no Elon actually can I hold your [ __ ]
  • 01:02:40
    of course of course Kate you definitely
  • 01:02:43
    hold my [ __ ] um and and then you know we
  • 01:02:46
    exchange and I think just the audio
  • 01:02:48
    version of this is pretty good right um
  • 01:02:50
    and and and um you know so it's like wow
  • 01:02:54
    you I really like stroking your [ __ ] and
  • 01:02:58
    I was
  • 01:03:00
    like and El says I'm really enjoying
  • 01:03:03
    strug at your [ __ ] yes of course and um
  • 01:03:08
    yeah so you know they're looking at us
  • 01:03:11
    like oh my god what have we done
  • 01:03:13
    inviting these lunatics on the
  • 01:03:15
    program yeah they said they said like
  • 01:03:18
    well um it is uh it is Mother's
  • 01:03:23
    Day it's Mother's Day we might not want
  • 01:03:25
    to go with this one mom's in the
  • 01:03:27
    audience and I'm like well that's a good
  • 01:03:29
    point well fair fair it might be a bit
  • 01:03:31
    uncomfortable for all the moms in the
  • 01:03:33
    audience maybe I don't know I don't know
  • 01:03:34
    maybe they'll dig it maybe they'll like
  • 01:03:35
    it uh so uh yeah that
  • 01:03:39
    was that's the um that's the that's the
  • 01:03:44
    that's the um cold open that didn't make
  • 01:03:46
    it we didn't get that on the air um but
  • 01:03:50
    uh we did fight for Doge yes and we got
  • 01:03:54
    Doge on the I mean there's a bunch of
  • 01:03:55
    things that I said that were just not on
  • 01:03:56
    the script like they have these like Q
  • 01:03:58
    cards for what you're supposed to say
  • 01:03:59
    and I just didn't say it I just went off
  • 01:04:01
    off the rails
  • 01:04:02
    yeah into that coming yeah it's live
  • 01:04:06
    well it's
  • 01:04:08
    live and uh so the Elon wanted to do
  • 01:04:13
    Doge this is the other one he wanted to
  • 01:04:15
    do Doge on late night and he says um hey
  • 01:04:17
    jakal can you um make sure oh yeah I
  • 01:04:19
    want I want to do the Doge father like
  • 01:04:21
    you sort of redo the you know that scene
  • 01:04:23
    from uh the The Godfather I mean you
  • 01:04:26
    kind of need the music to cue things
  • 01:04:30
    up you bring me on my daughter's
  • 01:04:34
    wedding listen you ask for do yeah you
  • 01:04:37
    got and I give you Bitcoin but you want
  • 01:04:39
    do exactly you really got to set the
  • 01:04:41
    mood you get have
  • 01:04:44
    tuxedo you got have like like
  • 01:04:50
    Mar said you come to me on this day of
  • 01:04:54
    my Do's wedding
  • 01:04:56
    and you ask me for your private
  • 01:05:01
    keys are you even a friend you call me
  • 01:05:05
    the Dodge
  • 01:05:07
    father so b
  • 01:05:09
    b you know so that's potential that had
  • 01:05:13
    great potential so they come to me and
  • 01:05:15
    I'm I'm talking to Colin um and Jo who's
  • 01:05:19
    got a great sense of humor he's amazing
  • 01:05:20
    he loves Elon and he's like we can't do
  • 01:05:22
    it because of the law and stuff like
  • 01:05:24
    that the law and liability so I said
  • 01:05:28
    it's okay Elon called Comcast and he put
  • 01:05:32
    in an offer and they just accepted it he
  • 01:05:35
    just bought NBC so it's fine yeah and
  • 01:05:40
    Colin Joe looks at me and I so so good
  • 01:05:42
    and he's like you're you're serious I'm
  • 01:05:45
    like yep we own NBC now yeah and he's
  • 01:05:50
    like okay well that kind of changes
  • 01:05:52
    things doesn't it I'm like absolutely
  • 01:05:54
    we're a go on on Doge yeah and then he's
  • 01:05:57
    like you're [ __ ] with me and I'm I'm
  • 01:05:59
    [ __ ] with
  • 01:06:00
    you or are we or are
  • 01:06:02
    we it was the greatest week of and that
  • 01:06:07
    like is like two of 10 stories yeah we
  • 01:06:09
    got yeah we got we'll save the other
  • 01:06:10
    eight yeah but it was and I was just so
  • 01:06:13
    happy for you to see you have a great
  • 01:06:17
    week of just joy and fun and letting go
  • 01:06:19
    because you were launching Rockets
  • 01:06:21
    you're dealing with so much [ __ ] in
  • 01:06:22
    your life to have those moments to share
  • 01:06:25
    them and just laugh um it was just so
  • 01:06:27
    great and more of those moments I think
  • 01:06:30
    we got to we got to get you back on SNL
  • 01:06:32
    who wants to back on SNL one more time
  • 01:06:35
    all right ladies and gentlemen our
  • 01:06:36
    bestie Elon Musk
  • 01:06:38
    [Applause]
  • 01:06:41
    [Laughter]
  • 01:06:44
    [Applause]
Tags
  • Elon Musk
  • Twitter
  • Tesla
  • SpaceX
  • Free Speech
  • Regulation
  • AI
  • Robotics
  • Government
  • Innovation