How To Build The Future: Sam Altman

00:46:52
https://www.youtube.com/watch?v=xXCBz_8hM9w

Resumen

TLDRIn an insightful discussion, Sam Altman of OpenAI emphasizes the pursuit of Artificial General Intelligence (AGI), while reflecting on the taboo and challenges surrounding its potential, especially in earlier years. The conversation explores the growth and success of OpenAI, highlighting technological advancements and the significant value of conviction on singular innovative bets. Altman expresses optimism about technology, envisioning a future where AGI and abundant energy reshape our lives, solving substantial challenges like climate change and space colonization. Drawing on lessons from startups, he underscores the importance of a like-minded peer group and staying focused on a singular objective, believing deeply in the scaling potential of deep learning. The impact of scale and AI’s rapid progression towards reasoning and agent capabilities are discussed. Altman predicts much anticipation for startups to embrace this tech shift, urging them to exploit AI’s transformative power while adhering to sound business practices. The talk underscores a hopeful, albeit challenging future landscape defined by innovative potential and emerging technological paradigms.

Para llevar

  • 🚀 Pursuit of AGI is central to OpenAI's mission.
  • 🤖 AI advancement is crucial for future technological solutions.
  • 🎯 Focused conviction on a single bet aids startup success.
  • 🔥 Abundant energy could unlock limitless innovation.
  • 🌌 Future tech could solve massive challenges like climate change.
  • 💡 Startups should leverage the rapid advancements in AI.
  • 🔍 Scaling neural networks is a key strategy for OpenAI.
  • 🧑‍🤝‍🧑 Peer groups are crucial in fostering startup success.
  • 📈 Growth of AI capabilities is exponential and transformative.
  • ⏩ The startup environment is ripe for tech innovation.

Cronología

  • 00:00:00 - 00:05:00

    The discussion begins with a mention of the initial pursuit of Artificial General Intelligence (AGI) by OpenAI, despite skepticism in the field. The speaker expresses enthusiasm about current times being the best period for starting tech companies, due to the significant potential and changes brought by technological revolutions.

  • 00:05:00 - 00:10:00

    The conversation then shifts to the future of superintelligence (ASI), predicting it's thousands of days away. The participants discuss the compounding nature of progress and envision a future where climate issues are resolved and space colonies established. They exhibit optimism about the potential of near-limitless intelligence and energy.

  • 00:10:00 - 00:15:00

    There is a focus on energy abundance and its transformative impact on physical work and quality of life. The discussion acknowledges solar and nuclear energy advancements, stressing the importance of achieving significant energy breakthroughs.

  • 00:15:00 - 00:20:00

    The talk shifts to the speaker's early days with Y Combinator (YC) and his drive as a sophomore to join. The significance of surrounding oneself with ambitious peers is discussed, highlighting the role of collaborative environments in fostering innovation.

  • 00:20:00 - 00:25:00

    Key insights are shared on peer influence, innovation, and founding experiences with Loop, Y Combinator, and OpenAI. The speaker compares early experiences at Stanford with the dynamic startup ecosystem.

  • 00:25:00 - 00:30:00

    Foundational insights about the formation of OpenAI are detailed. The speaker reflects on assembling a team passionate about AGI and overcoming skepticism from established figures in the AI field, favoring conviction and ambitious goals.

  • 00:30:00 - 00:35:00

    The strategic choice to focus on deep learning and scalability in OpenAI's development is emphasized. The narrative describes overcoming criticism and developing strong faith in scaling as a principle for technological advancement.

  • 00:35:00 - 00:40:00

    OpenAI's approach of focusing on scaling is discussed further, highlighting its success in pushing AI capabilities forward. The significant role of a highly talented research team is acknowledged as critical to OpenAI's progress.

  • 00:40:00 - 00:46:52

    The final segment discusses the rapid pace and future potential of AI development, emphasizing startup possibilities. The importance of adapting and scaling within this fast-moving landscape is stressed, with reflections on managing success and failures within OpenAI.

Ver más

Mapa mental

Vídeo de preguntas y respuestas

  • What is AGI?

    AGI stands for Artificial General Intelligence, aiming for machines with the ability to understand and learn tasks like a human.

  • Who is Sam Altman?

    Sam Altman is a prominent entrepreneur and CEO of OpenAI, focused on advancing artificial intelligence.

  • What does ASI mean?

    ASI stands for Artificial Superintelligence, a stage where machines surpass human intelligence capabilities.

  • What is the current state of AI progress?

    AI is progressing rapidly, with advancements in deep learning and unsupervised models leading the way.

  • How does OpenAI plan to scale AI technology?

    OpenAI focuses on leveraging deep learning and scaling neural networks to improve AI capabilities.

  • Why is conviction in one bet important for startups?

    Focusing on one technology or idea allows startups to innovate deeply without spreading resources thin.

  • What are some future possibilities mentioned in the discussion?

    Future possibilities include fixing climate issues, establishing space colonies, and achieving abundant energy.

  • How does Sam Altman feel about technological optimism?

    He is very optimistic and encourages a techno-optimistic approach to solving big future challenges.

  • What significance does the peer group have in startups?

    Having a supportive and inspiring peer group is crucial for staying motivated and achieving success.

  • What role did unsupervised learning have in OpenAI's progress?

    Unsupervised learning was a pivotal step, especially with the development of models like GPT-2.

Ver más resúmenes de vídeos

Obtén acceso instantáneo a resúmenes gratuitos de vídeos de YouTube gracias a la IA.
Subtítulos
en
Desplazamiento automático:
  • 00:00:00
    we said from the very beginning we were
  • 00:00:01
    going to go after AGI at a time when in
  • 00:00:04
    the field you weren't allowed to say
  • 00:00:05
    that because that just seemed impossibly
  • 00:00:08
    crazy I remember a rash of criticism for
  • 00:00:11
    you guys at that moment we really wanted
  • 00:00:13
    to push on that and we were far less
  • 00:00:17
    resourced than Deep Mind and others and
  • 00:00:19
    so we said okay they're going to try a
  • 00:00:21
    lot of things and we've just got to pick
  • 00:00:22
    one and really concentrate and that's
  • 00:00:23
    how we can we can win here most of the
  • 00:00:25
    world still does not understand the
  • 00:00:27
    value of like a fairly extreme level of
  • 00:00:29
    conviction on one bet that's why I'm so
  • 00:00:31
    excited for startups right now it is
  • 00:00:33
    because the world is still sleeping on
  • 00:00:35
    all this to such an astonishing
  • 00:00:42
    degree we have a real treat for you
  • 00:00:44
    today Sam Alman thanks for joining us
  • 00:00:47
    thanks this is actually a reboot of your
  • 00:00:50
    series how to build the future and so
  • 00:00:53
    welcome back to the series that you
  • 00:00:55
    started years ago I was trying to think
  • 00:00:56
    about that something like that that's
  • 00:00:57
    wild I'm glad it's being rebooted that's
  • 00:00:59
    right let's talk about your newest essay
  • 00:01:02
    on uh the age of intelligence you know
  • 00:01:04
    is this the best time ever to be
  • 00:01:07
    starting a technology company let's at
  • 00:01:10
    least say it's the best time yet
  • 00:01:11
    hopefully there'll be even better times
  • 00:01:12
    in the future I sort of think with each
  • 00:01:14
    successive major technological
  • 00:01:16
    Revolution you've been able to do more
  • 00:01:18
    than you could before and I would expect
  • 00:01:21
    the companies to be more amazing and
  • 00:01:24
    impactful in everything else so yeah I
  • 00:01:26
    think it's the best time yet big
  • 00:01:28
    companies have the edge when things are
  • 00:01:30
    like moving slowly and not that Dynamic
  • 00:01:32
    and then when something like this or
  • 00:01:34
    mobile or the internet or semiconductor
  • 00:01:37
    Revolution happens or probably like back
  • 00:01:39
    in the days of the Industrial Revolution
  • 00:01:41
    that was when upstarts had their have
  • 00:01:42
    their Edge so yeah this is like and it's
  • 00:01:45
    been a while since we've had one of
  • 00:01:46
    these so this is like pretty exciting in
  • 00:01:48
    the essay you actually say a really big
  • 00:01:50
    thing which is ASI super intelligence is
  • 00:01:54
    actually thousands of days away maybe I
  • 00:01:58
    mean that's our hope our guess whatever
  • 00:02:00
    uh but that's a very wild statement yeah
  • 00:02:05
    um tell us about it I mean that's that's
  • 00:02:07
    big that is really big I can see a path
  • 00:02:10
    where the work we are doing just keeps
  • 00:02:12
    compounding and the rate of progress
  • 00:02:15
    we've
  • 00:02:16
    made over the last three years
  • 00:02:19
    continuous for the next three or six or
  • 00:02:21
    nine or whatever um you know nine years
  • 00:02:24
    would be like 3500 days or whatever if
  • 00:02:27
    we can keep this rate of improvement or
  • 00:02:28
    even increase it
  • 00:02:30
    that system will be quite capable of
  • 00:02:31
    doing a lot of things I think already uh
  • 00:02:34
    even a system like A1 is capable of
  • 00:02:37
    doing like quite a lot of things from
  • 00:02:39
    just like a raw cognitive IQ on a closed
  • 00:02:42
    end well- defined task in a certain area
  • 00:02:45
    I'm like oh one is like a very smart
  • 00:02:47
    thing and I think we're nowhere near the
  • 00:02:49
    limit of progress I mean that was an
  • 00:02:51
    architecture shift that sort of unlocked
  • 00:02:53
    yeah a lot and what I'm sort of hearing
  • 00:02:56
    is that these things are going to
  • 00:02:57
    compound we could hit some like
  • 00:02:59
    unexpected Ed wall or we could be
  • 00:03:01
    missing something but it looks to us
  • 00:03:03
    like there's a lot of compounding in
  • 00:03:05
    front of us still to happen I mean this
  • 00:03:07
    essay is probably the most techno
  • 00:03:08
    Optimist of almost anything I've seen
  • 00:03:11
    out there some of the things we get to
  • 00:03:13
    look forward to uh fixing the climate
  • 00:03:15
    establishing a space Colony the
  • 00:03:18
    discovery of all of physics uh near
  • 00:03:20
    Limitless intelligence and abundant
  • 00:03:23
    energy I do think all of those things
  • 00:03:26
    and probably a lot more we can't even
  • 00:03:27
    imagine are maybe not that far away
  • 00:03:30
    and one of I I think it's like
  • 00:03:33
    tremendously exciting that we can talk
  • 00:03:34
    about this even semi-seriously now one
  • 00:03:37
    of the things that I always have loved
  • 00:03:39
    the most about YC is it
  • 00:03:41
    encourages slightly implausible degrees
  • 00:03:44
    of techno optimism and just a belief
  • 00:03:47
    that like ah you can figure this out and
  • 00:03:50
    you know in a world that I think is like
  • 00:03:52
    sort of consistently telling people this
  • 00:03:54
    is not going to work you can't do this
  • 00:03:55
    thing you can't do that I think the kind
  • 00:03:56
    of early PG Spirit of just encouraging
  • 00:03:59
    Founders to like think a little bit
  • 00:04:00
    bigger is like it is a special thing in
  • 00:04:03
    the world the Abundant energy thing
  • 00:04:05
    seems like a pretty big deal you know
  • 00:04:07
    there's sort of path a and path B you
  • 00:04:10
    know if we do achieve abundant
  • 00:04:13
    energy seems like this is a real unlock
  • 00:04:16
    almost any work not just you know
  • 00:04:20
    knowledge work but actually like real
  • 00:04:22
    physical work yeah could be unlocked
  • 00:04:24
    with ro Robotics and with language and
  • 00:04:27
    Intelligence on tap like there's a real
  • 00:04:29
    age of abundance I think these are like
  • 00:04:32
    the key to in the two key inputs to
  • 00:04:35
    everything else that we want there's a
  • 00:04:37
    lot of other stuff of course that
  • 00:04:39
    matters but the unlock that would happen
  • 00:04:42
    if we could just get truly abundant
  • 00:04:44
    intelligence truly abundant
  • 00:04:46
    energy what we'd be able to make happen
  • 00:04:49
    in the world like both like come up with
  • 00:04:51
    better ideas more quickly and then also
  • 00:04:53
    like make them happen in in the physical
  • 00:04:55
    world like to say nothing of it'd be
  • 00:04:57
    nice to be able to run lots of AI and
  • 00:04:59
    that takes energy too uh I think that
  • 00:05:02
    would be a huge unlock and the fact that
  • 00:05:03
    it's I'm not sure to be whether like
  • 00:05:06
    whether it be surprised that it's all
  • 00:05:07
    happening the same time or if this is
  • 00:05:08
    just like the natural effect of an
  • 00:05:11
    increasing rate of technological
  • 00:05:12
    progress but it's certainly very
  • 00:05:15
    exciting time to be alive and great time
  • 00:05:17
    to do a startup well so we sort of walk
  • 00:05:19
    through this age of abundance you know
  • 00:05:22
    maybe you robots can actually
  • 00:05:24
    manufacture do anything almost all
  • 00:05:27
    physical labor can then result in
  • 00:05:29
    material progress not just for the most
  • 00:05:31
    wealthy but for everyone you know what
  • 00:05:34
    happens if we don't unleash unlimited
  • 00:05:36
    energy if you know there's some physical
  • 00:05:39
    law that prevents us from exactly that
  • 00:05:43
    solar Plus Storage is on a good enough
  • 00:05:46
    trajectory that even if we don't get a
  • 00:05:48
    big nuclear
  • 00:05:49
    breakthrough we would be like
  • 00:05:52
    okayish but for sure it seems that
  • 00:05:55
    driving the cost of energy down the
  • 00:05:56
    abundance of it up has like a very
  • 00:05:59
    direct impact on quality of life
  • 00:06:03
    and eventually we'll solve every problem
  • 00:06:05
    in physics so we're going to figure this
  • 00:06:06
    out it's just a question of when and we
  • 00:06:09
    deserve it uh there's you know someday
  • 00:06:12
    we'll be talking not about Fusion or
  • 00:06:14
    whatever but about the dys feere and
  • 00:06:16
    that'll be awesome too yeah this is a
  • 00:06:18
    point in time whatever feels like
  • 00:06:19
    abundant energy to us will feel like not
  • 00:06:21
    nearly enough to our great-grandchildren
  • 00:06:23
    and there's a big universe out there
  • 00:06:25
    with a lot of matter yeah wanted to
  • 00:06:27
    switch gears a little bit to sort of
  • 00:06:29
    your earlier you were mentioning uh Paul
  • 00:06:31
    Graham who brought us all together
  • 00:06:33
    really created why combinator he likes
  • 00:06:36
    to tell the story of how you know how
  • 00:06:38
    you got into YC was actually you were a
  • 00:06:40
    Stanford freshman um and he said you
  • 00:06:43
    know what this is the very first YY
  • 00:06:45
    batch in 2005 and he said you know what
  • 00:06:49
    you're a freshman and wey will still be
  • 00:06:52
    here uh next time you should just wait
  • 00:06:55
    and you said I'm a sophomore and I'm
  • 00:06:58
    coming and
  • 00:07:00
    widely known in our community as you
  • 00:07:02
    know one of the most formidable people
  • 00:07:04
    where do you think that came from that
  • 00:07:07
    one story I think I I I would happy i'
  • 00:07:10
    be happy if that like drifted off hisory
  • 00:07:12
    well now it's it's purely immortalized
  • 00:07:14
    here here it is my memory of that is
  • 00:07:18
    that like I needed to reschedule an
  • 00:07:20
    interview one day or something um and PG
  • 00:07:23
    tried to like say like I just do it next
  • 00:07:25
    year or whatever and then I think I said
  • 00:07:27
    some nicer version of I'm a sophomore
  • 00:07:29
    and I'm coming but yeah you know these
  • 00:07:32
    things get slightly apocryphal it's
  • 00:07:34
    funny I
  • 00:07:35
    don't and I say this with no false
  • 00:07:38
    modesty I don't like identify as a
  • 00:07:41
    formidable person at all in fact I think
  • 00:07:42
    there's a lot of ways in which I'm
  • 00:07:44
    really not I do have a little bit of a
  • 00:07:47
    just
  • 00:07:48
    like I don't see why things have to
  • 00:07:52
    be the way they are and so I'm just
  • 00:07:54
    going to like do this thing that from
  • 00:07:57
    first principles seems like fine and I
  • 00:07:59
    always felt a little bit weird about
  • 00:08:01
    that and then I I remember one of the
  • 00:08:03
    things I thought was so great about YC
  • 00:08:05
    and still that I care so much about YC
  • 00:08:06
    about is it was like a collection of the
  • 00:08:09
    weird people who are just like I'm just
  • 00:08:11
    going to do my thing the part of this
  • 00:08:13
    that does resonate as a like accurate
  • 00:08:16
    self-identity thing is I do think you
  • 00:08:18
    can just do stuff or try stuff a
  • 00:08:21
    surprising amount of the time and I
  • 00:08:24
    think more of that is a good thing and
  • 00:08:26
    then I think one of the things that both
  • 00:08:27
    of us found at YC was a bunch of people
  • 00:08:31
    who all believed that you could just do
  • 00:08:33
    stuff for a long time when I was trying
  • 00:08:35
    to like figure out what made ycu so
  • 00:08:36
    special I thought that it was like okay
  • 00:08:39
    you have this like
  • 00:08:41
    very amazing person telling you I you
  • 00:08:46
    can do stuff I believe in you and as a
  • 00:08:49
    young founder that felt so special and
  • 00:08:51
    inspiring and of course it is but the
  • 00:08:53
    thing that I didn't understand until
  • 00:08:55
    much later was it was the peer group of
  • 00:08:57
    other people doing that and one of the
  • 00:09:00
    biggest pieces of advice I would give to
  • 00:09:02
    young people now is finding that peer
  • 00:09:05
    group as early as you can was so
  • 00:09:07
    important to me um and I didn't realize
  • 00:09:11
    it was something that mattered I kind of
  • 00:09:12
    thought ah like I have you know I'll
  • 00:09:14
    figure it out on my own but man being
  • 00:09:17
    around like inspiring
  • 00:09:19
    peers so so valuable what's funny is
  • 00:09:21
    both of us did spend time at Stanford I
  • 00:09:23
    actually did graduate which is I
  • 00:09:25
    probably shouldn't have done that but I
  • 00:09:27
    did sford it's great you pursued the
  • 00:09:30
    path of uh you know far greater return
  • 00:09:33
    uh by dropping out but you know that was
  • 00:09:35
    a community that purportedly had a lot
  • 00:09:38
    of these characteristics but I was still
  • 00:09:40
    Beyond surprised at how much more potent
  • 00:09:43
    it was with a room full of Founders it
  • 00:09:45
    was I was just going to say the same
  • 00:09:46
    thing actually I liked Samford a lot
  • 00:09:48
    yeah
  • 00:09:49
    but I was I did not feel surrounded by
  • 00:09:53
    people that made me like want to be
  • 00:09:55
    better and more ambitious and whatever
  • 00:09:58
    else and to the degree did the thing you
  • 00:10:00
    were competing with your peers on was
  • 00:10:02
    like who was going to get the internship
  • 00:10:04
    at which Investment Bank which I'm
  • 00:10:06
    embarrassed to say I fell on that trap
  • 00:10:07
    this is like how powerful peer groups
  • 00:10:09
    are um it was a very easy decision to
  • 00:10:12
    not go back to school after like seeing
  • 00:10:14
    what the like YC Vibe was like yeah uh
  • 00:10:17
    there's a powerful quote by uh Carl
  • 00:10:19
    Young that I really love um it's you
  • 00:10:21
    know the world will come and ask you who
  • 00:10:24
    you are and if you don't know it will
  • 00:10:27
    tell you it sounds like being very
  • 00:10:29
    intentional about who you want to be and
  • 00:10:31
    who you want to be around as early as
  • 00:10:33
    possible is very important yeah this was
  • 00:10:36
    definitely one of my takeaways at least
  • 00:10:38
    for myself is you no one is immune to
  • 00:10:40
    peer pressure and so all you can do is
  • 00:10:42
    like pick good peers yeah obviously you
  • 00:10:44
    know you went on to create looped you
  • 00:10:47
    know sell that go to Green Dot and then
  • 00:10:49
    we ended up getting to work together at
  • 00:10:51
    YC talk to me about like the early days
  • 00:10:53
    of YC research like one of the really
  • 00:10:55
    cool things that you brought to YC was
  • 00:10:58
    this experimentation and and you sort of
  • 00:11:01
    I mean I I remember you coming back to
  • 00:11:02
    partner rooms and talking about some of
  • 00:11:04
    the rooms that you were getting to sit
  • 00:11:06
    in with like the laran Sur gaze of the
  • 00:11:08
    world and that you know AI was some sort
  • 00:11:11
    of at the tip of everyone's tongue
  • 00:11:13
    because it felt so close and yet it was
  • 00:11:16
    you know that was 10 years ago the thing
  • 00:11:19
    I
  • 00:11:22
    always thought would be the coolest
  • 00:11:24
    retirement job was to get to like run a
  • 00:11:25
    research lab and it was not specifically
  • 00:11:29
    to AI at that time when we started
  • 00:11:32
    talking about YC research well not only
  • 00:11:34
    was it going to it it did end up funding
  • 00:11:35
    like a bunch of different efforts and I
  • 00:11:38
    wish I could tell the story of like oh
  • 00:11:39
    was obvious that AI was going to work
  • 00:11:41
    and be the thing but like we tried a lot
  • 00:11:43
    of bad things too it around that
  • 00:11:47
    time I read a few books on like the
  • 00:11:50
    history of zerox Park and Bell labs and
  • 00:11:53
    stuff and I think there were a lot of
  • 00:11:54
    people like it was in the air of Silicon
  • 00:11:55
    Valley at the time that we need to like
  • 00:11:57
    have good research Labs again and I just
  • 00:11:59
    thought it would be so cool to do and it
  • 00:12:01
    was sort of similar to what YC does and
  • 00:12:04
    that you're going to like allocate
  • 00:12:05
    Capital to smart people and sometimes
  • 00:12:07
    it's going to work and sometimes it's
  • 00:12:08
    not going to
  • 00:12:11
    and I just wanted to try it AI for sure
  • 00:12:14
    was having a mini moment this was like
  • 00:12:17
    kind of late 2014 2015 early 2016 was
  • 00:12:21
    like the super intelligence discussion
  • 00:12:24
    like the book super intelligence was
  • 00:12:25
    happening Bo yep yeah the Deep Mind had
  • 00:12:29
    a few like impressive results but a
  • 00:12:31
    little bit of a different direction you
  • 00:12:33
    know I had been an AI nerd forever so I
  • 00:12:35
    was like oh it' be so cool to try to do
  • 00:12:37
    something but it's very hard to say was
  • 00:12:38
    imet out yet imag net was out yeah yeah
  • 00:12:41
    for a while at that point so you could
  • 00:12:43
    tell if it was a hot dog or not you
  • 00:12:45
    could sometimes yeah that was getting
  • 00:12:48
    there yeah you know how did you identify
  • 00:12:50
    the initial people you wanted involved
  • 00:12:53
    in you know YC research and open AI I
  • 00:12:56
    mean Greg Greg Brockman was early in
  • 00:12:58
    retrospect it feels like this movie
  • 00:13:00
    montage and there were like all of these
  • 00:13:02
    like you know at the beginning of like
  • 00:13:03
    the Bai movie when you're like driving
  • 00:13:05
    around to find the people and whatever
  • 00:13:07
    and and they're like you son of a
  • 00:13:09
    I'm in right right like Ilia I like
  • 00:13:13
    heard he was really smart and then I
  • 00:13:15
    watched some video of his and he's ALS
  • 00:13:17
    now he's extremely smart like true true
  • 00:13:19
    genuine genius and Visionary but also he
  • 00:13:21
    has this incredible presence and so I
  • 00:13:23
    watched this video of his on YouTube or
  • 00:13:25
    something I was like I got to meet that
  • 00:13:26
    guy and I emailed him and he didn't
  • 00:13:27
    respond so I just like went to some con
  • 00:13:29
    conference he was speaking at and we met
  • 00:13:31
    up and then after that we started
  • 00:13:32
    talking a bunch and and then like Greg I
  • 00:13:35
    had known a little bit from the early
  • 00:13:36
    stripe days what was that conversation
  • 00:13:38
    like though it's like I really like what
  • 00:13:40
    your your ideas about Ai and I want to
  • 00:13:43
    start a lab yes and one of the things
  • 00:13:46
    that worked really well in
  • 00:13:48
    retrospect was we said from the very
  • 00:13:51
    beginning we were going to go after AGI
  • 00:13:53
    at a time when in the FI you weren't
  • 00:13:55
    allowed to say that because that just
  • 00:13:58
    seemed possibly crazy and you know
  • 00:14:02
    borderline irresponsible to talk so that
  • 00:14:03
    got his attention immediately it got all
  • 00:14:06
    of the good young people's attention and
  • 00:14:08
    the derion derision whatever that word
  • 00:14:10
    is of the mediocre old people and I felt
  • 00:14:13
    like somehow that was like a really good
  • 00:14:14
    sign and really powerful and we were
  • 00:14:16
    like this rag tag group of people I mean
  • 00:14:19
    I was the oldest by a decent amount I
  • 00:14:21
    was like I guess I was 30 then and so
  • 00:14:24
    you had like these people who were like
  • 00:14:26
    those are these irresponsible young kids
  • 00:14:28
    who don't know anything about anything
  • 00:14:29
    and they're like saying these ridiculous
  • 00:14:31
    things and the people who that was
  • 00:14:33
    really appealing to I guess are the same
  • 00:14:36
    kind of people who would have said like
  • 00:14:37
    it's a you know I'm a sophomore and I'm
  • 00:14:39
    coming or whatever and they were like
  • 00:14:40
    let's just do this thing let's take a
  • 00:14:41
    run at
  • 00:14:42
    it and so we kind of went around and met
  • 00:14:45
    people one by one and then in different
  • 00:14:47
    configurations of groups and it kind of
  • 00:14:49
    came together over the course of in fits
  • 00:14:53
    and starts but over the course of like
  • 00:14:54
    nine months and then it started h i mean
  • 00:14:57
    and then it started it started happening
  • 00:14:59
    and one of my favorite like memories of
  • 00:15:02
    all of open eye
  • 00:15:04
    was Ilia had some reason that with
  • 00:15:07
    Google or something that we couldn't
  • 00:15:08
    start in we announced in December of
  • 00:15:10
    2015 but we couldn't start until January
  • 00:15:11
    of 2016 so like January 3rd something
  • 00:15:14
    like that of 2016 like very early in the
  • 00:15:17
    Month people come back from the holidays
  • 00:15:19
    and we go to Greg's
  • 00:15:21
    apartment maybe there's 10 of us
  • 00:15:23
    something like that and we sit around
  • 00:15:26
    and it felt like we had done this
  • 00:15:27
    Monumental thing to get it started
  • 00:15:30
    and everyone's like so what do we do
  • 00:15:32
    now and what a great moment it reminded
  • 00:15:35
    me of when startup Founders work really
  • 00:15:38
    hard to like raise a round and they
  • 00:15:40
    think like oh I accomplished this great
  • 00:15:42
    we did it and then you sit down and say
  • 00:15:44
    like now we got to like figure out
  • 00:15:45
    what we're going to do it's not time for
  • 00:15:47
    popping champagne that was actually the
  • 00:15:49
    starting gun and now we got to run yeah
  • 00:15:51
    and you have no idea how hard the race
  • 00:15:53
    is going to be it took us a long time to
  • 00:15:55
    figure out what we're going to do um but
  • 00:15:58
    one of the things I'm really amazingly
  • 00:16:01
    impressed by Ilia in particular but
  • 00:16:03
    really all of the early people about is
  • 00:16:05
    although it took a lot of twist and
  • 00:16:06
    turns to get
  • 00:16:09
    here the big picture of the original
  • 00:16:11
    ideas was just so incredibly right and
  • 00:16:15
    so they were like up on like one of
  • 00:16:16
    those flip charts or whiteboards I don't
  • 00:16:18
    remember which in Greg's apartment and
  • 00:16:22
    then we went off and you know did some
  • 00:16:24
    other things that worked or didn't work
  • 00:16:26
    or whatever some of them did and
  • 00:16:27
    eventually now we have this like
  • 00:16:29
    system and it feels very crazy and very
  • 00:16:34
    improbable looking backwards that we
  • 00:16:36
    went from there to here with so many
  • 00:16:39
    detours on the way but got where we were
  • 00:16:40
    pointing was deep learning even on that
  • 00:16:42
    flip chart initially yeah uh I mean more
  • 00:16:45
    specifically than that like do a big
  • 00:16:47
    unsupervised model and then solve RL was
  • 00:16:49
    on that flip chart one of the flip
  • 00:16:51
    charts from a very this is before Greg's
  • 00:16:53
    apartment but from a very early offsite
  • 00:16:55
    I think this is right I believe there
  • 00:16:57
    were three goals for the for the effort
  • 00:16:59
    at the time it was like figure out how
  • 00:17:02
    to do unsupervised learning solve RL and
  • 00:17:04
    never get more than 120 people missed on
  • 00:17:07
    the third one but right the like d the
  • 00:17:10
    predictive direction of the first two is
  • 00:17:12
    pretty good so deep learning then the
  • 00:17:16
    second big one sounded like scaling like
  • 00:17:18
    the idea that you could scale that was
  • 00:17:21
    another heretical idea that people
  • 00:17:24
    actually found even offensive you know I
  • 00:17:26
    remember a rash of criticism for you
  • 00:17:29
    guys at that moment when we started yeah
  • 00:17:33
    the core beliefs were deep learning
  • 00:17:35
    works and it gets better with
  • 00:17:37
    scale and I think those were both
  • 00:17:40
    somewhat heretical beliefs at the time
  • 00:17:42
    we didn't know how predictably better a
  • 00:17:43
    got with scale that didn't come for a
  • 00:17:44
    few years later it was a hunch first and
  • 00:17:47
    then you got the data to show how
  • 00:17:48
    predictable it was but but people
  • 00:17:50
    already knew that if you made these
  • 00:17:51
    neural networks bigger they got better
  • 00:17:53
    yeah um like that was we were sure of
  • 00:17:56
    that um before we started
  • 00:18:00
    and what took the like where that keeps
  • 00:18:03
    coming to mind is like religious level
  • 00:18:05
    of belief was that that wasn't going to
  • 00:18:08
    stop everybody had some reason of oh
  • 00:18:11
    it's not really learning it's not really
  • 00:18:13
    reasoning I can't really do this it's
  • 00:18:15
    you know it's like a parlor trick and
  • 00:18:18
    these were like the eminent leaders of
  • 00:18:20
    the field and more than just saying
  • 00:18:23
    you're wrong they were like you're wrong
  • 00:18:25
    and this
  • 00:18:26
    is like a bad thing to believe or bad
  • 00:18:29
    thing to say it was that there's got to
  • 00:18:30
    you you know this is like you're going
  • 00:18:32
    to perpetuate an AI winter you're going
  • 00:18:34
    to do this you're going to do that and
  • 00:18:36
    we were just like looking at these
  • 00:18:37
    results and saying they keep getting
  • 00:18:39
    better then we got the scaling results
  • 00:18:41
    it just kind of breaks my intuition even
  • 00:18:44
    now and at some point you have to just
  • 00:18:47
    look at the scaling loss and say we're
  • 00:18:50
    going to keep doing this and this is
  • 00:18:51
    what we think it'll do
  • 00:18:52
    and it also it was starting to feel at
  • 00:18:55
    that time
  • 00:18:57
    like something about learning was just
  • 00:19:00
    this
  • 00:19:02
    emergent phenomenon that was really
  • 00:19:04
    important and even if we didn't
  • 00:19:06
    understand all of the details in
  • 00:19:08
    practice here which obviously we didn't
  • 00:19:09
    and still
  • 00:19:10
    don't that there was something really
  • 00:19:12
    fundamental going on it was the PG ISM
  • 00:19:15
    for this is we had like discovered a new
  • 00:19:16
    Square in the periodic table yeah and so
  • 00:19:19
    it we just we really wanted to push on
  • 00:19:21
    that and we were far less resourced than
  • 00:19:25
    Deep Mind and others and so we said okay
  • 00:19:27
    they're going to try a lot of things and
  • 00:19:29
    we've just got to pick one and really
  • 00:19:30
    concentrate and that's how we can we can
  • 00:19:31
    win here which is totally the right
  • 00:19:33
    startup takeaway and so we said well we
  • 00:19:38
    don't know what we don't know we do know
  • 00:19:40
    this one thing works so we're going to
  • 00:19:42
    really concentrate on that and I think
  • 00:19:44
    some of the other efforts were trying to
  • 00:19:46
    outsmart themselves in too many ways and
  • 00:19:48
    we just said we'll just we'll do the
  • 00:19:50
    thing in front of us and keep pushing on
  • 00:19:51
    it scale is this thing that I've always
  • 00:19:53
    been interested in um at kind of just
  • 00:19:56
    the emergent properties of scale for
  • 00:19:58
    everything for startups turns out for
  • 00:20:00
    deep learning models for a lot of other
  • 00:20:02
    things I think it's a very
  • 00:20:04
    underappreciated property and thing to
  • 00:20:06
    go after and I think it's you know when
  • 00:20:08
    in doubt if you have something that
  • 00:20:10
    seems like it's getting better with
  • 00:20:11
    scale I think you should scale it up I
  • 00:20:12
    think people want things to be uh you
  • 00:20:14
    know less is more but actually more is
  • 00:20:16
    more more is more we believed in that we
  • 00:20:18
    wanted to push on it I think one thing
  • 00:20:20
    that is not maybe that well understood
  • 00:20:23
    about open AI is we had just this even
  • 00:20:27
    when we were like pretty unknown
  • 00:20:29
    we had a crazy talented team of
  • 00:20:31
    researchers you know if you have like
  • 00:20:33
    the smartest people in the world you can
  • 00:20:34
    push on something really
  • 00:20:36
    hard yeah and they're motivated Andor
  • 00:20:39
    you created sort of one of the sole
  • 00:20:40
    places in the world where they could do
  • 00:20:42
    that like one of the stories I heard is
  • 00:20:45
    just even getting access to compute
  • 00:20:47
    resources even today is this crazy thing
  • 00:20:51
    and embedded in some of the criticism
  • 00:20:54
    from maybe the Elders of the industry at
  • 00:20:56
    the moment was sort of that you know
  • 00:20:58
    know you're going to waste a lot of
  • 00:21:00
    resources and somehow that's going to
  • 00:21:02
    result in an AI winter like people won't
  • 00:21:04
    give resources anymore it's funny people
  • 00:21:06
    were never sure if we were going to
  • 00:21:09
    waste resources or if we were doing
  • 00:21:11
    something kind of vaguely immoral by
  • 00:21:14
    putting in too much resources and you
  • 00:21:15
    were supposed to spread it across lots
  • 00:21:17
    of bets rather than like conviction on
  • 00:21:19
    one most of the world still does not
  • 00:21:22
    understand the value of like a fairly
  • 00:21:23
    extreme level of conviction on one bet
  • 00:21:26
    and so we said okay we have this
  • 00:21:27
    evidence we believe in this
  • 00:21:29
    we're going to at a time when like the
  • 00:21:30
    normal thing was we're going to spread
  • 00:21:31
    against this bet and that bet and that
  • 00:21:33
    bet definite Optimist you're a definite
  • 00:21:35
    Optimist and I think across like many of
  • 00:21:38
    the successful YC startups you see a
  • 00:21:40
    version of that again and again yeah
  • 00:21:42
    that sounds right when the world gives
  • 00:21:43
    you sort of push back and the push back
  • 00:21:46
    doesn't make sense to you you should do
  • 00:21:47
    it anyway totally one of the many things
  • 00:21:50
    that I'm very grateful
  • 00:21:52
    about getting exposure to from the world
  • 00:21:54
    of startups is how many times you see
  • 00:21:57
    that again and again and again and
  • 00:21:58
    before I think before YC I I really had
  • 00:22:01
    this deep belief that somewhere in the
  • 00:22:04
    world there were adults in charge adults
  • 00:22:07
    in the room and they knew what was going
  • 00:22:09
    on and someone had all the answers and
  • 00:22:11
    you know if someone was pushing back on
  • 00:22:12
    you they probably knew what was going on
  • 00:22:15
    and the degree to which I Now understand
  • 00:22:18
    that you know to pick up the earlier
  • 00:22:21
    phrase you can just do stuff you can
  • 00:22:22
    just try stuff no one has all the
  • 00:22:24
    answers there are no like adults in the
  • 00:22:25
    room that are going to magically tell
  • 00:22:27
    you exactly what to do um and you just
  • 00:22:29
    kind of have to like iterate quickly and
  • 00:22:31
    find your way that was like a big unlock
  • 00:22:33
    in life for me to understand there is a
  • 00:22:35
    difference between being uh High
  • 00:22:37
    conviction just for the sake of it and
  • 00:22:40
    if you're wrong and you don't adapt and
  • 00:22:42
    you don't try to be like truth seeking
  • 00:22:44
    it still is
  • 00:22:46
    really not that effective the thing that
  • 00:22:49
    we tried to do was really just believe
  • 00:22:53
    whatever the results told us and really
  • 00:22:57
    kind of try to go do the thing in front
  • 00:22:58
    of us and there were a lot of things
  • 00:23:00
    that we were high conviction and wrong
  • 00:23:02
    on but as soon as we realized we were
  • 00:23:04
    wrong we tried to like fully embrace it
  • 00:23:07
    conviction is great until the moment you
  • 00:23:08
    have data one way or the other and there
  • 00:23:10
    are a lot of people who hold on it past
  • 00:23:11
    the moment of data so it's it's
  • 00:23:13
    iterative it's not just they're wrong
  • 00:23:15
    and I'm right you have to go show your
  • 00:23:18
    work but there is a long moment where
  • 00:23:20
    you have to be willing to operate
  • 00:23:21
    without
  • 00:23:22
    data and at that point you do have to
  • 00:23:24
    just sort of run on conviction yeah it
  • 00:23:26
    sounds like there's a focusing aspect
  • 00:23:28
    there too like you had to make a choice
  • 00:23:31
    and that choice had better you know you
  • 00:23:34
    didn't have infinite choices and so you
  • 00:23:37
    know the prioritization itself was an
  • 00:23:39
    exercise that made it much more likely
  • 00:23:41
    for you to succeed I wish I could go
  • 00:23:43
    tell you like oh we knew exactly what
  • 00:23:45
    was going to happen and it was you know
  • 00:23:47
    we had this idea for language models
  • 00:23:49
    from the beginning and you know we kind
  • 00:23:51
    of went right to this but obviously the
  • 00:23:54
    story of opening eyes that we did a lot
  • 00:23:55
    of things that helped us develop some
  • 00:23:57
    scientific understanding but we're not
  • 00:24:00
    on the short path if we knew then what
  • 00:24:03
    we know now we could have speedrun this
  • 00:24:05
    whole thing to like an incredible degree
  • 00:24:07
    doesn't work that way like you don't get
  • 00:24:08
    to be right at every guess and so we
  • 00:24:11
    started off with a lot of assumptions
  • 00:24:14
    both about the direction of Technology
  • 00:24:16
    but also what kind of company we were
  • 00:24:17
    going to be and how we were going to be
  • 00:24:18
    structured and how AGI was going to go
  • 00:24:20
    and all of these things and we have been
  • 00:24:24
    like humbled and badly wrong many many
  • 00:24:27
    many times and one of our strengths is
  • 00:24:32
    the ability to get punched in the face
  • 00:24:33
    and get back up and keep going this
  • 00:24:35
    happens for scientific bets for uh you
  • 00:24:38
    know being willing to be wrong about a
  • 00:24:40
    bunch of other things we thought about
  • 00:24:41
    how the world was going to work and what
  • 00:24:43
    the sort of shape of the product was
  • 00:24:44
    going to
  • 00:24:45
    be again we had no idea or I at least
  • 00:24:49
    had no idea maybe Alec Radford did I had
  • 00:24:50
    no idea that language models were going
  • 00:24:52
    to be the thing um you know we started
  • 00:24:54
    working on robots and agents PL video
  • 00:24:56
    games and all these other things then a
  • 00:24:58
    few years later gbd3 happened that was
  • 00:25:02
    not so obvious at the time yeah it
  • 00:25:03
    sounded like there was a a key Insight
  • 00:25:06
    around positive or negative sentiment
  • 00:25:08
    around n GT1 even before gpt1 Oh before
  • 00:25:12
    he I think the paper was called the
  • 00:25:14
    unsupervised sentiment on and I think
  • 00:25:16
    Alec did it alone by the way Alec
  • 00:25:19
    is this unbelievable outlier of a human
  • 00:25:22
    and so he did this incredible work which
  • 00:25:28
    was just looking at he he noticed there
  • 00:25:30
    was one neuron that was flipping
  • 00:25:31
    positive or negative sentiment as it was
  • 00:25:33
    doing these generative Amazon reviews I
  • 00:25:36
    think other researchers might have hyped
  • 00:25:39
    it up more made a bigger deal out of it
  • 00:25:40
    or whatever but you know it was Alex so
  • 00:25:42
    it took people a while to I think fully
  • 00:25:44
    internalize what a big deal it was and
  • 00:25:46
    he then did gpt1 and somebody else
  • 00:25:48
    scaled it up into gpt2 um but it was off
  • 00:25:51
    of this Insight that there
  • 00:25:54
    was something uh amazing happening
  • 00:25:59
    where and at at the time unsupervised
  • 00:26:01
    learning was just not really working so
  • 00:26:04
    he noticed this one really interesting
  • 00:26:06
    property which is there was a neuron
  • 00:26:08
    that was flipping positive or negative
  • 00:26:10
    with sentiment and yeah that led to the
  • 00:26:13
    GPT series I guess one of the things
  • 00:26:16
    that Jake heler from case text uh we I
  • 00:26:19
    think of him as maybe I mean not
  • 00:26:21
    surprisingly a YC Alum who got access to
  • 00:26:25
    both uh 3 3.5 and four and he described
  • 00:26:29
    getting four as sort of the big moment
  • 00:26:32
    Revelation because 3.5 would still do
  • 00:26:36
    yeah I mean it would hallucinate more
  • 00:26:38
    than he could use in a legal setting and
  • 00:26:41
    then with four it reached the point
  • 00:26:44
    where if he chopped the prompts down
  • 00:26:46
    small enough into workflow he could get
  • 00:26:48
    it to do exactly what what he wanted and
  • 00:26:51
    he built you know huge test cases around
  • 00:26:54
    it and then sold that company for $650
  • 00:26:56
    million so it's uh you know I think of
  • 00:26:59
    him as like one of the first to
  • 00:27:01
    commercialize gp4 in a relatively Grand
  • 00:27:04
    fashion I remember that conversation
  • 00:27:06
    with him yeah with one gp4 like that was
  • 00:27:09
    one of the few moments in that thing
  • 00:27:11
    where I was like okay we have something
  • 00:27:12
    really great on our hands um when we
  • 00:27:15
    first started trying to like sell gpt3
  • 00:27:17
    to found Founders they would be like
  • 00:27:19
    it's cool it's doing something amazing
  • 00:27:21
    it's an incredible demo
  • 00:27:24
    but with the possible exception of
  • 00:27:28
    copyrighting no great businesses were
  • 00:27:30
    built on gpt3 and then 3 3.5 came along
  • 00:27:33
    and
  • 00:27:33
    people startups like YC startups in
  • 00:27:36
    particular started to do interest like
  • 00:27:37
    it no longer felt like we were pushing a
  • 00:27:39
    boulder uphill so like people actually
  • 00:27:40
    wanted to buy the thing we were selling
  • 00:27:41
    totally and then
  • 00:27:44
    four we kind of like got the like just
  • 00:27:47
    how many gpus can you give me oh yeah
  • 00:27:49
    moment like very quickly after giving
  • 00:27:51
    people access so we felt like okay we
  • 00:27:53
    got something like really good on our
  • 00:27:54
    hands so you you knew actually from your
  • 00:27:57
    users that totally like when the when
  • 00:27:59
    the uh model dropped itself and you got
  • 00:28:02
    your hands on it it was like well this
  • 00:28:04
    this is better we were totally impressed
  • 00:28:06
    then too we had all of these like tests
  • 00:28:09
    that we did on it that were very it like
  • 00:28:11
    looked great and it could just do these
  • 00:28:13
    things that we were all super impressed
  • 00:28:15
    by also like when we were all just
  • 00:28:17
    playing around with it and like getting
  • 00:28:19
    samples back I was like wow it's like it
  • 00:28:20
    can do this now and they were it can
  • 00:28:22
    rhyme and it can like tell a funny joke
  • 00:28:24
    slightly funny joke and it can like you
  • 00:28:26
    know do this and that and so it felt
  • 00:28:29
    really great but you know you never
  • 00:28:31
    really know if you have a hit product on
  • 00:28:33
    your hands until you like put it in
  • 00:28:35
    customer hands yeah you're always too
  • 00:28:36
    impressed with your own work yeah and
  • 00:28:39
    and so we were all excited about it we
  • 00:28:41
    were like oh this is really quite good
  • 00:28:43
    but until like the test happens it's
  • 00:28:46
    like the real test is yeah the real test
  • 00:28:48
    is users yeah so there's some anxiety
  • 00:28:50
    until that until that moment happens
  • 00:28:53
    yeah I wanted to switch gears a little
  • 00:28:54
    bit so before you created obviously one
  • 00:28:57
    of the craziest AI Labs ever to be
  • 00:29:00
    created um you started at 19 at YC with
  • 00:29:04
    a company called looped which was uh
  • 00:29:07
    basically find my friend's
  • 00:29:09
    geolocation you know probably what 15
  • 00:29:12
    years before Apple ended up making it
  • 00:29:14
    too early in any case yeah yeah what
  • 00:29:16
    Drew you to that particular idea I was
  • 00:29:19
    like interested in Mobile phones and I
  • 00:29:22
    wanted to do something that got to like
  • 00:29:25
    use mobile phone this was when like
  • 00:29:26
    mobile was just starting was like you
  • 00:29:28
    know still 3 years or years before the
  • 00:29:30
    iPhone but it was clear that carrying
  • 00:29:33
    around
  • 00:29:34
    computers in our pockets was somehow a
  • 00:29:38
    very big deal I mean that's hard to
  • 00:29:39
    believe now that there was a moment when
  • 00:29:42
    phones were actually literally you just
  • 00:29:44
    they were just a phone they were an
  • 00:29:45
    actual phone yeah yeah I mean I try not
  • 00:29:47
    to use it as an actual phone ever really
  • 00:29:49
    I still remember the first phone I got
  • 00:29:52
    that had internet on it and it was this
  • 00:29:55
    horrible like text based mostly
  • 00:29:58
    text-based browser it was really slow
  • 00:30:00
    you could like you know do like you
  • 00:30:01
    could so painfully and so slowly check
  • 00:30:03
    your email um but I was like a I don't
  • 00:30:07
    know in high school sometime in high
  • 00:30:09
    school and I got a phone that could do
  • 00:30:10
    that versus like just text and call and
  • 00:30:13
    I was like hooked right then yeah I was
  • 00:30:15
    like ah this is this is not a phone this
  • 00:30:17
    is like a computer we can carry and
  • 00:30:19
    we're stuck with a dial pad for this
  • 00:30:20
    accident of history but this is going to
  • 00:30:22
    be awesome and I mean now you have
  • 00:30:25
    billions of people who they don't have a
  • 00:30:28
    computer like to us growing up you know
  • 00:30:30
    that that actually uh was your first
  • 00:30:32
    computer not physically is a replica or
  • 00:30:35
    like another copy of my first computer
  • 00:30:37
    which is lc2 yeah so this is what a
  • 00:30:39
    computer was to us growing up and the
  • 00:30:41
    idea that you would carry this little
  • 00:30:43
    black mirror like kind of we've come a
  • 00:30:46
    long way unconscionable back then yeah
  • 00:30:49
    so you know even then you like
  • 00:30:50
    technology and what was going to come
  • 00:30:53
    was sort of in your brain yeah I was
  • 00:30:55
    like a real I mean I still am a real
  • 00:30:56
    tech nerd but I always that was what I
  • 00:31:00
    spent my Friday nights thinking about
  • 00:31:02
    and then uh one of the harder parts of
  • 00:31:04
    it was we didn't have the App Store the
  • 00:31:06
    iPhone didn't
  • 00:31:08
    exist uh you ended up being a big part
  • 00:31:10
    of that launch I think a small part but
  • 00:31:12
    yes we dig it to be a little part of it
  • 00:31:14
    it was a great experience for me to have
  • 00:31:16
    been through because I I kind of like
  • 00:31:18
    understood what it is like to go through
  • 00:31:21
    a platform shift and how messy the
  • 00:31:22
    beginning is and how much like little
  • 00:31:25
    things you do can shape the direction it
  • 00:31:26
    all goes I I was definitely on the other
  • 00:31:28
    side of it then like I was watching
  • 00:31:29
    somebody else create the platform shift
  • 00:31:32
    but it was a super valuable
  • 00:31:35
    experience to get to go through and sort
  • 00:31:37
    of just see what how it happens and how
  • 00:31:40
    quickly things change and how you adapt
  • 00:31:42
    through it what was that experience like
  • 00:31:44
    you ended up selling that company uh was
  • 00:31:47
    probably the first time you were
  • 00:31:49
    managing people and you know doing
  • 00:31:51
    Enterprise sales all of these things
  • 00:31:53
    were useful lessons from that first
  • 00:31:55
    experience I mean it obviously was not a
  • 00:31:57
    su ful company um it
  • 00:32:00
    was and so it's a very painful thing to
  • 00:32:03
    go through but the rate of experience
  • 00:32:04
    and education was incredible another
  • 00:32:08
    thing that PG said or quoted somebody
  • 00:32:09
    else saying but always stuck with me is
  • 00:32:10
    your 20s are always an apprenticeship
  • 00:32:12
    but you don't know for what and then you
  • 00:32:13
    do your real work later and I did learn
  • 00:32:16
    quite a lot and I'm very grateful for it
  • 00:32:18
    it was like a difficult experience and
  • 00:32:22
    we never found product Market fit really
  • 00:32:24
    and we also never like really found a
  • 00:32:26
    way to get to escape velocity which is
  • 00:32:27
    just always hard to do there is nothing
  • 00:32:30
    that I that I have ever heard of that
  • 00:32:32
    has a higher rate of generalized
  • 00:32:34
    learning than doing a startup so it was
  • 00:32:37
    great in that sense you know when you're
  • 00:32:39
    19 and 20 like riding the wave of some
  • 00:32:42
    other platform shift this shift from you
  • 00:32:44
    know dumb cell phones to smartphones and
  • 00:32:48
    mobile and you know here we are many
  • 00:32:51
    years later and your next ACT was
  • 00:32:53
    actually you know I mean I guess two
  • 00:32:55
    acts later literally spawning
  • 00:32:58
    one of the major platform sh we all get
  • 00:33:00
    old yeah but that's really what's
  • 00:33:02
    happening you know uh 18 20 year olds
  • 00:33:05
    are deciding that they could get their
  • 00:33:08
    degree but they're going to miss the
  • 00:33:10
    wave like cuz all of the stuff that's
  • 00:33:12
    great everything's happening right now
  • 00:33:14
    like proud do you have an intuitive
  • 00:33:16
    sense like speaking to even a lot of the
  • 00:33:19
    you know really great billion dooll
  • 00:33:21
    company Founders some of them are just
  • 00:33:24
    not that aware of what's Happening like
  • 00:33:26
    there're C to it's wild I think that's
  • 00:33:29
    why I'm so excited for startups right
  • 00:33:31
    now is because the world is still
  • 00:33:33
    sleeping on all of this to such an
  • 00:33:34
    astonishing degree yeah and then you
  • 00:33:36
    have like the YC Founders being like no
  • 00:33:38
    no I'm going to like do this amazing
  • 00:33:40
    thing and do it very quickly yeah it
  • 00:33:42
    reminds me of when um Facebook almost
  • 00:33:45
    missed mobile because they were making
  • 00:33:47
    web software and they were really good
  • 00:33:49
    at it yeah and um like they they I mean
  • 00:33:53
    they had to buy Instagram like Snapchat
  • 00:33:55
    right up yeah and WhatsApp so um it's
  • 00:33:58
    interesting the platform shift is always
  • 00:34:00
    built by the people who are young with
  • 00:34:03
    no prior knowledge it's it is I think
  • 00:34:07
    it's great so there's this other aspect
  • 00:34:10
    that's interesting in that I think
  • 00:34:12
    you're you know you and Elon and uh
  • 00:34:15
    Bezos and a bunch of people out there
  • 00:34:17
    like they sort of start their Journey as
  • 00:34:20
    Founders you know really you know
  • 00:34:24
    whether it's looped or zip to or you
  • 00:34:26
    know really in maybe pure soft software
  • 00:34:28
    like it's just a different thing that
  • 00:34:30
    they start and then later they you know
  • 00:34:32
    sort of get to level up you know is
  • 00:34:34
    there a path that you recommend at this
  • 00:34:36
    point if people are thinking you know I
  • 00:34:38
    want to work on the craziest hard tech
  • 00:34:40
    thing first should they just run towards
  • 00:34:42
    that to the extent they can or is there
  • 00:34:45
    value in you know sort of solving the
  • 00:34:47
    money problem first being able to invest
  • 00:34:49
    your own money like very deeply into the
  • 00:34:52
    next thing it's a really interesting
  • 00:34:55
    question it was definitely helpful
  • 00:34:58
    that I could just like write the early
  • 00:34:59
    checks for open Ai and I think it would
  • 00:35:02
    have been hard to get somebody else to
  • 00:35:03
    do that at the very beginning um and
  • 00:35:05
    then Elon did it a lot at much higher
  • 00:35:07
    scale which I'm very grateful for and
  • 00:35:09
    then other people did after that and and
  • 00:35:12
    there's other things that I've invested
  • 00:35:13
    in that I'm really happy to have been
  • 00:35:15
    able to support and I don't I think it
  • 00:35:17
    would have been hard to get other people
  • 00:35:18
    to to do it um so that's great for sure
  • 00:35:22
    and I did like we were talking about
  • 00:35:24
    earlier learn these extremely valuable
  • 00:35:28
    lessons but I also feel like I kind of
  • 00:35:30
    like was wasting my time for lack of a
  • 00:35:33
    better phrase working on looped I don't
  • 00:35:35
    I definitely don't regret it it's like
  • 00:35:37
    all part of the tapestry of life and I
  • 00:35:38
    learned a ton and whatever else what
  • 00:35:41
    would you have done differently or what
  • 00:35:43
    would you tell yourself from like now to
  • 00:35:45
    in a Time cap in like time travel
  • 00:35:48
    capsule that would show up on your desk
  • 00:35:50
    at Stanford when you were 19 well it's
  • 00:35:52
    hard because AI was always the thing I
  • 00:35:54
    most wanted to do and AI just like I
  • 00:35:55
    went to school to study AI but at the
  • 00:35:58
    time I was working in the AI lab the one
  • 00:35:59
    thing that I they told you is definitely
  • 00:36:01
    don't work on neural networks we tried
  • 00:36:03
    that it doesn't work a long time ago I
  • 00:36:05
    think I could have picked a much better
  • 00:36:07
    thing to work on than loped I don't know
  • 00:36:08
    exactly what it would have been but it
  • 00:36:10
    all works out it's fine yeah there's
  • 00:36:12
    this long history of people building
  • 00:36:14
    more technology to help improve other
  • 00:36:17
    people's lives and I I actually think
  • 00:36:19
    about this a lot like I think about the
  • 00:36:21
    people that made that computer and I
  • 00:36:23
    don't know them um you know they're many
  • 00:36:26
    of them probably long retired
  • 00:36:28
    but I am so grateful to them yeah and
  • 00:36:31
    some people worked super hard to make
  • 00:36:34
    this thing at the limits of technology I
  • 00:36:36
    got a copy of that on my eth birthday
  • 00:36:38
    and it totally changed my life yeah and
  • 00:36:41
    the lives of a lot of other people too
  • 00:36:43
    they worked super hard they never like
  • 00:36:45
    got to thank you for me but I feel it to
  • 00:36:47
    them very
  • 00:36:48
    deeply
  • 00:36:50
    and it's really nice to get to like add
  • 00:36:53
    our brick to that long road of progress
  • 00:36:56
    yeah um is it's been a great year for
  • 00:36:58
    open AI not without some drama uh always
  • 00:37:01
    yeah we're good at that uh what did you
  • 00:37:04
    learn from you know sort of the ouer
  • 00:37:06
    last fall and how do you feel about some
  • 00:37:08
    of the you know departures I mean teams
  • 00:37:11
    do evolve but how are you doing man tire
  • 00:37:15
    but good yeah uh it's we've kind of like
  • 00:37:19
    speedrun uh like medium siiz or even
  • 00:37:22
    kind of like pretty big siiz tech
  • 00:37:24
    company Arc that would normally take
  • 00:37:26
    like a decade and two years like chpt is
  • 00:37:28
    less than two years old yeah and and
  • 00:37:30
    there's like a lot of painful stuff that
  • 00:37:32
    comes with that
  • 00:37:35
    um and there are you know any company as
  • 00:37:38
    it scales goes through management teams
  • 00:37:41
    at some rate uh and you have to sort of
  • 00:37:44
    the people who are really good at the
  • 00:37:45
    zero to one phase are not necessarily
  • 00:37:46
    people that are good at the 1 to 10 or
  • 00:37:48
    the 10 to the 100 phase we've also kind
  • 00:37:50
    of like changed what were going to be um
  • 00:37:53
    made plenty of mistakes along the way
  • 00:37:55
    done a few things really right and
  • 00:37:58
    that comes with a lot of change and I
  • 00:38:02
    think the goal
  • 00:38:04
    of the company uh the emerging AGI or
  • 00:38:09
    whatever however you want to think about
  • 00:38:10
    it is like just keep making the best
  • 00:38:13
    decisions we can at every stage but it
  • 00:38:15
    does lead to a lot of change I hope that
  • 00:38:18
    we are heading towards a period now of
  • 00:38:21
    more calm but I'm sure there will be
  • 00:38:23
    other periods in the future where things
  • 00:38:25
    are very Dynamic again so I guess how
  • 00:38:28
    does open AI actually work right now you
  • 00:38:30
    know I mean the quality and like the
  • 00:38:33
    pace that you're pushing right now I
  • 00:38:35
    think is like Beyond world class
  • 00:38:38
    compared to a lot of the other you know
  • 00:38:41
    really established software players like
  • 00:38:43
    who came
  • 00:38:44
    before this is the first time ever where
  • 00:38:47
    I felt like
  • 00:38:50
    we actually know what to do like I think
  • 00:38:52
    from here
  • 00:38:54
    to building an AGI will still take a
  • 00:38:57
    huge amount of work there are some known
  • 00:38:59
    unknowns but I think we basically know
  • 00:39:01
    what to go what to go do and it'll take
  • 00:39:03
    a while it'll be hard but that's
  • 00:39:05
    tremendously exciting I also think on
  • 00:39:09
    the product side there's more to figure
  • 00:39:12
    out but roughly we know what to shoot at
  • 00:39:14
    and what we want to optimize
  • 00:39:16
    for that's a really exciting time and
  • 00:39:18
    when you have that Clarity I think you
  • 00:39:20
    can go pretty fast yeah if you're
  • 00:39:22
    willing to say we're going to do these
  • 00:39:23
    few things we're going to try to do them
  • 00:39:24
    very well and our research path is
  • 00:39:28
    fairly clear our infrastructure path is
  • 00:39:29
    fairly clear our product path is getting
  • 00:39:33
    clearer you can Orient around that super
  • 00:39:37
    well we for a long time did not have
  • 00:39:39
    that we were a true research lab and
  • 00:39:42
    even when you know that it's hard to act
  • 00:39:44
    with the conviction on it because
  • 00:39:45
    there's so many other good things You'
  • 00:39:46
    like to do
  • 00:39:48
    yeah but the degree to which you can get
  • 00:39:51
    everybody aligned and pointed at the
  • 00:39:53
    same thing is a significant determinant
  • 00:39:56
    in how fast you can move
  • 00:39:58
    I mean sounds like we went from level
  • 00:39:59
    one to level two very recently and that
  • 00:40:01
    was really powerful um and then we
  • 00:40:04
    actually just had our 01 hackathon at YC
  • 00:40:07
    that was so impressive that was super
  • 00:40:08
    fun um and then weirdly one of the
  • 00:40:12
    people who won I think they came in
  • 00:40:13
    third uh was camper and so CAD cam
  • 00:40:17
    startup you know did YC recently last
  • 00:40:20
    year or two and uh they were able to
  • 00:40:23
    during the hackathon build something
  • 00:40:25
    that would iteratively improve an air
  • 00:40:28
    foil from something that wouldn't fly to
  • 00:40:30
    literally something that had yeah that
  • 00:40:32
    was awesome a competitive amount of lift
  • 00:40:35
    and I mean that sort of sounds like
  • 00:40:37
    level four which is uh you know the
  • 00:40:40
    innovator stage it's very funny you say
  • 00:40:43
    that I I had been telling people for a
  • 00:40:45
    while I thought that the level two to
  • 00:40:46
    level three jump was going to happen but
  • 00:40:48
    then the level three to level four jump
  • 00:40:50
    was level two to level three was going
  • 00:40:52
    to happen quickly and then the level
  • 00:40:53
    three to level four
  • 00:40:55
    jump was somehow going to be much harder
  • 00:40:57
    and
  • 00:40:58
    require some medium-sized or larger new
  • 00:41:03
    ideas and that demo and a few others
  • 00:41:05
    have convinced me
  • 00:41:08
    that you can get a huge amount of
  • 00:41:10
    innovation just by using these current
  • 00:41:12
    models in really creative ways well yeah
  • 00:41:15
    I mean it's uh what's interesting is
  • 00:41:17
    basically camper already built sort of
  • 00:41:19
    the um underlying software for CAD Cam
  • 00:41:24
    and then you know language is sort of
  • 00:41:28
    the interface to the large language
  • 00:41:29
    model that then which then can use the
  • 00:41:32
    software like tool use and then if you
  • 00:41:35
    combine that with the idea of code gen
  • 00:41:38
    that's kind of a scary crazy idea right
  • 00:41:40
    like not only can the uh you large
  • 00:41:43
    language model code but it can create
  • 00:41:46
    tools for itself and then compose those
  • 00:41:48
    tools similar to you know chain of
  • 00:41:51
    thoughts with o1 yeah I think things are
  • 00:41:53
    going to go a lot faster than people are
  • 00:41:55
    appreciating right now yeah well it's a
  • 00:41:57
    an exciting time to be alive honestly
  • 00:41:59
    you know we you mentioned earlier that
  • 00:42:01
    thing about discover all of physics I uh
  • 00:42:04
    I was want to be a physicist wasn't
  • 00:42:05
    smart enough to be a good one had to
  • 00:42:06
    like contribute in this other way but
  • 00:42:08
    the fact that somebody else I really
  • 00:42:10
    believe is now going to go solve all the
  • 00:42:11
    physics with the stuff like I'm so
  • 00:42:14
    excited to be alive for that let's get
  • 00:42:16
    to level four so happy for whoever that
  • 00:42:18
    person is yeah do you want to talk about
  • 00:42:21
    level three four and five
  • 00:42:23
    briefly yeah so we realized that AGI had
  • 00:42:26
    become this like
  • 00:42:27
    badly overloaded word and people meant
  • 00:42:29
    all kinds of different things and we
  • 00:42:30
    tried to just say okay here's our best
  • 00:42:32
    guess roughly of the order of things you
  • 00:42:34
    have these level one systems which are
  • 00:42:36
    these chat Bots there'd be level two
  • 00:42:38
    that would come which would be these
  • 00:42:39
    this these reasoners we think we got
  • 00:42:41
    there earlier this year um with the o1
  • 00:42:44
    release three is Agents U ability to go
  • 00:42:48
    off and do these longer term tasks uh
  • 00:42:51
    you know maybe like multiple
  • 00:42:52
    interactions with an environment asking
  • 00:42:55
    people for help when they need it
  • 00:42:56
    working together all of that and I I
  • 00:42:59
    think we're going to get there faster
  • 00:43:00
    than people expect for as innovators
  • 00:43:03
    like that's like a scientist and you
  • 00:43:05
    know that's ability to go explore like a
  • 00:43:09
    not well understood
  • 00:43:11
    phenomena over like a long period of
  • 00:43:14
    time and understand what's just kind of
  • 00:43:16
    go just figure it out and then and then
  • 00:43:19
    level five this is the sort of slightly
  • 00:43:22
    amorphous like do that but at the scale
  • 00:43:24
    of the whole company or you know a whole
  • 00:43:26
    organization or whatever
  • 00:43:27
    ever that's going to be a pretty
  • 00:43:29
    powerful thing yeah and it feels kind of
  • 00:43:32
    fractal right like even the things you
  • 00:43:34
    had to do to get to two sort of rhyme
  • 00:43:36
    with level five and that you have
  • 00:43:38
    multiple agents that then self-correct
  • 00:43:40
    that work together I mean that kind of
  • 00:43:42
    sounds like an organization to me just
  • 00:43:44
    at like a very micro level do you think
  • 00:43:46
    that we'll have I mean you famously
  • 00:43:48
    talked about it I think Jake talks about
  • 00:43:49
    it it's like you will have companies
  • 00:43:52
    that make you know billions of dollars
  • 00:43:54
    per year and have like less than 100
  • 00:43:57
    employees maybe 50 maybe 20 employees
  • 00:44:00
    maybe one it does seem like that I don't
  • 00:44:03
    know what to make of that other than
  • 00:44:04
    it's a great time to be a startup
  • 00:44:05
    founder yeah but it does feel like
  • 00:44:08
    that's happening to me yeah um you know
  • 00:44:11
    it's like one person plus 10,000
  • 00:44:14
    gpus pretty pretty powerful Sam what
  • 00:44:17
    advice do you have for people watching
  • 00:44:19
    who you know either about to start or
  • 00:44:22
    just started their startup bet on this
  • 00:44:26
    Tech trend bet on this trend it's this
  • 00:44:28
    is we are not near the saturation point
  • 00:44:31
    the models are going to get so much
  • 00:44:32
    better so quickly what you can do as a
  • 00:44:34
    startup founder with this versus what
  • 00:44:37
    you could do without it is so wildly
  • 00:44:38
    different and the big companies even the
  • 00:44:41
    mediumsized companies even the startups
  • 00:44:43
    that are a few years old they're already
  • 00:44:44
    unlike quarterly planning cycles and
  • 00:44:47
    Google is on a year decade planning
  • 00:44:49
    cycle I don't know how they even do it
  • 00:44:50
    anymore but your advantage with speed
  • 00:44:54
    and focus and conviction and the ability
  • 00:44:56
    to react to to how fast the technology
  • 00:44:58
    is moving that is that is the number one
  • 00:45:00
    edge of a startup kind of ever but
  • 00:45:03
    especially right now so I would
  • 00:45:06
    definitely like build something with AI
  • 00:45:08
    and I would definitely like take
  • 00:45:09
    advantage of the ability to see a new
  • 00:45:12
    thing and build something that day
  • 00:45:14
    rather than like put it into a quarterly
  • 00:45:16
    planning cycle I guess the other thing I
  • 00:45:18
    would say
  • 00:45:20
    is it is easy when there's a new
  • 00:45:22
    technology platform to say well because
  • 00:45:25
    I'm doing something with AI the
  • 00:45:27
    the rule the laws of business don't
  • 00:45:29
    apply to me I have this magic technology
  • 00:45:31
    and so I don't have to build uh a moe or
  • 00:45:35
    a um you know Competitive Edge or a
  • 00:45:38
    better product it's because you know I'm
  • 00:45:39
    doing Ai and you're not so that's all I
  • 00:45:41
    need and that's obviously not true but
  • 00:45:44
    what you can get are these
  • 00:45:45
    short-term explosions of growth by
  • 00:45:49
    embracing a new technology more quickly
  • 00:45:51
    than somebody
  • 00:45:52
    else and remembering not to fall for
  • 00:45:55
    that and that you still have to build
  • 00:45:56
    something up been value that's I think
  • 00:45:58
    that's a good thing to keep in mind too
  • 00:45:59
    yeah everyone can build an absolutely
  • 00:46:01
    incredible demo right now but everyone
  • 00:46:03
    can build an incredible demo but
  • 00:46:04
    building a business man that's the brass
  • 00:46:07
    ring the rules still apply you can do it
  • 00:46:09
    faster than ever before and better than
  • 00:46:10
    ever before but you still have to build
  • 00:46:11
    a business what are you excited about in
  • 00:46:13
    2025 what's to come AGI yeah uh excited
  • 00:46:18
    for that uh what am I excited for um we
  • 00:46:22
    a kid I'm more excited for that than
  • 00:46:23
    congratulations ever been incredible
  • 00:46:26
    yeah probably that that's going to be
  • 00:46:27
    that's the thing I've like most excited
  • 00:46:29
    for ever in life yeah it uh changes your
  • 00:46:31
    life completely so I cannot wait well
  • 00:46:34
    here's to building that better world for
  • 00:46:37
    you know our kids and really hopefully
  • 00:46:40
    the whole world this is a lot of fun
  • 00:46:42
    thanks for hanging out Sam thank you
  • 00:46:45
    [Music]
Etiquetas
  • AGI
  • AI
  • OpenAI
  • Sam Altman
  • Tech Optimism
  • Innovation
  • Deep Learning
  • Startups
  • Future Technology
  • Energy Abundance