How To Build The Future: Sam Altman
Résumé
TLDRIn an insightful discussion, Sam Altman of OpenAI emphasizes the pursuit of Artificial General Intelligence (AGI), while reflecting on the taboo and challenges surrounding its potential, especially in earlier years. The conversation explores the growth and success of OpenAI, highlighting technological advancements and the significant value of conviction on singular innovative bets. Altman expresses optimism about technology, envisioning a future where AGI and abundant energy reshape our lives, solving substantial challenges like climate change and space colonization. Drawing on lessons from startups, he underscores the importance of a like-minded peer group and staying focused on a singular objective, believing deeply in the scaling potential of deep learning. The impact of scale and AI’s rapid progression towards reasoning and agent capabilities are discussed. Altman predicts much anticipation for startups to embrace this tech shift, urging them to exploit AI’s transformative power while adhering to sound business practices. The talk underscores a hopeful, albeit challenging future landscape defined by innovative potential and emerging technological paradigms.
A retenir
- 🚀 Pursuit of AGI is central to OpenAI's mission.
- 🤖 AI advancement is crucial for future technological solutions.
- 🎯 Focused conviction on a single bet aids startup success.
- 🔥 Abundant energy could unlock limitless innovation.
- 🌌 Future tech could solve massive challenges like climate change.
- 💡 Startups should leverage the rapid advancements in AI.
- 🔍 Scaling neural networks is a key strategy for OpenAI.
- 🧑🤝🧑 Peer groups are crucial in fostering startup success.
- 📈 Growth of AI capabilities is exponential and transformative.
- ⏩ The startup environment is ripe for tech innovation.
Chronologie
- 00:00:00 - 00:05:00
The discussion begins with a mention of the initial pursuit of Artificial General Intelligence (AGI) by OpenAI, despite skepticism in the field. The speaker expresses enthusiasm about current times being the best period for starting tech companies, due to the significant potential and changes brought by technological revolutions.
- 00:05:00 - 00:10:00
The conversation then shifts to the future of superintelligence (ASI), predicting it's thousands of days away. The participants discuss the compounding nature of progress and envision a future where climate issues are resolved and space colonies established. They exhibit optimism about the potential of near-limitless intelligence and energy.
- 00:10:00 - 00:15:00
There is a focus on energy abundance and its transformative impact on physical work and quality of life. The discussion acknowledges solar and nuclear energy advancements, stressing the importance of achieving significant energy breakthroughs.
- 00:15:00 - 00:20:00
The talk shifts to the speaker's early days with Y Combinator (YC) and his drive as a sophomore to join. The significance of surrounding oneself with ambitious peers is discussed, highlighting the role of collaborative environments in fostering innovation.
- 00:20:00 - 00:25:00
Key insights are shared on peer influence, innovation, and founding experiences with Loop, Y Combinator, and OpenAI. The speaker compares early experiences at Stanford with the dynamic startup ecosystem.
- 00:25:00 - 00:30:00
Foundational insights about the formation of OpenAI are detailed. The speaker reflects on assembling a team passionate about AGI and overcoming skepticism from established figures in the AI field, favoring conviction and ambitious goals.
- 00:30:00 - 00:35:00
The strategic choice to focus on deep learning and scalability in OpenAI's development is emphasized. The narrative describes overcoming criticism and developing strong faith in scaling as a principle for technological advancement.
- 00:35:00 - 00:40:00
OpenAI's approach of focusing on scaling is discussed further, highlighting its success in pushing AI capabilities forward. The significant role of a highly talented research team is acknowledged as critical to OpenAI's progress.
- 00:40:00 - 00:46:52
The final segment discusses the rapid pace and future potential of AI development, emphasizing startup possibilities. The importance of adapting and scaling within this fast-moving landscape is stressed, with reflections on managing success and failures within OpenAI.
Carte mentale
Vidéo Q&R
What is AGI?
AGI stands for Artificial General Intelligence, aiming for machines with the ability to understand and learn tasks like a human.
Who is Sam Altman?
Sam Altman is a prominent entrepreneur and CEO of OpenAI, focused on advancing artificial intelligence.
What does ASI mean?
ASI stands for Artificial Superintelligence, a stage where machines surpass human intelligence capabilities.
What is the current state of AI progress?
AI is progressing rapidly, with advancements in deep learning and unsupervised models leading the way.
How does OpenAI plan to scale AI technology?
OpenAI focuses on leveraging deep learning and scaling neural networks to improve AI capabilities.
Why is conviction in one bet important for startups?
Focusing on one technology or idea allows startups to innovate deeply without spreading resources thin.
What are some future possibilities mentioned in the discussion?
Future possibilities include fixing climate issues, establishing space colonies, and achieving abundant energy.
How does Sam Altman feel about technological optimism?
He is very optimistic and encourages a techno-optimistic approach to solving big future challenges.
What significance does the peer group have in startups?
Having a supportive and inspiring peer group is crucial for staying motivated and achieving success.
What role did unsupervised learning have in OpenAI's progress?
Unsupervised learning was a pivotal step, especially with the development of models like GPT-2.
Voir plus de résumés vidéo
These 5 Bible Verses Made Me a Millionaire
Energiewende Wahnsinn erklärt - Bsp: WIndräder im Wald in Kammer
The Right Questions Framework for Decision-Making, Coaching and Achieving Goals
Why is Decision Making Important (and How Can You Make Better Choices)?
Phonetics of Sign Languages (IntroLing 2020F.W02.14)
How Does the Human Body Produce Voice and Speech?
- 00:00:00we said from the very beginning we were
- 00:00:01going to go after AGI at a time when in
- 00:00:04the field you weren't allowed to say
- 00:00:05that because that just seemed impossibly
- 00:00:08crazy I remember a rash of criticism for
- 00:00:11you guys at that moment we really wanted
- 00:00:13to push on that and we were far less
- 00:00:17resourced than Deep Mind and others and
- 00:00:19so we said okay they're going to try a
- 00:00:21lot of things and we've just got to pick
- 00:00:22one and really concentrate and that's
- 00:00:23how we can we can win here most of the
- 00:00:25world still does not understand the
- 00:00:27value of like a fairly extreme level of
- 00:00:29conviction on one bet that's why I'm so
- 00:00:31excited for startups right now it is
- 00:00:33because the world is still sleeping on
- 00:00:35all this to such an astonishing
- 00:00:42degree we have a real treat for you
- 00:00:44today Sam Alman thanks for joining us
- 00:00:47thanks this is actually a reboot of your
- 00:00:50series how to build the future and so
- 00:00:53welcome back to the series that you
- 00:00:55started years ago I was trying to think
- 00:00:56about that something like that that's
- 00:00:57wild I'm glad it's being rebooted that's
- 00:00:59right let's talk about your newest essay
- 00:01:02on uh the age of intelligence you know
- 00:01:04is this the best time ever to be
- 00:01:07starting a technology company let's at
- 00:01:10least say it's the best time yet
- 00:01:11hopefully there'll be even better times
- 00:01:12in the future I sort of think with each
- 00:01:14successive major technological
- 00:01:16Revolution you've been able to do more
- 00:01:18than you could before and I would expect
- 00:01:21the companies to be more amazing and
- 00:01:24impactful in everything else so yeah I
- 00:01:26think it's the best time yet big
- 00:01:28companies have the edge when things are
- 00:01:30like moving slowly and not that Dynamic
- 00:01:32and then when something like this or
- 00:01:34mobile or the internet or semiconductor
- 00:01:37Revolution happens or probably like back
- 00:01:39in the days of the Industrial Revolution
- 00:01:41that was when upstarts had their have
- 00:01:42their Edge so yeah this is like and it's
- 00:01:45been a while since we've had one of
- 00:01:46these so this is like pretty exciting in
- 00:01:48the essay you actually say a really big
- 00:01:50thing which is ASI super intelligence is
- 00:01:54actually thousands of days away maybe I
- 00:01:58mean that's our hope our guess whatever
- 00:02:00uh but that's a very wild statement yeah
- 00:02:05um tell us about it I mean that's that's
- 00:02:07big that is really big I can see a path
- 00:02:10where the work we are doing just keeps
- 00:02:12compounding and the rate of progress
- 00:02:15we've
- 00:02:16made over the last three years
- 00:02:19continuous for the next three or six or
- 00:02:21nine or whatever um you know nine years
- 00:02:24would be like 3500 days or whatever if
- 00:02:27we can keep this rate of improvement or
- 00:02:28even increase it
- 00:02:30that system will be quite capable of
- 00:02:31doing a lot of things I think already uh
- 00:02:34even a system like A1 is capable of
- 00:02:37doing like quite a lot of things from
- 00:02:39just like a raw cognitive IQ on a closed
- 00:02:42end well- defined task in a certain area
- 00:02:45I'm like oh one is like a very smart
- 00:02:47thing and I think we're nowhere near the
- 00:02:49limit of progress I mean that was an
- 00:02:51architecture shift that sort of unlocked
- 00:02:53yeah a lot and what I'm sort of hearing
- 00:02:56is that these things are going to
- 00:02:57compound we could hit some like
- 00:02:59unexpected Ed wall or we could be
- 00:03:01missing something but it looks to us
- 00:03:03like there's a lot of compounding in
- 00:03:05front of us still to happen I mean this
- 00:03:07essay is probably the most techno
- 00:03:08Optimist of almost anything I've seen
- 00:03:11out there some of the things we get to
- 00:03:13look forward to uh fixing the climate
- 00:03:15establishing a space Colony the
- 00:03:18discovery of all of physics uh near
- 00:03:20Limitless intelligence and abundant
- 00:03:23energy I do think all of those things
- 00:03:26and probably a lot more we can't even
- 00:03:27imagine are maybe not that far away
- 00:03:30and one of I I think it's like
- 00:03:33tremendously exciting that we can talk
- 00:03:34about this even semi-seriously now one
- 00:03:37of the things that I always have loved
- 00:03:39the most about YC is it
- 00:03:41encourages slightly implausible degrees
- 00:03:44of techno optimism and just a belief
- 00:03:47that like ah you can figure this out and
- 00:03:50you know in a world that I think is like
- 00:03:52sort of consistently telling people this
- 00:03:54is not going to work you can't do this
- 00:03:55thing you can't do that I think the kind
- 00:03:56of early PG Spirit of just encouraging
- 00:03:59Founders to like think a little bit
- 00:04:00bigger is like it is a special thing in
- 00:04:03the world the Abundant energy thing
- 00:04:05seems like a pretty big deal you know
- 00:04:07there's sort of path a and path B you
- 00:04:10know if we do achieve abundant
- 00:04:13energy seems like this is a real unlock
- 00:04:16almost any work not just you know
- 00:04:20knowledge work but actually like real
- 00:04:22physical work yeah could be unlocked
- 00:04:24with ro Robotics and with language and
- 00:04:27Intelligence on tap like there's a real
- 00:04:29age of abundance I think these are like
- 00:04:32the key to in the two key inputs to
- 00:04:35everything else that we want there's a
- 00:04:37lot of other stuff of course that
- 00:04:39matters but the unlock that would happen
- 00:04:42if we could just get truly abundant
- 00:04:44intelligence truly abundant
- 00:04:46energy what we'd be able to make happen
- 00:04:49in the world like both like come up with
- 00:04:51better ideas more quickly and then also
- 00:04:53like make them happen in in the physical
- 00:04:55world like to say nothing of it'd be
- 00:04:57nice to be able to run lots of AI and
- 00:04:59that takes energy too uh I think that
- 00:05:02would be a huge unlock and the fact that
- 00:05:03it's I'm not sure to be whether like
- 00:05:06whether it be surprised that it's all
- 00:05:07happening the same time or if this is
- 00:05:08just like the natural effect of an
- 00:05:11increasing rate of technological
- 00:05:12progress but it's certainly very
- 00:05:15exciting time to be alive and great time
- 00:05:17to do a startup well so we sort of walk
- 00:05:19through this age of abundance you know
- 00:05:22maybe you robots can actually
- 00:05:24manufacture do anything almost all
- 00:05:27physical labor can then result in
- 00:05:29material progress not just for the most
- 00:05:31wealthy but for everyone you know what
- 00:05:34happens if we don't unleash unlimited
- 00:05:36energy if you know there's some physical
- 00:05:39law that prevents us from exactly that
- 00:05:43solar Plus Storage is on a good enough
- 00:05:46trajectory that even if we don't get a
- 00:05:48big nuclear
- 00:05:49breakthrough we would be like
- 00:05:52okayish but for sure it seems that
- 00:05:55driving the cost of energy down the
- 00:05:56abundance of it up has like a very
- 00:05:59direct impact on quality of life
- 00:06:03and eventually we'll solve every problem
- 00:06:05in physics so we're going to figure this
- 00:06:06out it's just a question of when and we
- 00:06:09deserve it uh there's you know someday
- 00:06:12we'll be talking not about Fusion or
- 00:06:14whatever but about the dys feere and
- 00:06:16that'll be awesome too yeah this is a
- 00:06:18point in time whatever feels like
- 00:06:19abundant energy to us will feel like not
- 00:06:21nearly enough to our great-grandchildren
- 00:06:23and there's a big universe out there
- 00:06:25with a lot of matter yeah wanted to
- 00:06:27switch gears a little bit to sort of
- 00:06:29your earlier you were mentioning uh Paul
- 00:06:31Graham who brought us all together
- 00:06:33really created why combinator he likes
- 00:06:36to tell the story of how you know how
- 00:06:38you got into YC was actually you were a
- 00:06:40Stanford freshman um and he said you
- 00:06:43know what this is the very first YY
- 00:06:45batch in 2005 and he said you know what
- 00:06:49you're a freshman and wey will still be
- 00:06:52here uh next time you should just wait
- 00:06:55and you said I'm a sophomore and I'm
- 00:06:58coming and
- 00:07:00widely known in our community as you
- 00:07:02know one of the most formidable people
- 00:07:04where do you think that came from that
- 00:07:07one story I think I I I would happy i'
- 00:07:10be happy if that like drifted off hisory
- 00:07:12well now it's it's purely immortalized
- 00:07:14here here it is my memory of that is
- 00:07:18that like I needed to reschedule an
- 00:07:20interview one day or something um and PG
- 00:07:23tried to like say like I just do it next
- 00:07:25year or whatever and then I think I said
- 00:07:27some nicer version of I'm a sophomore
- 00:07:29and I'm coming but yeah you know these
- 00:07:32things get slightly apocryphal it's
- 00:07:34funny I
- 00:07:35don't and I say this with no false
- 00:07:38modesty I don't like identify as a
- 00:07:41formidable person at all in fact I think
- 00:07:42there's a lot of ways in which I'm
- 00:07:44really not I do have a little bit of a
- 00:07:47just
- 00:07:48like I don't see why things have to
- 00:07:52be the way they are and so I'm just
- 00:07:54going to like do this thing that from
- 00:07:57first principles seems like fine and I
- 00:07:59always felt a little bit weird about
- 00:08:01that and then I I remember one of the
- 00:08:03things I thought was so great about YC
- 00:08:05and still that I care so much about YC
- 00:08:06about is it was like a collection of the
- 00:08:09weird people who are just like I'm just
- 00:08:11going to do my thing the part of this
- 00:08:13that does resonate as a like accurate
- 00:08:16self-identity thing is I do think you
- 00:08:18can just do stuff or try stuff a
- 00:08:21surprising amount of the time and I
- 00:08:24think more of that is a good thing and
- 00:08:26then I think one of the things that both
- 00:08:27of us found at YC was a bunch of people
- 00:08:31who all believed that you could just do
- 00:08:33stuff for a long time when I was trying
- 00:08:35to like figure out what made ycu so
- 00:08:36special I thought that it was like okay
- 00:08:39you have this like
- 00:08:41very amazing person telling you I you
- 00:08:46can do stuff I believe in you and as a
- 00:08:49young founder that felt so special and
- 00:08:51inspiring and of course it is but the
- 00:08:53thing that I didn't understand until
- 00:08:55much later was it was the peer group of
- 00:08:57other people doing that and one of the
- 00:09:00biggest pieces of advice I would give to
- 00:09:02young people now is finding that peer
- 00:09:05group as early as you can was so
- 00:09:07important to me um and I didn't realize
- 00:09:11it was something that mattered I kind of
- 00:09:12thought ah like I have you know I'll
- 00:09:14figure it out on my own but man being
- 00:09:17around like inspiring
- 00:09:19peers so so valuable what's funny is
- 00:09:21both of us did spend time at Stanford I
- 00:09:23actually did graduate which is I
- 00:09:25probably shouldn't have done that but I
- 00:09:27did sford it's great you pursued the
- 00:09:30path of uh you know far greater return
- 00:09:33uh by dropping out but you know that was
- 00:09:35a community that purportedly had a lot
- 00:09:38of these characteristics but I was still
- 00:09:40Beyond surprised at how much more potent
- 00:09:43it was with a room full of Founders it
- 00:09:45was I was just going to say the same
- 00:09:46thing actually I liked Samford a lot
- 00:09:48yeah
- 00:09:49but I was I did not feel surrounded by
- 00:09:53people that made me like want to be
- 00:09:55better and more ambitious and whatever
- 00:09:58else and to the degree did the thing you
- 00:10:00were competing with your peers on was
- 00:10:02like who was going to get the internship
- 00:10:04at which Investment Bank which I'm
- 00:10:06embarrassed to say I fell on that trap
- 00:10:07this is like how powerful peer groups
- 00:10:09are um it was a very easy decision to
- 00:10:12not go back to school after like seeing
- 00:10:14what the like YC Vibe was like yeah uh
- 00:10:17there's a powerful quote by uh Carl
- 00:10:19Young that I really love um it's you
- 00:10:21know the world will come and ask you who
- 00:10:24you are and if you don't know it will
- 00:10:27tell you it sounds like being very
- 00:10:29intentional about who you want to be and
- 00:10:31who you want to be around as early as
- 00:10:33possible is very important yeah this was
- 00:10:36definitely one of my takeaways at least
- 00:10:38for myself is you no one is immune to
- 00:10:40peer pressure and so all you can do is
- 00:10:42like pick good peers yeah obviously you
- 00:10:44know you went on to create looped you
- 00:10:47know sell that go to Green Dot and then
- 00:10:49we ended up getting to work together at
- 00:10:51YC talk to me about like the early days
- 00:10:53of YC research like one of the really
- 00:10:55cool things that you brought to YC was
- 00:10:58this experimentation and and you sort of
- 00:11:01I mean I I remember you coming back to
- 00:11:02partner rooms and talking about some of
- 00:11:04the rooms that you were getting to sit
- 00:11:06in with like the laran Sur gaze of the
- 00:11:08world and that you know AI was some sort
- 00:11:11of at the tip of everyone's tongue
- 00:11:13because it felt so close and yet it was
- 00:11:16you know that was 10 years ago the thing
- 00:11:19I
- 00:11:22always thought would be the coolest
- 00:11:24retirement job was to get to like run a
- 00:11:25research lab and it was not specifically
- 00:11:29to AI at that time when we started
- 00:11:32talking about YC research well not only
- 00:11:34was it going to it it did end up funding
- 00:11:35like a bunch of different efforts and I
- 00:11:38wish I could tell the story of like oh
- 00:11:39was obvious that AI was going to work
- 00:11:41and be the thing but like we tried a lot
- 00:11:43of bad things too it around that
- 00:11:47time I read a few books on like the
- 00:11:50history of zerox Park and Bell labs and
- 00:11:53stuff and I think there were a lot of
- 00:11:54people like it was in the air of Silicon
- 00:11:55Valley at the time that we need to like
- 00:11:57have good research Labs again and I just
- 00:11:59thought it would be so cool to do and it
- 00:12:01was sort of similar to what YC does and
- 00:12:04that you're going to like allocate
- 00:12:05Capital to smart people and sometimes
- 00:12:07it's going to work and sometimes it's
- 00:12:08not going to
- 00:12:11and I just wanted to try it AI for sure
- 00:12:14was having a mini moment this was like
- 00:12:17kind of late 2014 2015 early 2016 was
- 00:12:21like the super intelligence discussion
- 00:12:24like the book super intelligence was
- 00:12:25happening Bo yep yeah the Deep Mind had
- 00:12:29a few like impressive results but a
- 00:12:31little bit of a different direction you
- 00:12:33know I had been an AI nerd forever so I
- 00:12:35was like oh it' be so cool to try to do
- 00:12:37something but it's very hard to say was
- 00:12:38imet out yet imag net was out yeah yeah
- 00:12:41for a while at that point so you could
- 00:12:43tell if it was a hot dog or not you
- 00:12:45could sometimes yeah that was getting
- 00:12:48there yeah you know how did you identify
- 00:12:50the initial people you wanted involved
- 00:12:53in you know YC research and open AI I
- 00:12:56mean Greg Greg Brockman was early in
- 00:12:58retrospect it feels like this movie
- 00:13:00montage and there were like all of these
- 00:13:02like you know at the beginning of like
- 00:13:03the Bai movie when you're like driving
- 00:13:05around to find the people and whatever
- 00:13:07and and they're like you son of a
- 00:13:09I'm in right right like Ilia I like
- 00:13:13heard he was really smart and then I
- 00:13:15watched some video of his and he's ALS
- 00:13:17now he's extremely smart like true true
- 00:13:19genuine genius and Visionary but also he
- 00:13:21has this incredible presence and so I
- 00:13:23watched this video of his on YouTube or
- 00:13:25something I was like I got to meet that
- 00:13:26guy and I emailed him and he didn't
- 00:13:27respond so I just like went to some con
- 00:13:29conference he was speaking at and we met
- 00:13:31up and then after that we started
- 00:13:32talking a bunch and and then like Greg I
- 00:13:35had known a little bit from the early
- 00:13:36stripe days what was that conversation
- 00:13:38like though it's like I really like what
- 00:13:40your your ideas about Ai and I want to
- 00:13:43start a lab yes and one of the things
- 00:13:46that worked really well in
- 00:13:48retrospect was we said from the very
- 00:13:51beginning we were going to go after AGI
- 00:13:53at a time when in the FI you weren't
- 00:13:55allowed to say that because that just
- 00:13:58seemed possibly crazy and you know
- 00:14:02borderline irresponsible to talk so that
- 00:14:03got his attention immediately it got all
- 00:14:06of the good young people's attention and
- 00:14:08the derion derision whatever that word
- 00:14:10is of the mediocre old people and I felt
- 00:14:13like somehow that was like a really good
- 00:14:14sign and really powerful and we were
- 00:14:16like this rag tag group of people I mean
- 00:14:19I was the oldest by a decent amount I
- 00:14:21was like I guess I was 30 then and so
- 00:14:24you had like these people who were like
- 00:14:26those are these irresponsible young kids
- 00:14:28who don't know anything about anything
- 00:14:29and they're like saying these ridiculous
- 00:14:31things and the people who that was
- 00:14:33really appealing to I guess are the same
- 00:14:36kind of people who would have said like
- 00:14:37it's a you know I'm a sophomore and I'm
- 00:14:39coming or whatever and they were like
- 00:14:40let's just do this thing let's take a
- 00:14:41run at
- 00:14:42it and so we kind of went around and met
- 00:14:45people one by one and then in different
- 00:14:47configurations of groups and it kind of
- 00:14:49came together over the course of in fits
- 00:14:53and starts but over the course of like
- 00:14:54nine months and then it started h i mean
- 00:14:57and then it started it started happening
- 00:14:59and one of my favorite like memories of
- 00:15:02all of open eye
- 00:15:04was Ilia had some reason that with
- 00:15:07Google or something that we couldn't
- 00:15:08start in we announced in December of
- 00:15:102015 but we couldn't start until January
- 00:15:11of 2016 so like January 3rd something
- 00:15:14like that of 2016 like very early in the
- 00:15:17Month people come back from the holidays
- 00:15:19and we go to Greg's
- 00:15:21apartment maybe there's 10 of us
- 00:15:23something like that and we sit around
- 00:15:26and it felt like we had done this
- 00:15:27Monumental thing to get it started
- 00:15:30and everyone's like so what do we do
- 00:15:32now and what a great moment it reminded
- 00:15:35me of when startup Founders work really
- 00:15:38hard to like raise a round and they
- 00:15:40think like oh I accomplished this great
- 00:15:42we did it and then you sit down and say
- 00:15:44like now we got to like figure out
- 00:15:45what we're going to do it's not time for
- 00:15:47popping champagne that was actually the
- 00:15:49starting gun and now we got to run yeah
- 00:15:51and you have no idea how hard the race
- 00:15:53is going to be it took us a long time to
- 00:15:55figure out what we're going to do um but
- 00:15:58one of the things I'm really amazingly
- 00:16:01impressed by Ilia in particular but
- 00:16:03really all of the early people about is
- 00:16:05although it took a lot of twist and
- 00:16:06turns to get
- 00:16:09here the big picture of the original
- 00:16:11ideas was just so incredibly right and
- 00:16:15so they were like up on like one of
- 00:16:16those flip charts or whiteboards I don't
- 00:16:18remember which in Greg's apartment and
- 00:16:22then we went off and you know did some
- 00:16:24other things that worked or didn't work
- 00:16:26or whatever some of them did and
- 00:16:27eventually now we have this like
- 00:16:29system and it feels very crazy and very
- 00:16:34improbable looking backwards that we
- 00:16:36went from there to here with so many
- 00:16:39detours on the way but got where we were
- 00:16:40pointing was deep learning even on that
- 00:16:42flip chart initially yeah uh I mean more
- 00:16:45specifically than that like do a big
- 00:16:47unsupervised model and then solve RL was
- 00:16:49on that flip chart one of the flip
- 00:16:51charts from a very this is before Greg's
- 00:16:53apartment but from a very early offsite
- 00:16:55I think this is right I believe there
- 00:16:57were three goals for the for the effort
- 00:16:59at the time it was like figure out how
- 00:17:02to do unsupervised learning solve RL and
- 00:17:04never get more than 120 people missed on
- 00:17:07the third one but right the like d the
- 00:17:10predictive direction of the first two is
- 00:17:12pretty good so deep learning then the
- 00:17:16second big one sounded like scaling like
- 00:17:18the idea that you could scale that was
- 00:17:21another heretical idea that people
- 00:17:24actually found even offensive you know I
- 00:17:26remember a rash of criticism for you
- 00:17:29guys at that moment when we started yeah
- 00:17:33the core beliefs were deep learning
- 00:17:35works and it gets better with
- 00:17:37scale and I think those were both
- 00:17:40somewhat heretical beliefs at the time
- 00:17:42we didn't know how predictably better a
- 00:17:43got with scale that didn't come for a
- 00:17:44few years later it was a hunch first and
- 00:17:47then you got the data to show how
- 00:17:48predictable it was but but people
- 00:17:50already knew that if you made these
- 00:17:51neural networks bigger they got better
- 00:17:53yeah um like that was we were sure of
- 00:17:56that um before we started
- 00:18:00and what took the like where that keeps
- 00:18:03coming to mind is like religious level
- 00:18:05of belief was that that wasn't going to
- 00:18:08stop everybody had some reason of oh
- 00:18:11it's not really learning it's not really
- 00:18:13reasoning I can't really do this it's
- 00:18:15you know it's like a parlor trick and
- 00:18:18these were like the eminent leaders of
- 00:18:20the field and more than just saying
- 00:18:23you're wrong they were like you're wrong
- 00:18:25and this
- 00:18:26is like a bad thing to believe or bad
- 00:18:29thing to say it was that there's got to
- 00:18:30you you know this is like you're going
- 00:18:32to perpetuate an AI winter you're going
- 00:18:34to do this you're going to do that and
- 00:18:36we were just like looking at these
- 00:18:37results and saying they keep getting
- 00:18:39better then we got the scaling results
- 00:18:41it just kind of breaks my intuition even
- 00:18:44now and at some point you have to just
- 00:18:47look at the scaling loss and say we're
- 00:18:50going to keep doing this and this is
- 00:18:51what we think it'll do
- 00:18:52and it also it was starting to feel at
- 00:18:55that time
- 00:18:57like something about learning was just
- 00:19:00this
- 00:19:02emergent phenomenon that was really
- 00:19:04important and even if we didn't
- 00:19:06understand all of the details in
- 00:19:08practice here which obviously we didn't
- 00:19:09and still
- 00:19:10don't that there was something really
- 00:19:12fundamental going on it was the PG ISM
- 00:19:15for this is we had like discovered a new
- 00:19:16Square in the periodic table yeah and so
- 00:19:19it we just we really wanted to push on
- 00:19:21that and we were far less resourced than
- 00:19:25Deep Mind and others and so we said okay
- 00:19:27they're going to try a lot of things and
- 00:19:29we've just got to pick one and really
- 00:19:30concentrate and that's how we can we can
- 00:19:31win here which is totally the right
- 00:19:33startup takeaway and so we said well we
- 00:19:38don't know what we don't know we do know
- 00:19:40this one thing works so we're going to
- 00:19:42really concentrate on that and I think
- 00:19:44some of the other efforts were trying to
- 00:19:46outsmart themselves in too many ways and
- 00:19:48we just said we'll just we'll do the
- 00:19:50thing in front of us and keep pushing on
- 00:19:51it scale is this thing that I've always
- 00:19:53been interested in um at kind of just
- 00:19:56the emergent properties of scale for
- 00:19:58everything for startups turns out for
- 00:20:00deep learning models for a lot of other
- 00:20:02things I think it's a very
- 00:20:04underappreciated property and thing to
- 00:20:06go after and I think it's you know when
- 00:20:08in doubt if you have something that
- 00:20:10seems like it's getting better with
- 00:20:11scale I think you should scale it up I
- 00:20:12think people want things to be uh you
- 00:20:14know less is more but actually more is
- 00:20:16more more is more we believed in that we
- 00:20:18wanted to push on it I think one thing
- 00:20:20that is not maybe that well understood
- 00:20:23about open AI is we had just this even
- 00:20:27when we were like pretty unknown
- 00:20:29we had a crazy talented team of
- 00:20:31researchers you know if you have like
- 00:20:33the smartest people in the world you can
- 00:20:34push on something really
- 00:20:36hard yeah and they're motivated Andor
- 00:20:39you created sort of one of the sole
- 00:20:40places in the world where they could do
- 00:20:42that like one of the stories I heard is
- 00:20:45just even getting access to compute
- 00:20:47resources even today is this crazy thing
- 00:20:51and embedded in some of the criticism
- 00:20:54from maybe the Elders of the industry at
- 00:20:56the moment was sort of that you know
- 00:20:58know you're going to waste a lot of
- 00:21:00resources and somehow that's going to
- 00:21:02result in an AI winter like people won't
- 00:21:04give resources anymore it's funny people
- 00:21:06were never sure if we were going to
- 00:21:09waste resources or if we were doing
- 00:21:11something kind of vaguely immoral by
- 00:21:14putting in too much resources and you
- 00:21:15were supposed to spread it across lots
- 00:21:17of bets rather than like conviction on
- 00:21:19one most of the world still does not
- 00:21:22understand the value of like a fairly
- 00:21:23extreme level of conviction on one bet
- 00:21:26and so we said okay we have this
- 00:21:27evidence we believe in this
- 00:21:29we're going to at a time when like the
- 00:21:30normal thing was we're going to spread
- 00:21:31against this bet and that bet and that
- 00:21:33bet definite Optimist you're a definite
- 00:21:35Optimist and I think across like many of
- 00:21:38the successful YC startups you see a
- 00:21:40version of that again and again yeah
- 00:21:42that sounds right when the world gives
- 00:21:43you sort of push back and the push back
- 00:21:46doesn't make sense to you you should do
- 00:21:47it anyway totally one of the many things
- 00:21:50that I'm very grateful
- 00:21:52about getting exposure to from the world
- 00:21:54of startups is how many times you see
- 00:21:57that again and again and again and
- 00:21:58before I think before YC I I really had
- 00:22:01this deep belief that somewhere in the
- 00:22:04world there were adults in charge adults
- 00:22:07in the room and they knew what was going
- 00:22:09on and someone had all the answers and
- 00:22:11you know if someone was pushing back on
- 00:22:12you they probably knew what was going on
- 00:22:15and the degree to which I Now understand
- 00:22:18that you know to pick up the earlier
- 00:22:21phrase you can just do stuff you can
- 00:22:22just try stuff no one has all the
- 00:22:24answers there are no like adults in the
- 00:22:25room that are going to magically tell
- 00:22:27you exactly what to do um and you just
- 00:22:29kind of have to like iterate quickly and
- 00:22:31find your way that was like a big unlock
- 00:22:33in life for me to understand there is a
- 00:22:35difference between being uh High
- 00:22:37conviction just for the sake of it and
- 00:22:40if you're wrong and you don't adapt and
- 00:22:42you don't try to be like truth seeking
- 00:22:44it still is
- 00:22:46really not that effective the thing that
- 00:22:49we tried to do was really just believe
- 00:22:53whatever the results told us and really
- 00:22:57kind of try to go do the thing in front
- 00:22:58of us and there were a lot of things
- 00:23:00that we were high conviction and wrong
- 00:23:02on but as soon as we realized we were
- 00:23:04wrong we tried to like fully embrace it
- 00:23:07conviction is great until the moment you
- 00:23:08have data one way or the other and there
- 00:23:10are a lot of people who hold on it past
- 00:23:11the moment of data so it's it's
- 00:23:13iterative it's not just they're wrong
- 00:23:15and I'm right you have to go show your
- 00:23:18work but there is a long moment where
- 00:23:20you have to be willing to operate
- 00:23:21without
- 00:23:22data and at that point you do have to
- 00:23:24just sort of run on conviction yeah it
- 00:23:26sounds like there's a focusing aspect
- 00:23:28there too like you had to make a choice
- 00:23:31and that choice had better you know you
- 00:23:34didn't have infinite choices and so you
- 00:23:37know the prioritization itself was an
- 00:23:39exercise that made it much more likely
- 00:23:41for you to succeed I wish I could go
- 00:23:43tell you like oh we knew exactly what
- 00:23:45was going to happen and it was you know
- 00:23:47we had this idea for language models
- 00:23:49from the beginning and you know we kind
- 00:23:51of went right to this but obviously the
- 00:23:54story of opening eyes that we did a lot
- 00:23:55of things that helped us develop some
- 00:23:57scientific understanding but we're not
- 00:24:00on the short path if we knew then what
- 00:24:03we know now we could have speedrun this
- 00:24:05whole thing to like an incredible degree
- 00:24:07doesn't work that way like you don't get
- 00:24:08to be right at every guess and so we
- 00:24:11started off with a lot of assumptions
- 00:24:14both about the direction of Technology
- 00:24:16but also what kind of company we were
- 00:24:17going to be and how we were going to be
- 00:24:18structured and how AGI was going to go
- 00:24:20and all of these things and we have been
- 00:24:24like humbled and badly wrong many many
- 00:24:27many times and one of our strengths is
- 00:24:32the ability to get punched in the face
- 00:24:33and get back up and keep going this
- 00:24:35happens for scientific bets for uh you
- 00:24:38know being willing to be wrong about a
- 00:24:40bunch of other things we thought about
- 00:24:41how the world was going to work and what
- 00:24:43the sort of shape of the product was
- 00:24:44going to
- 00:24:45be again we had no idea or I at least
- 00:24:49had no idea maybe Alec Radford did I had
- 00:24:50no idea that language models were going
- 00:24:52to be the thing um you know we started
- 00:24:54working on robots and agents PL video
- 00:24:56games and all these other things then a
- 00:24:58few years later gbd3 happened that was
- 00:25:02not so obvious at the time yeah it
- 00:25:03sounded like there was a a key Insight
- 00:25:06around positive or negative sentiment
- 00:25:08around n GT1 even before gpt1 Oh before
- 00:25:12he I think the paper was called the
- 00:25:14unsupervised sentiment on and I think
- 00:25:16Alec did it alone by the way Alec
- 00:25:19is this unbelievable outlier of a human
- 00:25:22and so he did this incredible work which
- 00:25:28was just looking at he he noticed there
- 00:25:30was one neuron that was flipping
- 00:25:31positive or negative sentiment as it was
- 00:25:33doing these generative Amazon reviews I
- 00:25:36think other researchers might have hyped
- 00:25:39it up more made a bigger deal out of it
- 00:25:40or whatever but you know it was Alex so
- 00:25:42it took people a while to I think fully
- 00:25:44internalize what a big deal it was and
- 00:25:46he then did gpt1 and somebody else
- 00:25:48scaled it up into gpt2 um but it was off
- 00:25:51of this Insight that there
- 00:25:54was something uh amazing happening
- 00:25:59where and at at the time unsupervised
- 00:26:01learning was just not really working so
- 00:26:04he noticed this one really interesting
- 00:26:06property which is there was a neuron
- 00:26:08that was flipping positive or negative
- 00:26:10with sentiment and yeah that led to the
- 00:26:13GPT series I guess one of the things
- 00:26:16that Jake heler from case text uh we I
- 00:26:19think of him as maybe I mean not
- 00:26:21surprisingly a YC Alum who got access to
- 00:26:25both uh 3 3.5 and four and he described
- 00:26:29getting four as sort of the big moment
- 00:26:32Revelation because 3.5 would still do
- 00:26:36yeah I mean it would hallucinate more
- 00:26:38than he could use in a legal setting and
- 00:26:41then with four it reached the point
- 00:26:44where if he chopped the prompts down
- 00:26:46small enough into workflow he could get
- 00:26:48it to do exactly what what he wanted and
- 00:26:51he built you know huge test cases around
- 00:26:54it and then sold that company for $650
- 00:26:56million so it's uh you know I think of
- 00:26:59him as like one of the first to
- 00:27:01commercialize gp4 in a relatively Grand
- 00:27:04fashion I remember that conversation
- 00:27:06with him yeah with one gp4 like that was
- 00:27:09one of the few moments in that thing
- 00:27:11where I was like okay we have something
- 00:27:12really great on our hands um when we
- 00:27:15first started trying to like sell gpt3
- 00:27:17to found Founders they would be like
- 00:27:19it's cool it's doing something amazing
- 00:27:21it's an incredible demo
- 00:27:24but with the possible exception of
- 00:27:28copyrighting no great businesses were
- 00:27:30built on gpt3 and then 3 3.5 came along
- 00:27:33and
- 00:27:33people startups like YC startups in
- 00:27:36particular started to do interest like
- 00:27:37it no longer felt like we were pushing a
- 00:27:39boulder uphill so like people actually
- 00:27:40wanted to buy the thing we were selling
- 00:27:41totally and then
- 00:27:44four we kind of like got the like just
- 00:27:47how many gpus can you give me oh yeah
- 00:27:49moment like very quickly after giving
- 00:27:51people access so we felt like okay we
- 00:27:53got something like really good on our
- 00:27:54hands so you you knew actually from your
- 00:27:57users that totally like when the when
- 00:27:59the uh model dropped itself and you got
- 00:28:02your hands on it it was like well this
- 00:28:04this is better we were totally impressed
- 00:28:06then too we had all of these like tests
- 00:28:09that we did on it that were very it like
- 00:28:11looked great and it could just do these
- 00:28:13things that we were all super impressed
- 00:28:15by also like when we were all just
- 00:28:17playing around with it and like getting
- 00:28:19samples back I was like wow it's like it
- 00:28:20can do this now and they were it can
- 00:28:22rhyme and it can like tell a funny joke
- 00:28:24slightly funny joke and it can like you
- 00:28:26know do this and that and so it felt
- 00:28:29really great but you know you never
- 00:28:31really know if you have a hit product on
- 00:28:33your hands until you like put it in
- 00:28:35customer hands yeah you're always too
- 00:28:36impressed with your own work yeah and
- 00:28:39and so we were all excited about it we
- 00:28:41were like oh this is really quite good
- 00:28:43but until like the test happens it's
- 00:28:46like the real test is yeah the real test
- 00:28:48is users yeah so there's some anxiety
- 00:28:50until that until that moment happens
- 00:28:53yeah I wanted to switch gears a little
- 00:28:54bit so before you created obviously one
- 00:28:57of the craziest AI Labs ever to be
- 00:29:00created um you started at 19 at YC with
- 00:29:04a company called looped which was uh
- 00:29:07basically find my friend's
- 00:29:09geolocation you know probably what 15
- 00:29:12years before Apple ended up making it
- 00:29:14too early in any case yeah yeah what
- 00:29:16Drew you to that particular idea I was
- 00:29:19like interested in Mobile phones and I
- 00:29:22wanted to do something that got to like
- 00:29:25use mobile phone this was when like
- 00:29:26mobile was just starting was like you
- 00:29:28know still 3 years or years before the
- 00:29:30iPhone but it was clear that carrying
- 00:29:33around
- 00:29:34computers in our pockets was somehow a
- 00:29:38very big deal I mean that's hard to
- 00:29:39believe now that there was a moment when
- 00:29:42phones were actually literally you just
- 00:29:44they were just a phone they were an
- 00:29:45actual phone yeah yeah I mean I try not
- 00:29:47to use it as an actual phone ever really
- 00:29:49I still remember the first phone I got
- 00:29:52that had internet on it and it was this
- 00:29:55horrible like text based mostly
- 00:29:58text-based browser it was really slow
- 00:30:00you could like you know do like you
- 00:30:01could so painfully and so slowly check
- 00:30:03your email um but I was like a I don't
- 00:30:07know in high school sometime in high
- 00:30:09school and I got a phone that could do
- 00:30:10that versus like just text and call and
- 00:30:13I was like hooked right then yeah I was
- 00:30:15like ah this is this is not a phone this
- 00:30:17is like a computer we can carry and
- 00:30:19we're stuck with a dial pad for this
- 00:30:20accident of history but this is going to
- 00:30:22be awesome and I mean now you have
- 00:30:25billions of people who they don't have a
- 00:30:28computer like to us growing up you know
- 00:30:30that that actually uh was your first
- 00:30:32computer not physically is a replica or
- 00:30:35like another copy of my first computer
- 00:30:37which is lc2 yeah so this is what a
- 00:30:39computer was to us growing up and the
- 00:30:41idea that you would carry this little
- 00:30:43black mirror like kind of we've come a
- 00:30:46long way unconscionable back then yeah
- 00:30:49so you know even then you like
- 00:30:50technology and what was going to come
- 00:30:53was sort of in your brain yeah I was
- 00:30:55like a real I mean I still am a real
- 00:30:56tech nerd but I always that was what I
- 00:31:00spent my Friday nights thinking about
- 00:31:02and then uh one of the harder parts of
- 00:31:04it was we didn't have the App Store the
- 00:31:06iPhone didn't
- 00:31:08exist uh you ended up being a big part
- 00:31:10of that launch I think a small part but
- 00:31:12yes we dig it to be a little part of it
- 00:31:14it was a great experience for me to have
- 00:31:16been through because I I kind of like
- 00:31:18understood what it is like to go through
- 00:31:21a platform shift and how messy the
- 00:31:22beginning is and how much like little
- 00:31:25things you do can shape the direction it
- 00:31:26all goes I I was definitely on the other
- 00:31:28side of it then like I was watching
- 00:31:29somebody else create the platform shift
- 00:31:32but it was a super valuable
- 00:31:35experience to get to go through and sort
- 00:31:37of just see what how it happens and how
- 00:31:40quickly things change and how you adapt
- 00:31:42through it what was that experience like
- 00:31:44you ended up selling that company uh was
- 00:31:47probably the first time you were
- 00:31:49managing people and you know doing
- 00:31:51Enterprise sales all of these things
- 00:31:53were useful lessons from that first
- 00:31:55experience I mean it obviously was not a
- 00:31:57su ful company um it
- 00:32:00was and so it's a very painful thing to
- 00:32:03go through but the rate of experience
- 00:32:04and education was incredible another
- 00:32:08thing that PG said or quoted somebody
- 00:32:09else saying but always stuck with me is
- 00:32:10your 20s are always an apprenticeship
- 00:32:12but you don't know for what and then you
- 00:32:13do your real work later and I did learn
- 00:32:16quite a lot and I'm very grateful for it
- 00:32:18it was like a difficult experience and
- 00:32:22we never found product Market fit really
- 00:32:24and we also never like really found a
- 00:32:26way to get to escape velocity which is
- 00:32:27just always hard to do there is nothing
- 00:32:30that I that I have ever heard of that
- 00:32:32has a higher rate of generalized
- 00:32:34learning than doing a startup so it was
- 00:32:37great in that sense you know when you're
- 00:32:3919 and 20 like riding the wave of some
- 00:32:42other platform shift this shift from you
- 00:32:44know dumb cell phones to smartphones and
- 00:32:48mobile and you know here we are many
- 00:32:51years later and your next ACT was
- 00:32:53actually you know I mean I guess two
- 00:32:55acts later literally spawning
- 00:32:58one of the major platform sh we all get
- 00:33:00old yeah but that's really what's
- 00:33:02happening you know uh 18 20 year olds
- 00:33:05are deciding that they could get their
- 00:33:08degree but they're going to miss the
- 00:33:10wave like cuz all of the stuff that's
- 00:33:12great everything's happening right now
- 00:33:14like proud do you have an intuitive
- 00:33:16sense like speaking to even a lot of the
- 00:33:19you know really great billion dooll
- 00:33:21company Founders some of them are just
- 00:33:24not that aware of what's Happening like
- 00:33:26there're C to it's wild I think that's
- 00:33:29why I'm so excited for startups right
- 00:33:31now is because the world is still
- 00:33:33sleeping on all of this to such an
- 00:33:34astonishing degree yeah and then you
- 00:33:36have like the YC Founders being like no
- 00:33:38no I'm going to like do this amazing
- 00:33:40thing and do it very quickly yeah it
- 00:33:42reminds me of when um Facebook almost
- 00:33:45missed mobile because they were making
- 00:33:47web software and they were really good
- 00:33:49at it yeah and um like they they I mean
- 00:33:53they had to buy Instagram like Snapchat
- 00:33:55right up yeah and WhatsApp so um it's
- 00:33:58interesting the platform shift is always
- 00:34:00built by the people who are young with
- 00:34:03no prior knowledge it's it is I think
- 00:34:07it's great so there's this other aspect
- 00:34:10that's interesting in that I think
- 00:34:12you're you know you and Elon and uh
- 00:34:15Bezos and a bunch of people out there
- 00:34:17like they sort of start their Journey as
- 00:34:20Founders you know really you know
- 00:34:24whether it's looped or zip to or you
- 00:34:26know really in maybe pure soft software
- 00:34:28like it's just a different thing that
- 00:34:30they start and then later they you know
- 00:34:32sort of get to level up you know is
- 00:34:34there a path that you recommend at this
- 00:34:36point if people are thinking you know I
- 00:34:38want to work on the craziest hard tech
- 00:34:40thing first should they just run towards
- 00:34:42that to the extent they can or is there
- 00:34:45value in you know sort of solving the
- 00:34:47money problem first being able to invest
- 00:34:49your own money like very deeply into the
- 00:34:52next thing it's a really interesting
- 00:34:55question it was definitely helpful
- 00:34:58that I could just like write the early
- 00:34:59checks for open Ai and I think it would
- 00:35:02have been hard to get somebody else to
- 00:35:03do that at the very beginning um and
- 00:35:05then Elon did it a lot at much higher
- 00:35:07scale which I'm very grateful for and
- 00:35:09then other people did after that and and
- 00:35:12there's other things that I've invested
- 00:35:13in that I'm really happy to have been
- 00:35:15able to support and I don't I think it
- 00:35:17would have been hard to get other people
- 00:35:18to to do it um so that's great for sure
- 00:35:22and I did like we were talking about
- 00:35:24earlier learn these extremely valuable
- 00:35:28lessons but I also feel like I kind of
- 00:35:30like was wasting my time for lack of a
- 00:35:33better phrase working on looped I don't
- 00:35:35I definitely don't regret it it's like
- 00:35:37all part of the tapestry of life and I
- 00:35:38learned a ton and whatever else what
- 00:35:41would you have done differently or what
- 00:35:43would you tell yourself from like now to
- 00:35:45in a Time cap in like time travel
- 00:35:48capsule that would show up on your desk
- 00:35:50at Stanford when you were 19 well it's
- 00:35:52hard because AI was always the thing I
- 00:35:54most wanted to do and AI just like I
- 00:35:55went to school to study AI but at the
- 00:35:58time I was working in the AI lab the one
- 00:35:59thing that I they told you is definitely
- 00:36:01don't work on neural networks we tried
- 00:36:03that it doesn't work a long time ago I
- 00:36:05think I could have picked a much better
- 00:36:07thing to work on than loped I don't know
- 00:36:08exactly what it would have been but it
- 00:36:10all works out it's fine yeah there's
- 00:36:12this long history of people building
- 00:36:14more technology to help improve other
- 00:36:17people's lives and I I actually think
- 00:36:19about this a lot like I think about the
- 00:36:21people that made that computer and I
- 00:36:23don't know them um you know they're many
- 00:36:26of them probably long retired
- 00:36:28but I am so grateful to them yeah and
- 00:36:31some people worked super hard to make
- 00:36:34this thing at the limits of technology I
- 00:36:36got a copy of that on my eth birthday
- 00:36:38and it totally changed my life yeah and
- 00:36:41the lives of a lot of other people too
- 00:36:43they worked super hard they never like
- 00:36:45got to thank you for me but I feel it to
- 00:36:47them very
- 00:36:48deeply
- 00:36:50and it's really nice to get to like add
- 00:36:53our brick to that long road of progress
- 00:36:56yeah um is it's been a great year for
- 00:36:58open AI not without some drama uh always
- 00:37:01yeah we're good at that uh what did you
- 00:37:04learn from you know sort of the ouer
- 00:37:06last fall and how do you feel about some
- 00:37:08of the you know departures I mean teams
- 00:37:11do evolve but how are you doing man tire
- 00:37:15but good yeah uh it's we've kind of like
- 00:37:19speedrun uh like medium siiz or even
- 00:37:22kind of like pretty big siiz tech
- 00:37:24company Arc that would normally take
- 00:37:26like a decade and two years like chpt is
- 00:37:28less than two years old yeah and and
- 00:37:30there's like a lot of painful stuff that
- 00:37:32comes with that
- 00:37:35um and there are you know any company as
- 00:37:38it scales goes through management teams
- 00:37:41at some rate uh and you have to sort of
- 00:37:44the people who are really good at the
- 00:37:45zero to one phase are not necessarily
- 00:37:46people that are good at the 1 to 10 or
- 00:37:48the 10 to the 100 phase we've also kind
- 00:37:50of like changed what were going to be um
- 00:37:53made plenty of mistakes along the way
- 00:37:55done a few things really right and
- 00:37:58that comes with a lot of change and I
- 00:38:02think the goal
- 00:38:04of the company uh the emerging AGI or
- 00:38:09whatever however you want to think about
- 00:38:10it is like just keep making the best
- 00:38:13decisions we can at every stage but it
- 00:38:15does lead to a lot of change I hope that
- 00:38:18we are heading towards a period now of
- 00:38:21more calm but I'm sure there will be
- 00:38:23other periods in the future where things
- 00:38:25are very Dynamic again so I guess how
- 00:38:28does open AI actually work right now you
- 00:38:30know I mean the quality and like the
- 00:38:33pace that you're pushing right now I
- 00:38:35think is like Beyond world class
- 00:38:38compared to a lot of the other you know
- 00:38:41really established software players like
- 00:38:43who came
- 00:38:44before this is the first time ever where
- 00:38:47I felt like
- 00:38:50we actually know what to do like I think
- 00:38:52from here
- 00:38:54to building an AGI will still take a
- 00:38:57huge amount of work there are some known
- 00:38:59unknowns but I think we basically know
- 00:39:01what to go what to go do and it'll take
- 00:39:03a while it'll be hard but that's
- 00:39:05tremendously exciting I also think on
- 00:39:09the product side there's more to figure
- 00:39:12out but roughly we know what to shoot at
- 00:39:14and what we want to optimize
- 00:39:16for that's a really exciting time and
- 00:39:18when you have that Clarity I think you
- 00:39:20can go pretty fast yeah if you're
- 00:39:22willing to say we're going to do these
- 00:39:23few things we're going to try to do them
- 00:39:24very well and our research path is
- 00:39:28fairly clear our infrastructure path is
- 00:39:29fairly clear our product path is getting
- 00:39:33clearer you can Orient around that super
- 00:39:37well we for a long time did not have
- 00:39:39that we were a true research lab and
- 00:39:42even when you know that it's hard to act
- 00:39:44with the conviction on it because
- 00:39:45there's so many other good things You'
- 00:39:46like to do
- 00:39:48yeah but the degree to which you can get
- 00:39:51everybody aligned and pointed at the
- 00:39:53same thing is a significant determinant
- 00:39:56in how fast you can move
- 00:39:58I mean sounds like we went from level
- 00:39:59one to level two very recently and that
- 00:40:01was really powerful um and then we
- 00:40:04actually just had our 01 hackathon at YC
- 00:40:07that was so impressive that was super
- 00:40:08fun um and then weirdly one of the
- 00:40:12people who won I think they came in
- 00:40:13third uh was camper and so CAD cam
- 00:40:17startup you know did YC recently last
- 00:40:20year or two and uh they were able to
- 00:40:23during the hackathon build something
- 00:40:25that would iteratively improve an air
- 00:40:28foil from something that wouldn't fly to
- 00:40:30literally something that had yeah that
- 00:40:32was awesome a competitive amount of lift
- 00:40:35and I mean that sort of sounds like
- 00:40:37level four which is uh you know the
- 00:40:40innovator stage it's very funny you say
- 00:40:43that I I had been telling people for a
- 00:40:45while I thought that the level two to
- 00:40:46level three jump was going to happen but
- 00:40:48then the level three to level four jump
- 00:40:50was level two to level three was going
- 00:40:52to happen quickly and then the level
- 00:40:53three to level four
- 00:40:55jump was somehow going to be much harder
- 00:40:57and
- 00:40:58require some medium-sized or larger new
- 00:41:03ideas and that demo and a few others
- 00:41:05have convinced me
- 00:41:08that you can get a huge amount of
- 00:41:10innovation just by using these current
- 00:41:12models in really creative ways well yeah
- 00:41:15I mean it's uh what's interesting is
- 00:41:17basically camper already built sort of
- 00:41:19the um underlying software for CAD Cam
- 00:41:24and then you know language is sort of
- 00:41:28the interface to the large language
- 00:41:29model that then which then can use the
- 00:41:32software like tool use and then if you
- 00:41:35combine that with the idea of code gen
- 00:41:38that's kind of a scary crazy idea right
- 00:41:40like not only can the uh you large
- 00:41:43language model code but it can create
- 00:41:46tools for itself and then compose those
- 00:41:48tools similar to you know chain of
- 00:41:51thoughts with o1 yeah I think things are
- 00:41:53going to go a lot faster than people are
- 00:41:55appreciating right now yeah well it's a
- 00:41:57an exciting time to be alive honestly
- 00:41:59you know we you mentioned earlier that
- 00:42:01thing about discover all of physics I uh
- 00:42:04I was want to be a physicist wasn't
- 00:42:05smart enough to be a good one had to
- 00:42:06like contribute in this other way but
- 00:42:08the fact that somebody else I really
- 00:42:10believe is now going to go solve all the
- 00:42:11physics with the stuff like I'm so
- 00:42:14excited to be alive for that let's get
- 00:42:16to level four so happy for whoever that
- 00:42:18person is yeah do you want to talk about
- 00:42:21level three four and five
- 00:42:23briefly yeah so we realized that AGI had
- 00:42:26become this like
- 00:42:27badly overloaded word and people meant
- 00:42:29all kinds of different things and we
- 00:42:30tried to just say okay here's our best
- 00:42:32guess roughly of the order of things you
- 00:42:34have these level one systems which are
- 00:42:36these chat Bots there'd be level two
- 00:42:38that would come which would be these
- 00:42:39this these reasoners we think we got
- 00:42:41there earlier this year um with the o1
- 00:42:44release three is Agents U ability to go
- 00:42:48off and do these longer term tasks uh
- 00:42:51you know maybe like multiple
- 00:42:52interactions with an environment asking
- 00:42:55people for help when they need it
- 00:42:56working together all of that and I I
- 00:42:59think we're going to get there faster
- 00:43:00than people expect for as innovators
- 00:43:03like that's like a scientist and you
- 00:43:05know that's ability to go explore like a
- 00:43:09not well understood
- 00:43:11phenomena over like a long period of
- 00:43:14time and understand what's just kind of
- 00:43:16go just figure it out and then and then
- 00:43:19level five this is the sort of slightly
- 00:43:22amorphous like do that but at the scale
- 00:43:24of the whole company or you know a whole
- 00:43:26organization or whatever
- 00:43:27ever that's going to be a pretty
- 00:43:29powerful thing yeah and it feels kind of
- 00:43:32fractal right like even the things you
- 00:43:34had to do to get to two sort of rhyme
- 00:43:36with level five and that you have
- 00:43:38multiple agents that then self-correct
- 00:43:40that work together I mean that kind of
- 00:43:42sounds like an organization to me just
- 00:43:44at like a very micro level do you think
- 00:43:46that we'll have I mean you famously
- 00:43:48talked about it I think Jake talks about
- 00:43:49it it's like you will have companies
- 00:43:52that make you know billions of dollars
- 00:43:54per year and have like less than 100
- 00:43:57employees maybe 50 maybe 20 employees
- 00:44:00maybe one it does seem like that I don't
- 00:44:03know what to make of that other than
- 00:44:04it's a great time to be a startup
- 00:44:05founder yeah but it does feel like
- 00:44:08that's happening to me yeah um you know
- 00:44:11it's like one person plus 10,000
- 00:44:14gpus pretty pretty powerful Sam what
- 00:44:17advice do you have for people watching
- 00:44:19who you know either about to start or
- 00:44:22just started their startup bet on this
- 00:44:26Tech trend bet on this trend it's this
- 00:44:28is we are not near the saturation point
- 00:44:31the models are going to get so much
- 00:44:32better so quickly what you can do as a
- 00:44:34startup founder with this versus what
- 00:44:37you could do without it is so wildly
- 00:44:38different and the big companies even the
- 00:44:41mediumsized companies even the startups
- 00:44:43that are a few years old they're already
- 00:44:44unlike quarterly planning cycles and
- 00:44:47Google is on a year decade planning
- 00:44:49cycle I don't know how they even do it
- 00:44:50anymore but your advantage with speed
- 00:44:54and focus and conviction and the ability
- 00:44:56to react to to how fast the technology
- 00:44:58is moving that is that is the number one
- 00:45:00edge of a startup kind of ever but
- 00:45:03especially right now so I would
- 00:45:06definitely like build something with AI
- 00:45:08and I would definitely like take
- 00:45:09advantage of the ability to see a new
- 00:45:12thing and build something that day
- 00:45:14rather than like put it into a quarterly
- 00:45:16planning cycle I guess the other thing I
- 00:45:18would say
- 00:45:20is it is easy when there's a new
- 00:45:22technology platform to say well because
- 00:45:25I'm doing something with AI the
- 00:45:27the rule the laws of business don't
- 00:45:29apply to me I have this magic technology
- 00:45:31and so I don't have to build uh a moe or
- 00:45:35a um you know Competitive Edge or a
- 00:45:38better product it's because you know I'm
- 00:45:39doing Ai and you're not so that's all I
- 00:45:41need and that's obviously not true but
- 00:45:44what you can get are these
- 00:45:45short-term explosions of growth by
- 00:45:49embracing a new technology more quickly
- 00:45:51than somebody
- 00:45:52else and remembering not to fall for
- 00:45:55that and that you still have to build
- 00:45:56something up been value that's I think
- 00:45:58that's a good thing to keep in mind too
- 00:45:59yeah everyone can build an absolutely
- 00:46:01incredible demo right now but everyone
- 00:46:03can build an incredible demo but
- 00:46:04building a business man that's the brass
- 00:46:07ring the rules still apply you can do it
- 00:46:09faster than ever before and better than
- 00:46:10ever before but you still have to build
- 00:46:11a business what are you excited about in
- 00:46:132025 what's to come AGI yeah uh excited
- 00:46:18for that uh what am I excited for um we
- 00:46:22a kid I'm more excited for that than
- 00:46:23congratulations ever been incredible
- 00:46:26yeah probably that that's going to be
- 00:46:27that's the thing I've like most excited
- 00:46:29for ever in life yeah it uh changes your
- 00:46:31life completely so I cannot wait well
- 00:46:34here's to building that better world for
- 00:46:37you know our kids and really hopefully
- 00:46:40the whole world this is a lot of fun
- 00:46:42thanks for hanging out Sam thank you
- 00:46:45[Music]
- AGI
- AI
- OpenAI
- Sam Altman
- Tech Optimism
- Innovation
- Deep Learning
- Startups
- Future Technology
- Energy Abundance