OpenAI's Altman and Makanju on Global Implications of AI
摘要
TLDROpenAI announced new guidelines for the use of AI in elections, aiming to enhance transparency and prevent misuse. These guidelines include banning the use of ChatGPT in political campaigns and introducing cryptographic watermarks for AI-generated images to ensure authenticity. The company plans to enforce these guidelines through robust safety systems and strategic partnerships with authoritative bodies. The discussion highlights the need to address potential impacts of AI on democratic processes, emphasizing OpenAI's commitment to responsible AI use. The conversation also touches on the broader implications of AI, including its effects on jobs and societal inequalities. OpenAI expresses optimism about AI's role as a tool for enhancing productivity rather than displacing jobs. Additionally, the dialogue explores OpenAI's stance on publisher relations, copyright management, and collaborations with military entities on non-violent projects like cybersecurity. OpenAI maintains a focus on balancing innovation with ethical considerations, preparing for advancements in AI capabilities while respecting regulatory frameworks and societal norms.
心得
- 📜 New AI guidelines ban ChatGPT in elections.
- 🔗 Cryptographic watermarks for AI images introduced.
- 🛡️ Strong safety systems to enforce guidelines.
- 🤝 Partnerships to provide credible voting information.
- 🗳️ AI's role in elections needs cautious monitoring.
- 🛑 AI impacts jobs by transforming roles, not displacing.
- 📚 Favor learning from high-quality data over quantity.
- 🤖 OpenAI collaborates on non-violent military projects.
- ⚙️ Explore innovative AI model developments carefully.
- 🚫 Focus on protecting copyright and data rights.
时间轴
- 00:00:00 - 00:05:00
OpenAI has announced new guidelines for the use of AI in elections, including banning the use of ChatGPT in political campaigns and introducing cryptographic watermarks for AI-generated images to ensure transparency and authenticity. Although larger platforms like Facebook and YouTube struggle with enforcement, OpenAI leverages its safety systems and partnerships, such as with the National Association of Secretaries of State, to effectively enforce these guidelines.
- 00:05:00 - 00:10:00
There is ongoing anxiety and focus within OpenAI regarding its role in upcoming elections, emphasizing the importance of monitoring and a proactive approach. The mention of past technological cycles suggests that existing dynamics, such as those seen with previous tech advancements, will inform their strategies. The seriousness with which OpenAI approaches the ethical development of AI is clear, drawing lessons from past incidents such as Cambridge Analytica.
- 00:10:00 - 00:15:00
The discussion extends to the implications of AI in the political sphere, recognizing that AI could become a significant social issue. The utilization of AI could exacerbate inequalities or technological divides, reflecting concerns similar to those previously seen with figures like Donald Trump. However, OpenAI views its technology as a productivity tool that could enhance human capabilities, and they express cautious optimism about its impact on jobs and society.
- 00:15:00 - 00:20:00
OpenAI maintains a stance against the use of its technology in military projects, except in areas like cybersecurity and assistance for veterans, reflecting ethical considerations. Adjustments in their policies are aimed at improving transparency and aligning with broader social good, resonating with their mission to prioritize democratic oversight in AI technology.
- 00:20:00 - 00:25:00
Collaboration with the Department of Defense is highlighted, focusing on non-destructive applications like cybersecurity. OpenAI is also exploring GPT Store, which invites creativity akin to the early mobile app stores. Issues of copyright and partnerships with publishers are being navigated to ensure cooperative content use, with efforts to respect copyright laws while advancing AI capabilities.
- 00:25:00 - 00:30:00
The conversation touches upon the challenges and opportunities in AI-related copyright issues, with OpenAI fostering partnerships and addressing the complexities of data usage. They remain committed to working with artists, considering solutions for concerns about style imitation. This reflects their vision to create beneficial tools through dialogue and collaboration.
- 00:30:00 - 00:38:47
OpenAI's unique corporate structure, which includes a nonprofit board, faced challenges during internal governance issues. While the return of Sam Altman as CEO was employee-driven, it highlights the importance of their mission-driven focus. In broader discussions, themes of regulation, safety, and addressing potential conflicts between AI progress and energy requirements are explored, with an emphasis on balancing innovation with responsible governance.
思维导图
常见问题
What new guidelines were announced by OpenAI?
OpenAI announced new guidelines banning the use of ChatGPT in political campaigns and introducing cryptographic watermarks for AI-generated images.
How does OpenAI plan to enforce these AI guidelines?
OpenAI plans to leverage strong safety systems and partnerships with organizations like the National Association of Secretaries of State to enforce AI guidelines.
What are the potential impacts of AI on elections?
AI could impact the dissemination of information and campaign strategies, requiring careful monitoring and enforcement to maintain fair elections.
How does OpenAI view its role compared to distribution platforms like Facebook and TikTok?
OpenAI sees its role as distinct but complementary to distribution platforms, focusing on generating content while platforms manage distribution.
What is OpenAI's stance on AI's impact on job displacement?
OpenAI believes AI changes the nature of jobs rather than causing massive job displacement, offering tools that enhance productivity.
How important are publisher relations to OpenAI's business?
Publisher relations are important for content access and partnerships, though OpenAI is focusing on learning from smaller amounts of higher-quality data.
What are OpenAI's policies on military use of AI?
OpenAI prohibits the development of weapons but collaborates on cybersecurity tools and other non-violent applications with military agencies.
What are the anticipated challenges in regulating AI?
Regulating AI involves ensuring innovation isn't stifled while establishing effective safety and ethical guidelines across industries.
What does OpenAI expect from future AI model developments?
OpenAI anticipates more capable and efficient models, cautiously integrating advancements into society.
How does OpenAI handle copyright issues with AI models?
OpenAI focuses on not training models with unauthorized data and aims to respect publisher rights while exploring data partnerships.
查看更多视频摘要
- 00:00:00you guys made some news today announcing
- 00:00:02some new guidelines around the use of AI
- 00:00:04in elections I'm sure it's all uh stuff
- 00:00:07that the the the davo set loved to hear
- 00:00:11uh you banned the use of chat GPT in
- 00:00:13political campaigns you introduced
- 00:00:16cryptographic watermarks or images
- 00:00:18created by do to to create kind of
- 00:00:20Providence and transparency around the
- 00:00:23use of AI generated images I read it and
- 00:00:25I thought you know this is great some of
- 00:00:28these principles are shared by much
- 00:00:29larger platforms like Facebook and Tik
- 00:00:31Tok and YouTube and they have struggled
- 00:00:34to enforce it how do
- 00:00:36you make it
- 00:00:38real I mean these a lot of these are
- 00:00:40things that we've been doing for a long
- 00:00:42time and we have a really strong safety
- 00:00:44systems team that um not only sort of
- 00:00:48has monitoring but we're actually able
- 00:00:49to leverage our own tools in order to
- 00:00:51scale our enforcement which gives us I
- 00:00:53think a significant Advantage um but uh
- 00:00:56so there this there also some some
- 00:01:00really important Partnerships like with
- 00:01:01the National Association with the
- 00:01:02secretaries of state so we can Surface
- 00:01:04authoritative voting information so we
- 00:01:06have quite a few ways that we are able
- 00:01:08to enforce this mean Sam are you does
- 00:01:10this put your mind at ease that we don't
- 00:01:13that that open AI doesn't move the
- 00:01:15needle in in some 77 upcoming critical
- 00:01:19Democratic elections in 2024 no we're
- 00:01:22quite focused on it uh and I think it's
- 00:01:24good that our mind is not at EAS I think
- 00:01:25it's good that we have a lot of anxiety
- 00:01:26and are going to do everything we can to
- 00:01:28get it as right as we can um I think our
- 00:01:31role is very different than the role of
- 00:01:33a distribution platform but still
- 00:01:35important we'll have to work with them
- 00:01:36too uh it'll you know it's like you
- 00:01:38generate here and distribute here uh and
- 00:01:41there needs to be a good conversation
- 00:01:43between them but we also have the
- 00:01:45benefit of having watched what's
- 00:01:47happened in previous Cycles with
- 00:01:49previous uh you know Technologies and I
- 00:01:53don't think this will be the same as
- 00:01:54before I I think it's always a mistake
- 00:01:55to try to fight the last war but we do
- 00:01:58get to take away some learnings from
- 00:01:59that
- 00:02:00and so I I wouldn't you know I I think
- 00:02:03it'd be terrible if I said oh yeah I'm
- 00:02:04not worried I feel great like we're
- 00:02:05going to have to watch this incredibly
- 00:02:07closely this year super tight monitoring
- 00:02:09super tight feedback loop Anna you you
- 00:02:11were at Facebook for open
- 00:02:13AI so so I almost apologize for asking
- 00:02:16it this in this way uh probably a
- 00:02:18trigger phrase but do you worry about
- 00:02:20another Cambridge analytical analytica
- 00:02:22moment I think as Sam alluded to there
- 00:02:25are a lot of learnings that we can
- 00:02:27leverage but also open the eye from its
- 00:02:29Inception has been a company that thinks
- 00:02:31about these issues that it was one of
- 00:02:33the reasons that it was founded so I
- 00:02:35think I'm a lot less concerned because
- 00:02:37these are issues that our teams have
- 00:02:38been thinking about from the beginning
- 00:02:40of uh our building of these tools Sam
- 00:02:44Donald Trump just won the Iowa caucus
- 00:02:47yesterday uh we are now sort of
- 00:02:49confronted with the reality of this
- 00:02:50upcoming election what do you think is
- 00:02:52at
- 00:02:53stake in the in the US election for for
- 00:02:57Tech and for the safe stewardship of AI
- 00:02:59do you feel like that's a a critical
- 00:03:01issue that voters should and will have
- 00:03:04to consider in this election I think the
- 00:03:05now confronted as part of the problem uh
- 00:03:07I actually think most people who come to
- 00:03:09D say that again I didn't quite get that
- 00:03:10I think part of the problem is we're
- 00:03:11saying we're now confronted you know it
- 00:03:13never occurred to us that what Trump is
- 00:03:15saying might be resonating with a lot of
- 00:03:16people and now all of a sudden after
- 00:03:18this performance in Iowa oh man um it's
- 00:03:22a very like Davos Centric you know um
- 00:03:25I've been here for two days I guess
- 00:03:28just uh so I I would love if we had a
- 00:03:32lot more reflection and if we started it
- 00:03:34a lot sooner um about and we didn't feel
- 00:03:37now confronted but uh I think there's a
- 00:03:39lot at stake at this election I think
- 00:03:41elections are you know huge deals I
- 00:03:44believe that America is going to be fine
- 00:03:47no matter what happens in this election
- 00:03:49I believe that AI is going to be fine no
- 00:03:51matter what happens in this election and
- 00:03:52we will have to work very hard to make
- 00:03:54it so um but this is not you know no one
- 00:03:59wants to sit up here and like hear me
- 00:04:01rant about politics I'm going to stop
- 00:04:02after this um but I think there has been
- 00:04:07a real failure to sort of learn lessons
- 00:04:11about what what's kind of like working
- 00:04:13for the citizens of America and what's
- 00:04:15not Anna I want to ask you the same
- 00:04:17question uh um you know taking your
- 00:04:19political background into account what
- 00:04:21do you feel like for Silicon Valley for
- 00:04:24AI is at stake in the US election I
- 00:04:27think what has struck me and has been
- 00:04:29really remarkable is that the
- 00:04:30conversation around AI has remained very
- 00:04:34bipartisan and so you know I think that
- 00:04:37the one concern I have is that somehow
- 00:04:39both parties hate
- 00:04:42it no but you know this is like an area
- 00:04:45where um
- 00:04:46you Republicans tend to of course have a
- 00:04:50an approach where they are not as in
- 00:04:52favor of Regulation but on this I think
- 00:04:54there's agreement on both parties that
- 00:04:55they are consider they believe that
- 00:04:57something is needed on this technology
- 00:05:00you know Senator Schumer has this
- 00:05:01bipartisan effort that he is running
- 00:05:03with his Republican counterparts again
- 00:05:05uh when we speak to people in DC on both
- 00:05:08sides of the aisle for now it seems like
- 00:05:11they're on the same page and do you feel
- 00:05:13like all the existing campaigns are
- 00:05:15equally articulate about the about the
- 00:05:18issues relating to Ai No know that AI
- 00:05:20has really been a campaign issue to date
- 00:05:22so it will be interesting to see how
- 00:05:24that if we're right about what's going
- 00:05:25to happen here this is like bigger than
- 00:05:28just a technological re ution in some
- 00:05:30sense I mean sort of like all
- 00:05:31technological revolutions or societal
- 00:05:33revolutions but this one feels like it
- 00:05:35can be much more of that than usual and
- 00:05:39so it it is going to become uh a social
- 00:05:43issue a political issue um it already
- 00:05:45has in some ways but I think it is
- 00:05:48strange to both of us that it's not more
- 00:05:50of that already but with what we expect
- 00:05:52to happen this year not with the
- 00:05:53election but just with the the increase
- 00:05:55in the capabilities of the products uh
- 00:05:58and as people really
- 00:06:00catch up with what's going to happen
- 00:06:02what is happening what's already
- 00:06:04happened uh there's like a lot of a Nur
- 00:06:05always in society well I mean there are
- 00:06:07political figures in the US and around
- 00:06:08the world like Donald Trump who have
- 00:06:11successfully tapped into a feeling of
- 00:06:13yeah
- 00:06:14dislocation uh anger of the working
- 00:06:17class the feeling of you know
- 00:06:19exacerbating inequality or technology
- 00:06:22leaving people behind is there the
- 00:06:24danger that uh you know AI furthers
- 00:06:27those Trends yes for sure I think that's
- 00:06:29something to think about but one of the
- 00:06:32things that surprised us very pleasantly
- 00:06:34on the upside uh cuz you know when you
- 00:06:36start building a technology you start
- 00:06:37doing research you you kind of say well
- 00:06:39we'll follow where the science leads us
- 00:06:40and when you put a product you'll say
- 00:06:42this is going to co-evolve with society
- 00:06:43and we'll follow where users lead us but
- 00:06:46it's not you get you get to steer it but
- 00:06:48only somewhat there's some which is just
- 00:06:50like this is what the technology can do
- 00:06:53this is how people want to use it and
- 00:06:55this is what it's capable of and this
- 00:06:57has been much more of a tool than I
- 00:06:59think we expected it is not yet and
- 00:07:02again in the future it'll it'll get
- 00:07:04better but it's not yet like replacing
- 00:07:06jobs in the way to the degree that
- 00:07:08people thought it was going to it is
- 00:07:10this incredible tool for productivity
- 00:07:13and you can see people magnifying what
- 00:07:14they can do um by a factor of two or
- 00:07:17five or in some way that doesn't even
- 00:07:19talk to makes sense to talk about a
- 00:07:20number because they just couldn't do the
- 00:07:21things at all before and that is I think
- 00:07:25quite exciting this this new vision of
- 00:07:28the future that we didn't really see
- 00:07:30when we started we kind of didn't know
- 00:07:31how it was going to go and very thankful
- 00:07:33the technology did go in this direction
- 00:07:35but where this is a tool that magnifies
- 00:07:37what humans do lets people do their jobs
- 00:07:39better lets the AI do parts of jobs and
- 00:07:42of course jobs will change and of course
- 00:07:43some jobs will totally go away but the
- 00:07:46human drives are so strong and the sort
- 00:07:48of way that Society works is so strong
- 00:07:50that I think and I can't believe I'm
- 00:07:52saying this because it would have
- 00:07:54sounded like an ungrammatical sentence
- 00:07:56to me at some point but I think AGI will
- 00:07:58get developed in the reasonably
- 00:08:00close-ish future and it'll change the
- 00:08:02world much less than we all think it'll
- 00:08:03change jobs much less than we all think
- 00:08:06and again that sounds I may be wrong
- 00:08:08again now but that wouldn't have even
- 00:08:10compiled for me as a sentence at some
- 00:08:11point given my conception then of how
- 00:08:13AGI was going to go as you've watched
- 00:08:15the technology develop have you both
- 00:08:17changed your views on how significant
- 00:08:19the job dislocation and disruption will
- 00:08:22be as AGI comes into Focus so this is
- 00:08:25actually an area that we know we have a
- 00:08:26policy research team that studies this
- 00:08:28and they've seen pretty significant
- 00:08:30impact in terms of changing the way
- 00:08:31people do jobs rather than job
- 00:08:33dislocation and I think that's actually
- 00:08:35going to accelerate and that it's going
- 00:08:36to change more people's jobs um but as
- 00:08:39Sam said so far it hasn't been the
- 00:08:41significant a replacement of jobs you
- 00:08:44know you hear a coder say okay I'm like
- 00:08:46two times more productive three times
- 00:08:48more productive whatever than they used
- 00:08:49to be and I like can never code again
- 00:08:50without this tool you mostly hear that
- 00:08:52from the younger ones but
- 00:08:54um it turns out and I think this will be
- 00:08:57true for a lot of Industries the world
- 00:08:58just needs a lot more code than we have
- 00:09:00people to write right now and so it's
- 00:09:02not like we run out of demand it's that
- 00:09:04people can just do more expectations go
- 00:09:06up but ability goes up
- 00:09:08too goes up I want to ask you about
- 00:09:10another news report today that suggested
- 00:09:13that open AI was relaxing its
- 00:09:15restrictions around the use of AI in
- 00:09:18military projects and developing weapons
- 00:09:21can you say more about that and you what
- 00:09:24work are you doing with the US
- 00:09:25Department of Defense and other military
- 00:09:27agencies so a lot of these policies were
- 00:09:30written um before we even knew what
- 00:09:32these people would use our tools for so
- 00:09:34what this was not actually just the
- 00:09:37adjustment of the military use case
- 00:09:38policies but across the board to make it
- 00:09:40more clear so that people understand
- 00:09:42what is possible what is not possible
- 00:09:43but specifically on this um area we
- 00:09:47actually still prohibit the development
- 00:09:49of weapons um the destruction of
- 00:09:51property harm to individuals but for
- 00:09:53example we've been doing work with the
- 00:09:55Department of Defense on um cyber
- 00:09:57security tools for uh open source
- 00:09:59software that secures critical
- 00:10:01infrastructure we've been exploring
- 00:10:02whether it can assist with veteran
- 00:10:04suicide and because we previously had a
- 00:10:06what essentially was a blanket
- 00:10:07prohibition on Military many people felt
- 00:10:10like that would have prohibited any of
- 00:10:12these use cases which we think are very
- 00:10:13much aligned with what we want to see in
- 00:10:15the world has the US government asked
- 00:10:17you to restrict the level of cooperation
- 00:10:20with uh militaries in other
- 00:10:23countries um they haven't asked us but
- 00:10:25we certainly are not you know right for
- 00:10:28now actually our discussion are focused
- 00:10:29on um United States national security
- 00:10:32agencies and um you know I think we have
- 00:10:35always believed that democracies need to
- 00:10:37be in the lead on this technology uh Sam
- 00:10:39changing topics uh give us an update on
- 00:10:41the GPT store and are you seeing maybe
- 00:10:44probably explain it briefly and are you
- 00:10:45seeing the same kind of explosion of
- 00:10:47creativity we saw in the early days of
- 00:10:50the mobile app stores yeah the same
- 00:10:52level of creativity and the same level
- 00:10:53of crap but it I mean that happens in
- 00:10:56the early days as people like feel out a
- 00:10:57technology there's some incredible stuff
- 00:10:59in there too um give us an example the
- 00:11:01gpts should I say what gpts are first
- 00:11:04yeah sure um so gpts are a way to do a
- 00:11:06very lightweight customization of chat
- 00:11:08GPT and if you want it to behave in a
- 00:11:11particular way to use particular data to
- 00:11:13be able to call out to an external
- 00:11:14service um you can make this thing and
- 00:11:17you can do all sorts of like uh great
- 00:11:19stuff with it um and then we just
- 00:11:21recently launched a store where you can
- 00:11:23see what other people have built and you
- 00:11:24can share it and um I mean personally
- 00:11:27one that I have loved is Al Trails I
- 00:11:29have this like every other weekend I
- 00:11:31would like to like go for a long hike
- 00:11:33and there's always like the version of
- 00:11:34Netflix that other people have where
- 00:11:35it's like takes an hour to figure out
- 00:11:37what to watch it takes me like two hours
- 00:11:38to figure out what hike to do and the
- 00:11:40all Trails thing to like say I want this
- 00:11:42I want that you know I've already done
- 00:11:44this one and like here's a great hike
- 00:11:46it's been I it's sounds silly but I love
- 00:11:48that one have you added any gpts of your
- 00:11:50own have I made any yeah um I have not
- 00:11:53put any in the store maybe I will great
- 00:11:57um can you give us an update on the
- 00:11:58volume or or the pace at which you're
- 00:12:00seeing new gpts um the number I know is
- 00:12:03that there had been 3 million created
- 00:12:04before we launched the store I have been
- 00:12:05in the middle of this trip around the
- 00:12:07world that has been quite hectic and I
- 00:12:08have not been doing my normal daily
- 00:12:10metrics tracking so I don't know how
- 00:12:12it's gone since launch but I'll tell you
- 00:12:13by the slowness of chat GPT it's
- 00:12:15probably doing really
- 00:12:18well um I want to ask you about open
- 00:12:20ai's copyright issues uh how important
- 00:12:22are publisher relations to open ai's
- 00:12:25business considering for example the
- 00:12:27lawsuit last month file against open AI
- 00:12:29by the New York Times They are important
- 00:12:32but not for the reason people think um
- 00:12:34there is this belief held by some people
- 00:12:36that man you need all of my training
- 00:12:38data and my training data is so valuable
- 00:12:40and actually uh that is generally not
- 00:12:43the case we do not want to train on the
- 00:12:45New York Times data for example um and
- 00:12:48all more generally we're getting to a
- 00:12:49world where it's been like data data
- 00:12:51data you just need more you need more
- 00:12:53you need more you're going to run out of
- 00:12:54that at some point anyway so a lot of
- 00:12:55our research has been how can we learn
- 00:12:57more from smaller amounts of very high
- 00:12:59quality data and I think the world is
- 00:13:01going to figure that out what we want to
- 00:13:02do with Publishers if they want is when
- 00:13:05one of our users says what happened to
- 00:13:08Davos today be able to say here's an
- 00:13:10article from blueberg here's an article
- 00:13:11from New York Times and here you know
- 00:13:12here's like a little snippet or probably
- 00:13:14not a snippet there's probably some
- 00:13:15cooler thing that we can do with the
- 00:13:16technology and you know some people want
- 00:13:18to partner with us some people don't
- 00:13:20we've been striking a lot of great
- 00:13:21Partnerships and we have a lot more
- 00:13:23coming um and then you know some people
- 00:13:26don't want want to uh we'd rather they
- 00:13:28just say we don't want to do that rather
- 00:13:30than Sue us but like we'll defend
- 00:13:32ourselves that's fine too I just heard
- 00:13:34you say you don't want to train on the
- 00:13:36New York Times does that mean given the
- 00:13:38the legal exposure you would have done
- 00:13:40things differently as you trained your
- 00:13:41model here's a tricky thing about that
- 00:13:43um people the web is a big thing and
- 00:13:46there are people who like copy from The
- 00:13:47New York Times and put an article
- 00:13:49without attribution up on some website
- 00:13:51and you don't know that's a New York
- 00:13:52Times article if the New York Times
- 00:13:54wants to give us a database of all their
- 00:13:55articles or someone else does and say
- 00:13:57hey don't put anything out that's like a
- 00:13:58match for this we can probably do a
- 00:14:00pretty good job and um solve we don't
- 00:14:03want to regurgitate someone else's
- 00:14:05content um but the problem is not as
- 00:14:07easy as it sounds in a vacuum I think we
- 00:14:10can get that number down and down and
- 00:14:11down have it be quite low and that seems
- 00:14:13like a super reasonable thing to
- 00:14:15evaluate us on you know if you have
- 00:14:17copyrighted content whether or not it
- 00:14:20got put into someone else's thing
- 00:14:22without our knowledge and you're willing
- 00:14:23to show us what it is and say don't
- 00:14:25don't put this stuff as a direct
- 00:14:26response we should be able to do that
- 00:14:29um again it won't like thousand you know
- 00:14:32monkeys thousand typewriters whatever it
- 00:14:33is once in a while the model will just
- 00:14:35generate something very close but on the
- 00:14:36whole we should be able to do a great
- 00:14:38job with this um so there's like there's
- 00:14:41all the negatives of this people like ah
- 00:14:43you know don't don't do this but the
- 00:14:44positives are I think there's going to
- 00:14:46be great new ways to consume and
- 00:14:49monetize news and other published
- 00:14:51content and for every one New York Times
- 00:14:54situation we have we have many more
- 00:14:56Super productive things about people
- 00:14:57that are excited to to build the future
- 00:14:59and not do their theatrics and and what
- 00:15:03and what about DOI I mean there have
- 00:15:05been artists who have been upset with
- 00:15:07Dolly 2 Dolly 3 what what has that
- 00:15:09taught you and how will you do things
- 00:15:11differently we engage with the artist
- 00:15:12Community a lot and uh you know we we
- 00:15:15try to like do the requests so one is
- 00:15:16don't don't generate in my style um even
- 00:15:20if you're not training on my data super
- 00:15:22reasonable so we you know Implement
- 00:15:23things like that
- 00:15:25um you know let me opt out of training
- 00:15:27even if my images are all over the
- 00:15:28Internet and you don't know what they
- 00:15:29are what I'm and so there's a lot of
- 00:15:31other things too what I'm really excited
- 00:15:32to do and the technology isn't here yet
- 00:15:34but get to a point where rather than the
- 00:15:36artist say I don't want this thing for
- 00:15:38these reasons be able to deliver
- 00:15:40something where an artist can make a
- 00:15:41great version of Dolly in their style
- 00:15:44sell access to that if they want don't
- 00:15:46if they don't want just use it for
- 00:15:47themselves uh or get some sort of
- 00:15:49economic benefit or otherwise when
- 00:15:52someone does use their stuff um and it's
- 00:15:54not just training on their images it
- 00:15:55really is like you know it really is
- 00:15:59about style uh and and that's that's the
- 00:16:02thing that at least in the artist
- 00:16:03conversations I've had that people are
- 00:16:05super interested in so for now it's like
- 00:16:07all right let's know what people don't
- 00:16:08want make sure that we respect that um
- 00:16:11of course you can't make everybody happy
- 00:16:12but try to like make the community feel
- 00:16:14like we're being a good partner um but
- 00:16:17what what I what I think will be better
- 00:16:18and more exciting is when we can do
- 00:16:20things that artists are like that's
- 00:16:23awesome Anna you are open AI ambassador
- 00:16:27to Washington other capitals around the
- 00:16:30world I I'm curious what you've taken
- 00:16:32from your experience in Facebook what
- 00:16:34you've taken from the tense relations
- 00:16:37between a lot of tech companies and
- 00:16:39governments and Regulators over the past
- 00:16:42few decades and how you're putting that
- 00:16:43to use now in open open AI I mean so I
- 00:16:46think one thing that I really learned
- 00:16:48working in government and of course I
- 00:16:49worked in the White House during the
- 00:16:512016 Russia election interference and
- 00:16:53people think that that was the first
- 00:16:54time we'd ever heard of it but it was
- 00:16:56something that we had actually been
- 00:16:57working on for years and thinking you
- 00:16:59know we know that this happens what do
- 00:17:01we do about it and one thing I never did
- 00:17:03during that period is go out and talk to
- 00:17:05the companies because it's not actually
- 00:17:07typical thing you do in government and
- 00:17:08was much more rare back then especially
- 00:17:11with you know these emerging tools and I
- 00:17:13thought about that a lot as I entered
- 00:17:15the tech space that I regretted that and
- 00:17:16that I wanted governments to be able to
- 00:17:18really understand the technology and how
- 00:17:19the decisions are made by these
- 00:17:21companies and also just honestly when I
- 00:17:23first joined openi no one of course had
- 00:17:25heard of openi in government for the
- 00:17:27most part
- 00:17:28and I thought every time I used it I
- 00:17:31thought my God if IID had this for8
- 00:17:33years I was in the administration I
- 00:17:35could have gotten 10 times more done so
- 00:17:37for me it was really how do I get my
- 00:17:38colleagues to use it um especially with
- 00:17:40open eyes mission to make sure these
- 00:17:42tools benefit everyone I don't think
- 00:17:44that'll ever happen unless governments
- 00:17:45are incorporating it to serve citizens
- 00:17:47more efficiently and faster and so this
- 00:17:49is actually one of the things I've been
- 00:17:51most excited about is to just really get
- 00:17:53governments to use it for everyone's
- 00:17:55benefit I mean I'm hearing like a lot of
- 00:17:56sincerity in that pitch are Regulators
- 00:17:59receptive to it it feels like a lot are
- 00:18:01coming to the conversation probably with
- 00:18:04a good deal of skepticism because of
- 00:18:06past interactions with Silicon Valley I
- 00:18:08think I mostly don't even really get to
- 00:18:09talk about it because for the most part
- 00:18:11people are interested in governance and
- 00:18:12Regulation and I think that they know um
- 00:18:16theoretically that there is a lot of
- 00:18:17benefit the government many governments
- 00:18:19are not quite ready to incorporate I
- 00:18:20mean there are exceptions obviously
- 00:18:22people who are really at the Forefront
- 00:18:24so it's not you know I think often I
- 00:18:26just don't even really get to that
- 00:18:27conversation
- 00:18:29so I want to ask you both about the
- 00:18:31dramatic turn of events in uh November
- 00:18:34Sam one day the window on these
- 00:18:36questions will close um that is not you
- 00:18:39think they
- 00:18:40will I think at some point they probably
- 00:18:43will but it hasn't happened yet so it
- 00:18:45doesn't doesn't matter um I guess my
- 00:18:47question is is you know have you
- 00:18:50addressed the Govern the governance
- 00:18:52issues the very unique uh corporate
- 00:18:56structure at open AI with the nonprofit
- 00:18:58board and the cap profit arm that led to
- 00:19:02your ouer we're going to focus first on
- 00:19:05putting a great full board in place um I
- 00:19:08expect us to make a lot of progress on
- 00:19:09that in the coming months uh and then
- 00:19:11after that the new board uh will take a
- 00:19:13look at the governance structure but I
- 00:19:15think we debated both what does that
- 00:19:17mean is it should open AI be a
- 00:19:19traditional Silicon Valley for-profit
- 00:19:21company we'll never be a traditional
- 00:19:23company but the structure I I think we
- 00:19:25should take a look at the structure
- 00:19:27maybe the answer we have now is right
- 00:19:28but I think we should be willing to
- 00:19:30consider other things but I think this
- 00:19:32is not the time for it and the focus on
- 00:19:33the board first and then we'll go look
- 00:19:35at it from all angles I mean presumably
- 00:19:37you have investors including Microsoft
- 00:19:40including uh your Venture Capital
- 00:19:42supporters um your employees who uh over
- 00:19:46the long term are seeking a return on
- 00:19:48their investment um I think one of the
- 00:19:51things that's difficult to express about
- 00:19:54open aai is the degree to which our team
- 00:19:57and the people around us investors
- 00:19:58Microsoft whatever are committed to this
- 00:20:01Mission um in the middle of that crazy
- 00:20:04few days uh at one point I think like 97
- 00:20:09something like that 98% of the company
- 00:20:11signed uh a letter saying you know we're
- 00:20:14all going to resign and go to something
- 00:20:16else and that would have torched
- 00:20:18everyone's equity and for a lot of our
- 00:20:20employees like this is all or the great
- 00:20:22majority of their wealth and people
- 00:20:24being willing to go do that I think is
- 00:20:27quite unusual our investors who also
- 00:20:29were about to like watch their Stakes go
- 00:20:31to zero which just like how can we
- 00:20:33support you and whatever is best for for
- 00:20:35the mission Microsoft too um I feel very
- 00:20:37very fortunate about that uh of course
- 00:20:40also would like to make all of our
- 00:20:42shareholders a bunch of money but it was
- 00:20:44very clear to me what people's
- 00:20:45priorities were and uh that meant a lot
- 00:20:47I I I sort of smiled because you came to
- 00:20:49the Bloomberg Tech Conference in last
- 00:20:51June and Emily Chang asked uh it was
- 00:20:54something along along the lines of why
- 00:20:56should we trust you and you very
- 00:20:58candidly says you shouldn't and you said
- 00:21:00the board should be able to fire me if
- 00:21:02if they want and of course then they did
- 00:21:05and you quite uh adeptly orchestrated
- 00:21:08your return actually let me tell you
- 00:21:09something um I the board did that I was
- 00:21:12like I think this is wild super confused
- 00:21:16super caught off guard but this is the
- 00:21:17structure and I immediately just went to
- 00:21:19go thinking about what I was going to do
- 00:21:20next it was not until some board members
- 00:21:22called me the next morning that I even
- 00:21:24thought about really coming back um when
- 00:21:27they asked you don't want you want to
- 00:21:28come back uh you want to talk about that
- 00:21:31but like the board did have all of the
- 00:21:33Power there now you know what I'm not
- 00:21:36going to say that next thing but I I I
- 00:21:39think you should continue I think I no I
- 00:21:42would I would also just say that I think
- 00:21:44that there's a lot of narratives out
- 00:21:45there it's like oh well this was
- 00:21:46orchestrated by all these other forces
- 00:21:48it's not accurate I mean it was the
- 00:21:50employees of open AI that wanted this
- 00:21:54and that thought that it was the right
- 00:21:55thing for Sam to be back the you know
- 00:21:57like yeah I thing I'll will say is uh I
- 00:21:59think it's important that I have an
- 00:22:01entity that like can fire this but that
- 00:22:04entity has got to have some
- 00:22:05accountability too and that is a clear
- 00:22:08issue with what happened right Anna you
- 00:22:11wrote a remarkable letter to employees
- 00:22:13during The Saga and one of the many
- 00:22:15reasons I was excited to to have you on
- 00:22:17stage today was ju to just ask you what
- 00:22:20were those five days like for you and
- 00:22:22why did you step up and write that uh
- 00:22:25Anna can clearly answer this if she
- 00:22:26wants to but like is really what you
- 00:22:28want to spend our time on like the soap
- 00:22:30opera rather than like what AI is going
- 00:22:32to do I mean I'm wrapping it up but but
- 00:22:35um I mean go I think people are
- 00:22:36interested okay well we can leave it
- 00:22:38here if you want no no yeah let's let's
- 00:22:40answer that question and we'll we'll we
- 00:22:42can move on I would just say uh for
- 00:22:45color that it happened the day before
- 00:22:46the entire company was supposed to take
- 00:22:48a week off so we were all on Friday uh
- 00:22:50preparing to you know have a restful
- 00:22:52week after an insane year so then you
- 00:22:54know many of us slept on the floor of
- 00:22:56the office for a week right there's a
- 00:22:58question here that I think is a a really
- 00:23:00good one we are at Davos climate change
- 00:23:03is on the agenda um the question is does
- 00:23:06do well I'm going to give it a different
- 00:23:08spin considering the compute costs and
- 00:23:12the the need for chips does the
- 00:23:14development of AI and the path to AGI
- 00:23:16threaten to take us in the opposite
- 00:23:19direction on the climate
- 00:23:23um we do need way more energy in the
- 00:23:27world than I think we thought we needed
- 00:23:29before my my whole model of the world is
- 00:23:32that the two important currencies of the
- 00:23:34future are compute SL intelligence and
- 00:23:37energy um you know the ideas that we
- 00:23:40want and the ability to make stuff
- 00:23:42happen and uh the ability to like run
- 00:23:44the compute and I think we still don't
- 00:23:47appreciate the energy needs of this
- 00:23:50technology um the good news to the
- 00:23:53degree there's good news is there's no
- 00:23:55way to get there without a breakthrough
- 00:23:57we need Fusion or we need like radically
- 00:23:59cheaper solar Plus Storage or something
- 00:24:02at massive scale like a scale that no
- 00:24:04one is really planning for um so
- 00:24:08we it's totally fair to say that AI is
- 00:24:11going to need a lot of energy but it
- 00:24:13will force us I think to invest more in
- 00:24:16the technologies that can deliver this
- 00:24:18none of which are the ones that are
- 00:24:19burning the carbon like that'll be those
- 00:24:21all those unbelievable number of fuel
- 00:24:23trucks and by the way you back one or
- 00:24:25more nuclear yeah I I personally think
- 00:24:29that
- 00:24:31is either the most likely or the second
- 00:24:33most likely approach feel like the world
- 00:24:36is more receptive to that technology now
- 00:24:38certainly historically not in the US um
- 00:24:40I think the world is still
- 00:24:43unfortunately pretty negative on fishing
- 00:24:46super positive on Fusion it's a much
- 00:24:48easier story um but I wish the world
- 00:24:51would Embrace fishing much more I look I
- 00:24:55I may be too optimistic about this but I
- 00:24:56think
- 00:24:58I I think we have paths now to
- 00:25:02massive a massive energy transition away
- 00:25:05from burning carbon it'll take a while
- 00:25:07those cars are going to keep driving
- 00:25:08there you know there's all the transport
- 00:25:10stuff it'll be a while till there's like
- 00:25:12a fusion reactor in every cargo ship um
- 00:25:14but if if we can drop the cost of energy
- 00:25:16as dramatically as I hope we can then
- 00:25:19the math on carbon captur just so
- 00:25:22changes uh I still expect unfortunately
- 00:25:26the world is on a path where we're going
- 00:25:27to have to do something dramatic with
- 00:25:29climate look like geoengineering as a as
- 00:25:32a as a Band-Aid as a stop Gap but I
- 00:25:34think we do now see a path to the
- 00:25:36long-term solution so I I want to just
- 00:25:38go back to my question in terms of
- 00:25:40moving in the opposite direction it
- 00:25:42sounds like the answer is potentially
- 00:25:44yes on the demand side unless we take
- 00:25:49drastic action on the supply side but
- 00:25:51there there is no I I see no way to
- 00:25:54supply this with to to manage the supply
- 00:25:56side without
- 00:25:58a really big breakthrough right which is
- 00:26:01this is does this frighten you guys
- 00:26:03because um you know the world hasn't
- 00:26:05been that versatile when it comes to
- 00:26:08supply but AI as you know you have
- 00:26:10pointed out is not going to take its
- 00:26:12time until we start generating enough
- 00:26:14power it motivates us to go invest more
- 00:26:16in fusion and invest more in nor new
- 00:26:18storage and and not only the technology
- 00:26:21but what it's going to take to deliver
- 00:26:23this at the scale that AI needs and that
- 00:26:26the whole globe needs so I think it
- 00:26:28would be not helpful for us to just sit
- 00:26:30there and be nervous um we're just like
- 00:26:32hey we see what's coming with very high
- 00:26:34conviction it's coming how can we use
- 00:26:37our
- 00:26:38abilities uh our Capital our whatever
- 00:26:40else to do this and in the process of
- 00:26:42that hopefully deliver a solution for
- 00:26:44the rest of the world not just AI
- 00:26:46training workloads or inference
- 00:26:47workloads Anna it felt like in 2023 we
- 00:26:50had the beginning of a almost
- 00:26:53hypothetical conversation about
- 00:26:54regulating AI what what should we expect
- 00:26:58in 2024 and you know does it do do do
- 00:27:02governments act does it does it become
- 00:27:04real and what is what is AI safety look
- 00:27:06like so I think we it is becoming real
- 00:27:09you know the EU is uh on the cusp of
- 00:27:12actually finalizing this regulation
- 00:27:14which is going to be quite extensive and
- 00:27:16the Biden Administration uh wrote the
- 00:27:18longest executive order I think in the
- 00:27:20history of executive orders uh covering
- 00:27:22this technology and is being implemented
- 00:27:24in 2024 because they gave agencies you
- 00:27:26know a bunch of homework for how to
- 00:27:29implement this and govern this
- 00:27:30technology and and it's happening so I
- 00:27:32think it is really moving forward um but
- 00:27:35what exactly safety looks like of what
- 00:27:37it even is I think this is still a
- 00:27:38conversation we haven't bottomed out on
- 00:27:41you know we founded this Frontier Model
- 00:27:42Forum in part yeah maybe explain what
- 00:27:44that is so this is um for now this is um
- 00:27:47Microsoft openai anthropic and um Google
- 00:27:50but it will I think expand to other
- 00:27:52Frontier Labs but really right now all
- 00:27:55of us are working on safety we all red
- 00:27:57teamr models um we all do a lot of this
- 00:28:00work but we really don't have even a
- 00:28:01common vocabulary um or a standardized
- 00:28:04approach and to the extent that people
- 00:28:06think like well this is just industry
- 00:28:08but uh this is in part in response to
- 00:28:10many governments that have asked us for
- 00:28:12this very thing so like what is it
- 00:28:14across industry that you think are
- 00:28:16viable best practices is there a risk
- 00:28:20that regulation starts to discourage
- 00:28:23entrepreneurial activity in in AI I mean
- 00:28:26I think people are terrified of this um
- 00:28:28this is why I think Germany and France
- 00:28:30and Italy in interjected into the EU um
- 00:28:34AI act discussion because they are
- 00:28:36really concerned about their own
- 00:28:37domestic Industries being sort of
- 00:28:39undercut before they've even had a
- 00:28:41chance to develop were you satisfied
- 00:28:44with your old boss's executive order and
- 00:28:46was was there anything in there that uh
- 00:28:48you had lobbied against no and in fact
- 00:28:52you know I think it's it was really good
- 00:28:54in that it wasn't just these are the
- 00:28:56restrictions it's like and then also
- 00:28:58please go and think about how your
- 00:29:00agency will actually leverage this to do
- 00:29:02your work better so I was really
- 00:29:04encouraged that they actually did have a
- 00:29:07balanced
- 00:29:08approach um Sam first time at Davos
- 00:29:11first time okay is um uh you mentioned
- 00:29:15that uh You' prefer to spend more of our
- 00:29:16time here on stage talking about AGI
- 00:29:19what is the message you're bringing to
- 00:29:21political leaders and other Business
- 00:29:22Leaders here if you could distill it
- 00:29:24thank you
- 00:29:26um
- 00:29:28so I think 2023 was a year where the
- 00:29:31world woke up to the possibility of
- 00:29:34these systems becoming increasingly
- 00:29:36capable and increasingly General but GPT
- 00:29:394 I think is best understood as a
- 00:29:42preview and it was more Over the Bar
- 00:29:46than we expected of utility for more
- 00:29:48people in more ways but you know it's
- 00:29:51easy to point out the limitations and
- 00:29:53again we're thrilled that people love it
- 00:29:55and use it as much as they do but this
- 00:29:58is progress here is not linear and this
- 00:30:01is the thing that I think is really
- 00:30:03tricky humans have horrible intuition
- 00:30:06for exponentials at least speaking for
- 00:30:07myself but it seems like a common part
- 00:30:09of the human condition um what does it
- 00:30:12mean if GPT 5 is as much better than gp4
- 00:30:15is four was to three and six is to five
- 00:30:17and what does it mean if we're just on
- 00:30:18this trajectory now um what you know on
- 00:30:23the question of Regulation I think it's
- 00:30:24great that different countries are going
- 00:30:25to try different things some countries
- 00:30:27will probably ban AI some countries will
- 00:30:29probably say no guard rails at all both
- 00:30:31of those I think will turn out to be
- 00:30:32suboptimal and we'll we'll get to see
- 00:30:34different things work but as these
- 00:30:36systems become more powerful um as they
- 00:30:41as they become more deeply integrated
- 00:30:42into the economy as they become
- 00:30:43something we all used to do our work and
- 00:30:45then as things beyond that happen as
- 00:30:47they become capable of discovering new
- 00:30:50scientific knowledge for
- 00:30:52Humanity even as they become capable of
- 00:30:54doing AI research at some point um the
- 00:30:57world is going
- 00:30:58to change more slowly and then more
- 00:31:01quickly than than we might imagine but
- 00:31:03the world is going to change um this is
- 00:31:06you know a thing I I always say to
- 00:31:08people is no one knows what happens next
- 00:31:09and I really believe that and I think
- 00:31:10keeping the humility about that is
- 00:31:12really important you can see a few steps
- 00:31:14in front of you but not too many
- 00:31:17um but when cognition the when the cost
- 00:31:20of cognition Falls by a factor of a
- 00:31:23thousand or a million when the
- 00:31:24capability of it becomes uh it augments
- 00:31:28Us in ways we can't even imagine you
- 00:31:30know uh like one example I I try to give
- 00:31:33to people is what if everybody in the
- 00:31:35world had a really competent company of
- 00:31:3810,000 great virtual employees experts
- 00:31:41in every area they never fought with
- 00:31:42each other they didn't need to rest they
- 00:31:45got really smart they got smarter at
- 00:31:46this rapid Pace what would we be able to
- 00:31:48create for each other what would that do
- 00:31:50to the world that we experience and the
- 00:31:52answer is none of us know of course and
- 00:31:55none of us have strong intuitions for
- 00:31:56that I can imagine it sort of but it's
- 00:31:59not like a clear picture um and this is
- 00:32:03going to happen uh it doesn't mean we
- 00:32:06don't get to steer it it doesn't mean we
- 00:32:07don't get to work really hard to make it
- 00:32:09safe and to do it in a responsible way
- 00:32:11but we are going to go to the Future and
- 00:32:13I think the best way to get there in a
- 00:32:15way that works
- 00:32:17is the level of Engagement we now have
- 00:32:20part of the reason a big part of the
- 00:32:21reason we believe in iterative
- 00:32:23deployment of our technology is that
- 00:32:25people need time to gradually get used
- 00:32:28to it to understand it we need time to
- 00:32:30make mistakes while the stakes are low
- 00:32:32governments need time to make some
- 00:32:33policy mistakes and also technology and
- 00:32:36Society have to co-evolve in a case like
- 00:32:39this uh so technology is going to change
- 00:32:41with each iteration but so is the way
- 00:32:43Society works and that's got to be this
- 00:32:45interactive iterative process um and we
- 00:32:48need to embrace it but have caution
- 00:32:51without fear and how long do we have for
- 00:32:53this iterative process to play I I think
- 00:32:56it's surprisingly continuous I don't
- 00:32:58like if I try to think about
- 00:33:00discontinuities I can sort of see one
- 00:33:02when AI can do really good AI research
- 00:33:05um and I can see a few others too but
- 00:33:07that's like an evocative example um but
- 00:33:09on the whole I don't think it's about
- 00:33:12like Crossing this one line I think it's
- 00:33:14about this continuous exponential curve
- 00:33:17we climb together and so how long do we
- 00:33:19have like no time at all in
- 00:33:24infinite I saw GPT five trending on X
- 00:33:28earlier this week and I clicked and I
- 00:33:30you know couldn't I it sounded uh you
- 00:33:33know probably misinformed but what what
- 00:33:35can you tell us about gbt 5 and is it an
- 00:33:40exponential uh you know improvement over
- 00:33:43what we've seen look I don't know what
- 00:33:44we're going to call our next model um I
- 00:33:45don't know when are you going to get
- 00:33:46creative with the uh the naming process
- 00:33:49uh I don't want to be like shipping
- 00:33:52iPhone
- 00:33:5327 um so you know it's not my style
- 00:33:57quite uh but I I think the next model we
- 00:34:02release uh I expect it to be very
- 00:34:04impressive to do new things that were
- 00:34:06not possible with gp4 to do a lot of
- 00:34:08things better and I expect us to like
- 00:34:10take our time and make sure we can
- 00:34:11launch something that we feel good about
- 00:34:14and responsible about within open AI
- 00:34:16some employees consider themselves to be
- 00:34:20quote building God is that I haven't
- 00:34:23heard that okay is um I mean I've heard
- 00:34:27like people say that factiously but uh I
- 00:34:31think almost all employees would say
- 00:34:34they're building a tool more so than
- 00:34:36they thought they were going to be which
- 00:34:38they're thrilled about you know this
- 00:34:39confusion in the industry of Are We
- 00:34:41building a creature are we building a
- 00:34:42tool um I think we're much more building
- 00:34:45a tool and that's much
- 00:34:46better uh to transition to something
- 00:34:49yeah goad no no no no you finish your
- 00:34:51thought oh I was just going to say like
- 00:34:53the
- 00:34:54the we think of ourselves as tool
- 00:34:57Builders um AI is much more of a tool
- 00:35:01than a product and much much more of a
- 00:35:03tool than this like entity and uh one of
- 00:35:08the most wonderful things about last
- 00:35:10year was seeing just how much people
- 00:35:12around the world could do with that tool
- 00:35:14and they astonished us and I think we'll
- 00:35:16just see more and more and human
- 00:35:17creativity uh and ability to like do
- 00:35:21more with better tools is remarkable and
- 00:35:23and before we have to start wrapping up
- 00:35:25you know there was a report that you
- 00:35:26were working with Johnny I on an AI
- 00:35:29powered device either within open AI
- 00:35:32perhaps as a separate company you know I
- 00:35:34bring it up because CES was earlier this
- 00:35:37month and AI powered devices were the
- 00:35:39the talk of of the conference you know
- 00:35:42can you give us an update on that and
- 00:35:44are we approach does AI bring us to the
- 00:35:46beginning of the end of the smartphone
- 00:35:48era smartphones are fantastic I don't
- 00:35:51think smartphones are going anywhere uh
- 00:35:53I think what they do they do really
- 00:35:54really well and they're very general if
- 00:35:56if there is a new thing to make uh I
- 00:35:59don't think it replaces a smartphone in
- 00:36:01the way that I don't think smartphones
- 00:36:02replace computers but if there's a new
- 00:36:04thing to make that helps us do more
- 00:36:06better you know in a in a new way given
- 00:36:08that we have this unbelievable change
- 00:36:11like I don't think we quite I don't
- 00:36:13spend enough time I think like marveling
- 00:36:14at the fact that we can now talk to
- 00:36:16computers and they understand us and do
- 00:36:18stuff for us like it is a new affordance
- 00:36:20a new way to use a computer and if we
- 00:36:22can do something great there uh a new
- 00:36:25kind of computer we should do that and
- 00:36:27if it turns out that the smartphone's
- 00:36:28really good and this is all software
- 00:36:29then fine but I bet there is something
- 00:36:32great to be done and um the partnership
- 00:36:35with Johnny is that an open AI effort is
- 00:36:38that another company I have not heard
- 00:36:39anything official about a partnership
- 00:36:41with
- 00:36:42Johnny okay um Anna I'm going to give
- 00:36:46you the last word as you and Sam meet
- 00:36:48with business and world leaders here at
- 00:36:50Davos what's the message you want to
- 00:36:52leave them
- 00:36:54with um I think the that there is an a
- 00:36:58trend where people feel more fear than
- 00:37:01excitement about this technology and I
- 00:37:03understand that we have to work very
- 00:37:04hard to make sure that the best version
- 00:37:06of this technology is realized but I do
- 00:37:08think that many people are engaging with
- 00:37:11this via the leaders here and that they
- 00:37:13really have a responsibility to make
- 00:37:15sure that um they are sending a balanced
- 00:37:17message so that um people can really
- 00:37:20actually engage with it and realize the
- 00:37:22benefit of this technology can I have 20
- 00:37:24seconds absolutely one one of the things
- 00:37:26that I think open ey has not always done
- 00:37:28right in the field hasn't either is find
- 00:37:30a way to build these tools in a way uh
- 00:37:33and also talk about them that don't
- 00:37:36don't get that kind of response I think
- 00:37:38chat gbt one of the best things it did
- 00:37:39is it shifted the conversation to the
- 00:37:41positive not because we said trust us
- 00:37:43it'll be great but because people used
- 00:37:44it and are like oh I get this I use this
- 00:37:47in a very natural way the smartphone was
- 00:37:48cool cuz I didn't even have to use a
- 00:37:49keyboard and phone I could use it more
- 00:37:51naturally talking is even more natural
- 00:37:53um speaking of Johnny Johnny is a genius
- 00:37:55and one of the things that I think he
- 00:37:57has done again and again about computers
- 00:37:59is figuring out a way to make them very
- 00:38:03human compatible and I think that's
- 00:38:05super important with this technology
- 00:38:07making this feel like uh you know not
- 00:38:10this mystical thing from sci-fi not this
- 00:38:11scary thing from sci-fi but this this
- 00:38:14new way to use a computer that you love
- 00:38:16and that really feels like I still
- 00:38:18remember the first iMac I got and what
- 00:38:21that felt like to me
- 00:38:23relative it was heavy but the fact that
- 00:38:25it had that handle even though it is
- 00:38:26like a kid it was very heavy to carry um
- 00:38:29it did mean that I was like I had a
- 00:38:31different relationship with it because
- 00:38:32of that handle and because of the way it
- 00:38:34looked I was like oh I can move this
- 00:38:36thing around I could unplug it and throw
- 00:38:38it out the window if it tried to like
- 00:38:39wake up and take over that's nice um and
- 00:38:42I think the way we design our technology
- 00:38:44and our products really does matter
- AI guidelines
- Elections
- AI in politics
- ChatGPT ban
- Cryptographic watermarks
- OpenAI
- AI enforcement
- Job displacement
- Publisher relations
- AI regulation