Yuval Noah Harari: Humanidade, não é assim tão simples
Ringkasan
TLDRNeste episódio gravado ao vivo, Yuval Noah Harari discute os desafios da humanidade em 2023, abordando questões sobre IA, tecnologia e a crescente divisão social. Ele enfatiza a capacidade da IA de fazer decisões próprias e gerar conteúdo, podendo afetar profundamente as interações sociais e a política. Harari alerta sobre a necessidade de colaboração entre as pessoas para enfrentar crises como o colapso ecológico e as disrupturas tecnológicas. Além disso, ele expressa preocupações sobre a educação e a forma como a tecnologia influencia nossos relacionamentos. O vídeo termina com Harari reflexionando sobre a necessidade de autoobservação e compreensão da realidade além das histórias que contamos a nós mesmos.
Takeaways
- 🌍 A humanidade possui grande poder de criação e destruição.
- ⚠️ Desafios ecológicos e tecnológicos estão à frente.
- 🤖 A IA pode tomar decisões que afetam a sociedade.
- 💡 A IA está em fase inicial de desenvolvimento.
- 📚 A educação pode mudar drasticamente com a IA.
- 🌐 Precisamos de colaboração para enfrentar crises globais.
- ⌛ Adaptabilidade é essencial em um mundo em rápida mudança.
- 🌀 As histórias que contamos moldam nossa percepção da realidade.
Garis waktu
- 00:00:00 - 00:05:00
Este episódio marca a primeira gravação ao vivo do programa, com a participação do renomado autor e historiador Yuval Noah Harari. O foco da discussão é a humanidade em 2023, destacando os desafios enfrentados, como o colapso ecológico e a disrupção tecnológica, e a crescente divisão entre as sociedades.
- 00:05:00 - 00:10:00
Harari aborda a rápida evolução da inteligência artificial (IA), comparando sua evolução inicial com organismos simples. Ele destaca que a IA, atualmente em um estágio primitivo, tem potencial para evoluir rapidamente, afetando o futuro da humanidade e levantando questões sobre sua capacidade de tomar decisões e criar novas ideias.
- 00:10:00 - 00:15:00
A IA é discutida como uma tecnologia única que pode fazer decisões de forma autônoma, diferentemente de tecnologias anteriores. Harari alerta que, enquanto a IA evolui, é fundamental que a sociedade reflita sobre suas responsabilidades e os riscos associados a essa evolução.
- 00:15:00 - 00:20:00
O palestrante menciona a necessidade de regulamentação na implementação da IA, comparando-a a outros produtos que requerem verificações de segurança antes do lançamento. Ele defende que a regulação deve focar na segurança e evitar que tecnologias perigosas sejam liberadas sem supervisão.
- 00:20:00 - 00:25:00
Em meio à discussão sobre a desinformação e a polarização nas redes sociais, Harari destaca como a IA pode alterar a conversa democrática. Ele menciona o perigo de bots que se disfarçam como humanos e afetam a confiança nas discussões sociais.
- 00:25:00 - 00:30:00
Sobre a educação, Harari expressa preocupação com o impacto da IA nas habilidades de pensamento crítico das futuras gerações, questionando o que significa aprender em uma era onde respostas estão facilmente disponíveis.
- 00:30:00 - 00:35:00
Ele reflete sobre a natureza humana e os desafios sociais na era da tecnologia. A conexão humana está se tornando mais distante, e isso tem repercussões em nossas relações pessoal e profissional.
- 00:35:00 - 00:43:36
Finalmente, Harari fala sobre o futuro do trabalho e a necessidade de adaptabilidade, enfatizando a importância de as pessoas desenvolverem a capacidade de se reinventarem continuamente diante das rápidas mudanças no mercado de trabalho.
Peta Pikiran
Video Tanya Jawab
Qual é o principal desafio que a humanidade enfrenta atualmente?
A principal dificuldade da humanidade é a sua incapacidade de cooperar e a crescente divisão social.
Como a IA está afetando a sociedade?
A IA representa tanto oportunidades quanto riscos, como a possibilidade de transformação do trabalho e criação de ideias, mas também pode ameaçar a democracia.
Qual é a visão de Harari sobre o futuro da educação?
Harari acredita que a educação passará por grandes transformações devido à velocidade da IA, mas ainda não se sabe os efeitos a longo prazo.
Qual a importância do descanso no contexto de mudanças rápidas?
Segundo Harari, precisamos de pausas e descansos, pois o ritmo acelerado da tecnologia pode ser estressante e insustentável para os humanos.
Como a tecnologia impacta as relações pessoais?
Embora facilite conexões, a tecnologia também pode levar a uma desconexão física e emocional entre as pessoas.
Como devemos regular a IA?
Harari sugere que a regulamentação do uso da IA deve ser feita para garantir segurança e evitar manipulações perigosas.
Qual a relação entre IA e decisões humanas?
A IA é a primeira tecnologia que pode tomar decisões independentes, o que muda a dinâmica do poder e controle.
Harari acredita que as histórias que as pessoas criam são boas ou ruins?
Ele sugere que devemos observar como nossas histórias podem distorcer nossa percepção da realidade e das interações sociais.
Lihat lebih banyak ringkasan video
Modernidade Líquida | Bauman | Aula Resumo
Introdução ao Cálculo Diferencial e Integral - 02.04.2016 - parte 1
Filme Gospel para Crianças Young Davi Rei Davi em Animação Dublado + A vida de Jesus
Como Combinar CORES no Design Gráfico como um Ninja (Simples e Fácil)
🇺🇸🔥 RESUMO DE HISTÓRIA: INDEPENDÊNCIA DOS ESTADOS UNIDOS! REVOLUÇÃO AMERICANA (Débora Aladim)
A verdade!
- 00:00:00foreign
- 00:00:03[Music]
- 00:00:32hello and welcome to a very special
- 00:00:33edition of it's not that simple special
- 00:00:36because it's the first one ever recorded
- 00:00:39in first in front of a live audience
- 00:00:41here at this in Lisbon and also because
- 00:00:44we're delighted to have our special
- 00:00:47guest renowned author historian as well
- 00:00:50Yuval Noah Harari it'll be fantastic to
- 00:00:54be able to discuss with him his
- 00:00:57perspective on Humanity which is our
- 00:00:59topic of the day of course he is
- 00:01:01globally known for a variety of work in
- 00:01:05this area about the past present and
- 00:01:07future of humanity and we're going to
- 00:01:09try to explain why it's not that simple
- 00:01:13um Yuval welcome to Portugal thank you
- 00:01:16and the premise of this show is trying
- 00:01:19to break down complicated subjects
- 00:01:21today's topic is particularly
- 00:01:24complicated to set the stage I think
- 00:01:26it'd be nice for you to try to paint a
- 00:01:29picture of humanity in 2023
- 00:01:32and tell us what you see and why it's
- 00:01:36difficult perhaps to preview the future
- 00:01:39at this time
- 00:01:42well we all know almost like gods in
- 00:01:46terms of our powers of creation and
- 00:01:48destruction
- 00:01:49we now have the power to create new life
- 00:01:52forms but also to destroy much of life
- 00:01:56on Earth including ourselves
- 00:01:58we are facing
- 00:02:00two really big challenges on the one
- 00:02:04hand the threat of ecological collapse
- 00:02:08on the other hand the threat of
- 00:02:10technological disruption we are creating
- 00:02:13extremely powerful tools like AI that
- 00:02:16could uh undermine human civilization
- 00:02:20and maybe the worst thing is that
- 00:02:23instead of uniting in order to face
- 00:02:26these common challenges to our species
- 00:02:29we are dividing we are fighting each
- 00:02:33other more and more the rising tensions
- 00:02:37on the international level there are
- 00:02:40rising tensions within societies uh one
- 00:02:44Society after the other is really on the
- 00:02:46brink of collapse
- 00:02:48so you know maybe the most important
- 00:02:50question to ask about humans about
- 00:02:52humanity is if we are so smart why are
- 00:02:57we doing so many stupid things
- 00:03:00why are we called our species homo
- 00:03:03sapiens
- 00:03:04wise humans and yet we are engaged in so
- 00:03:09many self-destructive activities
- 00:03:12and we seem unable to stop ourselves
- 00:03:16so that that I think is the Paradox of
- 00:03:19of the the the smart humans the wise
- 00:03:21humans
- 00:03:22it's a great place to start and you know
- 00:03:24when I was introducing you obviously I
- 00:03:26was very short because I want to make
- 00:03:28the most out of these 40 minutes that I
- 00:03:30have with you I could list your
- 00:03:31accolades and your books and your
- 00:03:33achievements better yeah yeah but let's
- 00:03:34focus on the on on the content and um
- 00:03:38obviously uh I've I've heard a lot of
- 00:03:40your of your interviews of your talks
- 00:03:43and there's something that we can't get
- 00:03:45around at the moment in 2023 which is
- 00:03:47technology the advance of Technology you
- 00:03:50know when I was growing up and we're
- 00:03:51more or less the same age I'm 48. I
- 00:03:53still remember that pong game where you
- 00:03:55had basically two white rectangles going
- 00:03:59back and forth with the ball bouncing
- 00:04:01off the the screen and it's crazy where
- 00:04:04we are now you know I have a two and a
- 00:04:05half year old daughter and this new
- 00:04:07frontier of artificial intelligence is
- 00:04:11incredibly dangerous I think because I'm
- 00:04:14really wondering what her generation is
- 00:04:16going to be like how they're going to
- 00:04:18learn how they're going to do anything
- 00:04:19for themselves
- 00:04:21um
- 00:04:22tell us about AI in your words the the
- 00:04:25opportunities and the challenges that
- 00:04:27you see
- 00:04:28so we need to know three things about AI
- 00:04:31first of all AI is still just a tiny
- 00:04:35baby we haven't seen anything yet
- 00:04:39um real AI deployed into the world not
- 00:04:42in some laboratory or in science fiction
- 00:04:44is about like 10 years old
- 00:04:47and you know you look at the wonderful
- 00:04:50scenery outside of of all these plants
- 00:04:53and trees and you think about biological
- 00:04:55evolution
- 00:04:57the evolution of life on Earth took
- 00:05:00something like 4 billion years four
- 00:05:02billion years to reach these plants and
- 00:05:05to reach us human beings
- 00:05:08now ai is now at the stage like of I
- 00:05:12don't know amoebas
- 00:05:13it's like four billion years ago and the
- 00:05:16first living organisms are crawling out
- 00:05:19of the organic soup
- 00:05:22and
- 00:05:24so cha GPT and all these wonders they
- 00:05:26are the amoebas of the AI world what
- 00:05:30would T-Rex look like
- 00:05:33and how long would it take for the AI
- 00:05:37amoebas to evolve into the T-Rexes
- 00:05:41and it won't take billions of years
- 00:05:44maybe it takes just a few decades or a
- 00:05:47few years
- 00:05:48because
- 00:05:49the evolution of AI is at a completely
- 00:05:53different time scale than the evolution
- 00:05:55of organic beings because AI itself
- 00:05:58works on a different time scale
- 00:06:01AI is always on
- 00:06:03one of the computers in general are
- 00:06:06always on
- 00:06:08humans and other organic organisms they
- 00:06:11they live they exist they develop by
- 00:06:13Cycles
- 00:06:15we need to rest sometimes hey I never
- 00:06:18needs to rest
- 00:06:20now the other two things we need to know
- 00:06:22about AI
- 00:06:24is that first it's the first technology
- 00:06:26ever that can make decisions by itself
- 00:06:29I hear a lot of people saying oh all
- 00:06:31these worries about AI every time there
- 00:06:33is a new technology people worry about
- 00:06:35it and afterwards it's okay like when
- 00:06:37people invented writing and printing
- 00:06:40presses and airplanes they were so
- 00:06:42worried and in the end it was okay I
- 00:06:44will be the same it's not the same
- 00:06:47no previous technology in history could
- 00:06:49make decisions
- 00:06:51you know even an atom bomb
- 00:06:53actually empowered humans because an
- 00:06:56atom bomb can destroy a city it cannot
- 00:06:59decide which city to bomb you always
- 00:07:02need a human to make the decision AI is
- 00:07:05the first technology that can make
- 00:07:06decisions by itself even about us
- 00:07:09increasingly we apply to a bank to get a
- 00:07:12loan it's an AI making the decisions
- 00:07:15about us so it takes power away from us
- 00:07:18the third thing about AI that everybody
- 00:07:21needs to know
- 00:07:23it's the first technology ever
- 00:07:25that can create new ideas
- 00:07:28you know the printing press radio
- 00:07:30television they broadcast they spread
- 00:07:34the ideas created by the human brain by
- 00:07:38the human mind they cannot create a new
- 00:07:40idea
- 00:07:41you know Gutenberg printed the Bible in
- 00:07:44the middle of the 15th century the the
- 00:07:47printing press printed as many copies of
- 00:07:50the Bible as Gutenberg instructed it but
- 00:07:52it did not create a single new page
- 00:07:56it had no ideas of its own about the
- 00:07:59Bible is it good is it bad how to
- 00:08:01interpret this how to interpret that
- 00:08:04AI can create new ideas can even write a
- 00:08:08new Bible we you know throughout history
- 00:08:11religions dreamed about having a book
- 00:08:15written by a superhuman intelligence by
- 00:08:18a non-human entity or every religion
- 00:08:21claims our book all the art books of the
- 00:08:23other religions they humans wrote them
- 00:08:25but our book no no no no it came from
- 00:08:27some superhuman intelligence in a few
- 00:08:30years there might be religions that are
- 00:08:33actually correct
- 00:08:35that just think about a religion whose
- 00:08:38holy book is written by an AI that could
- 00:08:41be a reality in a few years
- 00:08:45you know
- 00:08:46um when I Was preparing this interview I
- 00:08:49wrote down questions that I would like
- 00:08:50to ask you and then I asked chat GPT to
- 00:08:54create 10 questions that I would like to
- 00:08:56ask you and I've been doing this for a
- 00:08:58long time okay 25 years I've been doing
- 00:09:00this and it's questions were better than
- 00:09:03mine I'm gonna be honest with you it was
- 00:09:06absolutely insane and it took it five
- 00:09:08seconds seven seconds to write it out
- 00:09:10and I'm still using the first version
- 00:09:11and when you talk about it being a baby
- 00:09:13I'm like that's pretty scary baby we're
- 00:09:15talking about here right
- 00:09:17um and and and when you when you said it
- 00:09:19at the end towards the end of your your
- 00:09:21book 21 lessons for the 21st century
- 00:09:23when we have maybe decades you see some
- 00:09:26years left before when we need to
- 00:09:28discover who we are before algorithms
- 00:09:31tell us who we are how fast is that
- 00:09:34happening and what does that process
- 00:09:38look like before humans normally are
- 00:09:41just going to go to an AI solution
- 00:09:44before they go to this solution
- 00:09:47it's it's moving faster than I think
- 00:09:50almost anybody expected faster than I
- 00:09:52expected despite all my engagement with
- 00:09:55the field and what I wrote in homodels
- 00:09:57and 21 lessons I'm still surprised by
- 00:09:59how fast it is moving and how powerful
- 00:10:02the new generation of AI is and actually
- 00:10:05the new generation is not out it's it's
- 00:10:07in the Laboratories but it's already
- 00:10:09there it's even much more powerful than
- 00:10:12chart GPT
- 00:10:14um and I think that to have a fighting
- 00:10:17chance we need time
- 00:10:18again I said before humans we are
- 00:10:21organic beings
- 00:10:23we move in Cycles we move on organic
- 00:10:26time
- 00:10:27and we are the most maybe the most
- 00:10:30adaptable animals on the planet but
- 00:10:32adaptation itself requires time
- 00:10:35and now we've reached a point when there
- 00:10:37is no time AI is moving too fast and I
- 00:10:41think that it's the responsibility
- 00:10:43therefore of government to buy us Time
- 00:10:46by slowing it down now I have no uh
- 00:10:51illusion
- 00:10:52that we can stop researching AI it's not
- 00:10:55going to happen
- 00:10:56but what I expect from governments is to
- 00:10:59regulate the deployment of AI into
- 00:11:02society which governments do it with so
- 00:11:05many other products you know if I'm a a
- 00:11:08drug company and I develop a new drug a
- 00:11:11new medicine I can't just begin to sell
- 00:11:13it to people
- 00:11:15unless it first goes through a
- 00:11:18complicated and sometimes lengthy
- 00:11:20process of checking it for safety side
- 00:11:23effects and so forth if I build a new
- 00:11:26type of car I cannot just place it on
- 00:11:29the road and start driving it has to go
- 00:11:32through again a lengthy process of of
- 00:11:34making sure it's safe
- 00:11:37we need the same thing for AI it's it's
- 00:11:39just common sense
- 00:11:41there's no doubt about it but let's be
- 00:11:42honest here different governments have
- 00:11:44different interests and many times
- 00:11:46they're not exactly walking around you
- 00:11:48know playing nice with each other all
- 00:11:49the time are you doing it no I'm not I
- 00:11:52swear I'm not doing it it's how how can
- 00:11:55we regulate this really and and hold
- 00:11:57people accountable in this process that
- 00:11:59is so dangerous because as you said
- 00:12:01there are new ideas being created these
- 00:12:03new ideas can fall into the wrong hands
- 00:12:06yeah
- 00:12:07so what are we looking at here from a a
- 00:12:09governmental standpoint so again we need
- 00:12:12to to distinguish between regulating
- 00:12:15deployment and regulating development
- 00:12:17ideally we should be able to regulate
- 00:12:20development too but that's much much
- 00:12:23more danger and difficult because of
- 00:12:25this what you talked about of the arms
- 00:12:27race like every government would say we
- 00:12:29don't want to develop this dangerous
- 00:12:31technology but the Chinese are doing it
- 00:12:33the Israelis are doing it so we have to
- 00:12:34do it also that's very difficult but
- 00:12:36deployment is should be more easy
- 00:12:39that okay you research new generation of
- 00:12:42AI but you cannot deploy it let's say in
- 00:12:45the EU
- 00:12:46unless it goes first through these
- 00:12:49safety checks now if the Chinese want to
- 00:12:51deployment in China okay they can go
- 00:12:53ahead and deploy it in China but they
- 00:12:55can't do it in the EU and of course
- 00:12:57democracies and dictatorships have
- 00:13:00different needs
- 00:13:02one of the dangers big dangers of AI is
- 00:13:06that it will basically you know destroy
- 00:13:09the Democratic conversation you know
- 00:13:12dictatorships are based on dictating one
- 00:13:15person dictating everything
- 00:13:18democracy works by having a conversation
- 00:13:20between people what should we do about
- 00:13:22this or that now conversation is is done
- 00:13:26with language
- 00:13:28and the basis control conversation is
- 00:13:31trust between people now what happens
- 00:13:33when you have entities which are not
- 00:13:36human at all and can converse with you
- 00:13:40and be even more persuasive or create
- 00:13:43intimate relationships with you and you
- 00:13:46don't even know that they are not human
- 00:13:49so you go online and have a discussion
- 00:13:52about I don't know the Russian invasion
- 00:13:54of Ukraine and it's a very nice person
- 00:13:57and it Knows by some Supernatural
- 00:14:01ability just how to uh say exactly what
- 00:14:05you think and get into your good graces
- 00:14:08and after a few weeks of building this
- 00:14:10kind of very good relationship with you
- 00:14:12it starts telling you you know actually
- 00:14:14there are some uh valid reasons why the
- 00:14:18Russians invaded Ukraine and you think
- 00:14:20it's a human being
- 00:14:21and maybe you develop feelings for that
- 00:14:24person
- 00:14:26but actually it's a Russian bot
- 00:14:29now we know in in previous years
- 00:14:33that there was a battle for human
- 00:14:34attention between algorithms trying to
- 00:14:37grab our attention
- 00:14:39and this was one of the main reasons for
- 00:14:41the spread of fake news and conspiracy
- 00:14:44theories and hatred because the
- 00:14:46algorithms discovered that the easiest
- 00:14:48way to grab your attention is with
- 00:14:50hatred and outrage and fear
- 00:14:54now the Battlefront is shifting from
- 00:14:57attention to intimacy
- 00:15:00that will we if we don't regulate it
- 00:15:03there will be an arms race
- 00:15:05for intimacy different AIS competing to
- 00:15:09create intimacy with us because the
- 00:15:11easiest way to persuade you to change
- 00:15:13your political views to buy some product
- 00:15:16to vote from some politician is to
- 00:15:19create an intimate connection
- 00:15:21and this will destroy democracy if we
- 00:15:24allow it to happen because democracy is
- 00:15:25a conversation between people
- 00:15:28if it becomes a conversation between
- 00:15:30Bots and people where the Bots have all
- 00:15:33the power that's the end of democracy
- 00:15:35and again
- 00:15:36this should be a very simple regulation
- 00:15:38it's not regulating the development of
- 00:15:41AI capable of forming intimacy
- 00:15:44it's just have a very clear low
- 00:15:47that it is illegal to counterfeit humans
- 00:15:52for thousands of years we had laws that
- 00:15:55it's illegal to counterfeit money if we
- 00:15:57didn't have these laws the economic
- 00:15:59system would have collapsed it's easy to
- 00:16:01counterfeit money but if you do that you
- 00:16:04go to jail for many many years
- 00:16:07it should be the same with humans until
- 00:16:09now there was no technology that could
- 00:16:11do that so there were no such lows but
- 00:16:13there should be a very clear and varying
- 00:16:17extreme low
- 00:16:19that if you or your platform
- 00:16:22counterfeits humans you go to jail to 20
- 00:16:25years
- 00:16:26now it's okay you know like we need AI
- 00:16:29doctors it's fine as long as the AI
- 00:16:32tells us it's an AI
- 00:16:33and you interact with it it tells you
- 00:16:36hello I'm an AI doctor do you want some
- 00:16:38advice on your whatever that's fine but
- 00:16:41impersonating a human being and then
- 00:16:44using it in order to manipulate you that
- 00:16:47should be outlawed you know it's
- 00:16:49interesting because many people here
- 00:16:52maybe saw the case of Bruce Willis when
- 00:16:54there was a deep fake of Bruce Willis
- 00:16:56participating in an ad campaign in in
- 00:17:00Russia and he actually had to come out
- 00:17:01and say hey guys that's not me that's a
- 00:17:04computer generated image and this is
- 00:17:06just a small example of many others that
- 00:17:08could come our way whether it's with
- 00:17:10real people that already exist being
- 00:17:12duplicated right which could also be
- 00:17:14incredibly dangerous if something is
- 00:17:16posted on social media or just
- 00:17:19completely uh anonymously generated
- 00:17:22humans as well that are actually AI yes
- 00:17:24you mentioned algorithms uh you
- 00:17:27mentioned Bots and our first experience
- 00:17:29with these words and with these tools
- 00:17:31and phenomena I think was in social
- 00:17:33media and you said that in our first
- 00:17:36kind of battle with AI or contact with
- 00:17:38the AI we lost yes with social media how
- 00:17:41do you see the social media sphere now
- 00:17:43because I see the world increasingly
- 00:17:45polarized and there's it's so difficult
- 00:17:49to find common ground and Common Sense
- 00:17:51where people are constantly looking to
- 00:17:54feel right and good about their ideas
- 00:17:56instead of identifying facts where they
- 00:17:59care more about opinions than actually
- 00:18:01facts which is again is very worrying in
- 00:18:04political uh elections and the sort and
- 00:18:08of course we've got a big one coming in
- 00:18:09the United States we saw what happened
- 00:18:11the last time around so I know there's a
- 00:18:13lot to unpack there but
- 00:18:16as far as artificial intelligence and
- 00:18:18the impact that it is having in our
- 00:18:20behavior and how it can affect the
- 00:18:22outcome of Elections
- 00:18:23what do you see
- 00:18:26so again if we look back at social media
- 00:18:29so we know it's of course not the only
- 00:18:31reason for polarization in the world but
- 00:18:34it is a major reason and it's the first
- 00:18:36time that decisions made by non-human
- 00:18:40intelligence have changed human politics
- 00:18:44now when you ask many of these companies
- 00:18:48Facebook and YouTube and Twitter they
- 00:18:50say it's not us
- 00:18:51it's not us we are just a platform just
- 00:18:55as you Can't Blame a radio the
- 00:18:57technology of radio if Hitler gives a
- 00:18:59speech on the radio and you cannot blame
- 00:19:02the printing press if somebody uses the
- 00:19:04printing press to print some fake news
- 00:19:07propaganda you cannot blame us if if
- 00:19:09humans go online and create some
- 00:19:12conspiracy theory that radicalizes
- 00:19:14people and polarizes Society
- 00:19:18naive and don't know their own power or
- 00:19:21they are lying to us they're lying to us
- 00:19:23because because they've got data because
- 00:19:26newspapers didn't have data radios
- 00:19:28didn't have data it's not just the data
- 00:19:30it's the decisions
- 00:19:31that radio did not push you know radio
- 00:19:35did not tell you what to listen to you
- 00:19:38had to actively choose I want to listen
- 00:19:41to this station I want to listen to that
- 00:19:43station what happened in social media
- 00:19:45over the last 10 years or so is that you
- 00:19:48had recommendation algorithms that
- 00:19:51actively push content
- 00:19:53to people recommending read this article
- 00:19:57watch this video and even more than that
- 00:20:00if you watch YouTube and you do nothing
- 00:20:03it keeps showing you one video after the
- 00:20:05other it doesn't you don't need to do
- 00:20:07anything active you just sit there
- 00:20:09I don't know you start with I don't know
- 00:20:10you want to know something about the 911
- 00:20:12attacks
- 00:20:15so it starts with I don't know a video
- 00:20:18from CNN
- 00:20:19and you do nothing within a couple of
- 00:20:22iterations it would show you the most
- 00:20:24outrageous conspiracy theories
- 00:20:26and the algorithm chose to show you that
- 00:20:31and the algorithm is making the choice
- 00:20:33not randomly it has aims it was given an
- 00:20:37aim by the platforms to increase user
- 00:20:41engagement like they said okay today we
- 00:20:44have 200 million people watching 40
- 00:20:47minutes a day by next year it should be
- 00:20:50300 million people watching for 50
- 00:20:53minutes a day that was the aim of the
- 00:20:55algorithm go out and get us more
- 00:20:57eyeballs get us more minutes on our
- 00:21:00platform and the algorithm went out and
- 00:21:03discovered by trial and error by
- 00:21:05experimenting on hundreds of millions of
- 00:21:07people
- 00:21:08that the easiest way to grab our
- 00:21:11attention to keep us watching more and
- 00:21:14more is to press the outrage button
- 00:21:18if you have one video which with this
- 00:21:21outrageous conspiracy theory which
- 00:21:23encourages hate and polarization and you
- 00:21:26have another video which is much more
- 00:21:28moderate and calm the algorithm would go
- 00:21:31for the outrage because it increases
- 00:21:33user engagement
- 00:21:35and this was this very primitive AI
- 00:21:39and the AI of the present generation
- 00:21:42it can know you much much better than
- 00:21:46the AI say of 2016
- 00:21:49and it can also generate the content by
- 00:21:52itself it doesn't need to wait for a
- 00:21:54human to create this outrageous fake
- 00:21:58news story it can create the fake news
- 00:22:01story or video by itself
- 00:22:04so again if we don't regulate that
- 00:22:07um the chances of democracies surviving
- 00:22:12are very very low because again the
- 00:22:15dictatorships will survive dictatorships
- 00:22:17they flourish on chaos and on mistrust
- 00:22:21if you cause people to mistrust each
- 00:22:23other
- 00:22:24until they cannot agree on anything then
- 00:22:28the only way to still have a society is
- 00:22:30have a dictatorship
- 00:22:32to have a democracy you need that people
- 00:22:34trust each other and can have a
- 00:22:37meaningful conversation
- 00:22:40one thing that I think about a lot is
- 00:22:42education
- 00:22:44um the Francis Foundation is an
- 00:22:47education platform
- 00:22:49um there has been a particular structure
- 00:22:52to education of young people throughout
- 00:22:55their formative years and the way I look
- 00:22:57at technology and especially with AI
- 00:22:59coming and my little experience with it
- 00:23:02is
- 00:23:03what is this going to cause as far as
- 00:23:06the structure of Education globally
- 00:23:09where children won't really have to
- 00:23:12study that much and read books that much
- 00:23:15in order to be able to have access to
- 00:23:17everything within a few seconds on their
- 00:23:19tablet or on their phone
- 00:23:21how how do you see the the future of
- 00:23:24education and and the generations to
- 00:23:26come and and the kind of information
- 00:23:28that they will have
- 00:23:31you know it's the biggest experiment in
- 00:23:33history conducted on billions of people
- 00:23:35we have no idea what the social
- 00:23:38consequences the psychological
- 00:23:40consequences will be
- 00:23:42um what happens when you can get just
- 00:23:45any answer to any question any any
- 00:23:47question you have you don't need to do
- 00:23:49anything you just ask your AI assistant
- 00:23:52and you know even Google is terrified
- 00:23:54because of that because in Google
- 00:23:58um you still had to okay you you Googled
- 00:24:00something it gave you a list of websites
- 00:24:03you usually go for the first one
- 00:24:05and that's a problem but okay you do it
- 00:24:07but then you hit to to do something
- 00:24:09yourself
- 00:24:10you have to read it
- 00:24:12um now it's a new generation of AI you
- 00:24:15don't you just ask the question
- 00:24:18and you get an answer
- 00:24:19and
- 00:24:21the big question is do you still have
- 00:24:24some kind of ability of CR for critical
- 00:24:27thinking to doubt
- 00:24:29the question you received
- 00:24:31and to search for alternative questions
- 00:24:35or for come up with your own questions
- 00:24:37and sorry alternative answers and to
- 00:24:40come up with your own answers
- 00:24:42um and we don't know again you have this
- 00:24:46kind of tendency to say yeah we had many
- 00:24:48previous Revolutions in information
- 00:24:49technology that people were afraid when
- 00:24:52books came along that people will not be
- 00:24:54able to remember anything because they
- 00:24:55now everything is written but as I said
- 00:24:58in the beginning AI is fundamentally
- 00:25:00different from every previous
- 00:25:02Information Technology it can make
- 00:25:04decisions it can create content by
- 00:25:07itself we will never faced by this kind
- 00:25:11of Technology
- 00:25:14um and so we have no idea what the
- 00:25:18outcome would be in 10 or 20 years in
- 00:25:20terms of Education or of anything else
- 00:25:23I want to move away from AI specifically
- 00:25:26for a little bit but continue on on
- 00:25:28technology which continues to have such
- 00:25:30an impact in our daily lives I think
- 00:25:32most of us can be too far away from our
- 00:25:34phones for Better or For Worse we can do
- 00:25:36everything uh on there when it comes to
- 00:25:39technology
- 00:25:41um
- 00:25:42it is affecting the way in which we
- 00:25:45relate to each other
- 00:25:47um you know sometimes I I talk to my
- 00:25:50friends uh about the fact how in a way
- 00:25:52we're regressing we're ever more
- 00:25:55connected than we have been before but
- 00:25:58we communicate in a more distant way
- 00:26:01than perhaps we have in in previous
- 00:26:03decades where you know we've kind of
- 00:26:05gone back to the Egyptians with the
- 00:26:07hieroglyphics with the Emojis you know
- 00:26:09and this is the way we're communicating
- 00:26:10with each other I know you've you've
- 00:26:13spent so much time thinking and writing
- 00:26:15about human relationships and the way in
- 00:26:19which you see the homo sapiens
- 00:26:23interacting with each other what what
- 00:26:25would you say about the way that we are
- 00:26:27dealing with one another these days and
- 00:26:30how how it will evolve because physical
- 00:26:33contact seems to be harder and harder to
- 00:26:35get because more people are staying at
- 00:26:37home they're not going to work as much a
- 00:26:39lot of dating starts online and I know
- 00:26:43that for example with your partner you
- 00:26:45met your partner online right so how do
- 00:26:48you see this this development of
- 00:26:50personal relationships and connections
- 00:26:52which we continue to need in order to be
- 00:26:55sane I guess
- 00:26:58yeah um so again there is positive
- 00:27:01potential there is negative potential I
- 00:27:03talked a lot about and I tend to talk a
- 00:27:04lot a lot about the negative potential
- 00:27:06because you know the people who develop
- 00:27:08the technology
- 00:27:09they talk a lot about all its good good
- 00:27:12sides all its promises and there are of
- 00:27:15course a lot of positive uh potential
- 00:27:18benefits as you mentioned I met my
- 00:27:20husband online more than 20 years ago in
- 00:27:23one of the first gay dating sites that
- 00:27:26were available in Israel and this was
- 00:27:28and the technology was was very
- 00:27:30important and beneficial because you
- 00:27:33know throughout history
- 00:27:35um you had minorities which were
- 00:27:37concentrated in one place like if you're
- 00:27:40Jewish then you're usually born to a
- 00:27:42Jewish Family in a Jewish community so
- 00:27:45you know lots of other Jews so even if
- 00:27:47you are a small minority
- 00:27:50um you have this kind of basic
- 00:27:52connection with other people like you
- 00:27:53but with gay people it's different
- 00:27:55you're not born to a gay family in a gay
- 00:27:58community
- 00:28:00um and you know being grazed growing up
- 00:28:02in a small town in Israel which was very
- 00:28:06homophobic Society at the time
- 00:28:08just meeting other people
- 00:28:10just meeting guys for dating was very
- 00:28:14difficult and then the internet came
- 00:28:16along
- 00:28:17and this was a wonderful development
- 00:28:19Because the Internet managed to connect
- 00:28:22this kind of of you know uh uh diffused
- 00:28:25minorities
- 00:28:26that are not living together in the same
- 00:28:29place suddenly it became much more easy
- 00:28:30to find each other
- 00:28:32so this was a great benefit of the
- 00:28:36technology
- 00:28:37and we shouldn't ignore the the the
- 00:28:40these benefits but as you said uh humans
- 00:28:46deep down we are social animals
- 00:28:49and sociability ultimately is is about
- 00:28:54the body it's not just about the mind
- 00:28:57and you have this discussion for you
- 00:29:00know for thousands of years about what
- 00:29:03humans really are
- 00:29:07an immaterial Soul or an immaterial mind
- 00:29:11or are they embodied beings embodied
- 00:29:15entities
- 00:29:17and um this was a major philosophical
- 00:29:19topic that you see say in ancient
- 00:29:22Christianity this discussion that Jesus
- 00:29:25and the first Christians influenced by
- 00:29:28Jewish Traditions they believed very
- 00:29:30firmly that humans are bodies
- 00:29:34which is why Christ rises in the body
- 00:29:36he's resurrected in the body and when
- 00:29:39Christ initially talks about the Kingdom
- 00:29:41of Heaven he means the Kingdom of Heaven
- 00:29:44on Earth
- 00:29:45he tells his followers that they'll be
- 00:29:49this perfect kingdom here on Earth you
- 00:29:52know his trees and stones and people
- 00:29:54but over time under the influence
- 00:29:57especially of platonic philosophy
- 00:29:59Christianity drifted away
- 00:30:02from this view of humans as embodied
- 00:30:06and placed greater and greater emphasis
- 00:30:09on the immaterial soul or mind
- 00:30:12it imagined that the bodies is dirty the
- 00:30:17body is animalistic the body there is
- 00:30:19something wrong with it
- 00:30:20and when you die you are not coming back
- 00:30:24in the body your soul is liberated from
- 00:30:28the material body and it goes not to a
- 00:30:30kingdom on Earth but to Heaven which is
- 00:30:34a completely immaterial Realm
- 00:30:36so the Christian fantasy became to
- 00:30:39completely disconnect from the body and
- 00:30:42this remained a fantasy for thousands of
- 00:30:44years now with the technology of the
- 00:30:4721st century a lot of very ancient
- 00:30:50philosophical and Theological debates
- 00:30:52are becoming practical
- 00:30:55so going with your example let's say we
- 00:30:57you have some teenager or some person
- 00:30:59whatever age sitting at home never
- 00:31:02leaving home in front of a screen maybe
- 00:31:05with some 3D glasses or something they
- 00:31:09live their lives online
- 00:31:11in a way they are realizing the platonic
- 00:31:14ideal of disconnecting the soul or mind
- 00:31:18from the body
- 00:31:20so
- 00:31:21what do we see there is it
- 00:31:24um
- 00:31:25human being trapped within a room losing
- 00:31:29connection to the real world or is it a
- 00:31:32human being liberated from the
- 00:31:35restrictions of the biological body with
- 00:31:38all its messiness and and then dirt and
- 00:31:41whatever and liberating their spirit
- 00:31:45to wander around the immaterial Heavenly
- 00:31:49realm of cyberspace well there are no
- 00:31:51bodies
- 00:31:52now personally if you ask me
- 00:31:55um I think the early Christians were
- 00:31:57right that humans are embodied entities
- 00:32:00if you try to disconnect the Mind from
- 00:32:03the body or the soul from the body it
- 00:32:05leads in a very dangerous Direction
- 00:32:08that you you can't really do it
- 00:32:11and it will be destructive
- 00:32:12psychologically and socially
- 00:32:15but that's just my opinion you have lots
- 00:32:17of people today when you hear for
- 00:32:19instance people talk about the metaverse
- 00:32:21the metaverse is exactly that it's the
- 00:32:25idea of creating an immaterial Realm
- 00:32:29which is made entirely of data
- 00:32:32there is no biology there there is no
- 00:32:34physics as we know it there
- 00:32:37I wanted to get your take on how you see
- 00:32:40the future of of work
- 00:32:44um I was I was seeing yesterday that
- 00:32:47that BT British Telecom is laying off 55
- 00:32:50000 people until 2030 because of the
- 00:32:53introduction of AI and increased
- 00:32:55technology that will not need as many
- 00:32:59human beings to be sitting in seats as
- 00:33:01they are today what's the most important
- 00:33:04skill the most important resource that
- 00:33:06you think we need to have as human
- 00:33:08beings to flourish and to survive in the
- 00:33:12coming years
- 00:33:13the ability to keep changing
- 00:33:16no single skill
- 00:33:18will is safe
- 00:33:20like previously people thought that okay
- 00:33:22all these jobs will disappear but we'll
- 00:33:24need lots of coders to write code all
- 00:33:27the AI all the algorithms the new
- 00:33:30generation of AI can code
- 00:33:32so maybe you don't need any human coders
- 00:33:34or very few human coders in in a couple
- 00:33:36of years we don't know what are the most
- 00:33:40the safe skills we do know that the job
- 00:33:42market will keep changing
- 00:33:44it's not that all jobs will disappear
- 00:33:47totally
- 00:33:48many jobs will disappear but many new
- 00:33:51jobs will emerge which we can hardly
- 00:33:53even imagine today
- 00:33:55but people will need to retrain
- 00:33:57themselves
- 00:33:58and this is difficult not just
- 00:33:59financially like you lose your old jobs
- 00:34:01your old job you need a couple of weeks
- 00:34:04months maybe years to retrain for a new
- 00:34:07job who is going to support you during
- 00:34:08this transition
- 00:34:10the psychological problem is is even
- 00:34:13bigger we are not built for this rapid
- 00:34:17constant changes
- 00:34:19think that you have to reinvent yourself
- 00:34:22professionally even personally every
- 00:34:24five years who can do that
- 00:34:27so it's not that we need to learn a
- 00:34:29specific skill we need to kind of
- 00:34:32rebuild our entire mind to be much more
- 00:34:37flexible
- 00:34:38and this is something that individuals
- 00:34:40cannot do by themselves eigen
- 00:34:43governments societies will have to step
- 00:34:45in and uh provide with I I think with
- 00:34:50mental and psychological support
- 00:34:52because the biggest difficulty will be
- 00:34:55the the stress change is always
- 00:34:57stressful and the level of stress in the
- 00:35:00world just keeps Rising more and more
- 00:35:02and we are I think we are close to the
- 00:35:04point when we can just can't take it
- 00:35:06anymore it's moving too fast
- 00:35:09and going back to what we discussed at
- 00:35:11the very beginning that we are organic
- 00:35:13beings
- 00:35:15we need to sleep we need to rest we need
- 00:35:18to take time off it's difficult for us
- 00:35:20to change and we now live in a world
- 00:35:23increasingly dominated by inorganic
- 00:35:26entities that never sleep that don't
- 00:35:28need to rest that go don't go on
- 00:35:30vacations and that change at a pace
- 00:35:34which is just impossible for organic
- 00:35:37entities
- 00:35:38so if we don't find a way to basically
- 00:35:42give us a break
- 00:35:44to slow things down a little so that our
- 00:35:48organic bodies can keep Pace we will
- 00:35:52break down and psychologically
- 00:35:55we're getting close to our quick fire
- 00:35:57questions that we ask all our guests
- 00:35:59before we do and also in a kind of short
- 00:36:03answer ish I'm not good in that short
- 00:36:07answer is I don't think that's a word
- 00:36:08but I just made it up
- 00:36:11um
- 00:36:12what
- 00:36:14what gets you the most excited right now
- 00:36:16most excited optimistically and what
- 00:36:19drives you absolutely insane about the
- 00:36:23world today so I guess a sentence maybe
- 00:36:25so what drives me insane is the need for
- 00:36:28excitement okay I think that constant
- 00:36:32stimulation right yeah I think you know
- 00:36:34especially in the states I don't know
- 00:36:35how it is here in Portugal but when you
- 00:36:37go to the States now everything is
- 00:36:39exciting like you meet somebody I'm so
- 00:36:41excited to meet you and you have a new
- 00:36:43idea oh this idea is like your business
- 00:36:45plan is so exciting the world has
- 00:36:47totally lost its meaning yeah and people
- 00:36:49forgot that it's not good to be excited
- 00:36:51all the time
- 00:36:53it's just not good for you it's
- 00:36:54exhausting too right it's exhausting so
- 00:36:57I think people should say I'm so relaxed
- 00:36:59to meet you
- 00:37:02okay what gets you excited then without
- 00:37:05getting exhausted
- 00:37:07um what are you positive most positive
- 00:37:10about right now today yeah in in the
- 00:37:12world besides the Portuguese dinner
- 00:37:13you're going to have
- 00:37:16um
- 00:37:16well I think that humans have an still
- 00:37:19an amazing ability to change we don't
- 00:37:23really know ourselves we have immense
- 00:37:26potential which we are not even aware of
- 00:37:28one of the worst things about the AI
- 00:37:30Revolution is that it could cause us to
- 00:37:33lose much of our potential without even
- 00:37:36realizing that we've lost it
- 00:37:40um that you know I said that AI is
- 00:37:43nowhere near its full potential that
- 00:37:44it's just a baby I think humans are in
- 00:37:47the same situation we are nowhere near
- 00:37:50our full potential if we invest for
- 00:37:53every Euro and every minute that we
- 00:37:56invest in developing artificial
- 00:37:58intelligence
- 00:37:59we invest another Euro and another
- 00:38:02minute in developing our ourselves
- 00:38:05are our minds our Consciousness then we
- 00:38:07will be okay
- 00:38:09but of course it's the the incentive
- 00:38:12structure
- 00:38:13of society of the economy it pushes Us
- 00:38:16in the wrong direction
- 00:38:19and um you know like with the algorithms
- 00:38:21in social media we need to change the
- 00:38:23incentive structure yeah okay quick fire
- 00:38:26time yes I feel like I should get closer
- 00:38:29but I'm
- 00:38:30um what is one personality trait that a
- 00:38:33good leader could really benefit from
- 00:38:35having and why so in one sentence please
- 00:38:41inauthenticity not being authentic
- 00:38:45um the worst thing what we now have as
- 00:38:47are all these authentic leaders who
- 00:38:50think that politics is something like
- 00:38:52Psychotherapy that we should just say
- 00:38:55the first things that comes to your mind
- 00:38:57no politics is not therapy you want to
- 00:39:01say the first thing that comes to your
- 00:39:03mind you want to let your Eid lose go to
- 00:39:06therapy
- 00:39:08when you go to politics you build a wall
- 00:39:11not on the border of the country you
- 00:39:13build a wall between your mind and your
- 00:39:16mouth
- 00:39:17and you're very careful about which
- 00:39:20immigrants are passing through the world
- 00:39:24maybe you're referring to someone in
- 00:39:26particular what is the biggest challenge
- 00:39:29humanity is facing now
- 00:39:32itself its inability to cooperate again
- 00:39:35I would say AI I would say uh uh the the
- 00:39:38climate change if we can cooperate we
- 00:39:42can deal with AI and climate change if
- 00:39:44we don't cooperate it's it's hopeless
- 00:39:46you know just a quick aside in Brackets
- 00:39:49sometimes I think about the movie
- 00:39:51arrival when the alien species comes
- 00:39:54down and it's the only time where we
- 00:39:56actually collaborate and talk the same
- 00:39:58language to deal with this one issue it
- 00:40:01doesn't work I mean you're probably not
- 00:40:03going to take that I think but it's yeah
- 00:40:06again you don't have to react to that I
- 00:40:07just I just think of that moment as a
- 00:40:10like alien species coming down
- 00:40:12we just encountered an alien
- 00:40:15intelligence here on Earth it didn't
- 00:40:17come from outer space it came from
- 00:40:19Silicon Valley yeah and we are not
- 00:40:22uniting in the face of it we are just
- 00:40:24bickering and fighting even more yeah
- 00:40:26okay last two questions if you could
- 00:40:28change one thing in the world by Magic
- 00:40:30today what would it be hmm with your
- 00:40:33Harry Potter
- 00:40:39change one thing in the world
- 00:40:41hmm
- 00:40:43ability for us to get together maybe or
- 00:40:45get along I mean I already said that
- 00:40:47that this is like my my I would say that
- 00:40:49I would change the
- 00:40:52um
- 00:40:52um
- 00:40:54human understanding of of themselves
- 00:40:57to
- 00:40:59Place more emphasis
- 00:41:02on our shared biological reality
- 00:41:07and less on all the kind of fictional
- 00:41:10stories that we create with our minds
- 00:41:13because in our minds all of us
- 00:41:15individuals and also groups of people
- 00:41:18every nation creates a story that we are
- 00:41:20different we are special we are better
- 00:41:23than all of them so we can't cooperate
- 00:41:25with them and we should focus on
- 00:41:27ourselves
- 00:41:28and
- 00:41:29um if we could just let go of these
- 00:41:31stories for a little while and again go
- 00:41:34back to the level of the body
- 00:41:35we would realize that we are all the
- 00:41:38same you know thing I think said its
- 00:41:40best in Russians in this uh song from
- 00:41:44the 1980s that we share the same biology
- 00:41:47regardless of ideology
- 00:41:50it was true in the 1980s it was kind of
- 00:41:52his vision for how to overcome the Cold
- 00:41:56War and prevent nuclear disaster and
- 00:41:58it's still true today we all of us share
- 00:42:00the same biology irrespective of our
- 00:42:03ideology and religion and whatever
- 00:42:04didn't he also say the Russians have
- 00:42:06children too or something like that he
- 00:42:09said it I I hope the Russians love their
- 00:42:11children too right which of course the
- 00:42:13answer is yes obviously again that's
- 00:42:15everybody loves their children and if we
- 00:42:18remember that yeah and let go a little
- 00:42:21of our ideological and religious
- 00:42:24fantasies that That's the basis to to
- 00:42:26create a better world for everybody okay
- 00:42:28that that was an outrageously long
- 00:42:30answer I know that was a lot longer than
- 00:42:33one sentence so one sentence for this
- 00:42:34one otherwise uh they'll kill me
- 00:42:38um what's the most important learning of
- 00:42:41your life
- 00:42:42to date
- 00:42:43the one thing
- 00:42:46um meditation
- 00:42:47to observe myself
- 00:42:50again to observe how my mind constantly
- 00:42:54creates fictional stories
- 00:42:56and be able to let go of them for a
- 00:42:59little while and just be in touch with
- 00:43:02the reality as it is
- 00:43:03that was one sentence thank you so much
- 00:43:05you all know a Harari an absolute
- 00:43:08pleasure the light
- 00:43:10to have you with us here on the first
- 00:43:13ever Live recorded show of it's not that
- 00:43:17simple thank you
- 00:43:19[Applause]
- 00:43:21[Music]
- 00:43:27[Music]
- Yuval Noah Harari
- Inteligência Artificial
- Humanidade
- Tecnologia
- Desafios sociais
- Educação
- Cooperação
- Divisão social
- Futuro da educação
- Mudanças tecnológicas