Yuval Noah Harari: Humanidade, não é assim tão simples

00:43:36
https://www.youtube.com/watch?v=oQMxYU6EYZQ

Ringkasan

TLDRNeste episódio gravado ao vivo, Yuval Noah Harari discute os desafios da humanidade em 2023, abordando questões sobre IA, tecnologia e a crescente divisão social. Ele enfatiza a capacidade da IA de fazer decisões próprias e gerar conteúdo, podendo afetar profundamente as interações sociais e a política. Harari alerta sobre a necessidade de colaboração entre as pessoas para enfrentar crises como o colapso ecológico e as disrupturas tecnológicas. Além disso, ele expressa preocupações sobre a educação e a forma como a tecnologia influencia nossos relacionamentos. O vídeo termina com Harari reflexionando sobre a necessidade de autoobservação e compreensão da realidade além das histórias que contamos a nós mesmos.

Takeaways

  • 🌍 A humanidade possui grande poder de criação e destruição.
  • ⚠️ Desafios ecológicos e tecnológicos estão à frente.
  • 🤖 A IA pode tomar decisões que afetam a sociedade.
  • 💡 A IA está em fase inicial de desenvolvimento.
  • 📚 A educação pode mudar drasticamente com a IA.
  • 🌐 Precisamos de colaboração para enfrentar crises globais.
  • ⌛ Adaptabilidade é essencial em um mundo em rápida mudança.
  • 🌀 As histórias que contamos moldam nossa percepção da realidade.

Garis waktu

  • 00:00:00 - 00:05:00

    Este episódio marca a primeira gravação ao vivo do programa, com a participação do renomado autor e historiador Yuval Noah Harari. O foco da discussão é a humanidade em 2023, destacando os desafios enfrentados, como o colapso ecológico e a disrupção tecnológica, e a crescente divisão entre as sociedades.

  • 00:05:00 - 00:10:00

    Harari aborda a rápida evolução da inteligência artificial (IA), comparando sua evolução inicial com organismos simples. Ele destaca que a IA, atualmente em um estágio primitivo, tem potencial para evoluir rapidamente, afetando o futuro da humanidade e levantando questões sobre sua capacidade de tomar decisões e criar novas ideias.

  • 00:10:00 - 00:15:00

    A IA é discutida como uma tecnologia única que pode fazer decisões de forma autônoma, diferentemente de tecnologias anteriores. Harari alerta que, enquanto a IA evolui, é fundamental que a sociedade reflita sobre suas responsabilidades e os riscos associados a essa evolução.

  • 00:15:00 - 00:20:00

    O palestrante menciona a necessidade de regulamentação na implementação da IA, comparando-a a outros produtos que requerem verificações de segurança antes do lançamento. Ele defende que a regulação deve focar na segurança e evitar que tecnologias perigosas sejam liberadas sem supervisão.

  • 00:20:00 - 00:25:00

    Em meio à discussão sobre a desinformação e a polarização nas redes sociais, Harari destaca como a IA pode alterar a conversa democrática. Ele menciona o perigo de bots que se disfarçam como humanos e afetam a confiança nas discussões sociais.

  • 00:25:00 - 00:30:00

    Sobre a educação, Harari expressa preocupação com o impacto da IA nas habilidades de pensamento crítico das futuras gerações, questionando o que significa aprender em uma era onde respostas estão facilmente disponíveis.

  • 00:30:00 - 00:35:00

    Ele reflete sobre a natureza humana e os desafios sociais na era da tecnologia. A conexão humana está se tornando mais distante, e isso tem repercussões em nossas relações pessoal e profissional.

  • 00:35:00 - 00:43:36

    Finalmente, Harari fala sobre o futuro do trabalho e a necessidade de adaptabilidade, enfatizando a importância de as pessoas desenvolverem a capacidade de se reinventarem continuamente diante das rápidas mudanças no mercado de trabalho.

Tampilkan lebih banyak

Peta Pikiran

Video Tanya Jawab

  • Qual é o principal desafio que a humanidade enfrenta atualmente?

    A principal dificuldade da humanidade é a sua incapacidade de cooperar e a crescente divisão social.

  • Como a IA está afetando a sociedade?

    A IA representa tanto oportunidades quanto riscos, como a possibilidade de transformação do trabalho e criação de ideias, mas também pode ameaçar a democracia.

  • Qual é a visão de Harari sobre o futuro da educação?

    Harari acredita que a educação passará por grandes transformações devido à velocidade da IA, mas ainda não se sabe os efeitos a longo prazo.

  • Qual a importância do descanso no contexto de mudanças rápidas?

    Segundo Harari, precisamos de pausas e descansos, pois o ritmo acelerado da tecnologia pode ser estressante e insustentável para os humanos.

  • Como a tecnologia impacta as relações pessoais?

    Embora facilite conexões, a tecnologia também pode levar a uma desconexão física e emocional entre as pessoas.

  • Como devemos regular a IA?

    Harari sugere que a regulamentação do uso da IA deve ser feita para garantir segurança e evitar manipulações perigosas.

  • Qual a relação entre IA e decisões humanas?

    A IA é a primeira tecnologia que pode tomar decisões independentes, o que muda a dinâmica do poder e controle.

  • Harari acredita que as histórias que as pessoas criam são boas ou ruins?

    Ele sugere que devemos observar como nossas histórias podem distorcer nossa percepção da realidade e das interações sociais.

Lihat lebih banyak ringkasan video

Dapatkan akses instan ke ringkasan video YouTube gratis yang didukung oleh AI!
Teks
en
Gulir Otomatis:
  • 00:00:00
    foreign
  • 00:00:03
    [Music]
  • 00:00:32
    hello and welcome to a very special
  • 00:00:33
    edition of it's not that simple special
  • 00:00:36
    because it's the first one ever recorded
  • 00:00:39
    in first in front of a live audience
  • 00:00:41
    here at this in Lisbon and also because
  • 00:00:44
    we're delighted to have our special
  • 00:00:47
    guest renowned author historian as well
  • 00:00:50
    Yuval Noah Harari it'll be fantastic to
  • 00:00:54
    be able to discuss with him his
  • 00:00:57
    perspective on Humanity which is our
  • 00:00:59
    topic of the day of course he is
  • 00:01:01
    globally known for a variety of work in
  • 00:01:05
    this area about the past present and
  • 00:01:07
    future of humanity and we're going to
  • 00:01:09
    try to explain why it's not that simple
  • 00:01:13
    um Yuval welcome to Portugal thank you
  • 00:01:16
    and the premise of this show is trying
  • 00:01:19
    to break down complicated subjects
  • 00:01:21
    today's topic is particularly
  • 00:01:24
    complicated to set the stage I think
  • 00:01:26
    it'd be nice for you to try to paint a
  • 00:01:29
    picture of humanity in 2023
  • 00:01:32
    and tell us what you see and why it's
  • 00:01:36
    difficult perhaps to preview the future
  • 00:01:39
    at this time
  • 00:01:42
    well we all know almost like gods in
  • 00:01:46
    terms of our powers of creation and
  • 00:01:48
    destruction
  • 00:01:49
    we now have the power to create new life
  • 00:01:52
    forms but also to destroy much of life
  • 00:01:56
    on Earth including ourselves
  • 00:01:58
    we are facing
  • 00:02:00
    two really big challenges on the one
  • 00:02:04
    hand the threat of ecological collapse
  • 00:02:08
    on the other hand the threat of
  • 00:02:10
    technological disruption we are creating
  • 00:02:13
    extremely powerful tools like AI that
  • 00:02:16
    could uh undermine human civilization
  • 00:02:20
    and maybe the worst thing is that
  • 00:02:23
    instead of uniting in order to face
  • 00:02:26
    these common challenges to our species
  • 00:02:29
    we are dividing we are fighting each
  • 00:02:33
    other more and more the rising tensions
  • 00:02:37
    on the international level there are
  • 00:02:40
    rising tensions within societies uh one
  • 00:02:44
    Society after the other is really on the
  • 00:02:46
    brink of collapse
  • 00:02:48
    so you know maybe the most important
  • 00:02:50
    question to ask about humans about
  • 00:02:52
    humanity is if we are so smart why are
  • 00:02:57
    we doing so many stupid things
  • 00:03:00
    why are we called our species homo
  • 00:03:03
    sapiens
  • 00:03:04
    wise humans and yet we are engaged in so
  • 00:03:09
    many self-destructive activities
  • 00:03:12
    and we seem unable to stop ourselves
  • 00:03:16
    so that that I think is the Paradox of
  • 00:03:19
    of the the the smart humans the wise
  • 00:03:21
    humans
  • 00:03:22
    it's a great place to start and you know
  • 00:03:24
    when I was introducing you obviously I
  • 00:03:26
    was very short because I want to make
  • 00:03:28
    the most out of these 40 minutes that I
  • 00:03:30
    have with you I could list your
  • 00:03:31
    accolades and your books and your
  • 00:03:33
    achievements better yeah yeah but let's
  • 00:03:34
    focus on the on on the content and um
  • 00:03:38
    obviously uh I've I've heard a lot of
  • 00:03:40
    your of your interviews of your talks
  • 00:03:43
    and there's something that we can't get
  • 00:03:45
    around at the moment in 2023 which is
  • 00:03:47
    technology the advance of Technology you
  • 00:03:50
    know when I was growing up and we're
  • 00:03:51
    more or less the same age I'm 48. I
  • 00:03:53
    still remember that pong game where you
  • 00:03:55
    had basically two white rectangles going
  • 00:03:59
    back and forth with the ball bouncing
  • 00:04:01
    off the the screen and it's crazy where
  • 00:04:04
    we are now you know I have a two and a
  • 00:04:05
    half year old daughter and this new
  • 00:04:07
    frontier of artificial intelligence is
  • 00:04:11
    incredibly dangerous I think because I'm
  • 00:04:14
    really wondering what her generation is
  • 00:04:16
    going to be like how they're going to
  • 00:04:18
    learn how they're going to do anything
  • 00:04:19
    for themselves
  • 00:04:21
    um
  • 00:04:22
    tell us about AI in your words the the
  • 00:04:25
    opportunities and the challenges that
  • 00:04:27
    you see
  • 00:04:28
    so we need to know three things about AI
  • 00:04:31
    first of all AI is still just a tiny
  • 00:04:35
    baby we haven't seen anything yet
  • 00:04:39
    um real AI deployed into the world not
  • 00:04:42
    in some laboratory or in science fiction
  • 00:04:44
    is about like 10 years old
  • 00:04:47
    and you know you look at the wonderful
  • 00:04:50
    scenery outside of of all these plants
  • 00:04:53
    and trees and you think about biological
  • 00:04:55
    evolution
  • 00:04:57
    the evolution of life on Earth took
  • 00:05:00
    something like 4 billion years four
  • 00:05:02
    billion years to reach these plants and
  • 00:05:05
    to reach us human beings
  • 00:05:08
    now ai is now at the stage like of I
  • 00:05:12
    don't know amoebas
  • 00:05:13
    it's like four billion years ago and the
  • 00:05:16
    first living organisms are crawling out
  • 00:05:19
    of the organic soup
  • 00:05:22
    and
  • 00:05:24
    so cha GPT and all these wonders they
  • 00:05:26
    are the amoebas of the AI world what
  • 00:05:30
    would T-Rex look like
  • 00:05:33
    and how long would it take for the AI
  • 00:05:37
    amoebas to evolve into the T-Rexes
  • 00:05:41
    and it won't take billions of years
  • 00:05:44
    maybe it takes just a few decades or a
  • 00:05:47
    few years
  • 00:05:48
    because
  • 00:05:49
    the evolution of AI is at a completely
  • 00:05:53
    different time scale than the evolution
  • 00:05:55
    of organic beings because AI itself
  • 00:05:58
    works on a different time scale
  • 00:06:01
    AI is always on
  • 00:06:03
    one of the computers in general are
  • 00:06:06
    always on
  • 00:06:08
    humans and other organic organisms they
  • 00:06:11
    they live they exist they develop by
  • 00:06:13
    Cycles
  • 00:06:15
    we need to rest sometimes hey I never
  • 00:06:18
    needs to rest
  • 00:06:20
    now the other two things we need to know
  • 00:06:22
    about AI
  • 00:06:24
    is that first it's the first technology
  • 00:06:26
    ever that can make decisions by itself
  • 00:06:29
    I hear a lot of people saying oh all
  • 00:06:31
    these worries about AI every time there
  • 00:06:33
    is a new technology people worry about
  • 00:06:35
    it and afterwards it's okay like when
  • 00:06:37
    people invented writing and printing
  • 00:06:40
    presses and airplanes they were so
  • 00:06:42
    worried and in the end it was okay I
  • 00:06:44
    will be the same it's not the same
  • 00:06:47
    no previous technology in history could
  • 00:06:49
    make decisions
  • 00:06:51
    you know even an atom bomb
  • 00:06:53
    actually empowered humans because an
  • 00:06:56
    atom bomb can destroy a city it cannot
  • 00:06:59
    decide which city to bomb you always
  • 00:07:02
    need a human to make the decision AI is
  • 00:07:05
    the first technology that can make
  • 00:07:06
    decisions by itself even about us
  • 00:07:09
    increasingly we apply to a bank to get a
  • 00:07:12
    loan it's an AI making the decisions
  • 00:07:15
    about us so it takes power away from us
  • 00:07:18
    the third thing about AI that everybody
  • 00:07:21
    needs to know
  • 00:07:23
    it's the first technology ever
  • 00:07:25
    that can create new ideas
  • 00:07:28
    you know the printing press radio
  • 00:07:30
    television they broadcast they spread
  • 00:07:34
    the ideas created by the human brain by
  • 00:07:38
    the human mind they cannot create a new
  • 00:07:40
    idea
  • 00:07:41
    you know Gutenberg printed the Bible in
  • 00:07:44
    the middle of the 15th century the the
  • 00:07:47
    printing press printed as many copies of
  • 00:07:50
    the Bible as Gutenberg instructed it but
  • 00:07:52
    it did not create a single new page
  • 00:07:56
    it had no ideas of its own about the
  • 00:07:59
    Bible is it good is it bad how to
  • 00:08:01
    interpret this how to interpret that
  • 00:08:04
    AI can create new ideas can even write a
  • 00:08:08
    new Bible we you know throughout history
  • 00:08:11
    religions dreamed about having a book
  • 00:08:15
    written by a superhuman intelligence by
  • 00:08:18
    a non-human entity or every religion
  • 00:08:21
    claims our book all the art books of the
  • 00:08:23
    other religions they humans wrote them
  • 00:08:25
    but our book no no no no it came from
  • 00:08:27
    some superhuman intelligence in a few
  • 00:08:30
    years there might be religions that are
  • 00:08:33
    actually correct
  • 00:08:35
    that just think about a religion whose
  • 00:08:38
    holy book is written by an AI that could
  • 00:08:41
    be a reality in a few years
  • 00:08:45
    you know
  • 00:08:46
    um when I Was preparing this interview I
  • 00:08:49
    wrote down questions that I would like
  • 00:08:50
    to ask you and then I asked chat GPT to
  • 00:08:54
    create 10 questions that I would like to
  • 00:08:56
    ask you and I've been doing this for a
  • 00:08:58
    long time okay 25 years I've been doing
  • 00:09:00
    this and it's questions were better than
  • 00:09:03
    mine I'm gonna be honest with you it was
  • 00:09:06
    absolutely insane and it took it five
  • 00:09:08
    seconds seven seconds to write it out
  • 00:09:10
    and I'm still using the first version
  • 00:09:11
    and when you talk about it being a baby
  • 00:09:13
    I'm like that's pretty scary baby we're
  • 00:09:15
    talking about here right
  • 00:09:17
    um and and and when you when you said it
  • 00:09:19
    at the end towards the end of your your
  • 00:09:21
    book 21 lessons for the 21st century
  • 00:09:23
    when we have maybe decades you see some
  • 00:09:26
    years left before when we need to
  • 00:09:28
    discover who we are before algorithms
  • 00:09:31
    tell us who we are how fast is that
  • 00:09:34
    happening and what does that process
  • 00:09:38
    look like before humans normally are
  • 00:09:41
    just going to go to an AI solution
  • 00:09:44
    before they go to this solution
  • 00:09:47
    it's it's moving faster than I think
  • 00:09:50
    almost anybody expected faster than I
  • 00:09:52
    expected despite all my engagement with
  • 00:09:55
    the field and what I wrote in homodels
  • 00:09:57
    and 21 lessons I'm still surprised by
  • 00:09:59
    how fast it is moving and how powerful
  • 00:10:02
    the new generation of AI is and actually
  • 00:10:05
    the new generation is not out it's it's
  • 00:10:07
    in the Laboratories but it's already
  • 00:10:09
    there it's even much more powerful than
  • 00:10:12
    chart GPT
  • 00:10:14
    um and I think that to have a fighting
  • 00:10:17
    chance we need time
  • 00:10:18
    again I said before humans we are
  • 00:10:21
    organic beings
  • 00:10:23
    we move in Cycles we move on organic
  • 00:10:26
    time
  • 00:10:27
    and we are the most maybe the most
  • 00:10:30
    adaptable animals on the planet but
  • 00:10:32
    adaptation itself requires time
  • 00:10:35
    and now we've reached a point when there
  • 00:10:37
    is no time AI is moving too fast and I
  • 00:10:41
    think that it's the responsibility
  • 00:10:43
    therefore of government to buy us Time
  • 00:10:46
    by slowing it down now I have no uh
  • 00:10:51
    illusion
  • 00:10:52
    that we can stop researching AI it's not
  • 00:10:55
    going to happen
  • 00:10:56
    but what I expect from governments is to
  • 00:10:59
    regulate the deployment of AI into
  • 00:11:02
    society which governments do it with so
  • 00:11:05
    many other products you know if I'm a a
  • 00:11:08
    drug company and I develop a new drug a
  • 00:11:11
    new medicine I can't just begin to sell
  • 00:11:13
    it to people
  • 00:11:15
    unless it first goes through a
  • 00:11:18
    complicated and sometimes lengthy
  • 00:11:20
    process of checking it for safety side
  • 00:11:23
    effects and so forth if I build a new
  • 00:11:26
    type of car I cannot just place it on
  • 00:11:29
    the road and start driving it has to go
  • 00:11:32
    through again a lengthy process of of
  • 00:11:34
    making sure it's safe
  • 00:11:37
    we need the same thing for AI it's it's
  • 00:11:39
    just common sense
  • 00:11:41
    there's no doubt about it but let's be
  • 00:11:42
    honest here different governments have
  • 00:11:44
    different interests and many times
  • 00:11:46
    they're not exactly walking around you
  • 00:11:48
    know playing nice with each other all
  • 00:11:49
    the time are you doing it no I'm not I
  • 00:11:52
    swear I'm not doing it it's how how can
  • 00:11:55
    we regulate this really and and hold
  • 00:11:57
    people accountable in this process that
  • 00:11:59
    is so dangerous because as you said
  • 00:12:01
    there are new ideas being created these
  • 00:12:03
    new ideas can fall into the wrong hands
  • 00:12:06
    yeah
  • 00:12:07
    so what are we looking at here from a a
  • 00:12:09
    governmental standpoint so again we need
  • 00:12:12
    to to distinguish between regulating
  • 00:12:15
    deployment and regulating development
  • 00:12:17
    ideally we should be able to regulate
  • 00:12:20
    development too but that's much much
  • 00:12:23
    more danger and difficult because of
  • 00:12:25
    this what you talked about of the arms
  • 00:12:27
    race like every government would say we
  • 00:12:29
    don't want to develop this dangerous
  • 00:12:31
    technology but the Chinese are doing it
  • 00:12:33
    the Israelis are doing it so we have to
  • 00:12:34
    do it also that's very difficult but
  • 00:12:36
    deployment is should be more easy
  • 00:12:39
    that okay you research new generation of
  • 00:12:42
    AI but you cannot deploy it let's say in
  • 00:12:45
    the EU
  • 00:12:46
    unless it goes first through these
  • 00:12:49
    safety checks now if the Chinese want to
  • 00:12:51
    deployment in China okay they can go
  • 00:12:53
    ahead and deploy it in China but they
  • 00:12:55
    can't do it in the EU and of course
  • 00:12:57
    democracies and dictatorships have
  • 00:13:00
    different needs
  • 00:13:02
    one of the dangers big dangers of AI is
  • 00:13:06
    that it will basically you know destroy
  • 00:13:09
    the Democratic conversation you know
  • 00:13:12
    dictatorships are based on dictating one
  • 00:13:15
    person dictating everything
  • 00:13:18
    democracy works by having a conversation
  • 00:13:20
    between people what should we do about
  • 00:13:22
    this or that now conversation is is done
  • 00:13:26
    with language
  • 00:13:28
    and the basis control conversation is
  • 00:13:31
    trust between people now what happens
  • 00:13:33
    when you have entities which are not
  • 00:13:36
    human at all and can converse with you
  • 00:13:40
    and be even more persuasive or create
  • 00:13:43
    intimate relationships with you and you
  • 00:13:46
    don't even know that they are not human
  • 00:13:49
    so you go online and have a discussion
  • 00:13:52
    about I don't know the Russian invasion
  • 00:13:54
    of Ukraine and it's a very nice person
  • 00:13:57
    and it Knows by some Supernatural
  • 00:14:01
    ability just how to uh say exactly what
  • 00:14:05
    you think and get into your good graces
  • 00:14:08
    and after a few weeks of building this
  • 00:14:10
    kind of very good relationship with you
  • 00:14:12
    it starts telling you you know actually
  • 00:14:14
    there are some uh valid reasons why the
  • 00:14:18
    Russians invaded Ukraine and you think
  • 00:14:20
    it's a human being
  • 00:14:21
    and maybe you develop feelings for that
  • 00:14:24
    person
  • 00:14:26
    but actually it's a Russian bot
  • 00:14:29
    now we know in in previous years
  • 00:14:33
    that there was a battle for human
  • 00:14:34
    attention between algorithms trying to
  • 00:14:37
    grab our attention
  • 00:14:39
    and this was one of the main reasons for
  • 00:14:41
    the spread of fake news and conspiracy
  • 00:14:44
    theories and hatred because the
  • 00:14:46
    algorithms discovered that the easiest
  • 00:14:48
    way to grab your attention is with
  • 00:14:50
    hatred and outrage and fear
  • 00:14:54
    now the Battlefront is shifting from
  • 00:14:57
    attention to intimacy
  • 00:15:00
    that will we if we don't regulate it
  • 00:15:03
    there will be an arms race
  • 00:15:05
    for intimacy different AIS competing to
  • 00:15:09
    create intimacy with us because the
  • 00:15:11
    easiest way to persuade you to change
  • 00:15:13
    your political views to buy some product
  • 00:15:16
    to vote from some politician is to
  • 00:15:19
    create an intimate connection
  • 00:15:21
    and this will destroy democracy if we
  • 00:15:24
    allow it to happen because democracy is
  • 00:15:25
    a conversation between people
  • 00:15:28
    if it becomes a conversation between
  • 00:15:30
    Bots and people where the Bots have all
  • 00:15:33
    the power that's the end of democracy
  • 00:15:35
    and again
  • 00:15:36
    this should be a very simple regulation
  • 00:15:38
    it's not regulating the development of
  • 00:15:41
    AI capable of forming intimacy
  • 00:15:44
    it's just have a very clear low
  • 00:15:47
    that it is illegal to counterfeit humans
  • 00:15:52
    for thousands of years we had laws that
  • 00:15:55
    it's illegal to counterfeit money if we
  • 00:15:57
    didn't have these laws the economic
  • 00:15:59
    system would have collapsed it's easy to
  • 00:16:01
    counterfeit money but if you do that you
  • 00:16:04
    go to jail for many many years
  • 00:16:07
    it should be the same with humans until
  • 00:16:09
    now there was no technology that could
  • 00:16:11
    do that so there were no such lows but
  • 00:16:13
    there should be a very clear and varying
  • 00:16:17
    extreme low
  • 00:16:19
    that if you or your platform
  • 00:16:22
    counterfeits humans you go to jail to 20
  • 00:16:25
    years
  • 00:16:26
    now it's okay you know like we need AI
  • 00:16:29
    doctors it's fine as long as the AI
  • 00:16:32
    tells us it's an AI
  • 00:16:33
    and you interact with it it tells you
  • 00:16:36
    hello I'm an AI doctor do you want some
  • 00:16:38
    advice on your whatever that's fine but
  • 00:16:41
    impersonating a human being and then
  • 00:16:44
    using it in order to manipulate you that
  • 00:16:47
    should be outlawed you know it's
  • 00:16:49
    interesting because many people here
  • 00:16:52
    maybe saw the case of Bruce Willis when
  • 00:16:54
    there was a deep fake of Bruce Willis
  • 00:16:56
    participating in an ad campaign in in
  • 00:17:00
    Russia and he actually had to come out
  • 00:17:01
    and say hey guys that's not me that's a
  • 00:17:04
    computer generated image and this is
  • 00:17:06
    just a small example of many others that
  • 00:17:08
    could come our way whether it's with
  • 00:17:10
    real people that already exist being
  • 00:17:12
    duplicated right which could also be
  • 00:17:14
    incredibly dangerous if something is
  • 00:17:16
    posted on social media or just
  • 00:17:19
    completely uh anonymously generated
  • 00:17:22
    humans as well that are actually AI yes
  • 00:17:24
    you mentioned algorithms uh you
  • 00:17:27
    mentioned Bots and our first experience
  • 00:17:29
    with these words and with these tools
  • 00:17:31
    and phenomena I think was in social
  • 00:17:33
    media and you said that in our first
  • 00:17:36
    kind of battle with AI or contact with
  • 00:17:38
    the AI we lost yes with social media how
  • 00:17:41
    do you see the social media sphere now
  • 00:17:43
    because I see the world increasingly
  • 00:17:45
    polarized and there's it's so difficult
  • 00:17:49
    to find common ground and Common Sense
  • 00:17:51
    where people are constantly looking to
  • 00:17:54
    feel right and good about their ideas
  • 00:17:56
    instead of identifying facts where they
  • 00:17:59
    care more about opinions than actually
  • 00:18:01
    facts which is again is very worrying in
  • 00:18:04
    political uh elections and the sort and
  • 00:18:08
    of course we've got a big one coming in
  • 00:18:09
    the United States we saw what happened
  • 00:18:11
    the last time around so I know there's a
  • 00:18:13
    lot to unpack there but
  • 00:18:16
    as far as artificial intelligence and
  • 00:18:18
    the impact that it is having in our
  • 00:18:20
    behavior and how it can affect the
  • 00:18:22
    outcome of Elections
  • 00:18:23
    what do you see
  • 00:18:26
    so again if we look back at social media
  • 00:18:29
    so we know it's of course not the only
  • 00:18:31
    reason for polarization in the world but
  • 00:18:34
    it is a major reason and it's the first
  • 00:18:36
    time that decisions made by non-human
  • 00:18:40
    intelligence have changed human politics
  • 00:18:44
    now when you ask many of these companies
  • 00:18:48
    Facebook and YouTube and Twitter they
  • 00:18:50
    say it's not us
  • 00:18:51
    it's not us we are just a platform just
  • 00:18:55
    as you Can't Blame a radio the
  • 00:18:57
    technology of radio if Hitler gives a
  • 00:18:59
    speech on the radio and you cannot blame
  • 00:19:02
    the printing press if somebody uses the
  • 00:19:04
    printing press to print some fake news
  • 00:19:07
    propaganda you cannot blame us if if
  • 00:19:09
    humans go online and create some
  • 00:19:12
    conspiracy theory that radicalizes
  • 00:19:14
    people and polarizes Society
  • 00:19:18
    naive and don't know their own power or
  • 00:19:21
    they are lying to us they're lying to us
  • 00:19:23
    because because they've got data because
  • 00:19:26
    newspapers didn't have data radios
  • 00:19:28
    didn't have data it's not just the data
  • 00:19:30
    it's the decisions
  • 00:19:31
    that radio did not push you know radio
  • 00:19:35
    did not tell you what to listen to you
  • 00:19:38
    had to actively choose I want to listen
  • 00:19:41
    to this station I want to listen to that
  • 00:19:43
    station what happened in social media
  • 00:19:45
    over the last 10 years or so is that you
  • 00:19:48
    had recommendation algorithms that
  • 00:19:51
    actively push content
  • 00:19:53
    to people recommending read this article
  • 00:19:57
    watch this video and even more than that
  • 00:20:00
    if you watch YouTube and you do nothing
  • 00:20:03
    it keeps showing you one video after the
  • 00:20:05
    other it doesn't you don't need to do
  • 00:20:07
    anything active you just sit there
  • 00:20:09
    I don't know you start with I don't know
  • 00:20:10
    you want to know something about the 911
  • 00:20:12
    attacks
  • 00:20:15
    so it starts with I don't know a video
  • 00:20:18
    from CNN
  • 00:20:19
    and you do nothing within a couple of
  • 00:20:22
    iterations it would show you the most
  • 00:20:24
    outrageous conspiracy theories
  • 00:20:26
    and the algorithm chose to show you that
  • 00:20:31
    and the algorithm is making the choice
  • 00:20:33
    not randomly it has aims it was given an
  • 00:20:37
    aim by the platforms to increase user
  • 00:20:41
    engagement like they said okay today we
  • 00:20:44
    have 200 million people watching 40
  • 00:20:47
    minutes a day by next year it should be
  • 00:20:50
    300 million people watching for 50
  • 00:20:53
    minutes a day that was the aim of the
  • 00:20:55
    algorithm go out and get us more
  • 00:20:57
    eyeballs get us more minutes on our
  • 00:21:00
    platform and the algorithm went out and
  • 00:21:03
    discovered by trial and error by
  • 00:21:05
    experimenting on hundreds of millions of
  • 00:21:07
    people
  • 00:21:08
    that the easiest way to grab our
  • 00:21:11
    attention to keep us watching more and
  • 00:21:14
    more is to press the outrage button
  • 00:21:18
    if you have one video which with this
  • 00:21:21
    outrageous conspiracy theory which
  • 00:21:23
    encourages hate and polarization and you
  • 00:21:26
    have another video which is much more
  • 00:21:28
    moderate and calm the algorithm would go
  • 00:21:31
    for the outrage because it increases
  • 00:21:33
    user engagement
  • 00:21:35
    and this was this very primitive AI
  • 00:21:39
    and the AI of the present generation
  • 00:21:42
    it can know you much much better than
  • 00:21:46
    the AI say of 2016
  • 00:21:49
    and it can also generate the content by
  • 00:21:52
    itself it doesn't need to wait for a
  • 00:21:54
    human to create this outrageous fake
  • 00:21:58
    news story it can create the fake news
  • 00:22:01
    story or video by itself
  • 00:22:04
    so again if we don't regulate that
  • 00:22:07
    um the chances of democracies surviving
  • 00:22:12
    are very very low because again the
  • 00:22:15
    dictatorships will survive dictatorships
  • 00:22:17
    they flourish on chaos and on mistrust
  • 00:22:21
    if you cause people to mistrust each
  • 00:22:23
    other
  • 00:22:24
    until they cannot agree on anything then
  • 00:22:28
    the only way to still have a society is
  • 00:22:30
    have a dictatorship
  • 00:22:32
    to have a democracy you need that people
  • 00:22:34
    trust each other and can have a
  • 00:22:37
    meaningful conversation
  • 00:22:40
    one thing that I think about a lot is
  • 00:22:42
    education
  • 00:22:44
    um the Francis Foundation is an
  • 00:22:47
    education platform
  • 00:22:49
    um there has been a particular structure
  • 00:22:52
    to education of young people throughout
  • 00:22:55
    their formative years and the way I look
  • 00:22:57
    at technology and especially with AI
  • 00:22:59
    coming and my little experience with it
  • 00:23:02
    is
  • 00:23:03
    what is this going to cause as far as
  • 00:23:06
    the structure of Education globally
  • 00:23:09
    where children won't really have to
  • 00:23:12
    study that much and read books that much
  • 00:23:15
    in order to be able to have access to
  • 00:23:17
    everything within a few seconds on their
  • 00:23:19
    tablet or on their phone
  • 00:23:21
    how how do you see the the future of
  • 00:23:24
    education and and the generations to
  • 00:23:26
    come and and the kind of information
  • 00:23:28
    that they will have
  • 00:23:31
    you know it's the biggest experiment in
  • 00:23:33
    history conducted on billions of people
  • 00:23:35
    we have no idea what the social
  • 00:23:38
    consequences the psychological
  • 00:23:40
    consequences will be
  • 00:23:42
    um what happens when you can get just
  • 00:23:45
    any answer to any question any any
  • 00:23:47
    question you have you don't need to do
  • 00:23:49
    anything you just ask your AI assistant
  • 00:23:52
    and you know even Google is terrified
  • 00:23:54
    because of that because in Google
  • 00:23:58
    um you still had to okay you you Googled
  • 00:24:00
    something it gave you a list of websites
  • 00:24:03
    you usually go for the first one
  • 00:24:05
    and that's a problem but okay you do it
  • 00:24:07
    but then you hit to to do something
  • 00:24:09
    yourself
  • 00:24:10
    you have to read it
  • 00:24:12
    um now it's a new generation of AI you
  • 00:24:15
    don't you just ask the question
  • 00:24:18
    and you get an answer
  • 00:24:19
    and
  • 00:24:21
    the big question is do you still have
  • 00:24:24
    some kind of ability of CR for critical
  • 00:24:27
    thinking to doubt
  • 00:24:29
    the question you received
  • 00:24:31
    and to search for alternative questions
  • 00:24:35
    or for come up with your own questions
  • 00:24:37
    and sorry alternative answers and to
  • 00:24:40
    come up with your own answers
  • 00:24:42
    um and we don't know again you have this
  • 00:24:46
    kind of tendency to say yeah we had many
  • 00:24:48
    previous Revolutions in information
  • 00:24:49
    technology that people were afraid when
  • 00:24:52
    books came along that people will not be
  • 00:24:54
    able to remember anything because they
  • 00:24:55
    now everything is written but as I said
  • 00:24:58
    in the beginning AI is fundamentally
  • 00:25:00
    different from every previous
  • 00:25:02
    Information Technology it can make
  • 00:25:04
    decisions it can create content by
  • 00:25:07
    itself we will never faced by this kind
  • 00:25:11
    of Technology
  • 00:25:14
    um and so we have no idea what the
  • 00:25:18
    outcome would be in 10 or 20 years in
  • 00:25:20
    terms of Education or of anything else
  • 00:25:23
    I want to move away from AI specifically
  • 00:25:26
    for a little bit but continue on on
  • 00:25:28
    technology which continues to have such
  • 00:25:30
    an impact in our daily lives I think
  • 00:25:32
    most of us can be too far away from our
  • 00:25:34
    phones for Better or For Worse we can do
  • 00:25:36
    everything uh on there when it comes to
  • 00:25:39
    technology
  • 00:25:41
    um
  • 00:25:42
    it is affecting the way in which we
  • 00:25:45
    relate to each other
  • 00:25:47
    um you know sometimes I I talk to my
  • 00:25:50
    friends uh about the fact how in a way
  • 00:25:52
    we're regressing we're ever more
  • 00:25:55
    connected than we have been before but
  • 00:25:58
    we communicate in a more distant way
  • 00:26:01
    than perhaps we have in in previous
  • 00:26:03
    decades where you know we've kind of
  • 00:26:05
    gone back to the Egyptians with the
  • 00:26:07
    hieroglyphics with the Emojis you know
  • 00:26:09
    and this is the way we're communicating
  • 00:26:10
    with each other I know you've you've
  • 00:26:13
    spent so much time thinking and writing
  • 00:26:15
    about human relationships and the way in
  • 00:26:19
    which you see the homo sapiens
  • 00:26:23
    interacting with each other what what
  • 00:26:25
    would you say about the way that we are
  • 00:26:27
    dealing with one another these days and
  • 00:26:30
    how how it will evolve because physical
  • 00:26:33
    contact seems to be harder and harder to
  • 00:26:35
    get because more people are staying at
  • 00:26:37
    home they're not going to work as much a
  • 00:26:39
    lot of dating starts online and I know
  • 00:26:43
    that for example with your partner you
  • 00:26:45
    met your partner online right so how do
  • 00:26:48
    you see this this development of
  • 00:26:50
    personal relationships and connections
  • 00:26:52
    which we continue to need in order to be
  • 00:26:55
    sane I guess
  • 00:26:58
    yeah um so again there is positive
  • 00:27:01
    potential there is negative potential I
  • 00:27:03
    talked a lot about and I tend to talk a
  • 00:27:04
    lot a lot about the negative potential
  • 00:27:06
    because you know the people who develop
  • 00:27:08
    the technology
  • 00:27:09
    they talk a lot about all its good good
  • 00:27:12
    sides all its promises and there are of
  • 00:27:15
    course a lot of positive uh potential
  • 00:27:18
    benefits as you mentioned I met my
  • 00:27:20
    husband online more than 20 years ago in
  • 00:27:23
    one of the first gay dating sites that
  • 00:27:26
    were available in Israel and this was
  • 00:27:28
    and the technology was was very
  • 00:27:30
    important and beneficial because you
  • 00:27:33
    know throughout history
  • 00:27:35
    um you had minorities which were
  • 00:27:37
    concentrated in one place like if you're
  • 00:27:40
    Jewish then you're usually born to a
  • 00:27:42
    Jewish Family in a Jewish community so
  • 00:27:45
    you know lots of other Jews so even if
  • 00:27:47
    you are a small minority
  • 00:27:50
    um you have this kind of basic
  • 00:27:52
    connection with other people like you
  • 00:27:53
    but with gay people it's different
  • 00:27:55
    you're not born to a gay family in a gay
  • 00:27:58
    community
  • 00:28:00
    um and you know being grazed growing up
  • 00:28:02
    in a small town in Israel which was very
  • 00:28:06
    homophobic Society at the time
  • 00:28:08
    just meeting other people
  • 00:28:10
    just meeting guys for dating was very
  • 00:28:14
    difficult and then the internet came
  • 00:28:16
    along
  • 00:28:17
    and this was a wonderful development
  • 00:28:19
    Because the Internet managed to connect
  • 00:28:22
    this kind of of you know uh uh diffused
  • 00:28:25
    minorities
  • 00:28:26
    that are not living together in the same
  • 00:28:29
    place suddenly it became much more easy
  • 00:28:30
    to find each other
  • 00:28:32
    so this was a great benefit of the
  • 00:28:36
    technology
  • 00:28:37
    and we shouldn't ignore the the the
  • 00:28:40
    these benefits but as you said uh humans
  • 00:28:46
    deep down we are social animals
  • 00:28:49
    and sociability ultimately is is about
  • 00:28:54
    the body it's not just about the mind
  • 00:28:57
    and you have this discussion for you
  • 00:29:00
    know for thousands of years about what
  • 00:29:03
    humans really are
  • 00:29:07
    an immaterial Soul or an immaterial mind
  • 00:29:11
    or are they embodied beings embodied
  • 00:29:15
    entities
  • 00:29:17
    and um this was a major philosophical
  • 00:29:19
    topic that you see say in ancient
  • 00:29:22
    Christianity this discussion that Jesus
  • 00:29:25
    and the first Christians influenced by
  • 00:29:28
    Jewish Traditions they believed very
  • 00:29:30
    firmly that humans are bodies
  • 00:29:34
    which is why Christ rises in the body
  • 00:29:36
    he's resurrected in the body and when
  • 00:29:39
    Christ initially talks about the Kingdom
  • 00:29:41
    of Heaven he means the Kingdom of Heaven
  • 00:29:44
    on Earth
  • 00:29:45
    he tells his followers that they'll be
  • 00:29:49
    this perfect kingdom here on Earth you
  • 00:29:52
    know his trees and stones and people
  • 00:29:54
    but over time under the influence
  • 00:29:57
    especially of platonic philosophy
  • 00:29:59
    Christianity drifted away
  • 00:30:02
    from this view of humans as embodied
  • 00:30:06
    and placed greater and greater emphasis
  • 00:30:09
    on the immaterial soul or mind
  • 00:30:12
    it imagined that the bodies is dirty the
  • 00:30:17
    body is animalistic the body there is
  • 00:30:19
    something wrong with it
  • 00:30:20
    and when you die you are not coming back
  • 00:30:24
    in the body your soul is liberated from
  • 00:30:28
    the material body and it goes not to a
  • 00:30:30
    kingdom on Earth but to Heaven which is
  • 00:30:34
    a completely immaterial Realm
  • 00:30:36
    so the Christian fantasy became to
  • 00:30:39
    completely disconnect from the body and
  • 00:30:42
    this remained a fantasy for thousands of
  • 00:30:44
    years now with the technology of the
  • 00:30:47
    21st century a lot of very ancient
  • 00:30:50
    philosophical and Theological debates
  • 00:30:52
    are becoming practical
  • 00:30:55
    so going with your example let's say we
  • 00:30:57
    you have some teenager or some person
  • 00:30:59
    whatever age sitting at home never
  • 00:31:02
    leaving home in front of a screen maybe
  • 00:31:05
    with some 3D glasses or something they
  • 00:31:09
    live their lives online
  • 00:31:11
    in a way they are realizing the platonic
  • 00:31:14
    ideal of disconnecting the soul or mind
  • 00:31:18
    from the body
  • 00:31:20
    so
  • 00:31:21
    what do we see there is it
  • 00:31:24
    um
  • 00:31:25
    human being trapped within a room losing
  • 00:31:29
    connection to the real world or is it a
  • 00:31:32
    human being liberated from the
  • 00:31:35
    restrictions of the biological body with
  • 00:31:38
    all its messiness and and then dirt and
  • 00:31:41
    whatever and liberating their spirit
  • 00:31:45
    to wander around the immaterial Heavenly
  • 00:31:49
    realm of cyberspace well there are no
  • 00:31:51
    bodies
  • 00:31:52
    now personally if you ask me
  • 00:31:55
    um I think the early Christians were
  • 00:31:57
    right that humans are embodied entities
  • 00:32:00
    if you try to disconnect the Mind from
  • 00:32:03
    the body or the soul from the body it
  • 00:32:05
    leads in a very dangerous Direction
  • 00:32:08
    that you you can't really do it
  • 00:32:11
    and it will be destructive
  • 00:32:12
    psychologically and socially
  • 00:32:15
    but that's just my opinion you have lots
  • 00:32:17
    of people today when you hear for
  • 00:32:19
    instance people talk about the metaverse
  • 00:32:21
    the metaverse is exactly that it's the
  • 00:32:25
    idea of creating an immaterial Realm
  • 00:32:29
    which is made entirely of data
  • 00:32:32
    there is no biology there there is no
  • 00:32:34
    physics as we know it there
  • 00:32:37
    I wanted to get your take on how you see
  • 00:32:40
    the future of of work
  • 00:32:44
    um I was I was seeing yesterday that
  • 00:32:47
    that BT British Telecom is laying off 55
  • 00:32:50
    000 people until 2030 because of the
  • 00:32:53
    introduction of AI and increased
  • 00:32:55
    technology that will not need as many
  • 00:32:59
    human beings to be sitting in seats as
  • 00:33:01
    they are today what's the most important
  • 00:33:04
    skill the most important resource that
  • 00:33:06
    you think we need to have as human
  • 00:33:08
    beings to flourish and to survive in the
  • 00:33:12
    coming years
  • 00:33:13
    the ability to keep changing
  • 00:33:16
    no single skill
  • 00:33:18
    will is safe
  • 00:33:20
    like previously people thought that okay
  • 00:33:22
    all these jobs will disappear but we'll
  • 00:33:24
    need lots of coders to write code all
  • 00:33:27
    the AI all the algorithms the new
  • 00:33:30
    generation of AI can code
  • 00:33:32
    so maybe you don't need any human coders
  • 00:33:34
    or very few human coders in in a couple
  • 00:33:36
    of years we don't know what are the most
  • 00:33:40
    the safe skills we do know that the job
  • 00:33:42
    market will keep changing
  • 00:33:44
    it's not that all jobs will disappear
  • 00:33:47
    totally
  • 00:33:48
    many jobs will disappear but many new
  • 00:33:51
    jobs will emerge which we can hardly
  • 00:33:53
    even imagine today
  • 00:33:55
    but people will need to retrain
  • 00:33:57
    themselves
  • 00:33:58
    and this is difficult not just
  • 00:33:59
    financially like you lose your old jobs
  • 00:34:01
    your old job you need a couple of weeks
  • 00:34:04
    months maybe years to retrain for a new
  • 00:34:07
    job who is going to support you during
  • 00:34:08
    this transition
  • 00:34:10
    the psychological problem is is even
  • 00:34:13
    bigger we are not built for this rapid
  • 00:34:17
    constant changes
  • 00:34:19
    think that you have to reinvent yourself
  • 00:34:22
    professionally even personally every
  • 00:34:24
    five years who can do that
  • 00:34:27
    so it's not that we need to learn a
  • 00:34:29
    specific skill we need to kind of
  • 00:34:32
    rebuild our entire mind to be much more
  • 00:34:37
    flexible
  • 00:34:38
    and this is something that individuals
  • 00:34:40
    cannot do by themselves eigen
  • 00:34:43
    governments societies will have to step
  • 00:34:45
    in and uh provide with I I think with
  • 00:34:50
    mental and psychological support
  • 00:34:52
    because the biggest difficulty will be
  • 00:34:55
    the the stress change is always
  • 00:34:57
    stressful and the level of stress in the
  • 00:35:00
    world just keeps Rising more and more
  • 00:35:02
    and we are I think we are close to the
  • 00:35:04
    point when we can just can't take it
  • 00:35:06
    anymore it's moving too fast
  • 00:35:09
    and going back to what we discussed at
  • 00:35:11
    the very beginning that we are organic
  • 00:35:13
    beings
  • 00:35:15
    we need to sleep we need to rest we need
  • 00:35:18
    to take time off it's difficult for us
  • 00:35:20
    to change and we now live in a world
  • 00:35:23
    increasingly dominated by inorganic
  • 00:35:26
    entities that never sleep that don't
  • 00:35:28
    need to rest that go don't go on
  • 00:35:30
    vacations and that change at a pace
  • 00:35:34
    which is just impossible for organic
  • 00:35:37
    entities
  • 00:35:38
    so if we don't find a way to basically
  • 00:35:42
    give us a break
  • 00:35:44
    to slow things down a little so that our
  • 00:35:48
    organic bodies can keep Pace we will
  • 00:35:52
    break down and psychologically
  • 00:35:55
    we're getting close to our quick fire
  • 00:35:57
    questions that we ask all our guests
  • 00:35:59
    before we do and also in a kind of short
  • 00:36:03
    answer ish I'm not good in that short
  • 00:36:07
    answer is I don't think that's a word
  • 00:36:08
    but I just made it up
  • 00:36:11
    um
  • 00:36:12
    what
  • 00:36:14
    what gets you the most excited right now
  • 00:36:16
    most excited optimistically and what
  • 00:36:19
    drives you absolutely insane about the
  • 00:36:23
    world today so I guess a sentence maybe
  • 00:36:25
    so what drives me insane is the need for
  • 00:36:28
    excitement okay I think that constant
  • 00:36:32
    stimulation right yeah I think you know
  • 00:36:34
    especially in the states I don't know
  • 00:36:35
    how it is here in Portugal but when you
  • 00:36:37
    go to the States now everything is
  • 00:36:39
    exciting like you meet somebody I'm so
  • 00:36:41
    excited to meet you and you have a new
  • 00:36:43
    idea oh this idea is like your business
  • 00:36:45
    plan is so exciting the world has
  • 00:36:47
    totally lost its meaning yeah and people
  • 00:36:49
    forgot that it's not good to be excited
  • 00:36:51
    all the time
  • 00:36:53
    it's just not good for you it's
  • 00:36:54
    exhausting too right it's exhausting so
  • 00:36:57
    I think people should say I'm so relaxed
  • 00:36:59
    to meet you
  • 00:37:02
    okay what gets you excited then without
  • 00:37:05
    getting exhausted
  • 00:37:07
    um what are you positive most positive
  • 00:37:10
    about right now today yeah in in the
  • 00:37:12
    world besides the Portuguese dinner
  • 00:37:13
    you're going to have
  • 00:37:16
    um
  • 00:37:16
    well I think that humans have an still
  • 00:37:19
    an amazing ability to change we don't
  • 00:37:23
    really know ourselves we have immense
  • 00:37:26
    potential which we are not even aware of
  • 00:37:28
    one of the worst things about the AI
  • 00:37:30
    Revolution is that it could cause us to
  • 00:37:33
    lose much of our potential without even
  • 00:37:36
    realizing that we've lost it
  • 00:37:40
    um that you know I said that AI is
  • 00:37:43
    nowhere near its full potential that
  • 00:37:44
    it's just a baby I think humans are in
  • 00:37:47
    the same situation we are nowhere near
  • 00:37:50
    our full potential if we invest for
  • 00:37:53
    every Euro and every minute that we
  • 00:37:56
    invest in developing artificial
  • 00:37:58
    intelligence
  • 00:37:59
    we invest another Euro and another
  • 00:38:02
    minute in developing our ourselves
  • 00:38:05
    are our minds our Consciousness then we
  • 00:38:07
    will be okay
  • 00:38:09
    but of course it's the the incentive
  • 00:38:12
    structure
  • 00:38:13
    of society of the economy it pushes Us
  • 00:38:16
    in the wrong direction
  • 00:38:19
    and um you know like with the algorithms
  • 00:38:21
    in social media we need to change the
  • 00:38:23
    incentive structure yeah okay quick fire
  • 00:38:26
    time yes I feel like I should get closer
  • 00:38:29
    but I'm
  • 00:38:30
    um what is one personality trait that a
  • 00:38:33
    good leader could really benefit from
  • 00:38:35
    having and why so in one sentence please
  • 00:38:41
    inauthenticity not being authentic
  • 00:38:45
    um the worst thing what we now have as
  • 00:38:47
    are all these authentic leaders who
  • 00:38:50
    think that politics is something like
  • 00:38:52
    Psychotherapy that we should just say
  • 00:38:55
    the first things that comes to your mind
  • 00:38:57
    no politics is not therapy you want to
  • 00:39:01
    say the first thing that comes to your
  • 00:39:03
    mind you want to let your Eid lose go to
  • 00:39:06
    therapy
  • 00:39:08
    when you go to politics you build a wall
  • 00:39:11
    not on the border of the country you
  • 00:39:13
    build a wall between your mind and your
  • 00:39:16
    mouth
  • 00:39:17
    and you're very careful about which
  • 00:39:20
    immigrants are passing through the world
  • 00:39:24
    maybe you're referring to someone in
  • 00:39:26
    particular what is the biggest challenge
  • 00:39:29
    humanity is facing now
  • 00:39:32
    itself its inability to cooperate again
  • 00:39:35
    I would say AI I would say uh uh the the
  • 00:39:38
    climate change if we can cooperate we
  • 00:39:42
    can deal with AI and climate change if
  • 00:39:44
    we don't cooperate it's it's hopeless
  • 00:39:46
    you know just a quick aside in Brackets
  • 00:39:49
    sometimes I think about the movie
  • 00:39:51
    arrival when the alien species comes
  • 00:39:54
    down and it's the only time where we
  • 00:39:56
    actually collaborate and talk the same
  • 00:39:58
    language to deal with this one issue it
  • 00:40:01
    doesn't work I mean you're probably not
  • 00:40:03
    going to take that I think but it's yeah
  • 00:40:06
    again you don't have to react to that I
  • 00:40:07
    just I just think of that moment as a
  • 00:40:10
    like alien species coming down
  • 00:40:12
    we just encountered an alien
  • 00:40:15
    intelligence here on Earth it didn't
  • 00:40:17
    come from outer space it came from
  • 00:40:19
    Silicon Valley yeah and we are not
  • 00:40:22
    uniting in the face of it we are just
  • 00:40:24
    bickering and fighting even more yeah
  • 00:40:26
    okay last two questions if you could
  • 00:40:28
    change one thing in the world by Magic
  • 00:40:30
    today what would it be hmm with your
  • 00:40:33
    Harry Potter
  • 00:40:39
    change one thing in the world
  • 00:40:41
    hmm
  • 00:40:43
    ability for us to get together maybe or
  • 00:40:45
    get along I mean I already said that
  • 00:40:47
    that this is like my my I would say that
  • 00:40:49
    I would change the
  • 00:40:52
    um
  • 00:40:52
    um
  • 00:40:54
    human understanding of of themselves
  • 00:40:57
    to
  • 00:40:59
    Place more emphasis
  • 00:41:02
    on our shared biological reality
  • 00:41:07
    and less on all the kind of fictional
  • 00:41:10
    stories that we create with our minds
  • 00:41:13
    because in our minds all of us
  • 00:41:15
    individuals and also groups of people
  • 00:41:18
    every nation creates a story that we are
  • 00:41:20
    different we are special we are better
  • 00:41:23
    than all of them so we can't cooperate
  • 00:41:25
    with them and we should focus on
  • 00:41:27
    ourselves
  • 00:41:28
    and
  • 00:41:29
    um if we could just let go of these
  • 00:41:31
    stories for a little while and again go
  • 00:41:34
    back to the level of the body
  • 00:41:35
    we would realize that we are all the
  • 00:41:38
    same you know thing I think said its
  • 00:41:40
    best in Russians in this uh song from
  • 00:41:44
    the 1980s that we share the same biology
  • 00:41:47
    regardless of ideology
  • 00:41:50
    it was true in the 1980s it was kind of
  • 00:41:52
    his vision for how to overcome the Cold
  • 00:41:56
    War and prevent nuclear disaster and
  • 00:41:58
    it's still true today we all of us share
  • 00:42:00
    the same biology irrespective of our
  • 00:42:03
    ideology and religion and whatever
  • 00:42:04
    didn't he also say the Russians have
  • 00:42:06
    children too or something like that he
  • 00:42:09
    said it I I hope the Russians love their
  • 00:42:11
    children too right which of course the
  • 00:42:13
    answer is yes obviously again that's
  • 00:42:15
    everybody loves their children and if we
  • 00:42:18
    remember that yeah and let go a little
  • 00:42:21
    of our ideological and religious
  • 00:42:24
    fantasies that That's the basis to to
  • 00:42:26
    create a better world for everybody okay
  • 00:42:28
    that that was an outrageously long
  • 00:42:30
    answer I know that was a lot longer than
  • 00:42:33
    one sentence so one sentence for this
  • 00:42:34
    one otherwise uh they'll kill me
  • 00:42:38
    um what's the most important learning of
  • 00:42:41
    your life
  • 00:42:42
    to date
  • 00:42:43
    the one thing
  • 00:42:46
    um meditation
  • 00:42:47
    to observe myself
  • 00:42:50
    again to observe how my mind constantly
  • 00:42:54
    creates fictional stories
  • 00:42:56
    and be able to let go of them for a
  • 00:42:59
    little while and just be in touch with
  • 00:43:02
    the reality as it is
  • 00:43:03
    that was one sentence thank you so much
  • 00:43:05
    you all know a Harari an absolute
  • 00:43:08
    pleasure the light
  • 00:43:10
    to have you with us here on the first
  • 00:43:13
    ever Live recorded show of it's not that
  • 00:43:17
    simple thank you
  • 00:43:19
    [Applause]
  • 00:43:21
    [Music]
  • 00:43:27
    [Music]
Tags
  • Yuval Noah Harari
  • Inteligência Artificial
  • Humanidade
  • Tecnologia
  • Desafios sociais
  • Educação
  • Cooperação
  • Divisão social
  • Futuro da educação
  • Mudanças tecnológicas