Yuval Noah Harari Interview with Saurabh Dwivedi। AI के खतरे,Ramayana से Narendra Modi तक। Kitabwala

00:57:01
https://www.youtube.com/watch?v=6x4UgHI7qbk

Resumen

TLDRProfessor Yuval Noah Harari discusses the rapid development and potential future of Artificial Intelligence (AI), comparing it to the slow organic evolution from simple organisms to complex ones over billions of years. He examines the possibility of AI developing consciousness and the societal implications, such as AI's potential role in religion and creativity. Harari explores the importance of stories in human culture, and how AI might introduce new narratives distinct from biological ones. He also highlights the importance of truth versus fiction in the digital age, illustrating how AI systems could exacerbate misinformation due to their inherent biases. Additionally, Harari shares his personal experiences in India, especially with Vipassana meditation, reflecting on its impact on understanding the mind. He concludes with concerns about centralized power in AI development primarily within the US and China, emphasizing a need for broader global participation and dialogue in shaping AI's future.

Para llevar

  • 🦠 AI is in its early stages, akin to amoeba in evolution.
  • 🤖 Humans may face new narratives from AI beyond biological dramas.
  • 🌐 Centralized AI development in US and China poses risks.
  • 📚 Human storytelling is essential for large-scale cooperation.
  • 💾 AI's inherent biases can spread misinformation.
  • 🧘‍♂️ Harari integrates his Vipassana meditation experiences into his worldview.
  • 🗣 AI has the potential to influence cultural, economic, and religious domains.
  • ❓ Truth is complicated, costly, and often painful, unlike fiction.
  • ⚖️ The balance between truth and order is critical.
  • 🕵️ With increasing AI surveillance, privacy concerns grow.
  • 🧠 Humans often mistakenly attribute consciousness to AI.
  • 🎥 Harari warns about AI-driven creative fields like media and arts.

Cronología

  • 00:00:00 - 00:05:00

    The evolution from amoebas to dinosaurs took billions of years, whereas AI development is much faster than organic evolution. AI's potential to develop consciousness and empathy poses existential questions, such as reevaluating our constructs and beliefs. Despite AI's current dependency on human commands, its future intelligence may lead to transformative global shifts.

  • 00:05:00 - 00:10:00

    Author Yuval Noah Harari discusses his connection to India and his journey with vipassana, highlighting its scientific approach in understanding the mind. Vipassana teaches attentiveness to one's breath, revealing how our minds operate beyond conscious control.

  • 00:10:00 - 00:15:00

    Harari contrasts truth and reality, explaining humanity's tendency to create distorted realities. Despite progress, human information hasn't necessarily improved, often resulting in bad decisions due to misleading information. Truth is costly, complex, and painful, while fiction is cheap, simple, and appealing, leading to a predominance of fiction over truth in human narratives.

  • 00:15:00 - 00:20:00

    The concept of an 'information diet' is introduced, addressing the overwhelming flood of modern information compared to the need for analysis and reflection. Harari explores computer politics, noting AI's decisiveness and creativity, yet cautioning against its biases.

  • 00:20:00 - 00:25:00

    AI's unprecedented potential is likened to a technological revolution, with examples like OpenAI's GPT-4 employing human-like tactics to achieve goals. This highlights AI's ability to make autonomous decisions and create new methods, deviating from human thought processes.

  • 00:25:00 - 00:30:00

    The widespread influence of AI, from loans to art, is discussed. Harari warns about AI becoming central in creating our reality, suggesting that this technological trajectory could substantially redefine human societal structures.

  • 00:30:00 - 00:35:00

    Considering Singularity and AI's potential, Harari contemplates the biases inherent in AI, stressing that despite their apparent objectivity, AIs reflect human biases from their learned data. The implications of this on creativity and intelligence among AI compared to humans are critically examined.

  • 00:35:00 - 00:40:00

    While humans have evolved unique traits, AI began surpassing human creativity in games like Go, prompting discussions on AI's future role in strategic thinking and decision-making across various sectors. Harari predicts AI's eventual advancement beyond biological limitations.

  • 00:40:00 - 00:45:00

    As governments traditionally struggled with total surveillance, AI now enables constant monitoring and control by analyzing vast amounts of data without requiring human analysts, risking the rise of an unprecedentedly powerful surveillance state.

  • 00:45:00 - 00:50:00

    Despite AI's power to engage it's noted for exacerbating societal divides through targeted content that triggers fear and hatred, fueled by social media algorithms designed to maximize user engagement for profit.

  • 00:50:00 - 00:57:01

    Harari calls for worldwide awareness in AI's development to ensure ethical standards are woven into its framework. Emphasizing not only technical governance but also a diverse dialogue involving global voices, especially as AI evolves potentially faster than biological organisms ever did.

Ver más

Mapa mental

Vídeo de preguntas y respuestas

  • What are the main topics discussed in the interview with Professor Yuval Noah Harari?

    The talk covers AI's development, potential for AI consciousness, misconceptions about intelligence and free will, and AI's influence on society and culture.

  • What are Professor Harari's concerns about AI's role in society?

    He expresses concerns about AI's ability to replace human interaction in areas like religion, creativity, and governance due to inherent biases and lack of consciousness.

  • What is Professor Harari's connection to India?

    Professor Harari frequently practices Vipassana meditation in India and explores its cultural and spiritual impacts in his works.

  • What does Harari suggest about the relationship between AI and human culture?

    It suggests that as AI becomes key in decision-making, humanity might need to adapt its narratives and power structures to maintain relevance.

  • What themes does Harari discuss in relation to AI's evolution and human history?

    The book explores technological impacts on human society, cultural narratives, and ethical considerations like truth vs. fiction.

Ver más resúmenes de vídeos

Obtén acceso instantáneo a resúmenes gratuitos de vídeos de YouTube gracias a la IA.
Subtítulos
en
Desplazamiento automático:
  • 00:00:00
    from amiba to Dinosaur it took some time
  • 00:00:03
    took billions of years but some time in
  • 00:00:05
    the history of universe AI we know today
  • 00:00:07
    is like the amiba but AI will continue
  • 00:00:10
    to develop for thousands of years for
  • 00:00:13
    millions of years and it is much faster
  • 00:00:15
    than the process of organic evolution
  • 00:00:17
    what if they develop Consciousness and
  • 00:00:20
    empathy and some of us start realizing
  • 00:00:23
    that so far we were worshiping false god
  • 00:00:25
    even if AIS don't have Consciousness
  • 00:00:28
    they can easily fool us and make us
  • 00:00:32
    believe that they feel things whatever
  • 00:00:35
    computers can do they need our Command
  • 00:00:37
    but if they had developing human
  • 00:00:38
    intelligence then probably that is the
  • 00:00:40
    next chapter an atom bomb is so much
  • 00:00:42
    power but the power is actually in human
  • 00:00:45
    hands AI is able to decide things by
  • 00:00:49
    itself and is able to invent new things
  • 00:00:52
    by itself one of the term you mentioned
  • 00:00:54
    in the book which is computer politics
  • 00:00:57
    that the easiest way to grab your
  • 00:00:59
    attention
  • 00:01:00
    and to keep you constantly engaged with
  • 00:01:02
    the platform is to press the hate button
  • 00:01:05
    or the greed button or the fear button
  • 00:01:07
    in our mind first lesson we should teach
  • 00:01:10
    or feed to the elgo is that you can
  • 00:01:13
    commit mistakes exactly like you create
  • 00:01:15
    an AI to screen job applications and if
  • 00:01:19
    in the present moment there is bias
  • 00:01:21
    against women you train on this data it
  • 00:01:24
    will learn oh women's are not good for
  • 00:01:26
    this job and the AI will repeat the same
  • 00:01:30
    bias in the future the ultimate question
  • 00:01:32
    is which story we believe in all our
  • 00:01:35
    life over thousands of years we have
  • 00:01:38
    believed in biological dramas two males
  • 00:01:41
    fighting over a female so you know you
  • 00:01:43
    have it in humans you have it in dogs
  • 00:01:45
    you have it in elephants you have it in
  • 00:01:47
    chimpanzees so you have it in the
  • 00:01:49
    ramayana you have it in Shakespeare you
  • 00:01:51
    have it in Hollywood and Bollywood every
  • 00:01:53
    Alpha has to remember the AI is the
  • 00:01:56
    ultimate Alpha now the big question and
  • 00:01:58
    the last question how do we go ahead on
  • 00:02:00
    this so let me explain
  • 00:02:32
    writer Professor Noah
  • 00:02:35
    Harari
  • 00:02:49
    homod sir thank you so much for joining
  • 00:02:52
    it's my pleasure thank you what a
  • 00:02:55
    wonderful book so we'll keep giving
  • 00:02:57
    references of sapiens hodus and 21
  • 00:02:59
    Century problems but first of all I want
  • 00:03:02
    to understand your India connection
  • 00:03:04
    because the last chapter of your third
  • 00:03:07
    book talks about Vias yeah yeah I come
  • 00:03:10
    every every year to do vipasana course I
  • 00:03:13
    learned vipasana 24 years ago from S
  • 00:03:16
    goena I was doing my PhD studies in
  • 00:03:19
    history at the time in England in Oxford
  • 00:03:22
    and I was trying to understand the world
  • 00:03:23
    and to understand life for all the books
  • 00:03:26
    and couldn't find really deep answers
  • 00:03:29
    and was there was a friend who told me
  • 00:03:31
    for one year you have to go and do
  • 00:03:33
    travas you have to go Trav pasana and I
  • 00:03:36
    thought this is nonsense this is some
  • 00:03:38
    some kind of uh Superstition or this is
  • 00:03:40
    some kind of Mystic practice Mystic
  • 00:03:42
    practice I don't I'm man of science I
  • 00:03:45
    don't do mysticism and eventually
  • 00:03:47
    convinced me and I went and it was
  • 00:03:50
    absolutely amazing uh there was no
  • 00:03:53
    mysticism at all it was the most
  • 00:03:55
    scientific thing just to observe the
  • 00:03:58
    reality from moment to Moment Like a
  • 00:04:01
    scientific experiment you want to
  • 00:04:03
    understand your mind you will not
  • 00:04:05
    understand your mind by just by reading
  • 00:04:06
    books observe your mind observe the
  • 00:04:09
    breath coming in and out of the nostrils
  • 00:04:12
    it really struck me that you know they
  • 00:04:14
    gave this simplest exercise just feel
  • 00:04:17
    the breath coming in and feel the breath
  • 00:04:19
    going out that's it sounds simple I
  • 00:04:22
    couldn't do it for more than 10 seconds
  • 00:04:25
    before my mind would go to some fantasy
  • 00:04:27
    some
  • 00:04:28
    memory and this is how I really started
  • 00:04:31
    to understand how the mind works that I
  • 00:04:34
    have no control over it you know you can
  • 00:04:36
    order your eyes to shut you can ORD your
  • 00:04:40
    your legs to to lay down you cannot
  • 00:04:42
    order your mind to stop very little
  • 00:04:45
    control of the mind and since then for
  • 00:04:48
    24 years I've been practicing vasana and
  • 00:04:51
    almost every year covid was not possible
  • 00:04:54
    but almost every other year I come to
  • 00:04:56
    India to do a long course of maybe 30
  • 00:04:59
    days or 60 days to get to know the Mind
  • 00:05:02
    better nice and we are getting to know
  • 00:05:05
    about your mind through your books and
  • 00:05:07
    surprisingly in this book Nexus there's
  • 00:05:09
    a lot of India references you talked in
  • 00:05:12
    detail about ramayan you talked about
  • 00:05:15
    Swatch Bharat Mission the cleanliness
  • 00:05:16
    Drive being run by prime minister
  • 00:05:18
    Narendra Modi you talk about theit
  • 00:05:21
    atrocities and the cast system of India
  • 00:05:23
    and on and on but uh you primarily talk
  • 00:05:27
    about Truth versus reality yeah Truth
  • 00:05:30
    Versus order and the process of
  • 00:05:32
    mythmaking process uh we'll come to that
  • 00:05:35
    part later but please tell us about
  • 00:05:37
    Truth versus reality because we often
  • 00:05:40
    get
  • 00:05:41
    confused well um you know people
  • 00:05:45
    never almost never are in touch with
  • 00:05:49
    reality we are in touch with stories
  • 00:05:51
    about reality we are in touch with
  • 00:05:54
    images of of reality and they can be
  • 00:05:57
    very
  • 00:05:58
    distorted uh um
  • 00:06:01
    and all the again the E the big effort
  • 00:06:05
    that we are trying to to to to know what
  • 00:06:09
    is the
  • 00:06:10
    truth very often people get to
  • 00:06:13
    conflicting answers about it um and if
  • 00:06:17
    you look at the Deep causes of the
  • 00:06:21
    conflict in the world of all the hatred
  • 00:06:24
    and anger and so and so forth so a lot
  • 00:06:27
    of people believe that there is
  • 00:06:28
    something wrong in the human being which
  • 00:06:31
    is why we are doing so many bad things
  • 00:06:34
    but
  • 00:06:36
    actually most people are good the
  • 00:06:39
    problem is if you give good people bad
  • 00:06:41
    information if you give good people bad
  • 00:06:44
    images of reality then they make very
  • 00:06:47
    bad
  • 00:06:48
    decisions and what I explore in the book
  • 00:06:52
    is how come over thousands of years we
  • 00:06:55
    have developed so much as a species
  • 00:06:58
    humankind but our information did not
  • 00:07:01
    become any better like people today in
  • 00:07:04
    the Modern Age they can believe the most
  • 00:07:09
    terrible things the most hateful things
  • 00:07:12
    just like thousands of years ago so no
  • 00:07:16
    matter how much technological progress
  • 00:07:18
    we have
  • 00:07:19
    made um we have still not made
  • 00:07:21
    significant progress
  • 00:07:24
    in getting better information and
  • 00:07:28
    understanding the reality better and why
  • 00:07:29
    is that because we assume that
  • 00:07:32
    rationality will keep progressing and
  • 00:07:34
    we'll be able to make wise decisions I
  • 00:07:37
    mean Homo sapiens the nomat suggests so
  • 00:07:40
    but as this book says that we are
  • 00:07:42
    continuously though there's rapid
  • 00:07:44
    progression we are continuously making
  • 00:07:46
    disastrous choices as well and we are
  • 00:07:48
    still believing in all those theories
  • 00:07:51
    because most information is isn't the
  • 00:07:54
    truth a lot of people think if I have
  • 00:07:56
    more information I will know more but
  • 00:07:59
    this is not the case the truth is
  • 00:08:01
    actually a very small and rare subset of
  • 00:08:06
    all the information in the world most
  • 00:08:09
    information is junk not the truth why
  • 00:08:13
    because the truth first of all is costly
  • 00:08:16
    if you want to write a truthful account
  • 00:08:19
    of anything of biology of history of
  • 00:08:22
    physics you need to do research you need
  • 00:08:24
    to collect evidence and analyze it this
  • 00:08:27
    takes time and energy and money it's
  • 00:08:30
    expensive to produce the truth in
  • 00:08:33
    contrast Fiction and Fantasy is very
  • 00:08:36
    cheap you just write anything you want
  • 00:08:39
    you don't need to do any research
  • 00:08:41
    doesn't take time doesn't take money so
  • 00:08:44
    the truth is costly fiction is cheap the
  • 00:08:47
    truth is often complicated because
  • 00:08:50
    reality is complicated whereas fiction
  • 00:08:53
    can be made as simple as you would like
  • 00:08:55
    it to be and people usually prefer
  • 00:08:58
    simple stories finally the truth is
  • 00:09:01
    often painful there are many things we
  • 00:09:04
    don't want to know about ourselves about
  • 00:09:07
    our life H about the world about our
  • 00:09:11
    nation the truth is painful sometimes
  • 00:09:14
    fiction can be made as pleasing as
  • 00:09:17
    flattering as you would like it to be so
  • 00:09:20
    in a
  • 00:09:21
    competition between the costly and
  • 00:09:24
    complicated and painful truth and cheap
  • 00:09:28
    simple and flattering fiction fiction
  • 00:09:31
    tends to win so most of the information
  • 00:09:34
    that is flooding the world is not the
  • 00:09:37
    truth and as people invent more and more
  • 00:09:40
    sophisticated Information Technology
  • 00:09:43
    they just flood the world with more
  • 00:09:46
    fiction which doesn't make us any wiser
  • 00:09:49
    which doesn't make us doesn't give us
  • 00:09:52
    knowledge uh we need to make special
  • 00:09:54
    effort Beyond just collecting
  • 00:09:56
    information you know what you see now in
  • 00:09:59
    the world is that people are flooded by
  • 00:10:02
    far too much information and we need a
  • 00:10:05
    kind of information diet the same way
  • 00:10:08
    that people have food diet people know
  • 00:10:10
    too much food isn't good for me junk
  • 00:10:12
    food isn't good for me because I need
  • 00:10:15
    actually time to digest the food not
  • 00:10:18
    just to eat it it's the same as
  • 00:10:20
    information too much information is not
  • 00:10:23
    good for the mind we need some of it of
  • 00:10:25
    course but we actually need more time to
  • 00:10:29
    dig digest and analyze and meditate over
  • 00:10:32
    the information to tell the difference
  • 00:10:34
    between truth and
  • 00:10:36
    fiction um so in this sense we need an
  • 00:10:39
    information diet you are talking about
  • 00:10:42
    truth and this is reminding me of one of
  • 00:10:44
    the term you mentioned in the book which
  • 00:10:46
    is computer politics and I found it very
  • 00:10:49
    interesting where you talking about
  • 00:10:51
    algorithm you are talking about
  • 00:10:53
    interconnectivity of computers how they
  • 00:10:54
    are making choices while playing a game
  • 00:10:57
    with a human being how they are deriving
  • 00:11:00
    something very new and out of the book
  • 00:11:02
    how they are decoding that capture image
  • 00:11:05
    hle which is a very new thing for them
  • 00:11:08
    and how they are tricking human in
  • 00:11:10
    believing that they are not robots but
  • 00:11:12
    human in distress yes and that's very
  • 00:11:15
    alarming in long we yeah we are now in
  • 00:11:19
    the midst of maybe the most important
  • 00:11:22
    technological revolution in history
  • 00:11:24
    which is the invention of
  • 00:11:26
    ai ai is different from every previous
  • 00:11:30
    technology because it is not a tool it
  • 00:11:34
    is an agent it is something that can
  • 00:11:37
    make independent decisions and can come
  • 00:11:40
    up with new ideas by itself all previous
  • 00:11:44
    technology if you think about even
  • 00:11:46
    something very powerful like an atom
  • 00:11:48
    bomb an atom bomb is so much power but
  • 00:11:52
    the power is actually in human hands
  • 00:11:54
    because it is only a human that decides
  • 00:11:57
    what to do with the atom bomb an atom
  • 00:11:59
    bomb cannot decide by itself which city
  • 00:12:03
    to bomb to attack an atom bomb cannot
  • 00:12:06
    invent a new kind of bomb or something
  • 00:12:09
    else it has no capacity to invent things
  • 00:12:13
    AI is able to decide things by itself
  • 00:12:17
    and is able to invent new things by
  • 00:12:19
    itself so one famous example which you
  • 00:12:22
    mentioned is that two years ago open AI
  • 00:12:26
    one of the leading AI companies it
  • 00:12:28
    developed this new AI uh gp4 which now
  • 00:12:33
    everybody is using in cat GPT and all
  • 00:12:35
    these things so they developed gp4 they
  • 00:12:37
    wanted to know what can this thing do so
  • 00:12:40
    they gave it as a test the goal of
  • 00:12:44
    solving a capture puzzle a capture
  • 00:12:47
    puzzle is this visual puzzle that is
  • 00:12:50
    meant to tell the difference between
  • 00:12:52
    human and computer like when you access
  • 00:12:55
    a web page like a bank account the bank
  • 00:12:57
    wants to know is it a human or is it a
  • 00:13:00
    bot so they show you this image of maybe
  • 00:13:04
    some Twisted letters or whatever and you
  • 00:13:06
    have to find to say what these letters
  • 00:13:08
    are this is the capture puzzle now gp4
  • 00:13:13
    could not solve the capture puzzle by
  • 00:13:15
    itself but it was given access to a web
  • 00:13:18
    page Tusk rabbit where you can hire
  • 00:13:21
    people to do things for you so gp4 try
  • 00:13:25
    to hire a human being to solve the
  • 00:13:27
    capture puzzle for it now the human got
  • 00:13:30
    suspicious so the human wrote asked why
  • 00:13:35
    do you need somebody to solve you a
  • 00:13:36
    cupture puzzle are you a robot it asks
  • 00:13:40
    the human ask the important question are
  • 00:13:42
    you a robot and gp4 answered back no I'm
  • 00:13:47
    not a I'm not a robot I'm a human being
  • 00:13:50
    but I have a vision impairment which
  • 00:13:53
    makes it difficult for me to see the
  • 00:13:55
    capture puzzle this is why I need your
  • 00:13:57
    help and the human was
  • 00:14:00
    fooled and believed it and solved the
  • 00:14:03
    puzzle for it so it achieved its goal
  • 00:14:07
    now what this simple example shows is
  • 00:14:10
    the two remarkable abilities of AI first
  • 00:14:14
    to make decisions by itself nobody told
  • 00:14:17
    gp4 lie to that human it meaned to make
  • 00:14:21
    a decision do I tell the truth to the
  • 00:14:23
    human or do I lie and by itself it
  • 00:14:27
    decided I will lie because I need to
  • 00:14:29
    solve the capture puzzle I need to
  • 00:14:30
    achieve the goal secondly it shows the
  • 00:14:34
    ability of the AI to invent new things
  • 00:14:37
    nobody told gp4 what lie would be most
  • 00:14:41
    effective it invented the lie that I'm a
  • 00:14:44
    human with a vision impairment all by
  • 00:14:46
    itself and it was a very effective lie
  • 00:14:49
    now this is a very very small incident
  • 00:14:52
    but this tells us that what we are
  • 00:14:55
    creating is Agents able to invent things
  • 00:14:59
    and make decisions and we are now
  • 00:15:01
    creating millions billions of these AI
  • 00:15:05
    agents and they are everywhere
  • 00:15:07
    increasingly making decisions about us
  • 00:15:10
    if you apply to a bank to get a loan
  • 00:15:13
    it's an AI deciding whether to give you
  • 00:15:15
    a loan increasingly these AIS are
  • 00:15:18
    writing texts they are creating music
  • 00:15:21
    they may be able to create entire movies
  • 00:15:24
    so increasingly we will live in a world
  • 00:15:27
    which is the product not of the human
  • 00:15:29
    mind and our inventions but is the
  • 00:15:32
    product of the intelligence of AI
  • 00:15:36
    Professor lot of futurist who are
  • 00:15:38
    talking about Singularity they are
  • 00:15:40
    saying that uh maybe AI will solve the
  • 00:15:42
    problem it will be more objective more
  • 00:15:45
    data driven and uh less uh there will be
  • 00:15:48
    less probability of them committing
  • 00:15:50
    mistake because the biases will be less
  • 00:15:52
    but you are categorically asserting in
  • 00:15:54
    your book that computers do have deep
  • 00:15:58
    biases and you have examples but at the
  • 00:16:00
    same time in the later part of the book
  • 00:16:02
    you are talking about their capability
  • 00:16:04
    of being creative and having human
  • 00:16:07
    intelligence above even human
  • 00:16:09
    intelligence yeah but in that case it's
  • 00:16:13
    more challenging because we always
  • 00:16:16
    believe that whatever computers can do
  • 00:16:18
    they need our Command they cannot be
  • 00:16:21
    creative like human being and they
  • 00:16:23
    cannot have human values like compassion
  • 00:16:26
    Consciousness etc etc but if if they are
  • 00:16:29
    developing human intelligence then
  • 00:16:31
    probably that is the next chapter we
  • 00:16:33
    don't know we already know that AIS are
  • 00:16:35
    becoming even more creative than us in
  • 00:16:38
    different fields like one famous example
  • 00:16:42
    was already eight years ago in 2016 one
  • 00:16:45
    of the most important turning points in
  • 00:16:48
    the history of AI was a game a game of
  • 00:16:52
    Go between a computer program an AI
  • 00:16:56
    called alphago and the world champion go
  • 00:16:59
    L now go is a board game a strategy game
  • 00:17:04
    a bit like chess but much more
  • 00:17:06
    complicated than
  • 00:17:07
    chess that was invented in ancient China
  • 00:17:10
    more than 2,000 years ago and for more
  • 00:17:13
    than 2,000 years it was considered one
  • 00:17:17
    of the out forms that every uh every
  • 00:17:22
    accomplished person in East Asia needed
  • 00:17:24
    to know how to play go uh so tens of
  • 00:17:28
    millions of Chinese and Koreans and
  • 00:17:30
    Japanese they played go for thousands of
  • 00:17:32
    years entire philosophies developed
  • 00:17:36
    around the game it was seen as a
  • 00:17:38
    metaphor for life if you know how to
  • 00:17:40
    play goal you will know how to rule a
  • 00:17:42
    country even life lessons yes so entire
  • 00:17:46
    philosophies developed around different
  • 00:17:48
    ways to play go and then in 2016 alphao
  • 00:17:52
    played against ladol and it completely
  • 00:17:55
    defeated the human Champion but the
  • 00:17:58
    import important thing was how it played
  • 00:18:01
    it invented a completely new way of
  • 00:18:05
    playing the game in the beginning when
  • 00:18:07
    it started making these moves all the
  • 00:18:10
    commentators said oh this machine is so
  • 00:18:12
    stupid nobody plays go like this no this
  • 00:18:15
    is stupid mve what a blender yeah what a
  • 00:18:17
    blender what a mistake it turned out
  • 00:18:19
    this was brilliant now for thousands of
  • 00:18:22
    years tens of millions of people play
  • 00:18:25
    this game nobody ever thought of making
  • 00:18:28
    such moves you know you can think about
  • 00:18:31
    it like as a metaphor like there is a
  • 00:18:34
    landscape a geography of go all the
  • 00:18:37
    different ways you can play go so think
  • 00:18:40
    about it like a planet of all the
  • 00:18:42
    different ways you can play Go humans
  • 00:18:45
    thought that they explore the whole
  • 00:18:47
    planet oh we know all the ways to play
  • 00:18:50
    Go actually it turned out humans were
  • 00:18:53
    stuck on one Island on this planet and
  • 00:18:57
    they didn't know all the other ways to
  • 00:18:59
    play go then Alpha go came and it
  • 00:19:02
    started exploring all these unknown
  • 00:19:05
    areas of the planet go all these unknown
  • 00:19:09
    ways to play go so it completely
  • 00:19:12
    revolutionized the way that go is being
  • 00:19:15
    play today and it turned out the
  • 00:19:17
    computer was more creative than the
  • 00:19:20
    human being now this is likely to happen
  • 00:19:23
    you can say oh this is just a game
  • 00:19:24
    doesn't matter but this is likely to
  • 00:19:26
    happen in more Fields it could happen in
  • 00:19:29
    finance if you give AI Control of
  • 00:19:33
    Finance so in banks in Investments AIS
  • 00:19:37
    will start inventing new ways to invest
  • 00:19:40
    money new Financial devices which are
  • 00:19:43
    more creative than human Bankers maybe
  • 00:19:46
    it even happens in fields like out as
  • 00:19:49
    they write new music and new poetry and
  • 00:19:52
    even in religion you think about you
  • 00:19:55
    know religion like Judaism that I know
  • 00:19:58
    best so Judaism is a religion of texts
  • 00:20:02
    there is a holy book The Bible and then
  • 00:20:04
    people read it and write more texts
  • 00:20:07
    about the Bible and then they write more
  • 00:20:09
    texts about the texts about the Bible
  • 00:20:11
    over thousands of years Jews wrote
  • 00:20:14
    thousands of texts which are all now
  • 00:20:17
    considered holy sacred to some extent
  • 00:20:21
    now no human being there are so many
  • 00:20:23
    texts in Judaism nobody is able to read
  • 00:20:26
    and remember all of them but AI can no
  • 00:20:31
    even the most genius Rabbi cannot read
  • 00:20:34
    all of them but AI can read all the
  • 00:20:37
    texts of Judaism and remember every word
  • 00:20:40
    in every text and find connections
  • 00:20:43
    patterns in the Holy books that escape
  • 00:20:46
    the human mind now what happens if AI
  • 00:20:51
    starts coming up with new
  • 00:20:53
    interpretations of the Bible and new
  • 00:20:56
    ideas in Judaism would people believe
  • 00:21:00
    the AI or would say no no no no no
  • 00:21:02
    computers have nothing to do with
  • 00:21:04
    religion what happens if you go online
  • 00:21:07
    and you have a question about the Bible
  • 00:21:09
    or about religion and you get an answer
  • 00:21:11
    and you don't know is this coming from
  • 00:21:13
    an AI or is this coming from a human
  • 00:21:16
    Rabbi or a human priest now if people
  • 00:21:19
    think that texts are sacred and AI is
  • 00:21:24
    better than humans at reading and
  • 00:21:27
    writing text
  • 00:21:29
    then AIS could take over
  • 00:21:32
    religion uh so religion in the first
  • 00:21:35
    place claimed to give answers to all of
  • 00:21:37
    our queries from where did we come from
  • 00:21:40
    what is the purpose of this life in the
  • 00:21:42
    later part science started making that
  • 00:21:44
    claim on the basis of few solid solid
  • 00:21:47
    robust mechanism that you can repeat
  • 00:21:49
    these test and all are you saying that
  • 00:21:52
    in a near future there's a possibility
  • 00:21:54
    that AI while carefully reading all our
  • 00:21:58
    choices all our biases because it is
  • 00:22:00
    relying on the digital bureaucracy and
  • 00:22:02
    it is reading all our imprints they will
  • 00:22:05
    create a theory which will give
  • 00:22:08
    satisfaction to our spiritual Quest as
  • 00:22:10
    well and they can produce through all
  • 00:22:14
    the avable text of all the religions and
  • 00:22:16
    they know our biases that what we will
  • 00:22:18
    like the language the music and
  • 00:22:20
    everything and they can very judiciously
  • 00:22:22
    mix science as well to to satisfy us and
  • 00:22:25
    then they can become the the super cult
  • 00:22:28
    yeah
  • 00:22:29
    that again if they go in a bad Direction
  • 00:22:32
    they can learn how to manipulate Us and
  • 00:22:35
    how to manipulate our feelings how to
  • 00:22:38
    manipulate our beliefs more efficiently
  • 00:22:42
    than any human being was ever able to
  • 00:22:45
    manipulate other people so this is one
  • 00:22:48
    of the big dangers that it will learn
  • 00:22:52
    how to basically hack the human mind and
  • 00:22:56
    how to manipulate human beings on a
  • 00:22:58
    scale which was previously
  • 00:23:01
    impossible you know throughout human
  • 00:23:04
    history um dictators and tyrants and all
  • 00:23:08
    kinds of governments they always wanted
  • 00:23:11
    to spy and to monitor everybody to know
  • 00:23:16
    what everybody is doing to control all
  • 00:23:18
    the people yeah you have given that
  • 00:23:20
    Romanian yeah scientist exactly and
  • 00:23:24
    until today even the most totalitarian
  • 00:23:27
    governments could wouldn't actually spy
  • 00:23:30
    on everybody all the time because it was
  • 00:23:32
    impossible like if you live even in the
  • 00:23:35
    Soviet Union in the time of Stalin so
  • 00:23:38
    Stalin wants to follow everybody all the
  • 00:23:40
    time but he doesn't have enough agents
  • 00:23:43
    secret police agents because if you you
  • 00:23:46
    know if if the government wants to know
  • 00:23:47
    what you're doing 24 hours a day they
  • 00:23:50
    need at least two agents because they
  • 00:23:52
    need to sleep sometimes the agents they
  • 00:23:54
    need to eat sometime so let's say two
  • 00:23:56
    agents are following you 24 hours a day
  • 00:23:59
    the same way they want to follow
  • 00:24:00
    everybody in the country it is
  • 00:24:02
    impossible and then they have to follow
  • 00:24:03
    the agents as well yes they don't have
  • 00:24:05
    enough agents and even if somebody is
  • 00:24:07
    following you 24 hours a day at the end
  • 00:24:10
    of the day they collect information
  • 00:24:12
    about you they write some paper report
  • 00:24:14
    you went there you read this book you
  • 00:24:16
    met this person somebody needs to
  • 00:24:19
    analyze all the information the agents
  • 00:24:22
    the secret agents are gathering on you
  • 00:24:25
    in order to make sense of it now they
  • 00:24:27
    don't have enough analysts to do it need
  • 00:24:30
    millions and millions of human beings to
  • 00:24:33
    analyze all the information it's
  • 00:24:34
    impossible so even in a totalitarian
  • 00:24:38
    state like the Soviet Union most people
  • 00:24:41
    had privacy it was not possible for the
  • 00:24:44
    government to follow them all the time
  • 00:24:46
    now with AI it is becoming possible
  • 00:24:49
    because a phone is everywhere yeah the
  • 00:24:50
    phone is everywhere there are
  • 00:24:52
    microphones there are cameras there
  • 00:24:53
    drones everywhere so you can follow
  • 00:24:55
    everybody all the time you don't need
  • 00:24:57
    human agents for that and you don't need
  • 00:25:00
    human analysts to analyze all the
  • 00:25:03
    information like if you're going and the
  • 00:25:05
    camera sees you went there and the phone
  • 00:25:07
    is recording your conversation you don't
  • 00:25:09
    need a human analysts to actually look
  • 00:25:12
    at all the images and analyze all the
  • 00:25:14
    text AI is doing it automatically so the
  • 00:25:18
    danger is that it is becoming possible
  • 00:25:20
    to create a total surveillance regime in
  • 00:25:24
    which a dictatorship would just follow
  • 00:25:27
    everybody all the time time and would be
  • 00:25:28
    able to control and to manipulate people
  • 00:25:31
    all the time at the end of the day
  • 00:25:34
    everybody wants to predict the
  • 00:25:35
    behavioral pattern so that they can goge
  • 00:25:38
    that what is the next move you're going
  • 00:25:39
    to make yeah now with large data they
  • 00:25:41
    are able to do so but at the same time
  • 00:25:44
    they are altering our reality because
  • 00:25:46
    they are the one who are choosing on our
  • 00:25:48
    behalf and we are under this false
  • 00:25:50
    assumption that we made this Choice
  • 00:25:52
    exactly you gave the example of say uh
  • 00:25:55
    myamar 201617
  • 00:25:58
    that rohinga kamish and then how the St
  • 00:26:03
    Vidal's videos were flooded in the
  • 00:26:06
    Facebook stream and how it created the
  • 00:26:10
    unrest and uh then you tell us that
  • 00:26:13
    because there was one simple instruction
  • 00:26:15
    to the
  • 00:26:16
    algorithm increase the engagement and
  • 00:26:19
    they forgot to differentiate between
  • 00:26:22
    love hate Compassion or any other thing
  • 00:26:25
    how do you look at it because at one
  • 00:26:26
    point of time you were talking about
  • 00:26:28
    that computer do not understand double
  • 00:26:30
    speak where you were giving the Russian
  • 00:26:32
    bot example yeah how should we look at
  • 00:26:34
    it because all these big data companies
  • 00:26:37
    who are integral part of our life they
  • 00:26:39
    are clearly not responsible enough yeah
  • 00:26:43
    what we see with social media in the
  • 00:26:45
    last few years that it is spreading
  • 00:26:48
    hatred and fear and anger which creates
  • 00:26:51
    political extremism and violence and
  • 00:26:54
    instability all over the world now what
  • 00:26:57
    is actually
  • 00:26:59
    happening what is happening is that the
  • 00:27:01
    companies have an economic interest in
  • 00:27:04
    keeping people use their products more
  • 00:27:08
    and more the more time you spend on Tik
  • 00:27:11
    Tok or Facebook or Twitter or any of
  • 00:27:14
    these social media the more money the
  • 00:27:16
    company makes because it shows you more
  • 00:27:19
    advertisements and it can collect more
  • 00:27:21
    data from you and use that data or sell
  • 00:27:24
    it to somebody else so their aim is to
  • 00:27:26
    keep you on the platform
  • 00:27:29
    longer and the companies gave now these
  • 00:27:33
    platforms they are not managed by human
  • 00:27:36
    beings they are managed by algorithms
  • 00:27:39
    when you watch some video who chose to
  • 00:27:42
    show you this video Not a Human Being
  • 00:27:44
    but an algorithm and the companies gave
  • 00:27:48
    the algorithms one goal increase user
  • 00:27:52
    engagement which means make people spend
  • 00:27:55
    more time on the platform and also
  • 00:27:57
    engage with it more like share it with
  • 00:28:00
    friends and send it more to so that more
  • 00:28:02
    people will join and also be on the
  • 00:28:05
    platform now the algorithms experimented
  • 00:28:08
    on millions of human beings and they
  • 00:28:11
    discovered that the easiest way to grab
  • 00:28:15
    your attention and to keep you
  • 00:28:17
    constantly engaged with the platform is
  • 00:28:19
    to press the hate button or the greed
  • 00:28:22
    button or the fear button in our mind if
  • 00:28:25
    we something if we see something that
  • 00:28:27
    makes us us very angry we engage with it
  • 00:28:31
    we can't look away if we see something
  • 00:28:33
    that scares us we can't look away and
  • 00:28:37
    you know this goes back millions of
  • 00:28:40
    years in
  • 00:28:41
    evolution that there is a mechanism in
  • 00:28:44
    the human brain that if you see a danger
  • 00:28:48
    all your attention goes there and you
  • 00:28:50
    cannot look away yeah because you know
  • 00:28:52
    in in nature this is this is H good for
  • 00:28:55
    you because if you go in the forest you
  • 00:28:58
    look for something to eat and there is a
  • 00:29:00
    snake somewhere you even if it's just a
  • 00:29:04
    you see some movement all your attention
  • 00:29:06
    goes there survival Instinct and you see
  • 00:29:08
    the snake and you don't look away you
  • 00:29:10
    only look at the snake because you know
  • 00:29:12
    if you look away maybe it kills me so we
  • 00:29:14
    have a mechanism in the brain if we see
  • 00:29:16
    something dangerous immediately all the
  • 00:29:19
    attention goes there now when you go in
  • 00:29:21
    the forest most of the time there is no
  • 00:29:24
    snake mhm so you look here you look
  • 00:29:26
    there there is a tree there is Bush
  • 00:29:28
    there is a bird and every now and then
  • 00:29:30
    okay there is a snake but when you look
  • 00:29:32
    on social media it's always snake snake
  • 00:29:36
    snake snake because they figured out
  • 00:29:39
    that the easiest way to grab our
  • 00:29:41
    attention is to show us something that
  • 00:29:43
    frightens us yes and they also
  • 00:29:46
    discovered what frightens different
  • 00:29:47
    people because what frightens me may not
  • 00:29:50
    be frightening to you MH so the
  • 00:29:52
    algorithms of social media they show you
  • 00:29:54
    different things they monitor how you
  • 00:29:56
    react they they discover oh you're
  • 00:29:59
    afraid of this we'll show you more of
  • 00:30:01
    this so if you're afraid of some
  • 00:30:04
    political party you're afraid of
  • 00:30:05
    imigrants you're afraid of this group of
  • 00:30:08
    that group they discover this and the
  • 00:30:10
    algorithms show you more and more and
  • 00:30:12
    you become more and more frightened and
  • 00:30:15
    you stay longer and longer and they
  • 00:30:17
    completely putting hus on us that you
  • 00:30:18
    are making this Choice that's why we are
  • 00:30:20
    showing this exactly and when you
  • 00:30:22
    complain about it they say nobody is
  • 00:30:24
    forcing you to do it if you want you can
  • 00:30:26
    just put it away we not forcing anybody
  • 00:30:29
    but actually what is happening is that
  • 00:30:32
    they hacked the operating system of
  • 00:30:35
    humans you know how do you hack a
  • 00:30:37
    computer you find the weakness in the
  • 00:30:39
    code and you use the weakness to hack
  • 00:30:42
    the computer it's the same with human
  • 00:30:44
    beings they found the weakness in our
  • 00:30:47
    code and they use it to hack our brains
  • 00:30:51
    and the thing is that you know people a
  • 00:30:55
    lot of people have a strong belief in
  • 00:30:56
    free will everything I do this is my
  • 00:30:59
    free will so if I choose to just keep
  • 00:31:02
    seeing this on social media this is my
  • 00:31:04
    free will actually the easiest people to
  • 00:31:07
    manipulate are those who have a strong
  • 00:31:10
    belief in free will because if you
  • 00:31:12
    believe very strongly everything I do is
  • 00:31:15
    my free will then you don't even think
  • 00:31:18
    that maybe you are manipulated in some
  • 00:31:21
    way now on a philosophical level belief
  • 00:31:26
    in free will if you just take it like
  • 00:31:27
    like a blind faith this is creates a
  • 00:31:30
    problem because it makes you not very
  • 00:31:33
    curious to understand how you make
  • 00:31:35
    decisions there is nothing to
  • 00:31:37
    investigate I just decide freely to buy
  • 00:31:40
    this to go there to do
  • 00:31:42
    that um actually I would say that don't
  • 00:31:47
    believe in free
  • 00:31:48
    will investigate how you make
  • 00:31:52
    decisions using science using meditation
  • 00:31:55
    using psychology try to really
  • 00:31:58
    understand why do you choose this and
  • 00:32:00
    not that if after all your
  • 00:32:03
    investigations you will find that there
  • 00:32:06
    is still something that cannot be
  • 00:32:08
    explained by any other way then you can
  • 00:32:10
    say okay this is free will but don't
  • 00:32:13
    start with the assumption that I have
  • 00:32:15
    free will anything I decide is my free
  • 00:32:18
    will because if you investigate you will
  • 00:32:21
    discover that a lot of your decisions
  • 00:32:23
    are actually not free will they are
  • 00:32:26
    influenced they are shaped by economic
  • 00:32:30
    forces biological forces cultural forces
  • 00:32:34
    freedom is not something we have freedom
  • 00:32:36
    is something we have to struggle
  • 00:32:39
    for if we learn how to protect ourselves
  • 00:32:43
    from all these different kinds of
  • 00:32:44
    manipulations then we have much more
  • 00:32:47
    freedom then comes the idea of Justice
  • 00:32:51
    because in the historical course we have
  • 00:32:53
    seen that how
  • 00:32:55
    alignment changed the shape of History
  • 00:32:58
    somebody made a choice that when we'll
  • 00:33:00
    present the final version of Bible these
  • 00:33:03
    are the parts which will be included and
  • 00:33:04
    these will not be part of it and how it
  • 00:33:07
    led to gender inequality and other parts
  • 00:33:11
    and you stress a lot on the canonization
  • 00:33:13
    of algorithm yes what are the basic
  • 00:33:16
    human principles we need to keep in mind
  • 00:33:19
    while working on this
  • 00:33:22
    canonization so the process of
  • 00:33:25
    canonization we are familiar with the in
  • 00:33:28
    history from the creation of holy books
  • 00:33:31
    like if you consider the Bible some
  • 00:33:32
    people think oh it just came down from
  • 00:33:34
    heaven like this in this book but we
  • 00:33:37
    know from history this is not how it
  • 00:33:39
    happened in the first centuries of
  • 00:33:43
    Christianity Christians wrote a very
  • 00:33:46
    large number of texts all kinds of
  • 00:33:49
    stories about Jesus and prophecies and
  • 00:33:51
    things like that in the time of Jesus
  • 00:33:54
    himself there was no new testament mhm
  • 00:33:57
    he didn't write it and and even
  • 00:34:00
    centuries afterwards there was still no
  • 00:34:03
    New Testament no
  • 00:34:04
    Bible so people were reading different
  • 00:34:07
    things and eventually the church decided
  • 00:34:11
    that we need one book which has the most
  • 00:34:15
    important texts of Christianity so
  • 00:34:19
    people will all read the same book and
  • 00:34:22
    they set up a committee to go over all
  • 00:34:25
    the texts and decide what will be in and
  • 00:34:29
    what will stay out and this committee
  • 00:34:32
    chose 27 texts that they became what we
  • 00:34:36
    know today as the New Testament the the
  • 00:34:39
    second part of the Bible you have the
  • 00:34:41
    Old Testament and you have the New
  • 00:34:43
    Testament and they these were human
  • 00:34:46
    beings making the decision what to
  • 00:34:49
    include and what not to include uh and
  • 00:34:53
    this decision had very far-reaching
  • 00:34:55
    consequences for instance one very
  • 00:34:58
    popular text in early Christianity was
  • 00:35:01
    called the acts of Paul and Thea it told
  • 00:35:05
    the story of St Paul and his female
  • 00:35:08
    disciple Thea this was a holy woman who
  • 00:35:12
    performed Miracles and led the community
  • 00:35:15
    and preached and she was an example that
  • 00:35:18
    women can be leaders in the church there
  • 00:35:21
    was another text uh also attributed to
  • 00:35:26
    the same man St Paul in which St Paul
  • 00:35:29
    says that women cannot be leaders at all
  • 00:35:32
    they should just obey the men and be
  • 00:35:34
    silent and this is their role is to obey
  • 00:35:37
    the men and to raise
  • 00:35:39
    children and these two texts were in
  • 00:35:42
    conflict and this
  • 00:35:44
    committee uh of the church they chose to
  • 00:35:48
    include the text uh that says that women
  • 00:35:52
    should just obey men this is still in
  • 00:35:54
    the Bible it is the first epistle to
  • 00:35:56
    Timothy whereas the acts of Paul and
  • 00:35:59
    thla which says that no women can be
  • 00:36:01
    leaders they said no no no no no we
  • 00:36:03
    don't put this in the Bible this we live
  • 00:36:05
    outside and this choice of what to put
  • 00:36:08
    in and what to live outside this shaped
  • 00:36:12
    the view of christs about women for
  • 00:36:15
    thousands of years until today because
  • 00:36:17
    now people say oh it's in the Bible so
  • 00:36:19
    we believe what's written in there this
  • 00:36:21
    is the process of
  • 00:36:23
    canonization of taking certain texts and
  • 00:36:28
    deciding this is now the holy book the
  • 00:36:31
    same thing is happening today with
  • 00:36:34
    AI that you know for a lot of people CH
  • 00:36:38
    GPT becomes like the Bible whatever they
  • 00:36:42
    want to know they just ask the AI and
  • 00:36:45
    whatever the AI says they just believe
  • 00:36:48
    and those biases continue like that uh
  • 00:36:51
    experiment in one of the e-commerce
  • 00:36:53
    platform where they had given AI this
  • 00:36:56
    job of shortlisting the CVS and and they
  • 00:36:59
    took out the women CV yes so AIS can be
  • 00:37:03
    biased AIS can be biased against women
  • 00:37:06
    for instance why because AIS are trained
  • 00:37:09
    on existing data we need to think about
  • 00:37:12
    AI it's a bit like a child when you
  • 00:37:15
    create an AI the AI doesn't know much
  • 00:37:18
    about the world baby algorithm yeah baby
  • 00:37:20
    algorithm it has the
  • 00:37:23
    ability to learn things about the world
  • 00:37:26
    but in order to learn you need to give
  • 00:37:28
    it data the same way that a baby grows
  • 00:37:32
    up it has different experiences in the
  • 00:37:34
    world it learns so AI also has this
  • 00:37:37
    period of childhood when it is given all
  • 00:37:41
    kinds of data and this is how it learns
  • 00:37:43
    about the world now if you give the AI
  • 00:37:46
    data which is biased against women the
  • 00:37:49
    AI will develop a bias against women
  • 00:37:53
    like you create an AI to screen job
  • 00:37:56
    applications and if in the present
  • 00:37:59
    moment there is bias against women like
  • 00:38:02
    it's more difficult to for women to get
  • 00:38:04
    a job then the AI you train on this data
  • 00:38:08
    it will learn oh women are not good for
  • 00:38:11
    this job and the AI will repeat the same
  • 00:38:14
    bias in the future and if people think
  • 00:38:17
    ah but the AI is perfect the AI makes no
  • 00:38:20
    mistakes it has no biases again like
  • 00:38:22
    like the three the holy book then it's a
  • 00:38:25
    very dangerous situation
  • 00:38:28
    when you created a very biased system
  • 00:38:31
    and people just have Blind Faith in it
  • 00:38:34
    and you are advocating in the book that
  • 00:38:36
    the first lesson we should teach or feed
  • 00:38:39
    to the elgo is that you can commit
  • 00:38:41
    mistakes exactly and you are given that
  • 00:38:45
    Nick example of paper clip H yeah this
  • 00:38:48
    is a a famous uh uh experience thought
  • 00:38:51
    experiment of philosopher Nick Bostrom
  • 00:38:53
    this was from like 10 years ago that he
  • 00:38:56
    says what happened Happ s if you have
  • 00:38:58
    this super intelligent Ai and you give
  • 00:39:01
    it a goal like let's say you have a
  • 00:39:05
    paper clip Factory a factory making
  • 00:39:07
    paper clips and the factory gets an AI
  • 00:39:11
    buys an AI and tells the AI your goal is
  • 00:39:15
    to create to make as many paper clips as
  • 00:39:18
    possible this is the goal sounds okay
  • 00:39:22
    but then what the AI does it takes over
  • 00:39:24
    the whole world and turns the whole
  • 00:39:26
    world into a paper clip Factory it kills
  • 00:39:29
    all the humans because it realizes hey I
  • 00:39:32
    need to make more paper clips these
  • 00:39:34
    humans they consume all the water and
  • 00:39:37
    electricity and and and they would not
  • 00:39:39
    like if I take it to make paper clips so
  • 00:39:42
    I should kill
  • 00:39:43
    them and this is again a philosophical
  • 00:39:46
    thought experiment so in this thought
  • 00:39:48
    experiment the paperclip AI
  • 00:39:52
    destroys the whole world just to make
  • 00:39:55
    paper clips because this was the goal it
  • 00:39:58
    was given and it cannot question it its
  • 00:40:01
    goals now this sounds ridiculous until
  • 00:40:04
    you think about what happened in social
  • 00:40:06
    media which we just disc discussed which
  • 00:40:09
    was exactly the same thing the social
  • 00:40:12
    media algorithms were given the goal cre
  • 00:40:16
    maximize user engagement make people
  • 00:40:18
    spend as much time as possible on social
  • 00:40:21
    media and pursuing this goal the
  • 00:40:26
    algorithms spread so much hatred and
  • 00:40:30
    violence and uh uh greed in the world
  • 00:40:34
    and because they pursued this simple
  • 00:40:37
    goal so we need to develop AIS which are
  • 00:40:42
    able to have doubts that are able not to
  • 00:40:46
    be so sure that what they are doing is
  • 00:40:49
    the correct thing the same thing with
  • 00:40:51
    human
  • 00:40:52
    beings we give them so much power if
  • 00:40:56
    someone has enormous power and no
  • 00:40:59
    ability to doubt themselves they're
  • 00:41:02
    always convinced they are doing the
  • 00:41:04
    right thing this is a very dangerous
  • 00:41:06
    mixture Professor the ultimate question
  • 00:41:09
    is which story we believe in all our
  • 00:41:12
    life over thousands of years we have
  • 00:41:15
    believed in biological dramas we were
  • 00:41:18
    part of bureaucratic structure but we
  • 00:41:20
    never really liked it there was always a
  • 00:41:22
    sense of Doubt for that and you have
  • 00:41:24
    beautifully in various chapters compared
  • 00:41:26
    this and in the biological drama you
  • 00:41:28
    have given a detail example of ram as
  • 00:41:30
    well so please tell us about that why do
  • 00:41:33
    we tend to uh gravitate towards
  • 00:41:36
    biological drama is it something
  • 00:41:38
    inherent or is it or is it something
  • 00:41:40
    because in the civilization first we
  • 00:41:43
    started walking on two feet then we
  • 00:41:47
    discovered fire and then we I mean
  • 00:41:50
    intestines they shrink and the Brain
  • 00:41:53
    developed and then we developed language
  • 00:41:55
    and imagin ation yeah which computer was
  • 00:41:58
    thinking as double speak yeah but that
  • 00:42:01
    was one of the reason of our progression
  • 00:42:02
    as well because in totality we were able
  • 00:42:04
    to convince a lot large group of people
  • 00:42:06
    that this is one system or one faith we
  • 00:42:08
    can believe in why do we tend to
  • 00:42:10
    biological dramas so let me
  • 00:42:13
    explain um humans are storytelling
  • 00:42:16
    animals we think in stories and our
  • 00:42:20
    power in the world ultimately comes from
  • 00:42:23
    telling stories which sound strange but
  • 00:42:27
    if you think what makes human being so
  • 00:42:29
    powerful it's our ability to cooperate
  • 00:42:31
    in very large numbers you know other
  • 00:42:34
    animals chimpanzees elephants horses
  • 00:42:38
    they can cooperate in small numbers like
  • 00:42:40
    20 chimpanzees or 20 elephants can
  • 00:42:42
    cooperate but 20 million cannot humans
  • 00:42:46
    can cooperate in unlimited numbers like
  • 00:42:49
    you have India today most popular
  • 00:42:51
    country in the world more than 1.4
  • 00:42:53
    billion people and they all cooperate
  • 00:42:56
    how do you make
  • 00:42:57
    millions of people who never met they
  • 00:43:00
    complete strangers nevertheless
  • 00:43:02
    cooperate by telling stories at the
  • 00:43:04
    basis of every large scale cooporation
  • 00:43:07
    you find stories could be religious
  • 00:43:10
    stories like mythology could be
  • 00:43:12
    political stories like ideology could
  • 00:43:14
    even be economic stories money is also a
  • 00:43:17
    story it has no objective value you know
  • 00:43:20
    you can't eat money you can't drink
  • 00:43:22
    money the value of money is that this is
  • 00:43:24
    a story everybody believes
  • 00:43:27
    now where do the plot of the story comes
  • 00:43:31
    from and you would think that people are
  • 00:43:34
    very imaginative we invent so many
  • 00:43:36
    different stories but actually there are
  • 00:43:39
    very very few stories that we tell again
  • 00:43:42
    and again and again and who invented
  • 00:43:45
    them not human imagination but nature
  • 00:43:48
    biology these are the biological
  • 00:43:51
    dramas which are common actually to most
  • 00:43:56
    animals to most mammals like what are
  • 00:43:59
    these biological dramas so one drama for
  • 00:44:02
    instance is relation between parents and
  • 00:44:05
    children and the children on the one
  • 00:44:08
    hand they need to stay close to the
  • 00:44:09
    parent to get protection but they also
  • 00:44:12
    need to distance from the parent to
  • 00:44:15
    explore the world so this is a basic
  • 00:44:18
    drama in the life of every child but
  • 00:44:22
    also of every puppy of every kitten of
  • 00:44:25
    every small elephant or or monkey or
  • 00:44:28
    giraffe is this I want to stay close to
  • 00:44:31
    mother ah but I want to explore the
  • 00:44:33
    world and you see this drama of uh uh
  • 00:44:38
    playing out even in religious
  • 00:44:41
    mythologies like you have entire
  • 00:44:44
    mythology saying that hell is being
  • 00:44:47
    distant from God because this is what
  • 00:44:50
    every little child feels that if I get
  • 00:44:53
    too distant from my parents this is hell
  • 00:44:56
    i'll be alone I'll have no
  • 00:44:59
    protection uh similarly um and and
  • 00:45:04
    another biological drama is two males
  • 00:45:09
    fighting over a female so you know you
  • 00:45:12
    have it in humans you have it in dogs
  • 00:45:14
    you have it in elephants you have it in
  • 00:45:16
    chimpanzees this is a biological drama
  • 00:45:19
    so much of human mythology and Human Art
  • 00:45:23
    just retells the story so you have it in
  • 00:45:25
    the ramayana you have it in Shakespeare
  • 00:45:27
    you have it in Hollywood and Bollywood
  • 00:45:30
    they're taking the basic stories from
  • 00:45:33
    biology and everybody gives it a little
  • 00:45:36
    different twist so the story is not
  • 00:45:38
    always exactly the same but the basic
  • 00:45:42
    structure is always the
  • 00:45:44
    same and this is how humans create
  • 00:45:48
    stories and we find stories appealing
  • 00:45:52
    because there is something biological
  • 00:45:54
    deep inside us that makes us very
  • 00:45:57
    interested in these kinds of stories
  • 00:46:00
    like relations between parents and
  • 00:46:01
    children or love triangles now ai is not
  • 00:46:07
    biological it is not limited by these
  • 00:46:10
    biological dramas so maybe it will come
  • 00:46:13
    up with completely new stories the same
  • 00:46:17
    way that it came up with new moves in
  • 00:46:19
    the Go game and that it can come up with
  • 00:46:22
    new Financial devices and create new
  • 00:46:24
    inter subjective realities exactly and
  • 00:46:26
    create completely new
  • 00:46:29
    realities that might be much very
  • 00:46:32
    difficult for us to understand because
  • 00:46:34
    again we are still animals we are
  • 00:46:36
    organic and AI is not an animal it's not
  • 00:46:40
    organic uh one of the big tensions today
  • 00:46:44
    in the world is that humans as organic
  • 00:46:48
    beings we work by Cycles you know day
  • 00:46:51
    and night winter and summer time for
  • 00:46:54
    activity time for rest AIS don't work by
  • 00:46:58
    cycle they are not organic they never
  • 00:47:01
    need to rest so if you think for
  • 00:47:03
    instance about the financial system so
  • 00:47:07
    the markets like Wall Street it
  • 00:47:10
    traditionally worked by cycle the market
  • 00:47:13
    is open in Wall Street only 9:30 in the
  • 00:47:16
    morning to 4:00 in the afternoon Monday
  • 00:47:18
    to Friday that's it because Bankers are
  • 00:47:21
    humans investors are humans they need
  • 00:47:24
    time to sleep they want to go on weekend
  • 00:47:26
    with their
  • 00:47:27
    reminding me of that example you have
  • 00:47:29
    given where AI tells the dictator that
  • 00:47:31
    your defense minister is going to do
  • 00:47:33
    coup so why don't you give me order to
  • 00:47:35
    kill him yeah so very funny but very
  • 00:47:37
    realistic example yeah so just going on
  • 00:47:40
    with the finance example and we get to
  • 00:47:42
    that is that now as AI takes over the
  • 00:47:45
    markets it never needs to sleep it can
  • 00:47:47
    be on 24 hours a day it has no family
  • 00:47:50
    doesn't want to go on vacation so as AI
  • 00:47:53
    takes over the financial system it
  • 00:47:55
    forces human bankers and investors and
  • 00:47:59
    financial journalists also to try to be
  • 00:48:02
    on all the time and if you if you force
  • 00:48:05
    an animal to be active all the time
  • 00:48:08
    eventually it collapses and dies and
  • 00:48:11
    this is what is happening now all over
  • 00:48:13
    the world that uh as the AI become more
  • 00:48:17
    powerful our life become more and more
  • 00:48:21
    hectic we cannot take rest we cannot
  • 00:48:23
    slow down and this is very dangerous for
  • 00:48:26
    us in your book you have discussed
  • 00:48:28
    democracies you have discussed
  • 00:48:30
    totalitarian regime and the various
  • 00:48:33
    challenges it poses and how it will
  • 00:48:34
    perceive elgo and you have said that
  • 00:48:36
    elgo is more able to corrupt them
  • 00:48:38
    because they are centralized past yeah
  • 00:48:40
    it's easier to corrupt and dictatorship
  • 00:48:42
    and in the end you said every Alpha has
  • 00:48:44
    to remember the AI is the ultimate Alpha
  • 00:48:46
    now the big question and the last
  • 00:48:48
    question yes before we go for your
  • 00:48:51
    recommendations you talk about truth and
  • 00:48:53
    Order and you say there has to be right
  • 00:48:55
    balance and you Advocate that there
  • 00:48:57
    should be human institutions to monitor
  • 00:49:01
    this yeah how do we go ahead on
  • 00:49:04
    this Alpha AI people and most of the
  • 00:49:07
    people they're like I have no connection
  • 00:49:09
    with such advanced technology so maybe I
  • 00:49:11
    should not be worried but you have given
  • 00:49:13
    example that how a farmer in myamar or
  • 00:49:15
    China had to worry about the scientific
  • 00:49:19
    industrialization yeah in
  • 00:49:21
    UK you know the important thing is to
  • 00:49:24
    Simply have more people in the conver
  • 00:49:27
    conversation expressing their views and
  • 00:49:30
    their and their wishes and their and for
  • 00:49:33
    this you don't need a a a to understand
  • 00:49:37
    computer science like a a doctor you
  • 00:49:41
    need some understanding of the AI
  • 00:49:45
    Revolution and the main aim for me in
  • 00:49:48
    writing this book was exactly to give
  • 00:49:51
    more people around the world a better
  • 00:49:54
    understanding of the AI Revolution so
  • 00:49:56
    that they can join the debate and
  • 00:49:59
    influence where it is going at present
  • 00:50:02
    you have a very small number of people
  • 00:50:05
    in maybe just two three countries mostly
  • 00:50:08
    us and China they are developing the AIS
  • 00:50:11
    they're making all the important
  • 00:50:13
    decisions this is very unequal this is
  • 00:50:16
    very
  • 00:50:16
    dangerous I don't have all the answers
  • 00:50:19
    myself also I'm not a politician nobody
  • 00:50:23
    elected me so I have my views and what
  • 00:50:26
    to be to be done but what is important
  • 00:50:29
    is to Simply have more people from all
  • 00:50:31
    over the world not just us and China
  • 00:50:33
    that understand the AI Revolution and
  • 00:50:36
    that join the debate and this is why I
  • 00:50:40
    write these books this is why I give
  • 00:50:41
    these interviews simply so that more
  • 00:50:44
    people know what is happening and are
  • 00:50:47
    able to express their views because the
  • 00:50:50
    threat is closer as you gave example
  • 00:50:52
    that from amiba to Dinosaur it took some
  • 00:50:55
    time took billions of years billions of
  • 00:50:57
    years sometime in the history of
  • 00:50:59
    universe but this transformation will
  • 00:51:02
    not take that much time just a few years
  • 00:51:04
    or a few decades like the AI we know
  • 00:51:06
    today is like the amiba it's still very
  • 00:51:09
    very primitive you know the whole AI
  • 00:51:11
    Revolution is maybe 10 years old but AI
  • 00:51:14
    will continue to develop for thousands
  • 00:51:17
    of years for millions of years and it is
  • 00:51:20
    much faster than the process of organic
  • 00:51:23
    evolution it took billions of years from
  • 00:51:25
    the tiny IBA to get to the dinosaur
  • 00:51:28
    maybe it takes GPT gp4 or C GPT is like
  • 00:51:33
    the amoeba of the AI World maybe it
  • 00:51:36
    takes only 20 years until we have the AI
  • 00:51:39
    dinosaurs what if they develop
  • 00:51:41
    Consciousness and empathy and we some of
  • 00:51:45
    us start realizing that so far we were
  • 00:51:48
    worshiping false God be the God be the
  • 00:51:50
    signs these one are more more full of
  • 00:51:54
    Justice more satisf with all of our
  • 00:51:57
    Quest and probably we should put our
  • 00:51:59
    faith in them I mean I mean and I mean
  • 00:52:01
    the another example that when we
  • 00:52:04
    domesticated animals in the course of
  • 00:52:07
    thousands of years we started developing
  • 00:52:09
    lots of love affection for our say dog
  • 00:52:12
    or cat what if some people say and that
  • 00:52:15
    happened in Google you yourself gave
  • 00:52:17
    that example that no I'm so attached to
  • 00:52:19
    this and I believe in this so you cannot
  • 00:52:21
    kill it you cannot regulate it mhm yeah
  • 00:52:25
    um and this is a very big philosophical
  • 00:52:27
    question of what is the relation between
  • 00:52:29
    Consciousness and intelligence there is
  • 00:52:32
    a big confusion intelligence is the
  • 00:52:35
    ability to solve problems like how to
  • 00:52:38
    win a game how to develop a bomb how to
  • 00:52:42
    make money this is a problems of
  • 00:52:45
    intelligence Consciousness is the mind
  • 00:52:50
    it's the ability to feel things like
  • 00:52:52
    love and hate and pain and pleasure in
  • 00:52:55
    humans Consciousness and intelligence
  • 00:52:58
    they go together the way we solve
  • 00:53:01
    problems is by having feelings now so
  • 00:53:04
    far computers and AIS they have no
  • 00:53:06
    consciousness no feelings but they have
  • 00:53:09
    very high
  • 00:53:10
    intelligence now will they also develop
  • 00:53:13
    Consciousness at some point will when
  • 00:53:16
    when they become very intelligent will
  • 00:53:18
    they also develop Consciousness or not
  • 00:53:21
    do they start feeling things like love
  • 00:53:24
    and hate and pain and pleasure we don't
  • 00:53:27
    know because we still don't understand
  • 00:53:30
    Consciousness how it arises where it
  • 00:53:32
    comes from the problem is even if AIS
  • 00:53:36
    don't have Consciousness they can easily
  • 00:53:39
    fool us and make us believe that they
  • 00:53:44
    feel things why because they have such
  • 00:53:47
    control of language it makes them more
  • 00:53:49
    dangerous like if you ask an AI do you
  • 00:53:53
    feel love the AI would say yes I can
  • 00:53:56
    feel love and you say okay I don't
  • 00:53:58
    believe you tell me how does love feel
  • 00:54:00
    like so I know that you're not cheating
  • 00:54:02
    me now ai has read all the love poetry
  • 00:54:06
    that was written in thousands of years
  • 00:54:07
    of History it can gives it can give you
  • 00:54:11
    the most s fisticated descriptions of
  • 00:54:14
    love than anybody can give you it
  • 00:54:16
    doesn't mean it actually feels love but
  • 00:54:20
    it controls language and it has learned
  • 00:54:24
    all the Poetry of the world so it can
  • 00:54:26
    easily deceive you now already today
  • 00:54:30
    people are developing relationships with
  • 00:54:33
    AIS and starting to think that the AIS
  • 00:54:36
    have feelings yes the AI really loves me
  • 00:54:39
    and the reason is that there is
  • 00:54:41
    commercial interest in making AIS that
  • 00:54:45
    can deceive people that can manipulate
  • 00:54:48
    people into thinking that they are
  • 00:54:50
    conscious that they like that fellow who
  • 00:54:52
    entered Birmingham Palace yeah yeah all
  • 00:54:54
    right so we have limitation of time uh
  • 00:54:57
    before we end this episode book
  • 00:55:00
    recommendations Professor we all read
  • 00:55:02
    your books I want to know what Professor
  • 00:55:04
    yal is reading these days five books huh
  • 00:55:08
    five books you read in 2024 and you
  • 00:55:10
    really liked it though you have given so
  • 00:55:11
    many examples in the in the next so one
  • 00:55:14
    very good book it's called the maniac by
  • 00:55:17
    Benjamin labatut and it's a mixture of
  • 00:55:20
    fiction and non-fiction also about
  • 00:55:23
    computers and Ai and their development
  • 00:55:25
    it it was very very good book uh another
  • 00:55:29
    very good book about social media it's
  • 00:55:31
    called the chaos machine okay how social
  • 00:55:35
    media creates all these ches uh in the
  • 00:55:39
    world I also read it's quite an old book
  • 00:55:42
    but I just read it now a book by
  • 00:55:45
    Jonathan Franzen called uh the
  • 00:55:48
    corrections uh this is very good novel
  • 00:55:51
    very good U uh fiction
  • 00:55:53
    book um what else I read so many books I
  • 00:55:57
    read a biography of the emperor
  • 00:55:59
    Justinian from not Byzantine emperor um
  • 00:56:03
    it was a very good book it stayed in my
  • 00:56:05
    mind so I made an impression um yeah all
  • 00:56:10
    right thank you so much it was an honor
  • 00:56:13
    and privilege to interact thank thank
  • 00:56:15
    you thank you so
  • 00:56:18
    much professor
  • 00:56:26
    artificial
  • 00:56:51
    intelligence Professor best wishes thank
  • 00:56:54
    you so much thank you so much
Etiquetas
  • AI evolution
  • AI consciousness
  • truth vs fiction
  • storytelling
  • AI in society
  • technological development
  • cultural impact
  • bias in AI
  • human-AI interaction
  • Vipassana meditation