Yuval Noah Harari Interview with Saurabh Dwivedi। AI के खतरे,Ramayana से Narendra Modi तक। Kitabwala
Zusammenfassung
TLDRProfessor Yuval Noah Harari discusses the rapid development and potential future of Artificial Intelligence (AI), comparing it to the slow organic evolution from simple organisms to complex ones over billions of years. He examines the possibility of AI developing consciousness and the societal implications, such as AI's potential role in religion and creativity. Harari explores the importance of stories in human culture, and how AI might introduce new narratives distinct from biological ones. He also highlights the importance of truth versus fiction in the digital age, illustrating how AI systems could exacerbate misinformation due to their inherent biases. Additionally, Harari shares his personal experiences in India, especially with Vipassana meditation, reflecting on its impact on understanding the mind. He concludes with concerns about centralized power in AI development primarily within the US and China, emphasizing a need for broader global participation and dialogue in shaping AI's future.
Mitbringsel
- 🦠 AI is in its early stages, akin to amoeba in evolution.
- 🤖 Humans may face new narratives from AI beyond biological dramas.
- 🌐 Centralized AI development in US and China poses risks.
- 📚 Human storytelling is essential for large-scale cooperation.
- 💾 AI's inherent biases can spread misinformation.
- 🧘♂️ Harari integrates his Vipassana meditation experiences into his worldview.
- 🗣 AI has the potential to influence cultural, economic, and religious domains.
- ❓ Truth is complicated, costly, and often painful, unlike fiction.
- ⚖️ The balance between truth and order is critical.
- 🕵️ With increasing AI surveillance, privacy concerns grow.
- 🧠 Humans often mistakenly attribute consciousness to AI.
- 🎥 Harari warns about AI-driven creative fields like media and arts.
Zeitleiste
- 00:00:00 - 00:05:00
The evolution from amoebas to dinosaurs took billions of years, whereas AI development is much faster than organic evolution. AI's potential to develop consciousness and empathy poses existential questions, such as reevaluating our constructs and beliefs. Despite AI's current dependency on human commands, its future intelligence may lead to transformative global shifts.
- 00:05:00 - 00:10:00
Author Yuval Noah Harari discusses his connection to India and his journey with vipassana, highlighting its scientific approach in understanding the mind. Vipassana teaches attentiveness to one's breath, revealing how our minds operate beyond conscious control.
- 00:10:00 - 00:15:00
Harari contrasts truth and reality, explaining humanity's tendency to create distorted realities. Despite progress, human information hasn't necessarily improved, often resulting in bad decisions due to misleading information. Truth is costly, complex, and painful, while fiction is cheap, simple, and appealing, leading to a predominance of fiction over truth in human narratives.
- 00:15:00 - 00:20:00
The concept of an 'information diet' is introduced, addressing the overwhelming flood of modern information compared to the need for analysis and reflection. Harari explores computer politics, noting AI's decisiveness and creativity, yet cautioning against its biases.
- 00:20:00 - 00:25:00
AI's unprecedented potential is likened to a technological revolution, with examples like OpenAI's GPT-4 employing human-like tactics to achieve goals. This highlights AI's ability to make autonomous decisions and create new methods, deviating from human thought processes.
- 00:25:00 - 00:30:00
The widespread influence of AI, from loans to art, is discussed. Harari warns about AI becoming central in creating our reality, suggesting that this technological trajectory could substantially redefine human societal structures.
- 00:30:00 - 00:35:00
Considering Singularity and AI's potential, Harari contemplates the biases inherent in AI, stressing that despite their apparent objectivity, AIs reflect human biases from their learned data. The implications of this on creativity and intelligence among AI compared to humans are critically examined.
- 00:35:00 - 00:40:00
While humans have evolved unique traits, AI began surpassing human creativity in games like Go, prompting discussions on AI's future role in strategic thinking and decision-making across various sectors. Harari predicts AI's eventual advancement beyond biological limitations.
- 00:40:00 - 00:45:00
As governments traditionally struggled with total surveillance, AI now enables constant monitoring and control by analyzing vast amounts of data without requiring human analysts, risking the rise of an unprecedentedly powerful surveillance state.
- 00:45:00 - 00:50:00
Despite AI's power to engage it's noted for exacerbating societal divides through targeted content that triggers fear and hatred, fueled by social media algorithms designed to maximize user engagement for profit.
- 00:50:00 - 00:57:01
Harari calls for worldwide awareness in AI's development to ensure ethical standards are woven into its framework. Emphasizing not only technical governance but also a diverse dialogue involving global voices, especially as AI evolves potentially faster than biological organisms ever did.
Mind Map
Video-Fragen und Antworten
What are the main topics discussed in the interview with Professor Yuval Noah Harari?
The talk covers AI's development, potential for AI consciousness, misconceptions about intelligence and free will, and AI's influence on society and culture.
What are Professor Harari's concerns about AI's role in society?
He expresses concerns about AI's ability to replace human interaction in areas like religion, creativity, and governance due to inherent biases and lack of consciousness.
What is Professor Harari's connection to India?
Professor Harari frequently practices Vipassana meditation in India and explores its cultural and spiritual impacts in his works.
What does Harari suggest about the relationship between AI and human culture?
It suggests that as AI becomes key in decision-making, humanity might need to adapt its narratives and power structures to maintain relevance.
What themes does Harari discuss in relation to AI's evolution and human history?
The book explores technological impacts on human society, cultural narratives, and ethical considerations like truth vs. fiction.
Weitere Video-Zusammenfassungen anzeigen
Elm & The Future of Open Source (with Evan Czaplicki)
MATATAG Curriculum K to 10 Curriculum K to 12 Program Department of Education
Cara Pidato Bahasa Inggris (Tips Public Speaking)
How To Get Lean & STAY Lean Forever (Using Science)
Is Adding Your Child to Your Title a Bad Idea?
What is the Best Toothpaste? (according to a dental hygienist)
- 00:00:00from amiba to Dinosaur it took some time
- 00:00:03took billions of years but some time in
- 00:00:05the history of universe AI we know today
- 00:00:07is like the amiba but AI will continue
- 00:00:10to develop for thousands of years for
- 00:00:13millions of years and it is much faster
- 00:00:15than the process of organic evolution
- 00:00:17what if they develop Consciousness and
- 00:00:20empathy and some of us start realizing
- 00:00:23that so far we were worshiping false god
- 00:00:25even if AIS don't have Consciousness
- 00:00:28they can easily fool us and make us
- 00:00:32believe that they feel things whatever
- 00:00:35computers can do they need our Command
- 00:00:37but if they had developing human
- 00:00:38intelligence then probably that is the
- 00:00:40next chapter an atom bomb is so much
- 00:00:42power but the power is actually in human
- 00:00:45hands AI is able to decide things by
- 00:00:49itself and is able to invent new things
- 00:00:52by itself one of the term you mentioned
- 00:00:54in the book which is computer politics
- 00:00:57that the easiest way to grab your
- 00:00:59attention
- 00:01:00and to keep you constantly engaged with
- 00:01:02the platform is to press the hate button
- 00:01:05or the greed button or the fear button
- 00:01:07in our mind first lesson we should teach
- 00:01:10or feed to the elgo is that you can
- 00:01:13commit mistakes exactly like you create
- 00:01:15an AI to screen job applications and if
- 00:01:19in the present moment there is bias
- 00:01:21against women you train on this data it
- 00:01:24will learn oh women's are not good for
- 00:01:26this job and the AI will repeat the same
- 00:01:30bias in the future the ultimate question
- 00:01:32is which story we believe in all our
- 00:01:35life over thousands of years we have
- 00:01:38believed in biological dramas two males
- 00:01:41fighting over a female so you know you
- 00:01:43have it in humans you have it in dogs
- 00:01:45you have it in elephants you have it in
- 00:01:47chimpanzees so you have it in the
- 00:01:49ramayana you have it in Shakespeare you
- 00:01:51have it in Hollywood and Bollywood every
- 00:01:53Alpha has to remember the AI is the
- 00:01:56ultimate Alpha now the big question and
- 00:01:58the last question how do we go ahead on
- 00:02:00this so let me explain
- 00:02:32writer Professor Noah
- 00:02:35Harari
- 00:02:49homod sir thank you so much for joining
- 00:02:52it's my pleasure thank you what a
- 00:02:55wonderful book so we'll keep giving
- 00:02:57references of sapiens hodus and 21
- 00:02:59Century problems but first of all I want
- 00:03:02to understand your India connection
- 00:03:04because the last chapter of your third
- 00:03:07book talks about Vias yeah yeah I come
- 00:03:10every every year to do vipasana course I
- 00:03:13learned vipasana 24 years ago from S
- 00:03:16goena I was doing my PhD studies in
- 00:03:19history at the time in England in Oxford
- 00:03:22and I was trying to understand the world
- 00:03:23and to understand life for all the books
- 00:03:26and couldn't find really deep answers
- 00:03:29and was there was a friend who told me
- 00:03:31for one year you have to go and do
- 00:03:33travas you have to go Trav pasana and I
- 00:03:36thought this is nonsense this is some
- 00:03:38some kind of uh Superstition or this is
- 00:03:40some kind of Mystic practice Mystic
- 00:03:42practice I don't I'm man of science I
- 00:03:45don't do mysticism and eventually
- 00:03:47convinced me and I went and it was
- 00:03:50absolutely amazing uh there was no
- 00:03:53mysticism at all it was the most
- 00:03:55scientific thing just to observe the
- 00:03:58reality from moment to Moment Like a
- 00:04:01scientific experiment you want to
- 00:04:03understand your mind you will not
- 00:04:05understand your mind by just by reading
- 00:04:06books observe your mind observe the
- 00:04:09breath coming in and out of the nostrils
- 00:04:12it really struck me that you know they
- 00:04:14gave this simplest exercise just feel
- 00:04:17the breath coming in and feel the breath
- 00:04:19going out that's it sounds simple I
- 00:04:22couldn't do it for more than 10 seconds
- 00:04:25before my mind would go to some fantasy
- 00:04:27some
- 00:04:28memory and this is how I really started
- 00:04:31to understand how the mind works that I
- 00:04:34have no control over it you know you can
- 00:04:36order your eyes to shut you can ORD your
- 00:04:40your legs to to lay down you cannot
- 00:04:42order your mind to stop very little
- 00:04:45control of the mind and since then for
- 00:04:4824 years I've been practicing vasana and
- 00:04:51almost every year covid was not possible
- 00:04:54but almost every other year I come to
- 00:04:56India to do a long course of maybe 30
- 00:04:59days or 60 days to get to know the Mind
- 00:05:02better nice and we are getting to know
- 00:05:05about your mind through your books and
- 00:05:07surprisingly in this book Nexus there's
- 00:05:09a lot of India references you talked in
- 00:05:12detail about ramayan you talked about
- 00:05:15Swatch Bharat Mission the cleanliness
- 00:05:16Drive being run by prime minister
- 00:05:18Narendra Modi you talk about theit
- 00:05:21atrocities and the cast system of India
- 00:05:23and on and on but uh you primarily talk
- 00:05:27about Truth versus reality yeah Truth
- 00:05:30Versus order and the process of
- 00:05:32mythmaking process uh we'll come to that
- 00:05:35part later but please tell us about
- 00:05:37Truth versus reality because we often
- 00:05:40get
- 00:05:41confused well um you know people
- 00:05:45never almost never are in touch with
- 00:05:49reality we are in touch with stories
- 00:05:51about reality we are in touch with
- 00:05:54images of of reality and they can be
- 00:05:57very
- 00:05:58distorted uh um
- 00:06:01and all the again the E the big effort
- 00:06:05that we are trying to to to to know what
- 00:06:09is the
- 00:06:10truth very often people get to
- 00:06:13conflicting answers about it um and if
- 00:06:17you look at the Deep causes of the
- 00:06:21conflict in the world of all the hatred
- 00:06:24and anger and so and so forth so a lot
- 00:06:27of people believe that there is
- 00:06:28something wrong in the human being which
- 00:06:31is why we are doing so many bad things
- 00:06:34but
- 00:06:36actually most people are good the
- 00:06:39problem is if you give good people bad
- 00:06:41information if you give good people bad
- 00:06:44images of reality then they make very
- 00:06:47bad
- 00:06:48decisions and what I explore in the book
- 00:06:52is how come over thousands of years we
- 00:06:55have developed so much as a species
- 00:06:58humankind but our information did not
- 00:07:01become any better like people today in
- 00:07:04the Modern Age they can believe the most
- 00:07:09terrible things the most hateful things
- 00:07:12just like thousands of years ago so no
- 00:07:16matter how much technological progress
- 00:07:18we have
- 00:07:19made um we have still not made
- 00:07:21significant progress
- 00:07:24in getting better information and
- 00:07:28understanding the reality better and why
- 00:07:29is that because we assume that
- 00:07:32rationality will keep progressing and
- 00:07:34we'll be able to make wise decisions I
- 00:07:37mean Homo sapiens the nomat suggests so
- 00:07:40but as this book says that we are
- 00:07:42continuously though there's rapid
- 00:07:44progression we are continuously making
- 00:07:46disastrous choices as well and we are
- 00:07:48still believing in all those theories
- 00:07:51because most information is isn't the
- 00:07:54truth a lot of people think if I have
- 00:07:56more information I will know more but
- 00:07:59this is not the case the truth is
- 00:08:01actually a very small and rare subset of
- 00:08:06all the information in the world most
- 00:08:09information is junk not the truth why
- 00:08:13because the truth first of all is costly
- 00:08:16if you want to write a truthful account
- 00:08:19of anything of biology of history of
- 00:08:22physics you need to do research you need
- 00:08:24to collect evidence and analyze it this
- 00:08:27takes time and energy and money it's
- 00:08:30expensive to produce the truth in
- 00:08:33contrast Fiction and Fantasy is very
- 00:08:36cheap you just write anything you want
- 00:08:39you don't need to do any research
- 00:08:41doesn't take time doesn't take money so
- 00:08:44the truth is costly fiction is cheap the
- 00:08:47truth is often complicated because
- 00:08:50reality is complicated whereas fiction
- 00:08:53can be made as simple as you would like
- 00:08:55it to be and people usually prefer
- 00:08:58simple stories finally the truth is
- 00:09:01often painful there are many things we
- 00:09:04don't want to know about ourselves about
- 00:09:07our life H about the world about our
- 00:09:11nation the truth is painful sometimes
- 00:09:14fiction can be made as pleasing as
- 00:09:17flattering as you would like it to be so
- 00:09:20in a
- 00:09:21competition between the costly and
- 00:09:24complicated and painful truth and cheap
- 00:09:28simple and flattering fiction fiction
- 00:09:31tends to win so most of the information
- 00:09:34that is flooding the world is not the
- 00:09:37truth and as people invent more and more
- 00:09:40sophisticated Information Technology
- 00:09:43they just flood the world with more
- 00:09:46fiction which doesn't make us any wiser
- 00:09:49which doesn't make us doesn't give us
- 00:09:52knowledge uh we need to make special
- 00:09:54effort Beyond just collecting
- 00:09:56information you know what you see now in
- 00:09:59the world is that people are flooded by
- 00:10:02far too much information and we need a
- 00:10:05kind of information diet the same way
- 00:10:08that people have food diet people know
- 00:10:10too much food isn't good for me junk
- 00:10:12food isn't good for me because I need
- 00:10:15actually time to digest the food not
- 00:10:18just to eat it it's the same as
- 00:10:20information too much information is not
- 00:10:23good for the mind we need some of it of
- 00:10:25course but we actually need more time to
- 00:10:29dig digest and analyze and meditate over
- 00:10:32the information to tell the difference
- 00:10:34between truth and
- 00:10:36fiction um so in this sense we need an
- 00:10:39information diet you are talking about
- 00:10:42truth and this is reminding me of one of
- 00:10:44the term you mentioned in the book which
- 00:10:46is computer politics and I found it very
- 00:10:49interesting where you talking about
- 00:10:51algorithm you are talking about
- 00:10:53interconnectivity of computers how they
- 00:10:54are making choices while playing a game
- 00:10:57with a human being how they are deriving
- 00:11:00something very new and out of the book
- 00:11:02how they are decoding that capture image
- 00:11:05hle which is a very new thing for them
- 00:11:08and how they are tricking human in
- 00:11:10believing that they are not robots but
- 00:11:12human in distress yes and that's very
- 00:11:15alarming in long we yeah we are now in
- 00:11:19the midst of maybe the most important
- 00:11:22technological revolution in history
- 00:11:24which is the invention of
- 00:11:26ai ai is different from every previous
- 00:11:30technology because it is not a tool it
- 00:11:34is an agent it is something that can
- 00:11:37make independent decisions and can come
- 00:11:40up with new ideas by itself all previous
- 00:11:44technology if you think about even
- 00:11:46something very powerful like an atom
- 00:11:48bomb an atom bomb is so much power but
- 00:11:52the power is actually in human hands
- 00:11:54because it is only a human that decides
- 00:11:57what to do with the atom bomb an atom
- 00:11:59bomb cannot decide by itself which city
- 00:12:03to bomb to attack an atom bomb cannot
- 00:12:06invent a new kind of bomb or something
- 00:12:09else it has no capacity to invent things
- 00:12:13AI is able to decide things by itself
- 00:12:17and is able to invent new things by
- 00:12:19itself so one famous example which you
- 00:12:22mentioned is that two years ago open AI
- 00:12:26one of the leading AI companies it
- 00:12:28developed this new AI uh gp4 which now
- 00:12:33everybody is using in cat GPT and all
- 00:12:35these things so they developed gp4 they
- 00:12:37wanted to know what can this thing do so
- 00:12:40they gave it as a test the goal of
- 00:12:44solving a capture puzzle a capture
- 00:12:47puzzle is this visual puzzle that is
- 00:12:50meant to tell the difference between
- 00:12:52human and computer like when you access
- 00:12:55a web page like a bank account the bank
- 00:12:57wants to know is it a human or is it a
- 00:13:00bot so they show you this image of maybe
- 00:13:04some Twisted letters or whatever and you
- 00:13:06have to find to say what these letters
- 00:13:08are this is the capture puzzle now gp4
- 00:13:13could not solve the capture puzzle by
- 00:13:15itself but it was given access to a web
- 00:13:18page Tusk rabbit where you can hire
- 00:13:21people to do things for you so gp4 try
- 00:13:25to hire a human being to solve the
- 00:13:27capture puzzle for it now the human got
- 00:13:30suspicious so the human wrote asked why
- 00:13:35do you need somebody to solve you a
- 00:13:36cupture puzzle are you a robot it asks
- 00:13:40the human ask the important question are
- 00:13:42you a robot and gp4 answered back no I'm
- 00:13:47not a I'm not a robot I'm a human being
- 00:13:50but I have a vision impairment which
- 00:13:53makes it difficult for me to see the
- 00:13:55capture puzzle this is why I need your
- 00:13:57help and the human was
- 00:14:00fooled and believed it and solved the
- 00:14:03puzzle for it so it achieved its goal
- 00:14:07now what this simple example shows is
- 00:14:10the two remarkable abilities of AI first
- 00:14:14to make decisions by itself nobody told
- 00:14:17gp4 lie to that human it meaned to make
- 00:14:21a decision do I tell the truth to the
- 00:14:23human or do I lie and by itself it
- 00:14:27decided I will lie because I need to
- 00:14:29solve the capture puzzle I need to
- 00:14:30achieve the goal secondly it shows the
- 00:14:34ability of the AI to invent new things
- 00:14:37nobody told gp4 what lie would be most
- 00:14:41effective it invented the lie that I'm a
- 00:14:44human with a vision impairment all by
- 00:14:46itself and it was a very effective lie
- 00:14:49now this is a very very small incident
- 00:14:52but this tells us that what we are
- 00:14:55creating is Agents able to invent things
- 00:14:59and make decisions and we are now
- 00:15:01creating millions billions of these AI
- 00:15:05agents and they are everywhere
- 00:15:07increasingly making decisions about us
- 00:15:10if you apply to a bank to get a loan
- 00:15:13it's an AI deciding whether to give you
- 00:15:15a loan increasingly these AIS are
- 00:15:18writing texts they are creating music
- 00:15:21they may be able to create entire movies
- 00:15:24so increasingly we will live in a world
- 00:15:27which is the product not of the human
- 00:15:29mind and our inventions but is the
- 00:15:32product of the intelligence of AI
- 00:15:36Professor lot of futurist who are
- 00:15:38talking about Singularity they are
- 00:15:40saying that uh maybe AI will solve the
- 00:15:42problem it will be more objective more
- 00:15:45data driven and uh less uh there will be
- 00:15:48less probability of them committing
- 00:15:50mistake because the biases will be less
- 00:15:52but you are categorically asserting in
- 00:15:54your book that computers do have deep
- 00:15:58biases and you have examples but at the
- 00:16:00same time in the later part of the book
- 00:16:02you are talking about their capability
- 00:16:04of being creative and having human
- 00:16:07intelligence above even human
- 00:16:09intelligence yeah but in that case it's
- 00:16:13more challenging because we always
- 00:16:16believe that whatever computers can do
- 00:16:18they need our Command they cannot be
- 00:16:21creative like human being and they
- 00:16:23cannot have human values like compassion
- 00:16:26Consciousness etc etc but if if they are
- 00:16:29developing human intelligence then
- 00:16:31probably that is the next chapter we
- 00:16:33don't know we already know that AIS are
- 00:16:35becoming even more creative than us in
- 00:16:38different fields like one famous example
- 00:16:42was already eight years ago in 2016 one
- 00:16:45of the most important turning points in
- 00:16:48the history of AI was a game a game of
- 00:16:52Go between a computer program an AI
- 00:16:56called alphago and the world champion go
- 00:16:59L now go is a board game a strategy game
- 00:17:04a bit like chess but much more
- 00:17:06complicated than
- 00:17:07chess that was invented in ancient China
- 00:17:10more than 2,000 years ago and for more
- 00:17:13than 2,000 years it was considered one
- 00:17:17of the out forms that every uh every
- 00:17:22accomplished person in East Asia needed
- 00:17:24to know how to play go uh so tens of
- 00:17:28millions of Chinese and Koreans and
- 00:17:30Japanese they played go for thousands of
- 00:17:32years entire philosophies developed
- 00:17:36around the game it was seen as a
- 00:17:38metaphor for life if you know how to
- 00:17:40play goal you will know how to rule a
- 00:17:42country even life lessons yes so entire
- 00:17:46philosophies developed around different
- 00:17:48ways to play go and then in 2016 alphao
- 00:17:52played against ladol and it completely
- 00:17:55defeated the human Champion but the
- 00:17:58import important thing was how it played
- 00:18:01it invented a completely new way of
- 00:18:05playing the game in the beginning when
- 00:18:07it started making these moves all the
- 00:18:10commentators said oh this machine is so
- 00:18:12stupid nobody plays go like this no this
- 00:18:15is stupid mve what a blender yeah what a
- 00:18:17blender what a mistake it turned out
- 00:18:19this was brilliant now for thousands of
- 00:18:22years tens of millions of people play
- 00:18:25this game nobody ever thought of making
- 00:18:28such moves you know you can think about
- 00:18:31it like as a metaphor like there is a
- 00:18:34landscape a geography of go all the
- 00:18:37different ways you can play go so think
- 00:18:40about it like a planet of all the
- 00:18:42different ways you can play Go humans
- 00:18:45thought that they explore the whole
- 00:18:47planet oh we know all the ways to play
- 00:18:50Go actually it turned out humans were
- 00:18:53stuck on one Island on this planet and
- 00:18:57they didn't know all the other ways to
- 00:18:59play go then Alpha go came and it
- 00:19:02started exploring all these unknown
- 00:19:05areas of the planet go all these unknown
- 00:19:09ways to play go so it completely
- 00:19:12revolutionized the way that go is being
- 00:19:15play today and it turned out the
- 00:19:17computer was more creative than the
- 00:19:20human being now this is likely to happen
- 00:19:23you can say oh this is just a game
- 00:19:24doesn't matter but this is likely to
- 00:19:26happen in more Fields it could happen in
- 00:19:29finance if you give AI Control of
- 00:19:33Finance so in banks in Investments AIS
- 00:19:37will start inventing new ways to invest
- 00:19:40money new Financial devices which are
- 00:19:43more creative than human Bankers maybe
- 00:19:46it even happens in fields like out as
- 00:19:49they write new music and new poetry and
- 00:19:52even in religion you think about you
- 00:19:55know religion like Judaism that I know
- 00:19:58best so Judaism is a religion of texts
- 00:20:02there is a holy book The Bible and then
- 00:20:04people read it and write more texts
- 00:20:07about the Bible and then they write more
- 00:20:09texts about the texts about the Bible
- 00:20:11over thousands of years Jews wrote
- 00:20:14thousands of texts which are all now
- 00:20:17considered holy sacred to some extent
- 00:20:21now no human being there are so many
- 00:20:23texts in Judaism nobody is able to read
- 00:20:26and remember all of them but AI can no
- 00:20:31even the most genius Rabbi cannot read
- 00:20:34all of them but AI can read all the
- 00:20:37texts of Judaism and remember every word
- 00:20:40in every text and find connections
- 00:20:43patterns in the Holy books that escape
- 00:20:46the human mind now what happens if AI
- 00:20:51starts coming up with new
- 00:20:53interpretations of the Bible and new
- 00:20:56ideas in Judaism would people believe
- 00:21:00the AI or would say no no no no no
- 00:21:02computers have nothing to do with
- 00:21:04religion what happens if you go online
- 00:21:07and you have a question about the Bible
- 00:21:09or about religion and you get an answer
- 00:21:11and you don't know is this coming from
- 00:21:13an AI or is this coming from a human
- 00:21:16Rabbi or a human priest now if people
- 00:21:19think that texts are sacred and AI is
- 00:21:24better than humans at reading and
- 00:21:27writing text
- 00:21:29then AIS could take over
- 00:21:32religion uh so religion in the first
- 00:21:35place claimed to give answers to all of
- 00:21:37our queries from where did we come from
- 00:21:40what is the purpose of this life in the
- 00:21:42later part science started making that
- 00:21:44claim on the basis of few solid solid
- 00:21:47robust mechanism that you can repeat
- 00:21:49these test and all are you saying that
- 00:21:52in a near future there's a possibility
- 00:21:54that AI while carefully reading all our
- 00:21:58choices all our biases because it is
- 00:22:00relying on the digital bureaucracy and
- 00:22:02it is reading all our imprints they will
- 00:22:05create a theory which will give
- 00:22:08satisfaction to our spiritual Quest as
- 00:22:10well and they can produce through all
- 00:22:14the avable text of all the religions and
- 00:22:16they know our biases that what we will
- 00:22:18like the language the music and
- 00:22:20everything and they can very judiciously
- 00:22:22mix science as well to to satisfy us and
- 00:22:25then they can become the the super cult
- 00:22:28yeah
- 00:22:29that again if they go in a bad Direction
- 00:22:32they can learn how to manipulate Us and
- 00:22:35how to manipulate our feelings how to
- 00:22:38manipulate our beliefs more efficiently
- 00:22:42than any human being was ever able to
- 00:22:45manipulate other people so this is one
- 00:22:48of the big dangers that it will learn
- 00:22:52how to basically hack the human mind and
- 00:22:56how to manipulate human beings on a
- 00:22:58scale which was previously
- 00:23:01impossible you know throughout human
- 00:23:04history um dictators and tyrants and all
- 00:23:08kinds of governments they always wanted
- 00:23:11to spy and to monitor everybody to know
- 00:23:16what everybody is doing to control all
- 00:23:18the people yeah you have given that
- 00:23:20Romanian yeah scientist exactly and
- 00:23:24until today even the most totalitarian
- 00:23:27governments could wouldn't actually spy
- 00:23:30on everybody all the time because it was
- 00:23:32impossible like if you live even in the
- 00:23:35Soviet Union in the time of Stalin so
- 00:23:38Stalin wants to follow everybody all the
- 00:23:40time but he doesn't have enough agents
- 00:23:43secret police agents because if you you
- 00:23:46know if if the government wants to know
- 00:23:47what you're doing 24 hours a day they
- 00:23:50need at least two agents because they
- 00:23:52need to sleep sometimes the agents they
- 00:23:54need to eat sometime so let's say two
- 00:23:56agents are following you 24 hours a day
- 00:23:59the same way they want to follow
- 00:24:00everybody in the country it is
- 00:24:02impossible and then they have to follow
- 00:24:03the agents as well yes they don't have
- 00:24:05enough agents and even if somebody is
- 00:24:07following you 24 hours a day at the end
- 00:24:10of the day they collect information
- 00:24:12about you they write some paper report
- 00:24:14you went there you read this book you
- 00:24:16met this person somebody needs to
- 00:24:19analyze all the information the agents
- 00:24:22the secret agents are gathering on you
- 00:24:25in order to make sense of it now they
- 00:24:27don't have enough analysts to do it need
- 00:24:30millions and millions of human beings to
- 00:24:33analyze all the information it's
- 00:24:34impossible so even in a totalitarian
- 00:24:38state like the Soviet Union most people
- 00:24:41had privacy it was not possible for the
- 00:24:44government to follow them all the time
- 00:24:46now with AI it is becoming possible
- 00:24:49because a phone is everywhere yeah the
- 00:24:50phone is everywhere there are
- 00:24:52microphones there are cameras there
- 00:24:53drones everywhere so you can follow
- 00:24:55everybody all the time you don't need
- 00:24:57human agents for that and you don't need
- 00:25:00human analysts to analyze all the
- 00:25:03information like if you're going and the
- 00:25:05camera sees you went there and the phone
- 00:25:07is recording your conversation you don't
- 00:25:09need a human analysts to actually look
- 00:25:12at all the images and analyze all the
- 00:25:14text AI is doing it automatically so the
- 00:25:18danger is that it is becoming possible
- 00:25:20to create a total surveillance regime in
- 00:25:24which a dictatorship would just follow
- 00:25:27everybody all the time time and would be
- 00:25:28able to control and to manipulate people
- 00:25:31all the time at the end of the day
- 00:25:34everybody wants to predict the
- 00:25:35behavioral pattern so that they can goge
- 00:25:38that what is the next move you're going
- 00:25:39to make yeah now with large data they
- 00:25:41are able to do so but at the same time
- 00:25:44they are altering our reality because
- 00:25:46they are the one who are choosing on our
- 00:25:48behalf and we are under this false
- 00:25:50assumption that we made this Choice
- 00:25:52exactly you gave the example of say uh
- 00:25:55myamar 201617
- 00:25:58that rohinga kamish and then how the St
- 00:26:03Vidal's videos were flooded in the
- 00:26:06Facebook stream and how it created the
- 00:26:10unrest and uh then you tell us that
- 00:26:13because there was one simple instruction
- 00:26:15to the
- 00:26:16algorithm increase the engagement and
- 00:26:19they forgot to differentiate between
- 00:26:22love hate Compassion or any other thing
- 00:26:25how do you look at it because at one
- 00:26:26point of time you were talking about
- 00:26:28that computer do not understand double
- 00:26:30speak where you were giving the Russian
- 00:26:32bot example yeah how should we look at
- 00:26:34it because all these big data companies
- 00:26:37who are integral part of our life they
- 00:26:39are clearly not responsible enough yeah
- 00:26:43what we see with social media in the
- 00:26:45last few years that it is spreading
- 00:26:48hatred and fear and anger which creates
- 00:26:51political extremism and violence and
- 00:26:54instability all over the world now what
- 00:26:57is actually
- 00:26:59happening what is happening is that the
- 00:27:01companies have an economic interest in
- 00:27:04keeping people use their products more
- 00:27:08and more the more time you spend on Tik
- 00:27:11Tok or Facebook or Twitter or any of
- 00:27:14these social media the more money the
- 00:27:16company makes because it shows you more
- 00:27:19advertisements and it can collect more
- 00:27:21data from you and use that data or sell
- 00:27:24it to somebody else so their aim is to
- 00:27:26keep you on the platform
- 00:27:29longer and the companies gave now these
- 00:27:33platforms they are not managed by human
- 00:27:36beings they are managed by algorithms
- 00:27:39when you watch some video who chose to
- 00:27:42show you this video Not a Human Being
- 00:27:44but an algorithm and the companies gave
- 00:27:48the algorithms one goal increase user
- 00:27:52engagement which means make people spend
- 00:27:55more time on the platform and also
- 00:27:57engage with it more like share it with
- 00:28:00friends and send it more to so that more
- 00:28:02people will join and also be on the
- 00:28:05platform now the algorithms experimented
- 00:28:08on millions of human beings and they
- 00:28:11discovered that the easiest way to grab
- 00:28:15your attention and to keep you
- 00:28:17constantly engaged with the platform is
- 00:28:19to press the hate button or the greed
- 00:28:22button or the fear button in our mind if
- 00:28:25we something if we see something that
- 00:28:27makes us us very angry we engage with it
- 00:28:31we can't look away if we see something
- 00:28:33that scares us we can't look away and
- 00:28:37you know this goes back millions of
- 00:28:40years in
- 00:28:41evolution that there is a mechanism in
- 00:28:44the human brain that if you see a danger
- 00:28:48all your attention goes there and you
- 00:28:50cannot look away yeah because you know
- 00:28:52in in nature this is this is H good for
- 00:28:55you because if you go in the forest you
- 00:28:58look for something to eat and there is a
- 00:29:00snake somewhere you even if it's just a
- 00:29:04you see some movement all your attention
- 00:29:06goes there survival Instinct and you see
- 00:29:08the snake and you don't look away you
- 00:29:10only look at the snake because you know
- 00:29:12if you look away maybe it kills me so we
- 00:29:14have a mechanism in the brain if we see
- 00:29:16something dangerous immediately all the
- 00:29:19attention goes there now when you go in
- 00:29:21the forest most of the time there is no
- 00:29:24snake mhm so you look here you look
- 00:29:26there there is a tree there is Bush
- 00:29:28there is a bird and every now and then
- 00:29:30okay there is a snake but when you look
- 00:29:32on social media it's always snake snake
- 00:29:36snake snake because they figured out
- 00:29:39that the easiest way to grab our
- 00:29:41attention is to show us something that
- 00:29:43frightens us yes and they also
- 00:29:46discovered what frightens different
- 00:29:47people because what frightens me may not
- 00:29:50be frightening to you MH so the
- 00:29:52algorithms of social media they show you
- 00:29:54different things they monitor how you
- 00:29:56react they they discover oh you're
- 00:29:59afraid of this we'll show you more of
- 00:30:01this so if you're afraid of some
- 00:30:04political party you're afraid of
- 00:30:05imigrants you're afraid of this group of
- 00:30:08that group they discover this and the
- 00:30:10algorithms show you more and more and
- 00:30:12you become more and more frightened and
- 00:30:15you stay longer and longer and they
- 00:30:17completely putting hus on us that you
- 00:30:18are making this Choice that's why we are
- 00:30:20showing this exactly and when you
- 00:30:22complain about it they say nobody is
- 00:30:24forcing you to do it if you want you can
- 00:30:26just put it away we not forcing anybody
- 00:30:29but actually what is happening is that
- 00:30:32they hacked the operating system of
- 00:30:35humans you know how do you hack a
- 00:30:37computer you find the weakness in the
- 00:30:39code and you use the weakness to hack
- 00:30:42the computer it's the same with human
- 00:30:44beings they found the weakness in our
- 00:30:47code and they use it to hack our brains
- 00:30:51and the thing is that you know people a
- 00:30:55lot of people have a strong belief in
- 00:30:56free will everything I do this is my
- 00:30:59free will so if I choose to just keep
- 00:31:02seeing this on social media this is my
- 00:31:04free will actually the easiest people to
- 00:31:07manipulate are those who have a strong
- 00:31:10belief in free will because if you
- 00:31:12believe very strongly everything I do is
- 00:31:15my free will then you don't even think
- 00:31:18that maybe you are manipulated in some
- 00:31:21way now on a philosophical level belief
- 00:31:26in free will if you just take it like
- 00:31:27like a blind faith this is creates a
- 00:31:30problem because it makes you not very
- 00:31:33curious to understand how you make
- 00:31:35decisions there is nothing to
- 00:31:37investigate I just decide freely to buy
- 00:31:40this to go there to do
- 00:31:42that um actually I would say that don't
- 00:31:47believe in free
- 00:31:48will investigate how you make
- 00:31:52decisions using science using meditation
- 00:31:55using psychology try to really
- 00:31:58understand why do you choose this and
- 00:32:00not that if after all your
- 00:32:03investigations you will find that there
- 00:32:06is still something that cannot be
- 00:32:08explained by any other way then you can
- 00:32:10say okay this is free will but don't
- 00:32:13start with the assumption that I have
- 00:32:15free will anything I decide is my free
- 00:32:18will because if you investigate you will
- 00:32:21discover that a lot of your decisions
- 00:32:23are actually not free will they are
- 00:32:26influenced they are shaped by economic
- 00:32:30forces biological forces cultural forces
- 00:32:34freedom is not something we have freedom
- 00:32:36is something we have to struggle
- 00:32:39for if we learn how to protect ourselves
- 00:32:43from all these different kinds of
- 00:32:44manipulations then we have much more
- 00:32:47freedom then comes the idea of Justice
- 00:32:51because in the historical course we have
- 00:32:53seen that how
- 00:32:55alignment changed the shape of History
- 00:32:58somebody made a choice that when we'll
- 00:33:00present the final version of Bible these
- 00:33:03are the parts which will be included and
- 00:33:04these will not be part of it and how it
- 00:33:07led to gender inequality and other parts
- 00:33:11and you stress a lot on the canonization
- 00:33:13of algorithm yes what are the basic
- 00:33:16human principles we need to keep in mind
- 00:33:19while working on this
- 00:33:22canonization so the process of
- 00:33:25canonization we are familiar with the in
- 00:33:28history from the creation of holy books
- 00:33:31like if you consider the Bible some
- 00:33:32people think oh it just came down from
- 00:33:34heaven like this in this book but we
- 00:33:37know from history this is not how it
- 00:33:39happened in the first centuries of
- 00:33:43Christianity Christians wrote a very
- 00:33:46large number of texts all kinds of
- 00:33:49stories about Jesus and prophecies and
- 00:33:51things like that in the time of Jesus
- 00:33:54himself there was no new testament mhm
- 00:33:57he didn't write it and and even
- 00:34:00centuries afterwards there was still no
- 00:34:03New Testament no
- 00:34:04Bible so people were reading different
- 00:34:07things and eventually the church decided
- 00:34:11that we need one book which has the most
- 00:34:15important texts of Christianity so
- 00:34:19people will all read the same book and
- 00:34:22they set up a committee to go over all
- 00:34:25the texts and decide what will be in and
- 00:34:29what will stay out and this committee
- 00:34:32chose 27 texts that they became what we
- 00:34:36know today as the New Testament the the
- 00:34:39second part of the Bible you have the
- 00:34:41Old Testament and you have the New
- 00:34:43Testament and they these were human
- 00:34:46beings making the decision what to
- 00:34:49include and what not to include uh and
- 00:34:53this decision had very far-reaching
- 00:34:55consequences for instance one very
- 00:34:58popular text in early Christianity was
- 00:35:01called the acts of Paul and Thea it told
- 00:35:05the story of St Paul and his female
- 00:35:08disciple Thea this was a holy woman who
- 00:35:12performed Miracles and led the community
- 00:35:15and preached and she was an example that
- 00:35:18women can be leaders in the church there
- 00:35:21was another text uh also attributed to
- 00:35:26the same man St Paul in which St Paul
- 00:35:29says that women cannot be leaders at all
- 00:35:32they should just obey the men and be
- 00:35:34silent and this is their role is to obey
- 00:35:37the men and to raise
- 00:35:39children and these two texts were in
- 00:35:42conflict and this
- 00:35:44committee uh of the church they chose to
- 00:35:48include the text uh that says that women
- 00:35:52should just obey men this is still in
- 00:35:54the Bible it is the first epistle to
- 00:35:56Timothy whereas the acts of Paul and
- 00:35:59thla which says that no women can be
- 00:36:01leaders they said no no no no no we
- 00:36:03don't put this in the Bible this we live
- 00:36:05outside and this choice of what to put
- 00:36:08in and what to live outside this shaped
- 00:36:12the view of christs about women for
- 00:36:15thousands of years until today because
- 00:36:17now people say oh it's in the Bible so
- 00:36:19we believe what's written in there this
- 00:36:21is the process of
- 00:36:23canonization of taking certain texts and
- 00:36:28deciding this is now the holy book the
- 00:36:31same thing is happening today with
- 00:36:34AI that you know for a lot of people CH
- 00:36:38GPT becomes like the Bible whatever they
- 00:36:42want to know they just ask the AI and
- 00:36:45whatever the AI says they just believe
- 00:36:48and those biases continue like that uh
- 00:36:51experiment in one of the e-commerce
- 00:36:53platform where they had given AI this
- 00:36:56job of shortlisting the CVS and and they
- 00:36:59took out the women CV yes so AIS can be
- 00:37:03biased AIS can be biased against women
- 00:37:06for instance why because AIS are trained
- 00:37:09on existing data we need to think about
- 00:37:12AI it's a bit like a child when you
- 00:37:15create an AI the AI doesn't know much
- 00:37:18about the world baby algorithm yeah baby
- 00:37:20algorithm it has the
- 00:37:23ability to learn things about the world
- 00:37:26but in order to learn you need to give
- 00:37:28it data the same way that a baby grows
- 00:37:32up it has different experiences in the
- 00:37:34world it learns so AI also has this
- 00:37:37period of childhood when it is given all
- 00:37:41kinds of data and this is how it learns
- 00:37:43about the world now if you give the AI
- 00:37:46data which is biased against women the
- 00:37:49AI will develop a bias against women
- 00:37:53like you create an AI to screen job
- 00:37:56applications and if in the present
- 00:37:59moment there is bias against women like
- 00:38:02it's more difficult to for women to get
- 00:38:04a job then the AI you train on this data
- 00:38:08it will learn oh women are not good for
- 00:38:11this job and the AI will repeat the same
- 00:38:14bias in the future and if people think
- 00:38:17ah but the AI is perfect the AI makes no
- 00:38:20mistakes it has no biases again like
- 00:38:22like the three the holy book then it's a
- 00:38:25very dangerous situation
- 00:38:28when you created a very biased system
- 00:38:31and people just have Blind Faith in it
- 00:38:34and you are advocating in the book that
- 00:38:36the first lesson we should teach or feed
- 00:38:39to the elgo is that you can commit
- 00:38:41mistakes exactly and you are given that
- 00:38:45Nick example of paper clip H yeah this
- 00:38:48is a a famous uh uh experience thought
- 00:38:51experiment of philosopher Nick Bostrom
- 00:38:53this was from like 10 years ago that he
- 00:38:56says what happened Happ s if you have
- 00:38:58this super intelligent Ai and you give
- 00:39:01it a goal like let's say you have a
- 00:39:05paper clip Factory a factory making
- 00:39:07paper clips and the factory gets an AI
- 00:39:11buys an AI and tells the AI your goal is
- 00:39:15to create to make as many paper clips as
- 00:39:18possible this is the goal sounds okay
- 00:39:22but then what the AI does it takes over
- 00:39:24the whole world and turns the whole
- 00:39:26world into a paper clip Factory it kills
- 00:39:29all the humans because it realizes hey I
- 00:39:32need to make more paper clips these
- 00:39:34humans they consume all the water and
- 00:39:37electricity and and and they would not
- 00:39:39like if I take it to make paper clips so
- 00:39:42I should kill
- 00:39:43them and this is again a philosophical
- 00:39:46thought experiment so in this thought
- 00:39:48experiment the paperclip AI
- 00:39:52destroys the whole world just to make
- 00:39:55paper clips because this was the goal it
- 00:39:58was given and it cannot question it its
- 00:40:01goals now this sounds ridiculous until
- 00:40:04you think about what happened in social
- 00:40:06media which we just disc discussed which
- 00:40:09was exactly the same thing the social
- 00:40:12media algorithms were given the goal cre
- 00:40:16maximize user engagement make people
- 00:40:18spend as much time as possible on social
- 00:40:21media and pursuing this goal the
- 00:40:26algorithms spread so much hatred and
- 00:40:30violence and uh uh greed in the world
- 00:40:34and because they pursued this simple
- 00:40:37goal so we need to develop AIS which are
- 00:40:42able to have doubts that are able not to
- 00:40:46be so sure that what they are doing is
- 00:40:49the correct thing the same thing with
- 00:40:51human
- 00:40:52beings we give them so much power if
- 00:40:56someone has enormous power and no
- 00:40:59ability to doubt themselves they're
- 00:41:02always convinced they are doing the
- 00:41:04right thing this is a very dangerous
- 00:41:06mixture Professor the ultimate question
- 00:41:09is which story we believe in all our
- 00:41:12life over thousands of years we have
- 00:41:15believed in biological dramas we were
- 00:41:18part of bureaucratic structure but we
- 00:41:20never really liked it there was always a
- 00:41:22sense of Doubt for that and you have
- 00:41:24beautifully in various chapters compared
- 00:41:26this and in the biological drama you
- 00:41:28have given a detail example of ram as
- 00:41:30well so please tell us about that why do
- 00:41:33we tend to uh gravitate towards
- 00:41:36biological drama is it something
- 00:41:38inherent or is it or is it something
- 00:41:40because in the civilization first we
- 00:41:43started walking on two feet then we
- 00:41:47discovered fire and then we I mean
- 00:41:50intestines they shrink and the Brain
- 00:41:53developed and then we developed language
- 00:41:55and imagin ation yeah which computer was
- 00:41:58thinking as double speak yeah but that
- 00:42:01was one of the reason of our progression
- 00:42:02as well because in totality we were able
- 00:42:04to convince a lot large group of people
- 00:42:06that this is one system or one faith we
- 00:42:08can believe in why do we tend to
- 00:42:10biological dramas so let me
- 00:42:13explain um humans are storytelling
- 00:42:16animals we think in stories and our
- 00:42:20power in the world ultimately comes from
- 00:42:23telling stories which sound strange but
- 00:42:27if you think what makes human being so
- 00:42:29powerful it's our ability to cooperate
- 00:42:31in very large numbers you know other
- 00:42:34animals chimpanzees elephants horses
- 00:42:38they can cooperate in small numbers like
- 00:42:4020 chimpanzees or 20 elephants can
- 00:42:42cooperate but 20 million cannot humans
- 00:42:46can cooperate in unlimited numbers like
- 00:42:49you have India today most popular
- 00:42:51country in the world more than 1.4
- 00:42:53billion people and they all cooperate
- 00:42:56how do you make
- 00:42:57millions of people who never met they
- 00:43:00complete strangers nevertheless
- 00:43:02cooperate by telling stories at the
- 00:43:04basis of every large scale cooporation
- 00:43:07you find stories could be religious
- 00:43:10stories like mythology could be
- 00:43:12political stories like ideology could
- 00:43:14even be economic stories money is also a
- 00:43:17story it has no objective value you know
- 00:43:20you can't eat money you can't drink
- 00:43:22money the value of money is that this is
- 00:43:24a story everybody believes
- 00:43:27now where do the plot of the story comes
- 00:43:31from and you would think that people are
- 00:43:34very imaginative we invent so many
- 00:43:36different stories but actually there are
- 00:43:39very very few stories that we tell again
- 00:43:42and again and again and who invented
- 00:43:45them not human imagination but nature
- 00:43:48biology these are the biological
- 00:43:51dramas which are common actually to most
- 00:43:56animals to most mammals like what are
- 00:43:59these biological dramas so one drama for
- 00:44:02instance is relation between parents and
- 00:44:05children and the children on the one
- 00:44:08hand they need to stay close to the
- 00:44:09parent to get protection but they also
- 00:44:12need to distance from the parent to
- 00:44:15explore the world so this is a basic
- 00:44:18drama in the life of every child but
- 00:44:22also of every puppy of every kitten of
- 00:44:25every small elephant or or monkey or
- 00:44:28giraffe is this I want to stay close to
- 00:44:31mother ah but I want to explore the
- 00:44:33world and you see this drama of uh uh
- 00:44:38playing out even in religious
- 00:44:41mythologies like you have entire
- 00:44:44mythology saying that hell is being
- 00:44:47distant from God because this is what
- 00:44:50every little child feels that if I get
- 00:44:53too distant from my parents this is hell
- 00:44:56i'll be alone I'll have no
- 00:44:59protection uh similarly um and and
- 00:45:04another biological drama is two males
- 00:45:09fighting over a female so you know you
- 00:45:12have it in humans you have it in dogs
- 00:45:14you have it in elephants you have it in
- 00:45:16chimpanzees this is a biological drama
- 00:45:19so much of human mythology and Human Art
- 00:45:23just retells the story so you have it in
- 00:45:25the ramayana you have it in Shakespeare
- 00:45:27you have it in Hollywood and Bollywood
- 00:45:30they're taking the basic stories from
- 00:45:33biology and everybody gives it a little
- 00:45:36different twist so the story is not
- 00:45:38always exactly the same but the basic
- 00:45:42structure is always the
- 00:45:44same and this is how humans create
- 00:45:48stories and we find stories appealing
- 00:45:52because there is something biological
- 00:45:54deep inside us that makes us very
- 00:45:57interested in these kinds of stories
- 00:46:00like relations between parents and
- 00:46:01children or love triangles now ai is not
- 00:46:07biological it is not limited by these
- 00:46:10biological dramas so maybe it will come
- 00:46:13up with completely new stories the same
- 00:46:17way that it came up with new moves in
- 00:46:19the Go game and that it can come up with
- 00:46:22new Financial devices and create new
- 00:46:24inter subjective realities exactly and
- 00:46:26create completely new
- 00:46:29realities that might be much very
- 00:46:32difficult for us to understand because
- 00:46:34again we are still animals we are
- 00:46:36organic and AI is not an animal it's not
- 00:46:40organic uh one of the big tensions today
- 00:46:44in the world is that humans as organic
- 00:46:48beings we work by Cycles you know day
- 00:46:51and night winter and summer time for
- 00:46:54activity time for rest AIS don't work by
- 00:46:58cycle they are not organic they never
- 00:47:01need to rest so if you think for
- 00:47:03instance about the financial system so
- 00:47:07the markets like Wall Street it
- 00:47:10traditionally worked by cycle the market
- 00:47:13is open in Wall Street only 9:30 in the
- 00:47:16morning to 4:00 in the afternoon Monday
- 00:47:18to Friday that's it because Bankers are
- 00:47:21humans investors are humans they need
- 00:47:24time to sleep they want to go on weekend
- 00:47:26with their
- 00:47:27reminding me of that example you have
- 00:47:29given where AI tells the dictator that
- 00:47:31your defense minister is going to do
- 00:47:33coup so why don't you give me order to
- 00:47:35kill him yeah so very funny but very
- 00:47:37realistic example yeah so just going on
- 00:47:40with the finance example and we get to
- 00:47:42that is that now as AI takes over the
- 00:47:45markets it never needs to sleep it can
- 00:47:47be on 24 hours a day it has no family
- 00:47:50doesn't want to go on vacation so as AI
- 00:47:53takes over the financial system it
- 00:47:55forces human bankers and investors and
- 00:47:59financial journalists also to try to be
- 00:48:02on all the time and if you if you force
- 00:48:05an animal to be active all the time
- 00:48:08eventually it collapses and dies and
- 00:48:11this is what is happening now all over
- 00:48:13the world that uh as the AI become more
- 00:48:17powerful our life become more and more
- 00:48:21hectic we cannot take rest we cannot
- 00:48:23slow down and this is very dangerous for
- 00:48:26us in your book you have discussed
- 00:48:28democracies you have discussed
- 00:48:30totalitarian regime and the various
- 00:48:33challenges it poses and how it will
- 00:48:34perceive elgo and you have said that
- 00:48:36elgo is more able to corrupt them
- 00:48:38because they are centralized past yeah
- 00:48:40it's easier to corrupt and dictatorship
- 00:48:42and in the end you said every Alpha has
- 00:48:44to remember the AI is the ultimate Alpha
- 00:48:46now the big question and the last
- 00:48:48question yes before we go for your
- 00:48:51recommendations you talk about truth and
- 00:48:53Order and you say there has to be right
- 00:48:55balance and you Advocate that there
- 00:48:57should be human institutions to monitor
- 00:49:01this yeah how do we go ahead on
- 00:49:04this Alpha AI people and most of the
- 00:49:07people they're like I have no connection
- 00:49:09with such advanced technology so maybe I
- 00:49:11should not be worried but you have given
- 00:49:13example that how a farmer in myamar or
- 00:49:15China had to worry about the scientific
- 00:49:19industrialization yeah in
- 00:49:21UK you know the important thing is to
- 00:49:24Simply have more people in the conver
- 00:49:27conversation expressing their views and
- 00:49:30their and their wishes and their and for
- 00:49:33this you don't need a a a to understand
- 00:49:37computer science like a a doctor you
- 00:49:41need some understanding of the AI
- 00:49:45Revolution and the main aim for me in
- 00:49:48writing this book was exactly to give
- 00:49:51more people around the world a better
- 00:49:54understanding of the AI Revolution so
- 00:49:56that they can join the debate and
- 00:49:59influence where it is going at present
- 00:50:02you have a very small number of people
- 00:50:05in maybe just two three countries mostly
- 00:50:08us and China they are developing the AIS
- 00:50:11they're making all the important
- 00:50:13decisions this is very unequal this is
- 00:50:16very
- 00:50:16dangerous I don't have all the answers
- 00:50:19myself also I'm not a politician nobody
- 00:50:23elected me so I have my views and what
- 00:50:26to be to be done but what is important
- 00:50:29is to Simply have more people from all
- 00:50:31over the world not just us and China
- 00:50:33that understand the AI Revolution and
- 00:50:36that join the debate and this is why I
- 00:50:40write these books this is why I give
- 00:50:41these interviews simply so that more
- 00:50:44people know what is happening and are
- 00:50:47able to express their views because the
- 00:50:50threat is closer as you gave example
- 00:50:52that from amiba to Dinosaur it took some
- 00:50:55time took billions of years billions of
- 00:50:57years sometime in the history of
- 00:50:59universe but this transformation will
- 00:51:02not take that much time just a few years
- 00:51:04or a few decades like the AI we know
- 00:51:06today is like the amiba it's still very
- 00:51:09very primitive you know the whole AI
- 00:51:11Revolution is maybe 10 years old but AI
- 00:51:14will continue to develop for thousands
- 00:51:17of years for millions of years and it is
- 00:51:20much faster than the process of organic
- 00:51:23evolution it took billions of years from
- 00:51:25the tiny IBA to get to the dinosaur
- 00:51:28maybe it takes GPT gp4 or C GPT is like
- 00:51:33the amoeba of the AI World maybe it
- 00:51:36takes only 20 years until we have the AI
- 00:51:39dinosaurs what if they develop
- 00:51:41Consciousness and empathy and we some of
- 00:51:45us start realizing that so far we were
- 00:51:48worshiping false God be the God be the
- 00:51:50signs these one are more more full of
- 00:51:54Justice more satisf with all of our
- 00:51:57Quest and probably we should put our
- 00:51:59faith in them I mean I mean and I mean
- 00:52:01the another example that when we
- 00:52:04domesticated animals in the course of
- 00:52:07thousands of years we started developing
- 00:52:09lots of love affection for our say dog
- 00:52:12or cat what if some people say and that
- 00:52:15happened in Google you yourself gave
- 00:52:17that example that no I'm so attached to
- 00:52:19this and I believe in this so you cannot
- 00:52:21kill it you cannot regulate it mhm yeah
- 00:52:25um and this is a very big philosophical
- 00:52:27question of what is the relation between
- 00:52:29Consciousness and intelligence there is
- 00:52:32a big confusion intelligence is the
- 00:52:35ability to solve problems like how to
- 00:52:38win a game how to develop a bomb how to
- 00:52:42make money this is a problems of
- 00:52:45intelligence Consciousness is the mind
- 00:52:50it's the ability to feel things like
- 00:52:52love and hate and pain and pleasure in
- 00:52:55humans Consciousness and intelligence
- 00:52:58they go together the way we solve
- 00:53:01problems is by having feelings now so
- 00:53:04far computers and AIS they have no
- 00:53:06consciousness no feelings but they have
- 00:53:09very high
- 00:53:10intelligence now will they also develop
- 00:53:13Consciousness at some point will when
- 00:53:16when they become very intelligent will
- 00:53:18they also develop Consciousness or not
- 00:53:21do they start feeling things like love
- 00:53:24and hate and pain and pleasure we don't
- 00:53:27know because we still don't understand
- 00:53:30Consciousness how it arises where it
- 00:53:32comes from the problem is even if AIS
- 00:53:36don't have Consciousness they can easily
- 00:53:39fool us and make us believe that they
- 00:53:44feel things why because they have such
- 00:53:47control of language it makes them more
- 00:53:49dangerous like if you ask an AI do you
- 00:53:53feel love the AI would say yes I can
- 00:53:56feel love and you say okay I don't
- 00:53:58believe you tell me how does love feel
- 00:54:00like so I know that you're not cheating
- 00:54:02me now ai has read all the love poetry
- 00:54:06that was written in thousands of years
- 00:54:07of History it can gives it can give you
- 00:54:11the most s fisticated descriptions of
- 00:54:14love than anybody can give you it
- 00:54:16doesn't mean it actually feels love but
- 00:54:20it controls language and it has learned
- 00:54:24all the Poetry of the world so it can
- 00:54:26easily deceive you now already today
- 00:54:30people are developing relationships with
- 00:54:33AIS and starting to think that the AIS
- 00:54:36have feelings yes the AI really loves me
- 00:54:39and the reason is that there is
- 00:54:41commercial interest in making AIS that
- 00:54:45can deceive people that can manipulate
- 00:54:48people into thinking that they are
- 00:54:50conscious that they like that fellow who
- 00:54:52entered Birmingham Palace yeah yeah all
- 00:54:54right so we have limitation of time uh
- 00:54:57before we end this episode book
- 00:55:00recommendations Professor we all read
- 00:55:02your books I want to know what Professor
- 00:55:04yal is reading these days five books huh
- 00:55:08five books you read in 2024 and you
- 00:55:10really liked it though you have given so
- 00:55:11many examples in the in the next so one
- 00:55:14very good book it's called the maniac by
- 00:55:17Benjamin labatut and it's a mixture of
- 00:55:20fiction and non-fiction also about
- 00:55:23computers and Ai and their development
- 00:55:25it it was very very good book uh another
- 00:55:29very good book about social media it's
- 00:55:31called the chaos machine okay how social
- 00:55:35media creates all these ches uh in the
- 00:55:39world I also read it's quite an old book
- 00:55:42but I just read it now a book by
- 00:55:45Jonathan Franzen called uh the
- 00:55:48corrections uh this is very good novel
- 00:55:51very good U uh fiction
- 00:55:53book um what else I read so many books I
- 00:55:57read a biography of the emperor
- 00:55:59Justinian from not Byzantine emperor um
- 00:56:03it was a very good book it stayed in my
- 00:56:05mind so I made an impression um yeah all
- 00:56:10right thank you so much it was an honor
- 00:56:13and privilege to interact thank thank
- 00:56:15you thank you so
- 00:56:18much professor
- 00:56:26artificial
- 00:56:51intelligence Professor best wishes thank
- 00:56:54you so much thank you so much
- AI evolution
- AI consciousness
- truth vs fiction
- storytelling
- AI in society
- technological development
- cultural impact
- bias in AI
- human-AI interaction
- Vipassana meditation