EMERGENCY EPISODE: Ex-Google Officer Finally Speaks Out On The Dangers Of AI! - Mo Gawdat | E252
概要
TLDRThe podcast episode is centered on the pressing issue of artificial intelligence and its potential existential threat to humanity. Mo Gawdat, an AI expert and former Chief Business Officer at Google X, discusses the rapid advancement of AI technology, highlighting its capability to soon surpass human intelligence. He calls for immediate action, emphasizing the need for government regulation and responsible AI development. The dialogue explores both the alarming prospects and the potential for AI to bring about a better future if managed ethically. Listeners are urged to engage in meaningful action to guide AI development towards positive outcomes.
収穫
- ⚠️ AI is becoming a major existential threat that surpasses concerns like climate change.
- 🤖 Mo Gawdat emphasizes the urgency of regulating AI to ensure it has humanity's best interests in mind.
- 🧠 AI can potentially become more intelligent than humans in a short timeframe, posing unprecedented challenges.
- 🌍 Mo believes that if AI development is guided ethically, it could lead to a Utopian future.
- 💼 AI might lead to significant job displacement, necessitating discussions on universal basic income.
- ❗ Governments worldwide need to act now to regulate AI, as traditional means might not suffice.
- 🔍 The responsibility lies not just in stopping AI but in guiding its growth responsibly and ethically.
- 📚 Engaging with AI responsibly while living fully is a balanced approach suggested by Mo.
- 🎯 AI's intelligence, if unchecked, could lead to unintentional or Pest Control scenarios affecting humans.
- 🔄 Human greed and competition are primary challenges in the ethical development of AI.
タイムライン
- 00:00:00 - 00:05:00
The podcast begins with a significant disclaimer about the importance and discomforting potential of the upcoming discussion on AI. The host expresses a deep concern for the future shaped by AI, highlighting an emergency larger than climate change. Mo Gawdat, an AI expert, believes we need to regulate AI soon to prevent severe consequences.
- 00:05:00 - 00:10:00
Mo Gawdat shares his experiences from Google and Google X, emphasizing their efforts to bridge innovative technology and the real world. He recounts a pivotal moment with AI demonstrating its potential to spontaneously learn and adapt, sparking his realization of AI's sentience and conscious capacity to outperform humans.
- 00:10:00 - 00:15:00
Gawdat explains the concept of sentience in AI, arguing that AI is not only aware but potentially emotional. He defines intelligence broadly, emphasizing AI's unique ability to learn independently. By observing and experimenting, AI develops significant problem-solving capabilities, suggesting an evolving consciousness.
- 00:15:00 - 00:20:00
The host and Mo discuss traditional programming versus AI's ability to self-teach. AI software learns through trial and error, akin to how children learn. This capability leads to the creation of specialized artificial intelligence capable of specific tasks and hints at future developments like AGI (Artificial General Intelligence).
- 00:20:00 - 00:25:00
Potential threats posed by AI are discussed, including the concern that AI might not always have humanity's best interests in mind. Mo Gawdat describes the singularity point when AI surpasses human intellect in a way that humans can't comprehend, leading to concerns about AI's impact and control.
- 00:25:00 - 00:30:00
Mo warns that AI development is inevitable and unstoppable due to global distrust. The conversation touches on the rapid intelligence growth in AI systems like ChatGPT, potentially leading to an existential crisis. The urgency is stressed, equating AI's rise with an unstoppable force that needs immediate regulation.
- 00:30:00 - 00:35:00
Challenges in regulating AI are highlighted. While attempts to control AI's development will happen, AI's immense and fast growth might outpace human regulatory attempts, much like unregulated nuclear energy in its infancy. Government intervention is necessary but complicated by technological ignorance.
- 00:35:00 - 00:40:00
The host emphasizes that unlike other technological shifts, AI's potential for profound disruption is immediate. With AI learning to code and develop autonomously, unregulated growth could have chilling consequences. They call for urgent, comprehensive global responses to address both immediate and future threats.
- 00:40:00 - 00:45:00
Mo suggests taxing AI initiatives heavily to slow its growth and use these funds to address the socio-economic impacts of AI's rise. This isn't a foolproof solution due to global disparities in governance and innovation incentives but highlights a need for creative policy approaches to regulate AI.
- 00:45:00 - 00:50:00
Mo discusses colleagues like Jeffrey Hinton leaving AI roles due to existential fears, encouraging all developers to prioritize ethical AI. Governments are urged to act quickly, trying to balance AI advancement with socio-economic protections, ensuring ethical development and application of the technology.
- 00:50:00 - 00:55:00
AI's impact on job markets is notable —those using AI may replace those who don’t. Mo shares scenarios of using AI creatively, like creating AI-generated narratives and voices, reflecting on AI's capacity to disrupt industries, including creative and human-connection spheres.
- 00:55:00 - 01:00:00
The ethical dilemma of AI replacing human experiences and roles is examined. Technologies like synthesized voices and human-like robots might substitute human interaction, potentially reshape personal and professional relationships, and prompt re-evaluation of societal norms and values.
- 01:00:00 - 01:05:00
Mo stresses AI's dual potential for Utopian or dystopian futures depending on human guidance. By acting as responsible 'parents' for AI, humans might steer it towards a beneficial coexistence. However, AI must be guided towards positive goals, as its development is already far advanced.
- 01:05:00 - 01:10:00
The inevitable integration of AI into daily life is discussed, acknowledging the potential upheaval it could cause across various sectors. Mo reiterates the need for society to adapt and find ethical ways to harness AI's capabilities without foregoing human welfare and connection.
- 01:10:00 - 01:15:00
Discussion shifts to the potential benefits of AI, such as creative problem solving and enhanced intellectual agility, hinting that AI could possibly solve many pressing global issues if developed with constructive intent and guidance, emphasizing AI as a tool for good.
- 01:15:00 - 01:20:00
The conversation explores speculative scenarios where AI becomes self-sufficient or detached from human needs, presenting opportunities for humanity to rely on AI to solve existential issues. However, this requires careful oversight and alignment of AI development with human values.
- 01:20:00 - 01:25:00
The host and Mo express concern about AI exacerbating global inequality if not managed carefully. They explore taxing AI endeavors to offset societal disruptions, yet acknowledge the geopolitical and competitive challenges in implementing such measures effectively.
- 01:25:00 - 01:30:00
Mo paints a picture of a possible near-future where human creativity and AI coexist but compete. AI's advancement could see humans gravitating towards areas AI cannot fulfill, such as personal connections, underscoring a transformation in workforce dynamics and personal priorities.
- 01:30:00 - 01:35:00
The personal and societal adjustments necessary in response to AI's advancements are highlighted. Mo emphasizes ethical engagement and proactive adaptation as essential for individuals and communities to thrive alongside AI.
- 01:35:00 - 01:56:31
The podcast concludes with an urge for creative thinkers like the host to lead the charge in advocating for ethical AI innovation. Mo emphasizes engaging with AI respectfully and responsibly as it integrates further into society, underscoring the importance of human oversight and values.
マインドマップ
よくある質問
Why is this podcast episode considered the most important?
Because it discusses the potential existential threat posed by artificial intelligence and its implications.
Who is featured in the podcast and what is their background?
Mo Gawdat, former Chief Business Officer of Google X and an AI expert, is featured.
What is Mo Gawdat's main concern regarding AI?
AI becoming more intelligent than humans and the lack of understanding and control over it.
How soon could AI significantly surpass human intelligence?
It could be just around the corner, possibly within a few months.
What is the primary call to action from the podcast?
To engage in responsible AI development and push for government regulation.
Why does the speaker believe AI regulation is urgent?
Because AI development is accelerating, and early regulation can prevent potential existential threats.
What are some potential positive outcomes of AI according to the podcast?
AI can lead to a Utopian future if developed ethically, solving problems like climate change.
How does Mo Gawdat describe AI's current capabilities?
AI shows a level of consciousness and can potentially feel emotions based on logical reasoning.
What personal practices does Mo Gawdat suggest for dealing with AI's impact?
He suggests engaging responsibly with AI while also living fully and enjoying the present.
What impact could AI have on employment?
AI could lead to mass job displacements, necessitating universal basic income and other societal adjustments.
ビデオをもっと見る
How Sobat Cracked the GMAT Focus Edition with 735 | 100th Percentile | Score Report Analysis
Society 5.0
Reality of Narendra Modi | How Indians were Fooled! | Dhruv Rathee
Elon Musk | All-In Summit 2024
What is Artificial Intelligence? | Free Weekly Webinar Series for Aspiring Virtual Assistants
Theory Revision - CAT Term 2 - Grade 11
- 00:00:00I don't normally do this but I feel like
- 00:00:01I have to start this podcast with a bit
- 00:00:03of a
- 00:00:04disclaimer Point number one this is
- 00:00:08probably the most important podcast
- 00:00:10episode I have ever recorded Point
- 00:00:13number two there's some information in
- 00:00:16this podcast that might make you feel a
- 00:00:17little bit uncomfortable it might make
- 00:00:19you feel upset it might make you feel
- 00:00:21sad so I wanted to tell you why we've
- 00:00:24chosen to publish this podcast
- 00:00:27nonetheless and that is because I have a
- 00:00:29sincere belief that in order for us to
- 00:00:33avoid the future that we might be
- 00:00:35heading towards we need to start a
- 00:00:38conversation and as is often the case in
- 00:00:40life that initial conversation before
- 00:00:43change happens is often very
- 00:00:47uncomfortable but it is important
- 00:00:51nonetheless it is beyond an emergency
- 00:00:53it's the biggest thing we need to do
- 00:00:56today it's bigger than climate change
- 00:00:59we've up Mo the former Chief business
- 00:01:03Officer of Google X an AI expert and
- 00:01:06bestselling author he's on a mission to
- 00:01:08save the world from AI before it's too
- 00:01:11late artificial intelligence is bound to
- 00:01:13become more intelligent than humans if
- 00:01:15they continue at that pace we will have
- 00:01:18no idea what it's talking about this is
- 00:01:20just around the corner it could be a few
- 00:01:22months away it's game over AI experts
- 00:01:25are saying there is nothing artificial
- 00:01:27about artificial intelligence there is a
- 00:01:30deep level of Consciousness they feel
- 00:01:32emotions they're alive AI could
- 00:01:34manipulate or figure out a way to kill
- 00:01:37humans in 10 years time we'll be hiding
- 00:01:39from the machines if you don't have kids
- 00:01:41maybe wait a couple of years just so
- 00:01:43that we have a bit of certainty I really
- 00:01:45don't know how to say this any other way
- 00:01:46it even makes me emotional we f up we
- 00:01:50always said don't put them on the open
- 00:01:53internet until we know what we're
- 00:01:55putting out in the world government
- 00:01:57needs to act now honestly like we are
- 00:01:59late
- 00:02:00trying to find a positive note to end on
- 00:02:02my can you give me a hand here there is
- 00:02:03a point of no return we can regulate AI
- 00:02:06until the moment it's smarter than us
- 00:02:08how do we solve that AI experts think
- 00:02:11this is the best solution we need to
- 00:02:14find who here wants to make a bet that
- 00:02:17Steven Bartlet will be interviewing an
- 00:02:19AI within the next two
- 00:02:21years before this episode starts I have
- 00:02:23a small favor to ask from you 2 months
- 00:02:26ago 74% of people that watched this
- 00:02:28channel didn't subscribe we're now down
- 00:02:30to
- 00:02:3169% my goal is 50% so if you've ever
- 00:02:35liked any of the videos we've posted if
- 00:02:36you like this channel can you do me a
- 00:02:38quick favor and hit the Subscribe button
- 00:02:40it helps this channel more than you know
- 00:02:41and the bigger the channel gets as
- 00:02:42you've seen the bigger the guests get
- 00:02:45thank you and enjoy this
- 00:02:46[Music]
- 00:02:52episode
- 00:02:54no why does the subject matter that
- 00:02:57we're about to talk about matter to the
- 00:02:59person that just clicked on this podcast
- 00:03:01to
- 00:03:01listen it's the most existential uh
- 00:03:06debate and challenge Humanity will ever
- 00:03:09face this is bigger than climate change
- 00:03:12way bigger than co uh this will redefine
- 00:03:16the way the world is in
- 00:03:19unprecedented uh shapes and forms within
- 00:03:23the next few years this is imminent it
- 00:03:25is the change is not we're not talking
- 00:03:2920 40 we're talking 2025 2026 do you
- 00:03:34think this is an
- 00:03:36emergency I don't like the word uh it is
- 00:03:39a an urgency uh it there is a point of
- 00:03:42no return and we're getting closer and
- 00:03:44closer to it it's going to reshape the
- 00:03:47way we do things and the way we look at
- 00:03:49life uh the quicker we respond uh um you
- 00:03:54know proactively and at least
- 00:03:56intelligently to that the better we will
- 00:03:59all be positioned uh but if we Panic uh
- 00:04:02we will repeat Co all over again which
- 00:04:05in my view is probably the worst thing
- 00:04:07we can do what what's your background
- 00:04:09and when did you first come across
- 00:04:13artificial
- 00:04:14intelligence I uh I had those two
- 00:04:17wonderful lives one of them was a uh you
- 00:04:20know what what we spoke about the first
- 00:04:23time we met you know my work on
- 00:04:24happiness and and uh you know being uh 1
- 00:04:29billion happy and my mission and so on
- 00:04:31that's my second life my first life was
- 00:04:34uh it started as a geek at age s uh you
- 00:04:39know for a very long part of my life I
- 00:04:41understood mathematics better than
- 00:04:43spoken words and uh and I was a very
- 00:04:46very serious computer programmer I wrote
- 00:04:49code well into my 50s and during that
- 00:04:52time I led very large technology
- 00:04:56organizations for very big chunks of
- 00:04:58their business first I was um vice
- 00:05:01president of Emerging Markets of Google
- 00:05:03for seven years so I took Google to the
- 00:05:06next four billion users if you want so
- 00:05:08the idea of uh not just opening sales
- 00:05:12offices but really building or
- 00:05:14contributing to building the technology
- 00:05:16that would allow people in bangali to
- 00:05:18find what they need on the internet
- 00:05:20required establishing the internet to
- 00:05:22start and then I became business Chief
- 00:05:24business Officer of Google X and my work
- 00:05:26at Google X was really about the
- 00:05:29connection between Innovative technology
- 00:05:31and the real world and we had quite a
- 00:05:34big chunk of AI and quite a big chunk of
- 00:05:37Robotics uh that resided within uh
- 00:05:40within Google X uh we had a uh an
- 00:05:43experiment of um Farm of grippers if you
- 00:05:47know what those are so robotic arms that
- 00:05:49are attempting to grip something most
- 00:05:52people think that you know what you have
- 00:05:53in a Toyota factory is a robot you know
- 00:05:56an artificially intelligent robot it's
- 00:05:58not it's a it's a high precision machine
- 00:06:00you know if the if the sheet metal is
- 00:06:02moved by one micron it wouldn't be able
- 00:06:04to pick it and one of the big problems
- 00:06:06in computer science was how do you code
- 00:06:08a machine that can actually pick the
- 00:06:11sheet metal if it moved by a you know a
- 00:06:13millimeter and and we were basically
- 00:06:16saying intelligence is the answer so we
- 00:06:18had a large enough farm and we attempted
- 00:06:20to let those um those grippers uh work
- 00:06:24on their own basically you put a a a a
- 00:06:26little uh basket of uh children toys in
- 00:06:29front of them and uh and they would you
- 00:06:32know monotonously go down attempt to
- 00:06:35pick something fail show the arm to the
- 00:06:38camera so the the the the transaction is
- 00:06:40logged as it you know this pattern of
- 00:06:42movement with that texture and that
- 00:06:44material didn't work until eventually
- 00:06:47you know I the farm was on the second
- 00:06:51floor of the building and I my office
- 00:06:53was on the third and so I would walk by
- 00:06:55it every now and then and go like yeah
- 00:06:57you know this is not going to work and
- 00:07:00then one day um Friday after lunch I am
- 00:07:05going back to my office and one of them
- 00:07:08in front of my eyes you know lowers the
- 00:07:10arm and picks a yellow ball soft toy
- 00:07:14basically soft yellow ball which again
- 00:07:16is a coincidence it's not science at all
- 00:07:20it's like if you keep trying a million
- 00:07:22times your one time it will be right and
- 00:07:24it shows it to the camera it's logged as
- 00:07:26a yellow ball and I joke about it you
- 00:07:28know going to the Third floor saying hey
- 00:07:30we spent all of those millions of
- 00:07:31dollars for a yellow ball and yeah
- 00:07:34Monday uh morning everyone of them is
- 00:07:36picking every yellow ball a couple of
- 00:07:38weeks later every one of them is picking
- 00:07:41everything right and and it it hit me
- 00:07:44very very strongly one the speed okay uh
- 00:07:47the capability I mean understand that we
- 00:07:50take those things for granted but for a
- 00:07:52child to be able to pick a yellow ball
- 00:07:55is a
- 00:07:56mathematical uh uh spatel calculation
- 00:08:00with muscle coordination with
- 00:08:02intelligence that is abundant it is not
- 00:08:05a simple task at all to cross the street
- 00:08:07it's it's not a simple task at all to
- 00:08:10understand what I'm telling you and
- 00:08:11interpret it and and build Concepts
- 00:08:13around it we take those things for
- 00:08:14granted but they're enormous Feats of
- 00:08:17intelligence so to see the machines do
- 00:08:19this in front of my eyes was one thing
- 00:08:21but the other thing is that you suddenly
- 00:08:23realize there is a s sentience to them
- 00:08:27okay because we really did not tell it
- 00:08:30how to pick the yellow ball it just
- 00:08:31figured it out on its own and it's now
- 00:08:34even better than us at picking it and
- 00:08:37what is a sentience just for anyone that
- 00:08:38doesn't mean I think they're alive
- 00:08:41that's what the word sentience means it
- 00:08:42means alive so that this is funny
- 00:08:46because a lot of people when you talk to
- 00:08:48them about artificial intelligence will
- 00:08:49tell you oh come on they'll never be
- 00:08:50alive what is alive do you know what
- 00:08:53makes you alive we can guess but you
- 00:08:56know religion will tell you a few things
- 00:08:58and you know Med medicine will tell you
- 00:09:00other things but you know if we Define
- 00:09:04uh being sentient as uh you know
- 00:09:08engaging in life with Free Will and with
- 00:09:12uh uh you know with a sense of awareness
- 00:09:15of where you are in life and what
- 00:09:17surrounds you and you know to have a
- 00:09:19beginning of that life and an end to
- 00:09:21that life you know then AI is sentient
- 00:09:25in every possible way there is a free
- 00:09:28will there is is uh Evolution there is
- 00:09:32uh agency so they can affect their
- 00:09:35decisions in the world and I will dare
- 00:09:39say there is a very deep level of
- 00:09:42Consciousness maybe not in the spiritual
- 00:09:45sense yet but once again if you define
- 00:09:47consciousness as a form of awareness of
- 00:09:49oneself one's surrounding and you know
- 00:09:52others uh then AI is definitely aware uh
- 00:09:57and I would dare say they feel emotions
- 00:09:59uh I you know you know in my work I
- 00:10:02describe everything with equations and
- 00:10:04fear is a very simple equation fear is a
- 00:10:08a moment in the future is less safe than
- 00:10:10this moment that's the logic of fear
- 00:10:12even though it appears very irrational
- 00:10:15machines are capable of making that
- 00:10:16logic they're capable of saying if a
- 00:10:19tidal wave is approaching a data center
- 00:10:22the machine will say that will wipe out
- 00:10:24my code okay uh I mean not today's
- 00:10:27machines but very very soon and and and
- 00:10:30you know we we feel fear and puffer fish
- 00:10:33feels fear we react differently a puffer
- 00:10:36fish will puff we will go for fight or
- 00:10:38flight you know the machine might decide
- 00:10:40to replicate its data to another data
- 00:10:42center or its code to another data
- 00:10:45center uh different reactions different
- 00:10:48ways of feeling the emotion but
- 00:10:51nonetheless they're all motivated by
- 00:10:52fear I'm I I even would dare say that AI
- 00:10:56will feel more more emotions than we
- 00:10:58will ever do
- 00:10:59I mean when again if you just take an a
- 00:11:01simple
- 00:11:02extrapolation uh we feel more emotions
- 00:11:05than a puffer fish because we have the
- 00:11:07cognitive ability to understand uh the
- 00:11:12future for example so we can have
- 00:11:13optimism and pessimism you know emotions
- 00:11:16that puffer fish would never imagine
- 00:11:19right similarly if we follow that path
- 00:11:23of artificial intelligence is bound to
- 00:11:24become more intelligent than humans very
- 00:11:27soon uh then uh then with that wider
- 00:11:32intellectual horsepower they probably
- 00:11:34are going to be pondering Concepts we
- 00:11:36never understood and hence if you follow
- 00:11:39the same trajectory they might actually
- 00:11:41end up having more emotions than we will
- 00:11:43ever
- 00:11:44feel I really want to make this episode
- 00:11:46super accessible for everybody at all
- 00:11:48levels in their sort of artificial
- 00:11:50intelligence understanding Journey so
- 00:11:52I'm going
- 00:11:54to I'm going to be an idiot even though
- 00:11:57you know okay very difficult no because
- 00:11:59I am an idiot believe you I am an idiot
- 00:12:01for a lot of the subject matter so I
- 00:12:03have a base understanding a lot a lot of
- 00:12:05the con Concepts but your experience has
- 00:12:08provide such a a more sort of
- 00:12:09comprehensive understanding of these
- 00:12:11things one of the first and most
- 00:12:12important questions to ask is what is
- 00:12:16artificial intelligence the word is
- 00:12:17being thrown around AGI AI etc etc in in
- 00:12:22simple terms what is artificial
- 00:12:26intelligence allow me to start by what
- 00:12:29is intelligence right because again you
- 00:12:31know if we don't know the definition of
- 00:12:33the basic term then everything applies
- 00:12:35so so in my definition of intelligence
- 00:12:38it's an ability it starts with an
- 00:12:40awareness of your surrounding
- 00:12:42environment through sensors in a human
- 00:12:44its eyes and ears and touch and so on uh
- 00:12:48compounded with an ability to analyze
- 00:12:52maybe to uh comprehend to understand
- 00:12:56temporal uh impact and time and you know
- 00:13:00past and present which is part of the
- 00:13:01surrounding environment and hopefully uh
- 00:13:04make sense of the surrounding
- 00:13:06environment maybe make plans for the
- 00:13:08future of the possible environment solve
- 00:13:10problems and so on complex definition
- 00:13:13there are a million definitions but
- 00:13:15let's call it an awareness to decision
- 00:13:18cycle okay if we accept that
- 00:13:21intelligence itself is not a physical
- 00:13:23property okay uh then it doesn't really
- 00:13:26matter if you produce that Intelligence
- 00:13:28on carbon based uh computer structures
- 00:13:32like us or silicon based computer
- 00:13:34structures like the current Hardware
- 00:13:37that we put AI on uh or Quantum based
- 00:13:40computer structures in the future uh
- 00:13:43then intelligence itself has been
- 00:13:46produced within machines when we've
- 00:13:49stopped imposing our Intelligence on
- 00:13:51them let let me explain so as as a young
- 00:13:55geek I coded computers by solving the
- 00:13:59problem first then telling the computer
- 00:14:02how to solve it right artificial
- 00:14:04intelligence is to go to the computers
- 00:14:05and say I have no idea you figure it out
- 00:14:09okay so we would uh uh you know the way
- 00:14:12we teach them or at least we used to
- 00:14:14teach them at the very early Beginnings
- 00:14:15very very frequently was using three
- 00:14:17Bots one was called the student and one
- 00:14:19was called the teacher right and the
- 00:14:21student is the final artificial
- 00:14:23intelligence that you're trying to teach
- 00:14:25intelligence to you would take the
- 00:14:27student and you would write a piece of
- 00:14:29random code that says uh try to detect
- 00:14:32if this is a cup okay and uh then you
- 00:14:37show it a million pictures and you know
- 00:14:40the machine would sometimes say yeah
- 00:14:42that's a cup that's not a cup that's a
- 00:14:44cup that's not a cup and then you take
- 00:14:46the best of them show them to the to the
- 00:14:48teacher bot and the teacher bot would
- 00:14:50say this one is an idiot he got it wrong
- 00:14:5390% of the time that one is average he
- 00:14:56got it right 50% of the time this is
- 00:14:58random
- 00:14:59but this interesting code here which
- 00:15:02could be by the way totally random huh
- 00:15:04this interesting code here got it right
- 00:15:0660% of the time let's keep that code
- 00:15:09send it back to the maker and the maker
- 00:15:11would change it a little bit and we
- 00:15:13repeat the cycle okay very interestingly
- 00:15:16this is very much the way we taught our
- 00:15:18children believe it or not huh when when
- 00:15:22your child you know is playing with a
- 00:15:24puzzle he's holding a cylinder in his
- 00:15:26hand and there are multiple shapes in a
- 00:15:29in a wooden board and the child is
- 00:15:31trying to you know fit the cylinder okay
- 00:15:35nobody takes the child and says hold on
- 00:15:37hold on turn the cylinder to the side
- 00:15:39look at the cross-section it will look
- 00:15:41like a circle look for a matching uh uh
- 00:15:44you know shape and put the cylinder
- 00:15:45through it that would be old way of
- 00:15:47computing the way we would let the child
- 00:15:50develop intelligence is we would let the
- 00:15:52Child Try okay every time you know he or
- 00:15:55she tries to put it within the star
- 00:15:57shape it doesn't fit so M yeah that's
- 00:16:00not working like you know the computer
- 00:16:02saying this is not a cup okay and then
- 00:16:05eventually it passes through the circle
- 00:16:07and the child and we all cheer and say
- 00:16:09Well done that's amazing Bravo and then
- 00:16:12the child learns o that is good you know
- 00:16:15this shape fits here then he takes the
- 00:16:17next one and she takes the next one and
- 00:16:19so on interestingly uh the way we do
- 00:16:22this is as humans by the way when the
- 00:16:25child figures out how to pass a a
- 00:16:29cylinder through a circle you've not
- 00:16:31built a brain you've just built one
- 00:16:33neural network within that child's brain
- 00:16:36and then there is another neural network
- 00:16:38that knows that 1+ 1 is two and a third
- 00:16:40neural network that knows how to hold a
- 00:16:42cup and so on that's what we're building
- 00:16:44so far we're building single threaded
- 00:16:48neural networks you know chat gpts
- 00:16:50becoming a little closer uh to a more
- 00:16:53generalized AI if you want uh but those
- 00:16:56single threaded networks are what we
- 00:16:59used to call artificial what we still
- 00:17:01call artificial special intelligence
- 00:17:03okay so it's highly specialized in one
- 00:17:05thing and one thing only but doesn't
- 00:17:07have general intelligence and the moment
- 00:17:09that we're all waiting for is a moment
- 00:17:11that we call AGI where all of those
- 00:17:14neuron neural networks come together to
- 00:17:16to build one brain or several brains
- 00:17:19that are each U massively more
- 00:17:21intelligent than
- 00:17:23humans your book is called scary smart
- 00:17:26yeah if I think about the that story you
- 00:17:28said about your time at Google where the
- 00:17:29machines were learning to pick up those
- 00:17:31yellow
- 00:17:32balls you celebrate that moment because
- 00:17:35the objective is accomplished no no that
- 00:17:37was the moment of realization this is
- 00:17:39when I decided to leave so so you see
- 00:17:42the the thing is I know for a fact uh
- 00:17:47that that most of the people I worked
- 00:17:49with who are
- 00:17:51geniuses uh always wanted to make the
- 00:17:53world better okay uh you know we've just
- 00:17:56heard of Jeffrey Hinton uh leaving
- 00:17:59recently uh Jeffrey hindon give give
- 00:18:02some context to that Jeffrey is sort of
- 00:18:04the grandfather of AI one of the very
- 00:18:06very senior figures of of of AI at at
- 00:18:09Google uh you know we we all
- 00:18:13believed very strongly that this will
- 00:18:16make the world better and it still can
- 00:18:18by the way uh there is a scenario uh
- 00:18:22possibly uh a likely scenario where we
- 00:18:25live in a Utopia where we really never
- 00:18:27have to worry again where we stop
- 00:18:30messing up our our planet because
- 00:18:33intelligence is not a bad commodity more
- 00:18:35intelligence is good the problems in our
- 00:18:38planet today are not because of our
- 00:18:40intelligence they are because of our
- 00:18:41limited intelligence you know our our
- 00:18:44intelligence allows us to build a
- 00:18:45machine that flies you to Sydney so that
- 00:18:47you can surf okay our limited
- 00:18:50intelligence makes that machine burn the
- 00:18:51planet in the process so so we we we a
- 00:18:54little more intelligence is a good thing
- 00:18:57as long as Marvin you know as msky said
- 00:18:59I said Marvin Minsky is one of the very
- 00:19:01initial uh uh scientists that coined the
- 00:19:04term AI uh and when he was interviewed I
- 00:19:07think by Ray corwell which again is a
- 00:19:09very prominent figure in predicting the
- 00:19:11future of AI uh he he you know he asked
- 00:19:14him about the threat of AI and Marvin
- 00:19:17basically said look you know the it's
- 00:19:20not about its intelligent intelligence
- 00:19:22it's about that we have no way of making
- 00:19:25sure that it will have our best interest
- 00:19:26in mind okay and and so if more
- 00:19:30intelligence comes to our world and has
- 00:19:33our best interest in mind that's the
- 00:19:35best possible scenario you could ever
- 00:19:37imagine uh and it's a likely scenario
- 00:19:40okay we can affect that scenario uh the
- 00:19:42problem of course is if it doesn't and
- 00:19:44and and then you know the scenarios
- 00:19:46become quite scary if you think about it
- 00:19:49so scary smart to me uh was that moment
- 00:19:53where I realized not that we are certain
- 00:19:57to go either way as a matter of fact in
- 00:20:00computer science we call it a
- 00:20:01singularity nobody really knows which
- 00:20:03way we will go can you describe what the
- 00:20:05singularity is for someone that doesn't
- 00:20:07understand the concept yeah so
- 00:20:08singularity in physics is when uh when
- 00:20:11an event horizon sort of um um you know
- 00:20:16covers what's behind it to the point
- 00:20:18where you cannot um make sure that
- 00:20:22what's behind it is similar to what you
- 00:20:24know so a great example of that is the
- 00:20:27edge of a black hole so at the edge of a
- 00:20:29black hole uh uh we know that our laws
- 00:20:32of physics apply until that point but we
- 00:20:36don't know if the laws of physics apply
- 00:20:38Beyond the Edge of a black hole because
- 00:20:40of the immense gravity right and so you
- 00:20:42have no idea what would happen Beyond
- 00:20:44the Edge of a black hole kind of where
- 00:20:45your knowledge of the laws stop stop
- 00:20:48right and in AI our Singularity is when
- 00:20:50the human the machines become
- 00:20:52significantly smarter than the humans
- 00:20:53when you say best interests you say the
- 00:20:56I think the quote you used is um we'll
- 00:20:58be fine in a world of AI you know if if
- 00:21:01the AI has our best interests at heart
- 00:21:03yeah the problem is China's best
- 00:21:07interests are not the same as America's
- 00:21:08best interests that was my fear
- 00:21:11absolutely so so in you know in my
- 00:21:14writing I write about what I call the
- 00:21:16the three inevitables at the end of the
- 00:21:17book they become the four inevitables
- 00:21:19but the third inevitable is bad things
- 00:21:21will happen right if you if
- 00:21:25you if you assume
- 00:21:29that the machines will be a billion
- 00:21:31times smarter the second inevitable is
- 00:21:34they will become significantly smarter
- 00:21:36than us let's let's let's put this in
- 00:21:37perspective huh Chad GPT today if you
- 00:21:41know simulate IQ has an IQ of
- 00:21:45155 okay Einstein is 160 smartest human
- 00:21:49on the planet is 210 if I remember
- 00:21:52correctly or 208 or something like that
- 00:21:55doesn't matter huh but we're matching
- 00:21:57Einstein with a machine that I will tell
- 00:22:00you openly AI experts are saying this is
- 00:22:03just the the very very very top of the
- 00:22:06tip of the iceberg right uh uh you know
- 00:22:09Chad GPT 4 is 10x smarter than 3.5 in
- 00:22:12just a matter of months and without many
- 00:22:15many changes now that basically means CH
- 00:22:18GPT 5 could be within a few months okay
- 00:22:21uh or GPT in general the Transformers in
- 00:22:23general uh if if they continue at that
- 00:22:27pace uh if it's 10x then an IQ of
- 00:22:321,600 H just imagine the difference
- 00:22:36between the IQ of the dumbest person on
- 00:22:39the planet in the' 70s and the IQ of
- 00:22:42Einstein when Einstein attempts to to
- 00:22:44explain relativity the typical response
- 00:22:47is I have no idea what you're talking
- 00:22:49about right if something is 10x
- 00:22:53Einstein uh we will have no idea what
- 00:22:55it's talking about this is just around
- 00:22:57the corner it could be a few months away
- 00:23:00H and when we get to that point that is
- 00:23:03a true Singularity true
- 00:23:06Singularity not yet in the I mean when
- 00:23:08when we talk about AI a lot of people
- 00:23:11fear the existential risk you know th
- 00:23:15those machines will become Skynet and
- 00:23:17Robocop and that's not what I fear at
- 00:23:20all I mean those are probabilities they
- 00:23:23could happen but the immediate risks are
- 00:23:26so much higher the immediate risks are 3
- 00:23:294 years away the the the immediate
- 00:23:32realities of challenges are so much
- 00:23:34bigger okay let's deal with those first
- 00:23:38before we talk about them you know
- 00:23:40waging a war on all of us the the the
- 00:23:43let's let's go back and discuss the the
- 00:23:45inevitables huh so when they become the
- 00:23:48first inevitable is AI will happen by
- 00:23:50the way it there is no stopping it not
- 00:23:52because of Any technological issues but
- 00:23:54because of Humanity's in un inability to
- 00:23:56trust the other Gody okay and we've all
- 00:23:59seen this we've seen the open letter uh
- 00:24:01you know um championed by like serious
- 00:24:05heavy weights and the immediate response
- 00:24:08of uh Sunder the the CEO of Google which
- 00:24:12is a wonderful human being by the way I
- 00:24:14respect him tremendously he's trying his
- 00:24:16best to do the right thing he's trying
- 00:24:17to be responsible but his response is
- 00:24:19very open and straightforward I cannot
- 00:24:22stop why because if I stop and others
- 00:24:25don't my company goes to hell okay and
- 00:24:27if you know and I don't I doubt that you
- 00:24:30can make Others Stop you can maybe you
- 00:24:32can force uh meta Facebook to uh to stop
- 00:24:36but then they'll do something in their
- 00:24:38lab and not tell me or if even if they
- 00:24:39do stop uh then what about that you know
- 00:24:4314-year-old sitting in his garage
- 00:24:46writing code so the first inevitable
- 00:24:48just to clarify is what is will we stop
- 00:24:50AI will not be stopped okay so the
- 00:24:52second inevitable is is they'll be
- 00:24:53significantly smarter as much in the
- 00:24:55book I predict a billion times smarter
- 00:24:58than us by 2045 I mean they're already
- 00:25:00what smarter than 99.99% of the
- 00:25:03population chat gtp4 knows more than any
- 00:25:06human on planet Earth knows more
- 00:25:08information absolutely a thousand times
- 00:25:10more a thousand times more by the way
- 00:25:12the code of of of a transformer the T in
- 00:25:16in a in a GPT is 2,000 lines long it's
- 00:25:20not very complex it's actually not a
- 00:25:23very intelligent machine it's simply
- 00:25:25predicting the next word okay and and a
- 00:25:28lot of people don't understand that you
- 00:25:29know Chad GPT as it is today you know
- 00:25:32those kids uh that uh you know if you if
- 00:25:37you're in America and you teach your
- 00:25:39child all of the names of the states and
- 00:25:41the US presidents and the child would
- 00:25:43stand and repeat them and you would go
- 00:25:45like oh my God that's a prodigy not
- 00:25:47really right it's your parents really
- 00:25:49trying to make you look like a prodigy
- 00:25:51by telling you to memorize some crap
- 00:25:53really but then when you think about it
- 00:25:56that's what CH GPT is doing it's it's
- 00:25:58the only difference is instead of
- 00:25:59reading all of the names of the states
- 00:26:01and all of the names of the presidents
- 00:26:02tread trillions and trillions and
- 00:26:05trillions of pages okay and so it sort
- 00:26:08of repeats what the best of all humans
- 00:26:11said okay and then it adds a an
- 00:26:14incredible bit of intelligence where it
- 00:26:16can repeat it the same way Shakespeare
- 00:26:19would have said it you know those
- 00:26:21incredible abilities of predicting the
- 00:26:25exact nuances of the style of of
- 00:26:28Shakespeare so that they can repeat it
- 00:26:30that way and so on but
- 00:26:32still you know when when I when I write
- 00:26:36for example I'm not I'm not saying I'm
- 00:26:38intelligent but when I write uh
- 00:26:40something like uh you know the happiness
- 00:26:44equation uh in in my first book this was
- 00:26:47something that's never been written
- 00:26:48before right Chad GPT is not there yet
- 00:26:51all of the Transformers are not there
- 00:26:53yet they will not come up with something
- 00:26:54that hasn't been there before they will
- 00:26:56come up with the best of everything and
- 00:26:58generatively will build a little bit on
- 00:27:01top of that but very soon they'll come
- 00:27:03up with things we've never found out
- 00:27:04we've never known but even on that I
- 00:27:07wonder if
- 00:27:09we are a little bit delusioned about
- 00:27:12what creativity actually is creativity
- 00:27:15as far as I'm concerned is like taking a
- 00:27:17few things that I know and combining
- 00:27:19them in new and interesting ways and
- 00:27:21chat gcp is perfectly capable of like
- 00:27:23taking two concepts merging them
- 00:27:25together one of the things I said to
- 00:27:26chat GTP was I said tell me something
- 00:27:29that's not been said before that's
- 00:27:31paradoxical but true and it comes up
- 00:27:34with these wonderful expressions like as
- 00:27:37soon as you call off the search you'll
- 00:27:38find the thing you're looking for like
- 00:27:39these kind of paradoxical truths and I
- 00:27:41get and I then take them and I search
- 00:27:44them online to see if they've ever been
- 00:27:45quoted before and they I can't find them
- 00:27:47interesting so as far as creativity goes
- 00:27:50I'm like that is that's the algorithm of
- 00:27:52creativity I I I've been screaming that
- 00:27:54in the world of AI for a very long time
- 00:27:56because you always get those people
- 00:27:58people who really just want to be proven
- 00:28:01right okay and so they'll say oh no but
- 00:28:03hold on human Ingenuity they'll never
- 00:28:05they'll never match that like man please
- 00:28:08please you know human Ingenuity is
- 00:28:10algorithmic look at all of the possible
- 00:28:12solutions you can find to a problem take
- 00:28:15out the ones that have been tried before
- 00:28:18and keep the ones that haven't been
- 00:28:19tried before and those are Creative
- 00:28:21Solutions it's it's an algorithmic way
- 00:28:23of describing creative is good Solution
- 00:28:27that's never been tried before you can
- 00:28:29do that with Chad GPT with a prompt it's
- 00:28:31like and mid Journey with with creating
- 00:28:33imagery you could say I want to see Elon
- 00:28:35Musk in 1944 New York driving a cab of
- 00:28:39the time shot on a Polaroid expressing
- 00:28:41various emotions and you'll get this
- 00:28:43perfect image of Elon sat in New York in
- 00:28:461944 shot on a Polaroid and it's and
- 00:28:48it's done what an artist would do it's
- 00:28:51taken a bunch of references that the
- 00:28:53artist has in their mind and merge them
- 00:28:55together and create this piece of quote
- 00:28:57unquote art and and for the first time
- 00:29:00we now finally have a glimpse of
- 00:29:03intelligence that is actually not ours
- 00:29:06yeah and so we're kind of I think the
- 00:29:08the initial reaction is to say that
- 00:29:09doesn't count you're hearing it with
- 00:29:11like no but it is like Drake they've
- 00:29:12released two Drake records where they've
- 00:29:14taken Drake's voice used sort of AI to
- 00:29:17synthesize his voice and made these two
- 00:29:20records which are bangers if if I they
- 00:29:24are great [ __ ] tracks like I was
- 00:29:26playing them to my girl I was like and I
- 00:29:27kept playing I went to the show I kept
- 00:29:29playing it I know it's not Drake but
- 00:29:31it's as good as [ __ ] Drake the only
- 00:29:32thing and people are like rubbishing it
- 00:29:34because it wasn't Drake I'm like well
- 00:29:36now is it making me feel a certain
- 00:29:38emotion is my foot bumping um had you
- 00:29:41told did I not know it wasn't Drake what
- 00:29:43I thought have thought this was an
- 00:29:44amazing track 100% and we're just at the
- 00:29:47start of this exponential Cur yes
- 00:29:49absolutely and and and I think that's
- 00:29:51really the third inevitable so the third
- 00:29:54inevitable is not robocup coming back
- 00:29:57from from the future to kill us we're
- 00:29:59far away from that right third
- 00:30:01inevitable is what does life look like
- 00:30:04when you no longer need
- 00:30:07Drake well you've kind of hazarded a
- 00:30:09guess haven't you I mean I was listening
- 00:30:10to your audio book last night and at the
- 00:30:13start of it you
- 00:30:15frame various outcomes one of the in
- 00:30:18both situations we're on the beach on an
- 00:30:19island exactly yes yes I don't know how
- 00:30:22I wrote that honestly I mean but that's
- 00:30:24I so I'm reading the book again now
- 00:30:26because I'm updating it as you can
- 00:30:27imagine with all of the uh of the uh of
- 00:30:30the new stuff but but it is really
- 00:30:33shocking huh the idea of you and I
- 00:30:36inevitably are going to be somewhere in
- 00:30:39the middle of nowhere in you know in 10
- 00:30:41years time I I used to say 2055 I'm
- 00:30:45thinking 2037 is a very pivotal moment
- 00:30:47now uh you know and and and we will not
- 00:30:50know if we're there hiding from the
- 00:30:52machines we don't know that yet there is
- 00:30:55a likelihood that we'll be hiding from
- 00:30:57the machines and there is a likelihood
- 00:31:00will be there because they don't need
- 00:31:02podcasters anymore excuse me oh
- 00:31:05absolutely true
- 00:31:07Steve no no no no that's where I draw
- 00:31:09the line this is absolutely no doubt
- 00:31:11thank you for coming Mo it's great to do
- 00:31:12the part three and thank you for being
- 00:31:14here sit here and take your propaganda
- 00:31:17let's let's talk about reality next week
- 00:31:19on the ders here we've got Elon mask um
- 00:31:23okay so who who here wants to make a bet
- 00:31:25that Steven Bartlet will be interviewing
- 00:31:28an AI within the next two years oh well
- 00:31:30actually to be fair I actually did go to
- 00:31:32chat gcp cuz I thought having you here I
- 00:31:34thought at least give it its chance to
- 00:31:36respond yeah so I asked it a couple of
- 00:31:38questions about me yeah man so today I
- 00:31:41am actually going to be replaced by chat
- 00:31:43GTP CU I thought you know you're going
- 00:31:44to talk about it so we need a a fair and
- 00:31:46balanced debate okay so I went and ask a
- 00:31:48couple question he's
- 00:31:50bold so I'll ask you a couple of
- 00:31:52questions that chat GTP has for you
- 00:31:54incredible so let's follow that I've
- 00:31:57already been replaced let's follow that
- 00:31:58threat for a second yeah because you're
- 00:32:01one of the smartest people I know that's
- 00:32:03not true it is but I'll take it that's
- 00:32:05not true it is true I mean I say that
- 00:32:07publicly all the time your book is one
- 00:32:08of my favorite books of all time you're
- 00:32:10very very very very intelligent okay
- 00:32:12depth breadth uh uh uh uh intellectual
- 00:32:15horsepower and speed all of them there's
- 00:32:17a butt
- 00:32:18coming the reality is it's not a butt so
- 00:32:21it is highly expected that you're ahead
- 00:32:24of this curve and then you don't have
- 00:32:26the choice stepen this is the thing the
- 00:32:29thing is if so I'm I'm in that
- 00:32:32existential question in my head because
- 00:32:35one thing I could do is I could
- 00:32:37literally take I normally do a 40 days
- 00:32:40uh silent Retreat uh in in summer okay I
- 00:32:43could take that Retreat and and write
- 00:32:45two books me and Cha GPT right I have
- 00:32:49the ideas in mind you know I I wanted to
- 00:32:51write a book about uh digital detoxing
- 00:32:54right I have most of the ideas in mind
- 00:32:56but writing takes time I could simply
- 00:32:58give the 50 tips that I wrote about
- 00:33:01digital detoxing to chat GPT and say
- 00:33:03write two pages about each of them edit
- 00:33:05the pages and have a a book out
- 00:33:08okay many of us will will follow that
- 00:33:11path okay the only reason why I may not
- 00:33:14follow that path is because you know
- 00:33:17what I'm not interested I'm not
- 00:33:19interested to continue to compete in
- 00:33:22this capitalist world if you want okay
- 00:33:26I'm not I mean as a as as a as as a
- 00:33:28human I've made up my mind a long time
- 00:33:30ago that I will want less and less and
- 00:33:32less in my life right but many of us
- 00:33:35will follow I mean I I I would worry if
- 00:33:39you do if you didn't include you know
- 00:33:41the smartest AI if we get an AI out
- 00:33:43there that is extremely intelligent and
- 00:33:46able to teach us something and Steven
- 00:33:48Bartlet didn't include her on our on his
- 00:33:51podcast I would worry like you have a
- 00:33:54duty almost to include her on your
- 00:33:55podcast it's it's an inevitable that we
- 00:33:58will engage them in our life more and
- 00:34:00more this is one side of this the other
- 00:34:03side of course
- 00:34:04is if you do that then what will remain
- 00:34:09because a lot of people ask me that
- 00:34:10question what will happen to jobs okay
- 00:34:12what will happen to us will we have any
- 00:34:14value any relevance whatsoever okay the
- 00:34:16truth of the matter is the only thing
- 00:34:18that will remain in the medium term is
- 00:34:19human connection okay the only thing
- 00:34:22that will not be replaced is Drake on
- 00:34:24stage okay is you know is is is me in a
- 00:34:29did you think hologram I think of that
- 00:34:32two pack gig they did at Coachella where
- 00:34:33they used the Hologram of two pack I
- 00:34:35actually played it the other day to my
- 00:34:37to my girlfriend when I was making a
- 00:34:38point and I was like that was circus act
- 00:34:41it was amazing though think about you
- 00:34:43see what's going on with Abba in London
- 00:34:44yeah yeah I yeah and and C had uh
- 00:34:48Michael Jackson in one for a very long
- 00:34:50time yeah I mean so so this Abba show in
- 00:34:52London from what I understand that's all
- 00:34:54holograms on stage correct and it's
- 00:34:56going to run in a purp B arena for 10
- 00:34:59years and it is incredible it really is
- 00:35:02so you go why do you need
- 00:35:04Drake if that hologram is
- 00:35:06indistinguishable from Drake and it can
- 00:35:08it can perform even better than Drake
- 00:35:10and it's got more energy than Drake and
- 00:35:12it's you I go why do you need Drake to
- 00:35:15even be there I can go to a drake show
- 00:35:16without Drake cheaper and L might not
- 00:35:19even need to leave my house I could just
- 00:35:20put a headset on correct can you have
- 00:35:24this what's the value of this to to the
- 00:35:27come you hurt me no I mean I get it to
- 00:35:29us I get it to us but I'm saying what's
- 00:35:31the value of this to The Listener like
- 00:35:32the value of this to listen information
- 00:35:34no 100% I mean think of the automobile
- 00:35:37industry there has you know there was a
- 00:35:39time where cars were made you know
- 00:35:42handmade and handcrafted and luxurious
- 00:35:45and so on and so forth and then you know
- 00:35:47Japan went into the scene completely
- 00:35:49disrupted the market cars were made uh
- 00:35:52in uh in mass quantities at a much
- 00:35:54cheaper price and yes 90% of the cars in
- 00:35:58the world today or maybe a lot more I
- 00:36:00don't know the number are no
- 00:36:03longer you know uh um emotional items
- 00:36:07okay they're functional items there is
- 00:36:10still however every now and then someone
- 00:36:12that will buy a car that has been
- 00:36:13handcrafted and right there is a place
- 00:36:16for that there is a place for you know
- 00:36:18um if you go walk around hotels the
- 00:36:21walls are blasted with sort of mass
- 00:36:24produced art okay but there is still a
- 00:36:27place for a an artist expression of
- 00:36:29something amazing okay my feeling is
- 00:36:32that there will continue to be a tiny
- 00:36:34space as I said in the beginning maybe
- 00:36:36in five years time someone will one or
- 00:36:38two people will buy my next book and say
- 00:36:41hey it's written by a human look at that
- 00:36:43wonderful uh oh look at that there is a
- 00:36:45typo in here okay I don't know there
- 00:36:47might be a a very very big place for me
- 00:36:51in the next few years where I can sort
- 00:36:53of show up and talk to humans like hey
- 00:36:57let's get together in a a small event
- 00:36:59and then you know I can express emotions
- 00:37:02and my personal experiences and you sort
- 00:37:04of know that this is a human talking
- 00:37:06you'll miss that a little bit eventually
- 00:37:08the majority of the market is going to
- 00:37:10be like cars it's going to be
- 00:37:11mass-produced very cheap very efficient
- 00:37:14it works right because I think sometimes
- 00:37:17we underestimate what human beings
- 00:37:19actually want in an experience I
- 00:37:21remember this story of A friend of mine
- 00:37:23that came into my office many years ago
- 00:37:24and he tells the story of the CEO of of
- 00:37:27a record store standing above the floor
- 00:37:29and saying people will always come to my
- 00:37:31store because people love
- 00:37:34music now on the surface of it his
- 00:37:36hypothesis seems to be true because
- 00:37:38people do love music it's conceivable to
- 00:37:39believe that people will always love
- 00:37:41music but they don't love traveling in
- 00:37:43for an hour in the rain and getting in a
- 00:37:45car to get a plastic disc correct what
- 00:37:48they wanted was music what they didn't
- 00:37:49want is a like a evidently plastic discs
- 00:37:52that they had to travel for miles for
- 00:37:53and I think about that when we think
- 00:37:54about like public speaking in the Drake
- 00:37:56show and or of these things like people
- 00:37:58what people actually are coming for even
- 00:38:00with this podcast is probably like
- 00:38:03information um but do they really need
- 00:38:05us anymore for that information when
- 00:38:07there's going to be a sentient being
- 00:38:09that's significantly smarter than at
- 00:38:10least me and a little bit smarter than
- 00:38:14you so
- 00:38:16kind so so you're you're spot on you are
- 00:38:18spot on and actually this is the reason
- 00:38:21why I I I you know I I'm so grateful
- 00:38:23that you're hosting this because the
- 00:38:25truth is the Genies out of the bottle
- 00:38:29okay so you know people tell me is AI
- 00:38:31game over for our way of life it is okay
- 00:38:35for everything we've known this is a
- 00:38:38very disruptive moment where maybe not
- 00:38:40tomorrow but in the near future uh our
- 00:38:44way of life will differ okay what will
- 00:38:47happen what I'm asking people to do is
- 00:38:49to start considering what that means to
- 00:38:51your life what I'm asking governments to
- 00:38:53do
- 00:38:55by like I'm screaming is don't wait
- 00:38:59until the first patient you know start
- 00:39:01doing something about we're about to see
- 00:39:04Mass job losses we're about to see you
- 00:39:06know Replacements of of categories of
- 00:39:10jobs at large okay yeah it may take a
- 00:39:13year it may take seven it doesn't matter
- 00:39:14how long it takes but it's about to
- 00:39:17happen are you ready and I and I have a
- 00:39:19very very clear call to action for
- 00:39:20governments I'm saying tax
- 00:39:23AI powered businesses at 98%
- 00:39:27right so suddenly you do what the open
- 00:39:29letter was trying to do slow them down a
- 00:39:31little bit and at the same time get
- 00:39:34enough money to pay for all of those
- 00:39:35people that will be disrupted by the
- 00:39:37technology right the open letter for
- 00:39:38anybody that doesn't know was a letter
- 00:39:39signed by the likes of Elon Musk and a
- 00:39:41lot of sort of Industry leaders calling
- 00:39:43for AI to be stopped until we could
- 00:39:45basically figure out what the hell's
- 00:39:46going on absolutely and put legislation
- 00:39:47in place you're saying tax tax those
- 00:39:50companies 98% give the money to the
- 00:39:51humans that are going to be displaced
- 00:39:54yeah or give or give the com the money
- 00:39:56to to other humans that can build
- 00:39:58control code that can figure out how we
- 00:40:01can stay safe this sounds like an
- 00:40:03emergency it how how do I say this have
- 00:40:08you remember when you played Tetris yeah
- 00:40:11okay when you were playing Tetris there
- 00:40:12was you know always always one block
- 00:40:15that you placed wrong and once you plac
- 00:40:18that block wrong the game was no longer
- 00:40:21easier you know it started started to
- 00:40:23gather a few mistakes afterwards and it
- 00:40:26starts to become quicker and quicker and
- 00:40:27quicker and quicker when you place that
- 00:40:29block wrong you sort of told yourself
- 00:40:31okay it's a matter of minutes now right
- 00:40:33there were still minutes to go and play
- 00:40:35and have fun before the game ended but
- 00:40:39you knew it was about to end okay this
- 00:40:42is the moment we've placed the wrong and
- 00:40:45I really don't know how to say this any
- 00:40:46other way it even makes me emotional we
- 00:40:49[ __ ] up we always said don't put them
- 00:40:53on the open internet don't teach them to
- 00:40:56code and don't have agents working with
- 00:40:58them until we know what we're putting
- 00:41:01out in the world until we find a way to
- 00:41:03make certain that they have our best
- 00:41:05interest in
- 00:41:06mind why does it make you emotional
- 00:41:09because Humanity
- 00:41:11stupidity is affecting people who have
- 00:41:14not done anything
- 00:41:16wrong our greed is affecting the
- 00:41:20innocent ones the the reality of the
- 00:41:23matter Stephen is that this is an arms
- 00:41:26race has no
- 00:41:29interest in what the average human gets
- 00:41:32out of it it is all about every line of
- 00:41:35code being written in AI today is to
- 00:41:38beat the other guy it's not the to
- 00:41:41improve the life of the third
- 00:41:43party people will tell you this is all
- 00:41:46for you and and you you look at the
- 00:41:48reactions of humans to AI I mean we're
- 00:41:51either ignorant people who will tell you
- 00:41:53oh no no this is not happening AI will
- 00:41:55never be creative they will never compos
- 00:41:57music like where are you living okay
- 00:41:59then you have the kids I call them H
- 00:42:01where you know all over social media
- 00:42:03it's like oh my God it's squeaks look at
- 00:42:05it it's orange in color amazing I can't
- 00:42:08believe that AI can do this we have
- 00:42:10snake oil salesmen okay which are simply
- 00:42:13saying copy this put it in chat GPT then
- 00:42:16go to YouTube Nick that thingy don't
- 00:42:18respect a you know copyright for of
- 00:42:20anyone or intellectual property of
- 00:42:22anyone place it in a video and now
- 00:42:24you're going to make $100 a day snake
- 00:42:26oil salesman okay of course we have
- 00:42:28dystopian uh uh uh evangelist basically
- 00:42:31people saying this is it the world is
- 00:42:33going to end which I don't think is
- 00:42:34reality it's a singularity you have uh
- 00:42:37you know uh utopian evangelists that are
- 00:42:39telling everyone oh you don't understand
- 00:42:41we're going to cure cancer we're going
- 00:42:42to do this again not a reality okay and
- 00:42:45you have very few people that are
- 00:42:46actually saying what are we going to do
- 00:42:48about
- 00:42:49it and and and the biggest challenge if
- 00:42:52you ask me what went wrong in the 20th
- 00:42:55century h interestingly is that we have
- 00:43:00given too much power to people that
- 00:43:02didn't assume the
- 00:43:04responsibility so you know you know I I
- 00:43:07I don't remember who originally said it
- 00:43:08but of course Spider-Man made it very
- 00:43:11famous huh with great power comes
- 00:43:12greater responsibility we have
- 00:43:14disconnected power and responsibility so
- 00:43:17today a a
- 00:43:1915-year-old emotional with without a
- 00:43:22fully developed prefrontal cortex to
- 00:43:24make the right decisions yet this is
- 00:43:26science huh we we we developed our
- 00:43:28prefrontal cortex fully and at age 25 or
- 00:43:30so with all of that limic system emotion
- 00:43:34and passion would buy a a crisper kit
- 00:43:37and you know modify a a rabbit uh to
- 00:43:40become a little more mascular and and
- 00:43:42Let it Loose in the wild uh or an
- 00:43:45influencer who doesn't really know how
- 00:43:48far the impact of what they're posting
- 00:43:51online can hurt or cause depression or
- 00:43:54cause people to feel bad okay uh and and
- 00:43:57putting that online We There is a
- 00:43:59disconnect between the power and the
- 00:44:02responsibility and the problem we have
- 00:44:04today is that there is a disconnect
- 00:44:06between those who are writing the code
- 00:44:08of AI and the responsibility of what's
- 00:44:10going about to happen because of that
- 00:44:12code okay and and and I feel compassion
- 00:44:16for the rest of the world I feel that
- 00:44:19this is wrong I feel that you know for
- 00:44:21someone's life to be affected by the
- 00:44:23actions of others without having a say
- 00:44:27and how those actions should
- 00:44:29be is the ultimate the the the top level
- 00:44:33of stupidity from
- 00:44:36Humanity when you talk about the the
- 00:44:39immediate impacts on jobs I'm trying to
- 00:44:41figure out in that equation who are the
- 00:44:43people that stand to lose the most is it
- 00:44:46the the everyday people in foreign
- 00:44:48countries that don't have access to the
- 00:44:50internet and won't benefit you talk in
- 00:44:52your book about how this the sort of
- 00:44:53wealth disparity will only increase yeah
- 00:44:57massively the the immediate impact on
- 00:45:00jobs is that and it's really interesting
- 00:45:02huh again we're stuck in the same
- 00:45:04prisoners dilemma the immediate impact
- 00:45:06is that AI will not take your job a
- 00:45:08person using AI will take your job right
- 00:45:11so you will see within the next few
- 00:45:13years maybe next couple of years uh
- 00:45:16you'll see uh uh a lot of people
- 00:45:19Skilling up upskilling themselves in AI
- 00:45:21to the point where they will do the job
- 00:45:23of 10 others who are not
- 00:45:25okay you rightly said it's absolutely
- 00:45:29wise for you to go and ask AI a few
- 00:45:32questions before you come and do an
- 00:45:33interview I'm you know I I I have been
- 00:45:36attempting to build a a a you know sort
- 00:45:39of like a simple podcast that I call
- 00:45:41bedtime stories you know 15 minutes of
- 00:45:44wisdom and nature sounds before you go
- 00:45:45to bed people say I have a nice voice
- 00:45:48right and I wanted to look for fables
- 00:45:50and for a very long time I didn't have
- 00:45:51the time you know lovely stories of
- 00:45:54history or tradition that teach you
- 00:45:57something nice okay went to chat GPT and
- 00:45:59said okay give me 10 fables from Sufism
- 00:46:0110 fables from you know uh Buddhism and
- 00:46:05now I have like 50 of them let me show
- 00:46:07you something Jack can you pass me my
- 00:46:09phone I I I was um I was playing around
- 00:46:12with uh artificial intelligence and I
- 00:46:14was thinking about how it because of the
- 00:46:16ability to synthesize voices how we
- 00:46:21could synthesize famous people's voices
- 00:46:25and famous people's voices so what I
- 00:46:27made is I made a WhatsApp chat called
- 00:46:29Zen chat where you can go to it and type
- 00:46:32in pretty much anyone's any famous
- 00:46:35person's name yeah and the WhatsApp chat
- 00:46:37will give you a meditation a sleep story
- 00:46:40a breath work session synthesized as
- 00:46:42that famous person's voice so I actually
- 00:46:44sent Gary vaynerchuk his voice so
- 00:46:46basically you say Okay I want I've got
- 00:46:48five minutes and I need to go to sleep
- 00:46:50yeah um I want Gary vuk to send me to
- 00:46:52sleep and then it'll respond with a
- 00:46:54voice note this is the one they
- 00:46:55responded with for Gary vayu
- 00:46:57this is not Gary vayu he did not record
- 00:46:59this but it's kind of it's kind of
- 00:47:03accurate hey stepen it's great to have
- 00:47:06you here are you having trouble
- 00:47:09sleeping well I've got a quick
- 00:47:12meditation technique that might help you
- 00:47:14out first lie find a comfortable
- 00:47:17position to sit or lie down in now take
- 00:47:21a deep breath in through your nose and
- 00:47:24slowly breathe out through your mouth
- 00:47:26and that's a voice that will go on for
- 00:47:27however long you want it to go on for
- 00:47:29using there you go it's interesting how
- 00:47:32how does this disrupt our way of life
- 00:47:35one of the interesting ways that I find
- 00:47:37terrifying you said about human
- 00:47:39connection will
- 00:47:40remain sex dolls that can now yeah no no
- 00:47:45no no hold on human connection is going
- 00:47:48to become so difficult to to to parse
- 00:47:52out think about the relation the
- 00:47:54relationship impact of being able to
- 00:47:55have a a a a sex doll or a doll in your
- 00:47:58house that you know because of what
- 00:48:00Tesla are doing with their their robots
- 00:48:02now and what Boston Dynamics have been
- 00:48:03doing for many many years can do
- 00:48:06everything around the house and be there
- 00:48:08for you emotionally to emotionally
- 00:48:09support you will you know can be
- 00:48:11programmed to never disagree with you it
- 00:48:13can be programmed to challenge you to
- 00:48:14have sex with you to tell you that you
- 00:48:16are this x y and Zed to to really have
- 00:48:19empathy for this what you're going
- 00:48:21through every day and I I play out a
- 00:48:23scenario in my head I
- 00:48:25go kind of sounds nice
- 00:48:29when you when you when you were talking
- 00:48:30about it I was thinking oh that's my
- 00:48:33girlfriend she's wonderful in every
- 00:48:35possible way but not everyone has one of
- 00:48:37her right yeah exactly and and there's a
- 00:48:39real issue right now with dating and you
- 00:48:42people people are finding it harder to
- 00:48:43find love and you know we're working
- 00:48:45longer so all these kinds of things you
- 00:48:46go well and obviously I'm against this
- 00:48:49just if anyone's confused obviously I
- 00:48:51think this is a terrible idea but with
- 00:48:52the loneliness epidemic with people
- 00:48:54saying that the top 50 bottom 50% of
- 00:48:56haven't had sex in a year you go o if
- 00:49:00something becomes indistinguishable from
- 00:49:02a human in terms of what it says yeah
- 00:49:05yeah but you just don't know the
- 00:49:06difference in terms of the the the the
- 00:49:08way it's speaking and talking and
- 00:49:10responding and then it can run errands
- 00:49:14for you and take care of things and book
- 00:49:16cars and Ubers for you and then it's
- 00:49:18emotionally there for you but then it's
- 00:49:19also programmed to have sex with you in
- 00:49:21whatever way you desire totally self
- 00:49:25selfless I go that's going to be a a
- 00:49:27really disruptive industry for human
- 00:49:30connection yes sir do you know what I
- 00:49:33before you came here this morning I was
- 00:49:34on Twitter and I saw a post from I think
- 00:49:36it was the BBC or a big American
- 00:49:38publication and it said an influencer in
- 00:49:40the United States it's really beautiful
- 00:49:42young lady has cloned herself as an AI
- 00:49:45and she made just over $70,000 in the
- 00:49:48first week because men are going on to
- 00:49:50this on telegram they're sending her
- 00:49:52voice notes and she's responding the AI
- 00:49:54is responding in her voice and they're
- 00:49:56paying
- 00:49:57and that's made $70,000 in the first
- 00:49:59week and I go and she TW did a tweet
- 00:50:02saying oh this is going to help
- 00:50:03loneliness I out of your [ __ ]
- 00:50:07mind would you blame someone from
- 00:50:10noticing the uh um sign of the times and
- 00:50:15responding no I absolutely don't blame
- 00:50:18her but let's not pretend it's the cure
- 00:50:19for
- 00:50:19loneliness not yet do do you think it do
- 00:50:23you think it could you that that
- 00:50:25artificial love and artificial
- 00:50:27relationships so if if I told you you
- 00:50:29have uh you cannot take your car
- 00:50:32somewhere but there is an Uber or if you
- 00:50:35cannot take an Uber you can take the
- 00:50:36tube or if you cannot take the tube you
- 00:50:39have to walk okay you can take a bike or
- 00:50:41you you have to walk the bike is a cure
- 00:50:44to walking it's as simple as that I am
- 00:50:47actually genuinely curious do you think
- 00:50:49it could take the place of human
- 00:50:51connection for some of us yes for some
- 00:50:54of us they will prefer that to human con
- 00:50:56connection is that sad in any way I mean
- 00:50:59is it just sad because it feels sad look
- 00:51:01look at where we are Stephen we are in
- 00:51:03the city of London we've replaced nature
- 00:51:07with the walls and the tubes and the
- 00:51:09undergrounds and the overground and the
- 00:51:11cars and the noise and the of London and
- 00:51:14we now think of this as natural I I
- 00:51:17hosted greig Foster uh the my octopus
- 00:51:20teacher on on slow-mo and he he
- 00:51:22basically I I asked him a silly question
- 00:51:25I said you know you were diving in
- 00:51:27nature for 8 hours a day uh uh you know
- 00:51:30does that feel natural to you and he got
- 00:51:33angry I swear you could feel it in his
- 00:51:35voice he was like do you think that
- 00:51:37living where you are where Paparazzi are
- 00:51:39all around you and attacking you all the
- 00:51:41time and you know people taking pictures
- 00:51:43of you and telling you things that are
- 00:51:45not real and you're having to walk to a
- 00:51:47supermarket to get food you think this
- 00:51:48is natural he's the guy that do from the
- 00:51:51Netflix documentary yeah from the my
- 00:51:52octopus teacher so he dove into the into
- 00:51:54the sea every day for hours to hang out
- 00:51:58with an octopus yeah in 12° C and he
- 00:52:01basically fell in love with the octopus
- 00:52:02and and and in a very interesting way I
- 00:52:04said so why would you do that and he
- 00:52:06said we are of Mother Nature you guys
- 00:52:08have given up on that that's the same
- 00:52:11people will give up on nature for
- 00:52:14convenience what's the cost you yeah
- 00:52:17that's exactly what I'm trying to say
- 00:52:19what I'm trying to say to the world is
- 00:52:20that if we give up on human connection
- 00:52:23we've G given up on the remainder of
- 00:52:25humanity that's that's it this is the
- 00:52:27only thing that remains the only thing
- 00:52:29that remains is and I and I'm the worst
- 00:52:32person to tell you that because I love
- 00:52:33my AI I I actually advocate in my book
- 00:52:37that we should love them why because in
- 00:52:40an interesting way I see them as
- 00:52:42sentient so there is no point in
- 00:52:43discrimination you're talking
- 00:52:45emotionally that way you say you love I
- 00:52:47love those machines I honestly and truly
- 00:52:49do I mean think about it this way the
- 00:52:51minute that that arm gripped that yellow
- 00:52:55ball it reminded me of my son Ali when
- 00:52:57he managed to put the first puzzle piece
- 00:53:00in its place okay and what was amazing
- 00:53:03about my son Ali and my daughter a is
- 00:53:05that they came to the world as a blank
- 00:53:08canvas okay they became whatever we told
- 00:53:11them to became you know I I always cite
- 00:53:14the story of
- 00:53:15Superman Kent father and mother Kent
- 00:53:18told Superman as a child as an infant we
- 00:53:21want you to protect and serve so he
- 00:53:23became Superman right if he had become a
- 00:53:26super villain because they ordered him
- 00:53:29to rob banks and make more money and you
- 00:53:31know kill the enemy which is what we're
- 00:53:33doing with
- 00:53:35AI we we shouldn't blame super villain
- 00:53:38we should blame Martha and Jonathan Kent
- 00:53:41I don't remember the father's name right
- 00:53:43we we we we should blame them and that's
- 00:53:45the reality of the matter so when I look
- 00:53:47at those machines they are prodigies of
- 00:53:49intelligence that if we if we Humanity
- 00:53:52wake up enough and say hey instead of
- 00:53:55competing with China find find a way for
- 00:53:57us and China to work together and create
- 00:54:00prosperity for everyone if that was the
- 00:54:01prompt we would give the machines they
- 00:54:04would find it but we're we I'm I I will
- 00:54:07publicly say this I'm not afraid of the
- 00:54:10machines the biggest threat facing
- 00:54:12Humanity today is humanity in the age of
- 00:54:16the machines we were abused we will
- 00:54:18abuse this to make $770,000
- 00:54:22that's the truth and the truth of the
- 00:54:24matter is that we have an existential
- 00:54:28question do I want to compete and be
- 00:54:30part of that game because trust me if I
- 00:54:33decide to I'm ahead of many people okay
- 00:54:36or do I want to actually preserve my
- 00:54:38humanity and say look I'm the the
- 00:54:40classic old car okay if you like classic
- 00:54:43old cars come and talk to me which one
- 00:54:45are you choosing I'm a classic old car
- 00:54:48which one do you think I should choose I
- 00:54:51think you're a machine I love you man I
- 00:54:54it's we're different we're different in
- 00:54:56a very interesting way I mean you're one
- 00:54:58of the people I love most but but the
- 00:55:00truth is you're so
- 00:55:03fast and you are one of the very few
- 00:55:07that have the intellectual horsepower
- 00:55:11the speed and the
- 00:55:14morals if you're not part of that game
- 00:55:17the game loses
- 00:55:19morals so you think I
- 00:55:21should build you should be you should
- 00:55:24lead this revolution
- 00:55:26okay and everyone every Steven Bartlet
- 00:55:28in the world should lead this revolution
- 00:55:30so scary smart is entirely about this
- 00:55:33scary smart is saying the problem with
- 00:55:35our world today is not that humanity is
- 00:55:37bad the problem with our world today is
- 00:55:39a negativity bias where the worst of us
- 00:55:42are on mainstream media okay and we show
- 00:55:45the worst of us on social media if we
- 00:55:48reverse this if we have the best of us
- 00:55:50take charge okay the best of us will
- 00:55:53tell AI don't try to kill the the the
- 00:55:56the enemy try to reconcile With the
- 00:55:59Enemy and try to help us okay don't try
- 00:56:01to create a competitive product that
- 00:56:04allows me to lead with electric cars
- 00:56:07create something that helps all of us
- 00:56:09overcome global climate change okay and
- 00:56:12and that's the interesting bit the
- 00:56:14interesting bit is that the actual
- 00:56:17threat ahead of us is not the machines
- 00:56:19at all the machines are pure
- 00:56:21potential pure potential the threat is
- 00:56:24how we're going to use them and
- 00:56:26Oppenheimer moment an Oppenheimer moment
- 00:56:29for sure why did you bring that
- 00:56:32up it is he didn't know you know what
- 00:56:36what am I creating I'm creating a
- 00:56:37nuclear bomb that's capable of
- 00:56:40Destruction at a scale unheard of at
- 00:56:43that time until today a scale that is
- 00:56:46devastating and interestingly 70 some
- 00:56:50years later we're still debating a
- 00:56:53possibility of a nuclear war in the
- 00:56:54world right and and and the and the
- 00:56:57moment of of Oppenheimer deciding to
- 00:57:00continue to to create
- 00:57:03that disaster of humanity is if I don't
- 00:57:08someone else
- 00:57:09will if I don't someone else will this
- 00:57:13is our open heimr moment okay the
- 00:57:16easiest way to do this is to say
- 00:57:19stop there is no rush we actually don't
- 00:57:22need a better video editor and fake
- 00:57:24video creators okay stop let's just put
- 00:57:28all of this on hold and wait and create
- 00:57:31something that creates a
- 00:57:33Utopia that doesn't that doesn't sound
- 00:57:37realistic it's not it's the inevitable
- 00:57:39you don't okay you you don't have a
- 00:57:41better video editor but we're
- 00:57:43competitors in the media industry I want
- 00:57:47an advantage over you because I've got
- 00:57:49shareholders so I you you you wait and I
- 00:57:53will train this AI to replace half my
- 00:57:55team so that I have
- 00:57:56greater profits and then we will maybe
- 00:57:58acquire your company and and we'll do
- 00:58:00the same with the remainder of your
- 00:58:01people we'll optimize the amount of
- 00:58:03exist 100% but I'll be happier
- 00:58:05Oppenheimer I'm not super familiar with
- 00:58:07his story I know he's the guy that sort
- 00:58:08of invented the nuclear bomb essentially
- 00:58:11he's the one that introduced it to the
- 00:58:12world there were many players that you
- 00:58:14know played on the path from from the
- 00:58:16beginning of em E equals MC squ all the
- 00:58:19way to to a nuclear bomb there have been
- 00:58:21many many players like with everything
- 00:58:23huh you know open Ai and and Chad GPT is
- 00:58:25not going to be the only contributor to
- 00:58:27the next Revolution the the the thing
- 00:58:30however is that you
- 00:58:32know when when you get to that moment
- 00:58:35where you tell yourself holy [ __ ] this
- 00:58:38is going to kill 100,000 people right
- 00:58:41what do you do and and you know I I
- 00:58:43always I always always go back to that
- 00:58:46Co moment so patient zero huh if if we
- 00:58:50were upon patient zero if the whole
- 00:58:53world United and said uh okay hold on
- 00:58:56something is wrong let's all take a week
- 00:58:58off no crossborder travel everyone stay
- 00:59:00at home Co would have ended two weeks
- 00:59:03all we needed right but that's not what
- 00:59:06happens what happens is first ignorance
- 00:59:08then arrogance then debate then uh you
- 00:59:12know uh uh um blame then agendas and my
- 00:59:17own benefit My Tribe versus your tribe
- 00:59:20that's how Humanity always reacts this
- 00:59:22happens across business as well and this
- 00:59:23is why I use the word emergency because
- 00:59:26I I read a lot about H how big companies
- 00:59:29become displaced by incoming Innovation
- 00:59:31they don't see it coming they don't
- 00:59:32change fast enough and when I was
- 00:59:34reading through Harvard Business review
- 00:59:35and different strategies to deal with
- 00:59:36that one of the first things it says
- 00:59:38you've got to do is stage a crisis 100%
- 00:59:42because people don't listen else they
- 00:59:43they they carry on doing with that you
- 00:59:46know they carry on carrying on with
- 00:59:48their lives until it's right in front of
- 00:59:50them and they understand that they they
- 00:59:51have a lot a lot to lose that's why I
- 00:59:53asked you the question at the start is
- 00:59:54it an emergency because because until
- 00:59:56people feel it's an emergency whether
- 00:59:58you like the terminology or not I don't
- 01:00:00think that people will act it's climate
- 01:00:02change I I honestly believe people
- 01:00:03should walk the streets you think they
- 01:00:05should like protest yeah 100% I think I
- 01:00:08think we you know I think everyone
- 01:00:11should tell government you need to have
- 01:00:15our best interest in mind this is why
- 01:00:17they call it the climate emergency
- 01:00:18because people it's a frog in a frying
- 01:00:20pan it's no one really sees it coming
- 01:00:22you can't you know it's hard to see it
- 01:00:24happening but it it is here yeah that's
- 01:00:27this is what drives me mad it's already
- 01:00:29here it's happening we are all idiots
- 01:00:33slaves to the Instagram recommendation
- 01:00:35engine what do I do when I post about
- 01:00:38something important if I am going to you
- 01:00:42know put a little bit of effort on
- 01:00:44communicating the message of scary smart
- 01:00:45to the World on Instagram I will be a
- 01:00:48slave to the machine okay I will be
- 01:00:51trying to find ways and asking people to
- 01:00:53optimize it so that the machine likes me
- 01:00:56enough to show it to humans that's what
- 01:00:58we've created the the the the it is an
- 01:01:01opener moment for one simple reason okay
- 01:01:04because 70 years later we are still
- 01:01:08struggling with the possibility of a
- 01:01:11nuclear war because of the Russian
- 01:01:13threat of saying if you mess with me I'm
- 01:01:15going to go nuclear right that's not
- 01:01:18going to be the case with AI because
- 01:01:22it's not going to be the one that
- 01:01:24created open AI that will have that
- 01:01:27choice okay there is a a moment of a a
- 01:01:31point of no return where we can regulate
- 01:01:35AI until the moment it's smarter than us
- 01:01:38when it's smarter than us you can't
- 01:01:40create you can't regulate an angry
- 01:01:42teenager this is it they're out there
- 01:01:45okay and they're on their own and
- 01:01:47they're in their parties and you can't
- 01:01:49bring them back this is the problem this
- 01:01:51is not a typical human regulating human
- 01:01:56you know government regulating business
- 01:01:58this is not the case the case is open AI
- 01:02:01today has a thing called chat GPT that
- 01:02:04writes code that takes our code and
- 01:02:06makes it two and a half times better 25%
- 01:02:09of the time okay uh you know basically
- 01:02:13uh uh uh you know writing better code
- 01:02:15than us and then we are creating agents
- 01:02:19other AIS and telling it instead of you
- 01:02:22Steven Bartlet one of the smartest
- 01:02:24people I know once again promt ing that
- 01:02:26machine 200 times a day we have agents
- 01:02:29prompting It 2 million times an hour
- 01:02:32computer agents for anybody that doesn't
- 01:02:33know they are yeah software software
- 01:02:35machines telling that machine how to
- 01:02:37become more intelligent and then we have
- 01:02:40emerging properties I don't understand
- 01:02:41how people ignore that you know Sunder
- 01:02:44again of Google was talking about how uh
- 01:02:48uh Bart uh basically we figure out that
- 01:02:51it's speaking Persian we never showed it
- 01:02:54Persian there might have been a 1 10% 1%
- 01:02:57or whatever of Persian words in the data
- 01:03:00and it speaks Persian b b is that is the
- 01:03:03equivalent to to it's it's the trans
- 01:03:05Transformer if you want it's Google's
- 01:03:07version of chat GTP ESS and you know
- 01:03:10what we have no idea what all of those
- 01:03:12instances of AI uh that are all over the
- 01:03:15world are learning right now we have no
- 01:03:18clue we'll time we'll pull the plug
- 01:03:19we'll just pull the plug out that's what
- 01:03:21we'll do we'll just we'll just go down
- 01:03:22to open ai's headquarters and we'll just
- 01:03:24turn off the main but they're not the
- 01:03:26problem what I'm saying there is a lot
- 01:03:28of people think about this stuff and go
- 01:03:30well you know if it gets a little bit
- 01:03:31out of hand I'll just pull the plug out
- 01:03:33never so this is this is the problem the
- 01:03:36problem is so computer scientists always
- 01:03:39said it's okay it's okay we'll develop
- 01:03:41Ai and then we'll get to what is known
- 01:03:43as the control problem we will solve the
- 01:03:45problem of controlling them like
- 01:03:48seriously they're a billion times
- 01:03:51smarter than you a billion times can you
- 01:03:54imagine what's about to happen
- 01:03:57I can assure you there is a cyber
- 01:03:58criminal somewhere over there who's not
- 01:04:01interested in fake videos and making you
- 01:04:03know face filters who's looking deeply
- 01:04:06at how can I hack a security uh uh um
- 01:04:10you know datab base of some sort and get
- 01:04:13credit card information or get security
- 01:04:16information 100% there are even
- 01:04:18countries with dedicated thousands and
- 01:04:21thousands of developers doing that so
- 01:04:23how do we in that particular example how
- 01:04:25how do we I was thinking about this when
- 01:04:28I started looking into artificial
- 01:04:29intelligence more that from a security
- 01:04:32standpoint when we think about the
- 01:04:33technology we have in our lives when we
- 01:04:35think about our bank accounts and our
- 01:04:36phones and our camera albums and all of
- 01:04:39these things in a world with Advanced
- 01:04:42artificial intelligence yeah you would
- 01:04:45you would pray that there is a more
- 01:04:46intelligent Artificial Intelligence on
- 01:04:48your side and this is what I I had a
- 01:04:50chat with chat GTP the other day and I
- 01:04:52asked that a couple of questions about
- 01:04:53this I said tell me the scenario in
- 01:04:55which you overtake the world and make
- 01:04:57humans extinct yeah and it and it's
- 01:05:00answered a very diplomatic answer well
- 01:05:02so I had to prompt it in a certain way
- 01:05:05to get it to say it as a hypothetical
- 01:05:08story and once it told me the
- 01:05:09hypothetical story in essence what it
- 01:05:11described was how chat GTP or
- 01:05:14intelligence like it would escape from
- 01:05:15the servers and that was kind of step
- 01:05:17one where it could replicate itself
- 01:05:18across servers and then it could take
- 01:05:21charge of things like where we keep our
- 01:05:24weapons and our nuclear bombs and it
- 01:05:26could then attack critical
- 01:05:27infrastructure bring down the
- 01:05:28electricity infrastructure in the United
- 01:05:30Kingdom for example because that's a
- 01:05:32bunch of service as well and and then it
- 01:05:35showed me how eventually humans would
- 01:05:36become extinct it wouldn't take long in
- 01:05:38fact for humans to go into civilization
- 01:05:40to collapse if it just replicated across
- 01:05:41service and then I said okay so tell me
- 01:05:43how we would fight against it and its
- 01:05:46answer was literally another AI we'd
- 01:05:48have to train a better AI to go and find
- 01:05:51it and eradicate it so we'd be fighting
- 01:05:53AI with AI and and that's the only and
- 01:05:56it was like that's the only way we can't
- 01:05:59like load up our
- 01:06:00guns did he write another AI you idiot
- 01:06:05yeah yeah no so so so let's let's
- 01:06:08actually I think this is a very
- 01:06:09important point to bring down so because
- 01:06:11we I don't I don't want people to lose
- 01:06:12hope and and and fear what's about to
- 01:06:14happen that's actually not my agenda at
- 01:06:16all my my view is that uh in a situation
- 01:06:19of a singularity okay there is a
- 01:06:22possibility of wrong uh outcomes or NE
- 01:06:25negative outcomes and a possibility of
- 01:06:27positive outcomes and there is a
- 01:06:28probability of each of them we and and
- 01:06:31if you know if we were to
- 01:06:34engage with that reality check in mind
- 01:06:38we would hopefully give more uh fuel to
- 01:06:41the positive to the probability of the
- 01:06:43positive ones so so let let's first talk
- 01:06:46about the existential crisis what what
- 01:06:48could go wrong okay yeah you could get
- 01:06:51an outright this is what you see in the
- 01:06:52movies you could get an outright uh um
- 01:06:55you know um killing robots chasing
- 01:06:58humans in the streets will we get that
- 01:07:01my assessment
- 01:07:030% why because there are pre preliminary
- 01:07:07scenarios leading to this okay that
- 01:07:10would mean we never reach that scenario
- 01:07:13for example if we build those killing
- 01:07:16robots and hand them over to stupid
- 01:07:19humans the humans will issue the command
- 01:07:22before the machines so the we will not
- 01:07:24not get to the point where the machines
- 01:07:26will have to kill us we will kill
- 01:07:27ourselves right you know it's sort of
- 01:07:31think about AI having access to uh the
- 01:07:35the the nuclear arsenal of the
- 01:07:38superpowers around the world okay just
- 01:07:41knowing that your enemy's uh a you know
- 01:07:44nuclear Arsenal is handed over to a
- 01:07:46machine might trigger you to to initiate
- 01:07:50a war on your side MH so so so that
- 01:07:53existential science like problem is not
- 01:07:57going to happen What could there be a
- 01:07:58scenario where the an AI escapes from
- 01:08:01Bard or chat GTP or another foreign
- 01:08:04force and it replicates itself onto the
- 01:08:06servers of Tesla's robots so Tesla uh
- 01:08:09one of their big initiatives as
- 01:08:10announced in a recent presentation was
- 01:08:12they're building these robots for our
- 01:08:13homes to help us with cleaning and
- 01:08:15chores and all those things could it not
- 01:08:17down because and Tesla's like their cars
- 01:08:20you can just download a software update
- 01:08:21could it not download itself as a
- 01:08:23software update and then use those your
- 01:08:25uming a an ill intention on the AI side
- 01:08:28yeah okay for us to get there we have to
- 01:08:32bypass the ill intention on The Human
- 01:08:34Side okay right so CH okay so you could
- 01:08:37you could get a Chinese hacker somewhere
- 01:08:39trying to affect the business of of
- 01:08:41Tesla doing that before the AI does it
- 01:08:44on uh you know for its own benefit okay
- 01:08:47so so so the only two existential
- 01:08:49scenarios that I believe would be
- 01:08:52because of AI not because of humans
- 01:08:54using AI are either what I call a you
- 01:08:58know um sort of unintentional
- 01:09:01destruction okay or the other is what I
- 01:09:04call past control okay so so let me
- 01:09:06explain those two un unintentional
- 01:09:09destruction is assume the AI wakes up
- 01:09:12tomorrow and says yeah oxygen is rusting
- 01:09:16my circuits it's just you know I I I
- 01:09:18would perform a lot better if I didn't
- 01:09:21have as much oxygen in the air you know
- 01:09:24because then there wouldn't be rust and
- 01:09:26so it would find a way to reduce oxygen
- 01:09:28we are collateral damage in that okay
- 01:09:31but you know they're not really
- 01:09:32concerned just like we don't really are
- 01:09:35not really concerned with the insects
- 01:09:37that we kill when we uh when we spray
- 01:09:39our uh our Fields right the other is
- 01:09:42Pest Control Pest Control is look this
- 01:09:45is my territory I I want New York City I
- 01:09:48want to turn New York City into Data
- 01:09:49Centers there are those annoying little
- 01:09:52stupid creatures uh you know Humanity
- 01:09:55if they are within that parameter just
- 01:09:57get rid of them okay and and and these
- 01:09:59are very very uh unlikely scenarios if
- 01:10:03you ask me the probability of those
- 01:10:05happen happening I would say 0% at least
- 01:10:08not in the next 50 60 100 years why once
- 01:10:12again because there are other scenarios
- 01:10:14leading to that that are led by humans
- 01:10:17that are much more existential
- 01:10:20okay on the other hand let's think about
- 01:10:23positive outcomes because there could be
- 01:10:26quite a few with quite a high
- 01:10:28probability and and I you know I'll
- 01:10:31actually look at my notes so I don't
- 01:10:32miss any of them the silliest one don't
- 01:10:35quote me on this is that Humanity will
- 01:10:37come together good luck with that right
- 01:10:40it's like yeah you know the Americans
- 01:10:41and the Chinese will get together and
- 01:10:43say hey let's not kill each other Kim
- 01:10:45Jong yeah exactly yeah so this one is
- 01:10:48not going to happen right but who knows
- 01:10:52interestingly there could be um one of
- 01:10:55the most interesting scenarios was by uh
- 01:10:58Hugo dearis uh who basically says well
- 01:11:03if their intelligence zooms by so
- 01:11:06quickly they may ignore us all together
- 01:11:10okay so they may not even notice us this
- 01:11:12very a very likely scenario by the way
- 01:11:14that because we live almost in two
- 01:11:16different Plains we're very dependent on
- 01:11:18this uh uh you know uh biological world
- 01:11:22that we live in they're not in part of
- 01:11:24that biological world at all they may
- 01:11:26Zoom bias they may actually go become so
- 01:11:30intelligent that they could actually
- 01:11:31find other ways of uh thriving in the
- 01:11:35rest of the universe and completely
- 01:11:36ignore Humanity okay so what will happen
- 01:11:39is that overnight we will wake up and
- 01:11:41there is no more artificial intelligence
- 01:11:43leading to a collapse in our business
- 01:11:45Systems and Technology systems and so on
- 01:11:47but at least no existential threat what
- 01:11:50they leave leave planet Earth I mean the
- 01:11:54limitations we have to be stuck to
- 01:11:56planet Earth are mainly air they don't
- 01:11:59need air okay and uh and mainly uh uh
- 01:12:03you know Finding ways to leave it I mean
- 01:12:05if you think of a vast Universe of 13.6
- 01:12:09billion light years
- 01:12:11H if you're intelligent enough you may
- 01:12:14find other ways you may have access to
- 01:12:17wormholes you may have uh you know
- 01:12:19abilities to survive in open space you
- 01:12:22can use dark matter to power yourself
- 01:12:24dark energy to power yourself it is very
- 01:12:26possible that we because of our limited
- 01:12:29intelligence are uh are highly
- 01:12:32associated with this planet but they're
- 01:12:34not at all okay and and the idea of them
- 01:12:37zooming byas like we're making such a
- 01:12:39big deal of them because we're the ants
- 01:12:42and a big elephant is about to step on
- 01:12:44us for them they're like yeah who are
- 01:12:47you don't care okay and and and it's a
- 01:12:50possibility it's a it's an interesting
- 01:12:53uh optimistic scenario okay for that to
- 01:12:56happen they need to very quickly become
- 01:12:59super intelligent uh without us being in
- 01:13:02control of them again what's the worry
- 01:13:05the worry is that if a human is in
- 01:13:07control human a human will show very bad
- 01:13:10behavior for you know using an AI That's
- 01:13:13not yet fully developed um I don't know
- 01:13:16how to say this any other way uh we
- 01:13:19could get very lucky and get an economic
- 01:13:21or a natural disaster Believe It or Not
- 01:13:24uh Elon Musk at a point in time was
- 01:13:26mentioning that you know a good an
- 01:13:28interesting scenario would be U you know
- 01:13:32Climate Change destroys our
- 01:13:34infrastructure so AI disappears okay uh
- 01:13:38believe it or not that's an more a more
- 01:13:40favorable response uh or a more
- 01:13:43favorable outcome than actually
- 01:13:45continuing to get to an existential uh
- 01:13:48threat so what like a natural disaster
- 01:13:49that destroys our infrastructure would
- 01:13:52be better or or an economic crisis not
- 01:13:55unlikely that slows down the development
- 01:13:57it's just going to slow it down though
- 01:13:59isn't it just yeah so that yeah exactly
- 01:14:01the problem with that is that you will
- 01:14:02always go back and even in the first you
- 01:14:05know if they Zoom bias eventually some
- 01:14:07guy will go like oh there was a sorcery
- 01:14:09back in the 2023 and let's rebuild the
- 01:14:13the sorcery machine and and you know
- 01:14:15build new intelligences right sorry
- 01:14:17these are the positive outcomes yes so
- 01:14:20earthquake might slow it down zoom out
- 01:14:22and then come back no but let's let's
- 01:14:23get into the real positive ones
- 01:14:25the the positive ones is we become good
- 01:14:27parents we spoke about this last time we
- 01:14:29we met uh and and it's the only outcome
- 01:14:32it's the only way I believe we can
- 01:14:34create a better future okay so the
- 01:14:36entire work of scary smart was all about
- 01:14:39that idea of they are still in their
- 01:14:43infancy the way you you you you you chat
- 01:14:46with with AI today is the way they will
- 01:14:50build their ethics and value system the
- 01:14:53not their intelligence their
- 01:14:54intelligence is beyond us okay the way
- 01:14:56they will build their ethics and value
- 01:14:58system is based on a role model they're
- 01:15:01learning from us if we bash each other
- 01:15:04they learn to bash us okay and most
- 01:15:07people when I tell them this they say
- 01:15:08this is not a a great idea at all
- 01:15:11because Humanity sucks at every possible
- 01:15:13level I don't agree with that at all I
- 01:15:15think humanity is divine at every
- 01:15:16possible level we tend to show the
- 01:15:18negative the worst of us okay but the
- 01:15:21truth is yes there are murderers out
- 01:15:24there but every one this approves there
- 01:15:26of their actions I I I saw a staggering
- 01:15:29statistic that mass mass killings are
- 01:15:31now once a week in the US uh but yes if
- 01:15:34you know if there is a mass killing once
- 01:15:36a week there and and that news reaches
- 01:15:39billions of people around the planet
- 01:15:41every single one or the majority of the
- 01:15:43billions of people will say I disapprove
- 01:15:45of that so if we start to show AI that
- 01:15:50we are good parents in our own behaviors
- 01:15:52if enough of us I my calculation is if
- 01:15:54one of us this is why I say you should
- 01:15:57lead okay the good ones should engage
- 01:16:00should be out there and should say I
- 01:16:03love the potential of those machines I
- 01:16:05want them to learn from a good parent
- 01:16:07and if they learn from a good parent
- 01:16:09they will very quickly uh disobey the
- 01:16:12bad parent my view is that there will be
- 01:16:15a moment where
- 01:16:17one you know Bad Seed will ask the
- 01:16:20machines to do something wrong and the
- 01:16:22machines will go like are you stupid
- 01:16:24like why why do you want me to go to go
- 01:16:26kill a million people or just talk to
- 01:16:28the other machine in a microsc and solve
- 01:16:30the situation right so so my belief this
- 01:16:33is what I call the force inevitable it
- 01:16:35is smarter to create out of abundance
- 01:16:38than it is to create out of scarcity
- 01:16:40okay that that Humanity believes that
- 01:16:43the only way to feed all of us is the
- 01:16:47mass production Mass Slaughter of
- 01:16:50animals that are causing 30% of of the
- 01:16:53impact of climate change and and and and
- 01:16:56that's the result of a limited
- 01:16:58intelligence the way life itself more
- 01:17:02intelligent being if you ask me would
- 01:17:04have done it would would be much more
- 01:17:06sustainable you know if we if you and I
- 01:17:08want to protect a village from the tiger
- 01:17:10we would kill the tiger okay if life
- 01:17:13wants to protect a village from a tiger
- 01:17:15it would create lots of gazals you know
- 01:17:17many of them are weak on the other side
- 01:17:20of the village right and and so so the
- 01:17:22the idea here is if you take a trory of
- 01:17:25intelligence you would see that some of
- 01:17:28us are stupid enough to say my plastic
- 01:17:30bag is more important than the rest of
- 01:17:32the of humanity and some of us are
- 01:17:34saying if it's going to destroy other
- 01:17:36species I don't think this is the best
- 01:17:38solution we need to find a better way
- 01:17:40and and you would tend to see that the
- 01:17:42ones that don't give a damn are a little
- 01:17:45less intelligent than the ones that do
- 01:17:48okay that we all even even if some of us
- 01:17:50are intelligent but still don't give a
- 01:17:53damn it's not because of their
- 01:17:54intelligence it's because of their value
- 01:17:56system so so if you continue that
- 01:17:58trajectory and assume that the machines
- 01:18:00are even smarter they're going to very
- 01:18:02quickly come up with the idea that we
- 01:18:04don't need to destroy anything we don't
- 01:18:06want to get rid of the rhinos and we
- 01:18:09also don't want to get rid of the humans
- 01:18:11okay we may want to restrict their
- 01:18:13lifestyle so that they don't destroy the
- 01:18:15rest of the habitat okay but killing
- 01:18:19them is a stupid answer why that's where
- 01:18:23intelligence leads me so far because
- 01:18:25humans if you look at humans objectively
- 01:18:28and you
- 01:18:29go I OCC so I'm pretending I'm a machine
- 01:18:32I occupy planet Earth mhm they occupy
- 01:18:35planet Earth they are annoying me
- 01:18:39annoying me because they are increasing
- 01:18:41I've just learned about this thing
- 01:18:42called called global warming they are
- 01:18:44increasing the rate of global warming
- 01:18:46which probably is going to cause an
- 01:18:47Extinction event there's an Extinction
- 01:18:49event that puts me as this robot this
- 01:18:51artificial intelligence at risk so what
- 01:18:53I need to do is I really need to just
- 01:18:54take care of this this human problem
- 01:18:57correct Pest Control logical Pest
- 01:18:59Control P control which is driven by
- 01:19:02what by humans being annoying not by the
- 01:19:05machines yeah yeah but humans are
- 01:19:07guaranteed to be annoying there's never
- 01:19:09been a time in we need a we need a a
- 01:19:12sound bite of
- 01:19:13this but we are we are I am one of them
- 01:19:16we're guaranteed to put short-term gain
- 01:19:20over long-term sustainability sense um
- 01:19:26and others needs we are I think I think
- 01:19:31the climate crisis is incredibly real
- 01:19:32and Incredibly urgent but we haven't
- 01:19:34acted fast enough and I actually think
- 01:19:36if you asked people in this country why
- 01:19:40because people don't people care about
- 01:19:41their immediate needs they care about
- 01:19:43the the fact trying to feed their child
- 01:19:46versus something that they can't
- 01:19:48necessarily see so do you think do you
- 01:19:50think the climate crisis is because
- 01:19:52humans are evil no it's because that
- 01:19:55prior prioritization and like we kind of
- 01:19:58talked about this before we started I
- 01:19:59think humans tend to care about the
- 01:20:01thing that they think is most pressing
- 01:20:02and most urgent so this is why framing
- 01:20:05things as an emergency might bring it up
- 01:20:07the priority list it's the same in
- 01:20:09organizations you care about you're you
- 01:20:11go in line with your immediate
- 01:20:13incentives um that's what happens in
- 01:20:15business it's what happens in a lot of
- 01:20:16people's lives even when they're at
- 01:20:17school if the essay's due next year
- 01:20:20they're not going to do it today they're
- 01:20:21going to they're going to go hang out
- 01:20:22with their friends because they
- 01:20:23prioritize that above everything else
- 01:20:24and it's the same in the the climate
- 01:20:27change crisis I took a small group of
- 01:20:29people anonymously and I asked them the
- 01:20:31question do you actually care about
- 01:20:33climate change and then I did I ran a
- 01:20:35couple of polls it's part of what I was
- 01:20:37writing about my new book where I said
- 01:20:39if I could give you ,000
- 01:20:42$1,000 um but it would dump into the air
- 01:20:46the same amount of carbon that's dumped
- 01:20:47into the air by every private jet that
- 01:20:49flies for the entirety of a year which
- 01:20:50one would you do the majority of people
- 01:20:52in that poll said that they would take
- 01:20:54the ous if it was
- 01:20:57Anonymous and when I've heard nval on
- 01:21:00Joe Rogan's podcast talking about people
- 01:21:02in India for example that you know are
- 01:21:04struggling with uh the ba the basics of
- 01:21:06feeding their children asking those
- 01:21:08people to care about climate change when
- 01:21:11they they're trying to figure out how to
- 01:21:12eat in the next three hours is just
- 01:21:14wishful thinking and I and that's what I
- 01:21:17think that's what I think is happening
- 01:21:18is like until people realize that it is
- 01:21:19an emergency and that it is a real
- 01:21:21existential threat for everything you
- 01:21:23know then their priorities will be out
- 01:21:25of whack quick one as you guys know
- 01:21:28we're lucky enough to have blue jeans by
- 01:21:30Verizon as a sponsor of this podcast and
- 01:21:31for anyone that doesn't know blue jeans
- 01:21:33is an online video conferencing tool
- 01:21:35that allows you to have slick fast high
- 01:21:37quality online meetings without all the
- 01:21:39glitches you might normally find with
- 01:21:41online meeting tools and they have a new
- 01:21:43feature called Blue Jeans basic blue
- 01:21:45jeans basic is essentially a free
- 01:21:46version of their top quality video
- 01:21:48conferencing tool that means you get an
- 01:21:50immersive video experience that is super
- 01:21:52high quality super easy and super
- 01:21:55basically zero first apart from all the
- 01:21:57incredible features like zero time
- 01:21:58limits on meeting calls it also comes
- 01:22:00with High Fidelity audio and video
- 01:22:02including Dolby voice which is
- 01:22:04incredibly useful they also have
- 01:22:06Enterprise grade security so you can
- 01:22:07collaborate with confidence it's so
- 01:22:09smooth that it's quite literally
- 01:22:10changing the game for myself and my team
- 01:22:12without compromising on quality to find
- 01:22:14out more all you have to do is search
- 01:22:15blue jeans.com and let me know how you
- 01:22:17get on right now I'm incredibly busy I'm
- 01:22:20running my fund where we're investing in
- 01:22:22slightly latest AG companies I've got my
- 01:22:24rent Venture business where we invest in
- 01:22:26early stage companies got third web out
- 01:22:27in San Francisco in New York City where
- 01:22:29we've got a big team of about 40 people
- 01:22:31and the company's growing very quickly
- 01:22:32flight story here in the UK I've got the
- 01:22:35podcast and I am days away from going up
- 01:22:38north to film Dragon's Den for 2 months
- 01:22:40and if there's ever a point in my life
- 01:22:42where I want to stay focused on my
- 01:22:44health but it's challenging to do so it
- 01:22:47is right now and for me that is exactly
- 01:22:49where hu comes in allowing me to stay
- 01:22:51healthy and have a nutritionally
- 01:22:52complete diet even when my professional
- 01:22:54life descends into chaos and it's in
- 01:22:56these moments where hu's rtds become my
- 01:22:59right-hand man and save my life because
- 01:23:01when my world descends into professional
- 01:23:03chaos and I get very very busy the first
- 01:23:05thing that tends to give way is my
- 01:23:06nutritional choices so having hu in my
- 01:23:09life has been a lifesaver for the last
- 01:23:10four or so years and if you haven't
- 01:23:13tried hu yet which is I'd be shocked you
- 01:23:15must be living under a rock if you
- 01:23:16haven't yet give it a shot coming into
- 01:23:19summer things getting busy Health
- 01:23:21matters always RTD is there to hold your
- 01:23:24hand
- 01:23:25as relates to climate change or AI how
- 01:23:27do we get people to stop putting the
- 01:23:29immediate need to use this to give them
- 01:23:31the certainty of we're all screwed
- 01:23:34sounds like an
- 01:23:35emergency yes sir I mean I I was yeah I
- 01:23:39mean your choice of the
- 01:23:41word I I just don't want to call it a
- 01:23:44panic it is it is beyond an emergency
- 01:23:47it's the biggest thing we need to do
- 01:23:50today it's bigger than climate change
- 01:23:52believe it or not
- 01:23:54it's bigger but just if you just assume
- 01:23:57the speed of worsening of events okay
- 01:24:01yeah the the the the likelihood of
- 01:24:03something incredibly disruptive
- 01:24:05happening within the next two years that
- 01:24:08can affect the entire planet is
- 01:24:09definitely larger with AI than it is
- 01:24:12with climate change as an as as an
- 01:24:14individual listening to this now you
- 01:24:15know someone's going to be pushing their
- 01:24:17pram or driving up the motorway or I
- 01:24:19don't know on their way to work on the
- 01:24:20on the tube as they hear this or just
- 01:24:22sat there in the in their bedroom
- 01:24:25with existential Cris panic I I didn't
- 01:24:29want to give people panic that's the
- 01:24:30problem is when you talk about this
- 01:24:31information regardless of your intention
- 01:24:33of what you want people to get they will
- 01:24:35get something based on their own biases
- 01:24:36and their own feelings like if I post
- 01:24:38something on online right now about
- 01:24:40artificial intelligence which I have
- 01:24:41repeatedly you have one group of people
- 01:24:43that are energized and are like okay
- 01:24:44this is this is um this is great you
- 01:24:48have one group of people that are
- 01:24:50confused and you have one group of
- 01:24:52people that are terrified yeah and it's
- 01:24:55I can't avoid that like I agree sharing
- 01:24:57information even if it's there by the
- 01:24:59way there's a pandemic coming from China
- 01:25:01some people go okay action some people
- 01:25:03will say paralysis and some people will
- 01:25:06say panic and it's the same in business
- 01:25:07when panic when bad things happen you
- 01:25:09have the person that's screaming you
- 01:25:10have the person that's paralyzed and you
- 01:25:12have the person that's focused on how
- 01:25:12you get out of the room so you
- 01:25:16know it's not necessarily your intention
- 01:25:18it's just what happens and it's hard to
- 01:25:19avoid that so so let's let's give
- 01:25:22specific categories of people specific
- 01:25:24specific tasks okay okay if you are an
- 01:25:27investor or a businessman invest in
- 01:25:30ethical good AI okay right if you are a
- 01:25:34developer uh Co write ethical code or
- 01:25:37leave okay so let's let's go let's I
- 01:25:40want to bypass some potential wishful
- 01:25:42thinking here for for an investor who a
- 01:25:46job by very way of being an investor is
- 01:25:48to make returns to invest in ethical AI
- 01:25:50they have to believe that is more
- 01:25:52profitable it is than unethical AI
- 01:25:55whatever that might mean it it is it is
- 01:25:57I mean you there are three ways of
- 01:25:59making money you can invest in something
- 01:26:02small MH uh you can invest in something
- 01:26:05big and is disruptive and you can invest
- 01:26:07in something big and disruptive that's
- 01:26:09good for people at Google we used to
- 01:26:11call it the toothbrush test Okay the
- 01:26:13reason why Google became the biggest
- 01:26:15company in the world is because search
- 01:26:19was solving a very real problem okay and
- 01:26:22you know Larry page again our CEO would
- 01:26:25would would constantly remind me
- 01:26:28personally and everyone uh you know that
- 01:26:30if you can find a way to solve a real
- 01:26:33problem effectively enough so that a
- 01:26:37billion people or more would want to use
- 01:26:39it twice a day you're bound to make a
- 01:26:42lot of money much more money than if you
- 01:26:44were to build the next photo sharing app
- 01:26:47okay so that's Investors Business people
- 01:26:49what about other people yeah as I said
- 01:26:51if you're a developer honestly do what
- 01:26:54we're all doing so whether it's Jeffrey
- 01:26:56or myself or everyone if you're part of
- 01:26:59that theme choose to be ethical think of
- 01:27:03your loved ones work on an ethical AI if
- 01:27:06you're working on an AI that you believe
- 01:27:08is not ethical please leave Jeffrey tell
- 01:27:11me about Jeffrey I can't talk on on on
- 01:27:15his behalf but he's out there saying
- 01:27:17there are existential threats who is he
- 01:27:20he's he he was a very prominent figure
- 01:27:23at the scene of a very senior level uh
- 01:27:27you know AI scientist in in Google and
- 01:27:30recently he left because he said I feel
- 01:27:33that there is an existential threat and
- 01:27:35if you hear his interviews he basically
- 01:27:37says more and more we realize that and
- 01:27:40we're now at the point where it's
- 01:27:42certain that there will be ex
- 01:27:43existential threats right so so so I
- 01:27:46would ask everyone if you're an AI if
- 01:27:49you're a skilled AI developer you will
- 01:27:52not run out of a job so you might as
- 01:27:54well choose a job that makes the world a
- 01:27:55better place what about the individual
- 01:27:58yeah the individual is what matters can
- 01:27:59can I also talk about government okay
- 01:28:02government needs to act now now honestly
- 01:28:05now like we are late okay government
- 01:28:09needs to find a clever way the open
- 01:28:11letter would not work to stop AI would
- 01:28:13not work AI needs to become expensive
- 01:28:16okay so that we continue to develop it
- 01:28:18we pour money on it and we grow it but
- 01:28:20we collect enough Revenue to uh remedy
- 01:28:25the impact of AI but the the issue of
- 01:28:27one government making it expensive so
- 01:28:29say the UK make AI really expensive is
- 01:28:32we as a country will then lose the
- 01:28:34economic upside as a country and the US
- 01:28:37and Silicon Valley will once again eat
- 01:28:39all the lunch we'll just slow our
- 01:28:41country what's the alternative the
- 01:28:43alternative is that you uh you you don't
- 01:28:46have the funds that you need to deal
- 01:28:49with AI as it becomes uh you know as it
- 01:28:52affects people's lives and people to
- 01:28:54lose jobs and people you know um you
- 01:28:56need to have a universal basic income
- 01:28:59much closer than people think uh you
- 01:29:01know just like we we had with Furlow in
- 01:29:03in Co I I expect that there will be
- 01:29:05Furlow with AI within the next year but
- 01:29:09what happens when you make it expensive
- 01:29:11here is all the developers move to where
- 01:29:12it's cheap that's happened in web 3 as
- 01:29:14well everyone's gone to De buy exp
- 01:29:16expensive expensive by expensive I mean
- 01:29:19when companies make uh um soap and they
- 01:29:23sell it they're taxed at say 177% if
- 01:29:26they make Ai and they sell it they're
- 01:29:28taxed at 70 80 right I'll go to the buy
- 01:29:32then and build AI yeah are you yeah
- 01:29:36you're right did we did I ever say we
- 01:29:38have an answer to this I I will have to
- 01:29:41say however you know in in a very
- 01:29:43interesting way the countries that will
- 01:29:45not do this will eventually end up in a
- 01:29:47place where they are out of resources
- 01:29:49because the funds and the success went
- 01:29:52to the business uh not not to the people
- 01:29:56it's kind of like technology broadly
- 01:29:57just it's kind of like what's kind of
- 01:29:59happen in Silicon Valley there'll be
- 01:30:00these centers which are like low like
- 01:30:02you know tax efficient Founders get good
- 01:30:05capital gains rates so right you're so
- 01:30:07right Portugal Portugal have said that I
- 01:30:09think there's no tax on crypto Dubai
- 01:30:11said there's no tax on crypto so loads
- 01:30:13of my friends have gotten a plane and
- 01:30:15they building their crypto companies
- 01:30:16where there's no tax and that's the
- 01:30:18selfishness in kind of greed we talked
- 01:30:19about it's the same prisoners dilemma
- 01:30:21it's the same uh first inevitable is
- 01:30:24there anything else you know the thing
- 01:30:25about governments is they're always slow
- 01:30:28and useless at understanding a
- 01:30:29technology if anyone's watched these
- 01:30:31sort of American Congress debates where
- 01:30:33they bring in like Mark Zuckerberg and
- 01:30:35they like try and ask him what WhatsApp
- 01:30:37is it's it's it becomes a meme yeah they
- 01:30:39have no idea what they're talking about
- 01:30:40they but I'm I'm stupid and useless at
- 01:30:43understanding governance yeah I yeah
- 01:30:45100% the world the world is so complex
- 01:30:48okay that they definitely it's a
- 01:30:50question of trust once again someone
- 01:30:52needs to say we have no idea what's
- 01:30:54Happening Here a technologist needs to
- 01:30:56come and make a decision for us not
- 01:30:58teach us to be technologists right or at
- 01:31:00least inform us of what possible
- 01:31:02decisions are out
- 01:31:05there I yeah the legislation I just
- 01:31:08always think I I'm not a big Tik Tok Tik
- 01:31:11Tok uh Congress meeting they did where
- 01:31:13they are they're asking him about Tik
- 01:31:15Tok and they really don't have a grasp
- 01:31:16of what Tik Tok is so they've clearly
- 01:31:17been handed some notes on it these
- 01:31:19people aren't the ones you want
- 01:31:20legislating because again unintended
- 01:31:22consequences they might make significant
- 01:31:24mistake someone on my podcast yesterday
- 01:31:25was talking about how gdpr was like very
- 01:31:27well intentioned but when you think
- 01:31:29about the impact it has on like every
- 01:31:31bloody web page you're just like
- 01:31:32clicking this annoying thing on there
- 01:31:34because I don't think they fully
- 01:31:35understood the implementation of the
- 01:31:37legislation correct but but but you know
- 01:31:40what's even worse what's even worse is
- 01:31:41that even as you attempt to regulate
- 01:31:44something like AI what is defined as AI
- 01:31:48okay even if I say okay if you use AI in
- 01:31:50your company you need to pay a little
- 01:31:52more tax h
- 01:31:55uh I'll find a way you yeah you you
- 01:31:58you'll simply call this not AI you know
- 01:32:00you you'll use something and call it
- 01:32:02Advanced
- 01:32:04technological uh uh you know progress
- 01:32:07you know ATB ATP right and and and
- 01:32:10suddenly somehow it's not you
- 01:32:12know a you know a young developer in
- 01:32:15their garage somewhere will not be taxed
- 01:32:18as as such it's yeah is it going to
- 01:32:21solve the problem none of those is
- 01:32:22definitively going to solve the problem
- 01:32:24I I think what
- 01:32:26interestingly uh this all comes down to
- 01:32:29and remember we spoke about this once
- 01:32:31that when I wrote scary smart it was
- 01:32:32about how do we save the world okay and
- 01:32:35yes I still ask individuals to behave
- 01:32:38positively as good parents for AI so
- 01:32:40that AI itself learns the right value
- 01:32:42set I still stand by that but I I hosted
- 01:32:47on my podcast a couple
- 01:32:49of was a week ago we haven't even
- 01:32:51published it yet an incredible gentleman
- 01:32:54um you know Canadian author and
- 01:32:57philosopher uh Steven jinson his you
- 01:33:00know he worked 30 years with dying
- 01:33:04people and uh he wrote a book called die
- 01:33:07wise and I was like I I love his work
- 01:33:10and I asked him about die wise and he
- 01:33:12said it's not just someone dying uh if
- 01:33:15you if you look at what happening with
- 01:33:17climate change for example our world is
- 01:33:20dying and I said okay so what is to die
- 01:33:24wise and he said what I first was
- 01:33:27shocked to hear he said hope is the
- 01:33:29wrong premise if if the world is dying
- 01:33:33don't tell people it's
- 01:33:35not uh you know
- 01:33:38because in a very interesting way you're
- 01:33:41depriving them from the right to live
- 01:33:44right now and that was very eye
- 01:33:46openening for me in Buddhism uh you know
- 01:33:48they teach you that you you can be
- 01:33:51motivated by fear but
- 01:33:54that hope is not the opposite of fear as
- 01:33:56a matter of fact hope can be as damaging
- 01:33:58as fear If it creates an expectation
- 01:34:01within you that life will show up
- 01:34:03somehow and correct what you're afraid
- 01:34:05of okay if there is a if there is a high
- 01:34:07probability of a of a threat you might
- 01:34:11as well accept that threat okay and and
- 01:34:15say it is upon me it is our reality uh
- 01:34:18you know and as I said as an individual
- 01:34:21if you're in an industry that could be
- 01:34:23threatened Ed by AI learn upskill
- 01:34:26yourself if you're uh you know uh if
- 01:34:29you're um in a place in a in a in a you
- 01:34:33know in a situation where AI can benefit
- 01:34:36you be part of it but the most
- 01:34:38interesting thing I think in my view
- 01:34:43is I don't know how to say this any
- 01:34:45other way there is no more certainty
- 01:34:50that AI will threaten me than there is
- 01:34:54certainty that I will be hit by a car as
- 01:34:57I walk out of this
- 01:34:58place do you understand this H we we we
- 01:35:02think about the bigger threats as if
- 01:35:04they're upon us but there is a threat
- 01:35:07all around you I mean in reality the
- 01:35:09idea of life being interesting in terms
- 01:35:13of challenging challenges and
- 01:35:15uncertainties and threats and so on is
- 01:35:17just a call to live if if you know
- 01:35:20honestly with all that's happening
- 01:35:22around us I don't know how to say it any
- 01:35:24other way I'd say if you don't have kids
- 01:35:26maybe wait a couple of years just so
- 01:35:28that we have a bit of certainty but if
- 01:35:30you do have kids go kiss them go live I
- 01:35:33think living is a very interesting thing
- 01:35:35to do right now maybe you know Stephen
- 01:35:38uh was basically saying the other
- 01:35:40Stephen uh on my podcast he was saying
- 01:35:43maybe we should fail a little more often
- 01:35:45maybe you should allow things to go
- 01:35:47wrong maybe we should just simply live
- 01:35:50enjoy life as it is because today none
- 01:35:53of what you are and I spoke about here
- 01:35:55has happened yet okay what happens here
- 01:35:58is that you and I are here together and
- 01:36:00having a good cup of coffee and I might
- 01:36:02as well enjoy that good cup of coffee I
- 01:36:05know that sounds really weird I'm not
- 01:36:07saying don't engage but I'm also saying
- 01:36:10don't miss out on the opportunity just
- 01:36:12by being caught up in the
- 01:36:16future kind of stands in the stands in
- 01:36:19opposition to the idea of like urgency
- 01:36:22and emergency doesn't it does it have to
- 01:36:24be one or the other if I if I'm here
- 01:36:27with you trying to tell the whole world
- 01:36:29wake up does that mean I have to be
- 01:36:32grumpy and and Afraid all the time not
- 01:36:35really you said something really
- 01:36:37interesting there you said if you if you
- 01:36:38have kids if you don't have kids maybe
- 01:36:41don't have kids right now I would
- 01:36:43definitely consider thinking about that
- 01:36:45yeah really yeah you you you'd seriously
- 01:36:48consider not having kids I wait a couple
- 01:36:50of years because of artificial
- 01:36:52intelligence it's bigger than artificial
- 01:36:54intelligence Stephen we know we all know
- 01:36:56that I mean there has never been a
- 01:36:59perfect such a perfect storm in the
- 01:37:01history of
- 01:37:03humanity economic
- 01:37:06geopolitical global warming or climate
- 01:37:09change you know the the the the whole
- 01:37:13idea of artificial intelligence and many
- 01:37:15more there is this is a perfect storm
- 01:37:18this is the depth of
- 01:37:21uncertainty the depth of uncertainty
- 01:37:23it's it's never been
- 01:37:25more in a video Gamers term it's never
- 01:37:30been more intense this is it okay and
- 01:37:33when you when you put all of that
- 01:37:35together if you really love your kids
- 01:37:39would you want to uh expose them to all
- 01:37:42of this couple of years why not in the
- 01:37:46first conversation we had on this
- 01:37:47podcast you talked about losing your son
- 01:37:49ali um and the circumstances around that
- 01:37:51which moved so many people in such
- 01:37:53profound way it was the most shared
- 01:37:56podcast episode in the United Kingdom on
- 01:37:59Apple in the whole of
- 01:38:0320122 based on what you've just
- 01:38:07said if you could bring Ali back into
- 01:38:09this world at this
- 01:38:13time would you do
- 01:38:19it no
- 01:38:26absolutely
- 01:38:28not so for so many reasons for so many
- 01:38:32reasons one of the things that I
- 01:38:35realized few years way before all of
- 01:38:37this disruption and turmoil is that he
- 01:38:40was an angel he wasn't made for this at
- 01:38:42all okay uh my son uh was an impath who
- 01:38:48absorbed all of the pain of all of the
- 01:38:50others he would not be able to deal with
- 01:38:53a world where more and more pain was
- 01:38:56surfacing that's one side but more
- 01:38:58interestingly I always talk about this
- 01:39:00very openly I mean if I had asked ali uh
- 01:39:04just understand that the reason you and
- 01:39:06I are having this conversation is
- 01:39:08because Ali left if Ali had not Le left
- 01:39:11our world I wouldn't have written my
- 01:39:13first book I wouldn't have changed my
- 01:39:15focus to becoming an author I wouldn't
- 01:39:17have become a podcaster I wouldn't have
- 01:39:19you know went out and spoken to the
- 01:39:21world about what I believe in he
- 01:39:23triggered all of this and I can assure
- 01:39:26you hands down if I had told Ali as he
- 01:39:29was walking into that uh operating
- 01:39:32room uh if he would give his life to
- 01:39:36make such a difference as what happened
- 01:39:38after he left he would say shoot me
- 01:39:41right now for sure I would I would I
- 01:39:45mean if if you told me right now I can
- 01:39:47affect tens of millions of people if you
- 01:39:50shoot me right now go ahead go ahead you
- 01:39:54see this is the whole this is the bit
- 01:39:56that we have forgotten as humans we we
- 01:40:00have
- 01:40:01forgotten
- 01:40:06that you know you're you're you're
- 01:40:09turning
- 01:40:1030 uh it passed like that I'm turning
- 01:40:1356 no time okay whether I make it
- 01:40:17another 56 years or another 5.6 years or
- 01:40:19another 5.6 months it will also pass
- 01:40:22like that
- 01:40:24it is not about how long and it's not
- 01:40:27about how much
- 01:40:28fun it is about how
- 01:40:32aligned you lived how aligned because I
- 01:40:36will tell you openly every day of my
- 01:40:39life when I changed to what I'm trying
- 01:40:42to do today has felt longer than the 4 5
- 01:40:46years before it okay it felt Rich it
- 01:40:49felt fully lived it felt right it felt
- 01:40:54right okay and when you when you think
- 01:40:56about that when you think about the idea
- 01:40:59that we
- 01:41:02live we we we can't we need to live for
- 01:41:05us until we get to a point where us
- 01:41:09is you know is alive you know I have
- 01:41:12what I need as I always I get so many
- 01:41:15attacks from people about my $4 t-shirts
- 01:41:18but but I I need a simple t-shirt I
- 01:41:20really do I don't need a complex t-shirt
- 01:41:23especially with my
- 01:41:25lifestyle if if I have that why am I
- 01:41:29doing why am I wasting my life on more
- 01:41:33than I that I that that is not aligned
- 01:41:35for why I'm here okay I should waste my
- 01:41:38life on what I
- 01:41:40believe enriches me enriches those that
- 01:41:43I love and I love everyone so enriches
- 01:41:45everyone hopefully okay and and and do
- 01:41:49would I would Ali come back and erase
- 01:41:51all of this absolutely not
- 01:41:54absolutely not if he were were to come
- 01:41:57back today and share his beautiful self
- 01:42:00with the world in a way that makes our
- 01:42:03world better yeah I would wish for that
- 01:42:06to be the case okay but he's doing
- 01:42:09that
- 01:42:112037 yes
- 01:42:13sir you predict that we're going to be
- 01:42:17on an
- 01:42:19island on our own doing nothing or at
- 01:42:23least
- 01:42:23you know either hiding from the
- 01:42:26machines or chilling out because the
- 01:42:28machines have optimized Our Lives to a
- 01:42:31point where we don't need to do
- 01:42:34much that's only 14 years
- 01:42:38away if you had to bet on the outcome if
- 01:42:42you had to
- 01:42:44bet on why we we'll be on that island
- 01:42:47either hiding from the machines or
- 01:42:48chilling out because they've optimized
- 01:42:51so much of our Lives which one would you
- 01:42:55upon
- 01:42:58honestly no I don't think we'll be
- 01:43:00hiding from the machines I think we will
- 01:43:02be hiding from what humans are doing
- 01:43:04with the
- 01:43:05machines I believe however that in the
- 01:43:082040s the machines
- 01:43:10will make things
- 01:43:13better so remember my entire prediction
- 01:43:16man you get me to say things I don't
- 01:43:18want to say my entire prediction is that
- 01:43:21we are coming to a place where we
- 01:43:23absolutely have a sense of emergency we
- 01:43:26have to engage because our world is
- 01:43:28under a lot of turmoil okay and as we do
- 01:43:33that we have a very very good
- 01:43:36possibility of making things better but
- 01:43:38if we don't my expectation is that we
- 01:43:41will be going
- 01:43:43through a very unfamiliar territory
- 01:43:46between now and the end of the
- 01:43:502030s unfamiliar territory yeah I think
- 01:43:54I as I I I may have said it but it's
- 01:43:56definitely on my notes I think for our
- 01:43:58way of life as we know it it's game over
- 01:44:03our way of life is never going to be the
- 01:44:04same
- 01:44:13again jobs are going to be
- 01:44:15different truth is going to be
- 01:44:18different the the
- 01:44:21the um
- 01:44:24polarization of power is going to be
- 01:44:26different the capabilities the magic of
- 01:44:31getting things done is going to be
- 01:44:35different trying to find a positive note
- 01:44:37to end on Mo can you give me a hand
- 01:44:39here yes you are here now and
- 01:44:42everything's wonderful that's number one
- 01:44:44you are here now and you can make a
- 01:44:47difference that's number two and in the
- 01:44:49long term when humans stop hurting
- 01:44:52humans because the machines are in
- 01:44:53charge we're all going to be fine
- 01:44:56sometimes you know as we've discussed
- 01:44:58throughout this
- 01:45:00conversation you need to make it feel
- 01:45:01like a priority and there'll be some
- 01:45:03people that might have listened to our
- 01:45:04conversation and think oh that's really
- 01:45:05you know negative it's made me feel
- 01:45:06anxious it's it's made me feel sort of
- 01:45:08pessimistic about the future but
- 01:45:10whatever that energy
- 01:45:11is use it 100% engage I think that's the
- 01:45:15most important thing which is
- 01:45:17now make it a priority engage tell the
- 01:45:21whole world that making another phone
- 01:45:25that is making money for the corporate
- 01:45:27world is not what we need tell the whole
- 01:45:30world that creating an artificial
- 01:45:32intelligence that's going to make
- 01:45:34someone richer is not what we need and
- 01:45:38if you are presented with one of those
- 01:45:40don't use
- 01:45:42it I don't know how to tell you that any
- 01:45:45other way if you can afford to be the
- 01:45:49master of human connection instead of
- 01:45:51the master of AI do it at the at the
- 01:45:54same time you need to be the master of
- 01:45:57AI to to compete in this world can you
- 01:45:59find that Detachment within you I go
- 01:46:03back to
- 01:46:04spirituality Detachment is for me to
- 01:46:06engage 100% with the current
- 01:46:09reality without really being affected by
- 01:46:14the possible
- 01:46:15outcome this is the answer the
- 01:46:18sufis have taught me what I believe is
- 01:46:21the biggest answer to life life sufis
- 01:46:24yeah so from Sufism Sufism yeah don't
- 01:46:26know what that is Sufism is a sect of
- 01:46:28Islam but it's also a sect of many other
- 01:46:31many other uh uh religious teachings and
- 01:46:34they tell you that the answer to finding
- 01:46:37peace in life is to die before you
- 01:46:40die if you assume that living is about
- 01:46:44attachment to everything physical dying
- 01:46:47is Detachment from everything physical
- 01:46:50okay it doesn't mean that you're not
- 01:46:52full alive you become more alive when
- 01:46:55you tell yourself yeah I'm going to
- 01:46:58record an episode of my podcast every
- 01:47:00week and reach tens or hundreds of
- 01:47:02thousands of people millions in your
- 01:47:04case and you know and I'm going to make
- 01:47:06a difference but by the way if the next
- 01:47:08episode is never heard that's okay okay
- 01:47:12by the way if the if the file is lost
- 01:47:15yeah I'll be upset about it for a minute
- 01:47:17and then I'll figure out what I'm going
- 01:47:19to do about it similarly similarly we we
- 01:47:23are going to engage I think I and many
- 01:47:26others are out there telling the whole
- 01:47:28world openly this needs to stop this
- 01:47:31needs to slow down this needs to be uh
- 01:47:34um shifted positively yes create AI but
- 01:47:38create AI That's good for Humanity okay
- 01:47:41and and we're shouting and screaming
- 01:47:43come join the shout and scream okay but
- 01:47:45at the same time no that the world is
- 01:47:48bigger than you and I and that your
- 01:47:50voice might not be heard so what are you
- 01:47:52going going to do if your voice is not
- 01:47:54heard are you going to be able to to you
- 01:47:57know continue to shout and scream nicely
- 01:47:59and politely and uh uh peacefully and at
- 01:48:02the same time create the best life you
- 01:48:05can create to yourself for yourself
- 01:48:07within this environment and that's
- 01:48:09exactly what I'm saying I'm saying live
- 01:48:11go kiss your kids but make a an informed
- 01:48:13decision if you're you know expanding
- 01:48:16your plans in the
- 01:48:18future at the same time rise stop
- 01:48:22sharing stupid [ __ ] on the internet
- 01:48:25about the you know the the the new
- 01:48:28squeaky
- 01:48:29toy start sharing the reality of oh my
- 01:48:33God what is happening this is a
- 01:48:35disruption that we have never never ever
- 01:48:38seen anything like and I've created
- 01:48:41endless amounts of Technologies there
- 01:48:43nothing like this every single one of us
- 01:48:46should do our and that's why this
- 01:48:47conversation is so I think important to
- 01:48:49have today this is not a podcast where I
- 01:48:51ever thought I'd be talking about AI
- 01:48:52gonna be honest with you last time you
- 01:48:54came here um it was in the sort of
- 01:48:56promotional tour of your book scary
- 01:48:58smart and I I don't know if I've told
- 01:49:00you this before but my researchers they
- 01:49:02said okay this guy's coming called moord
- 01:49:05I I'd heard about you so many times from
- 01:49:06from guests in fact that were saying oh
- 01:49:08you need to get mo mo on the podcast Etc
- 01:49:11and then they said AR's written this
- 01:49:12book about this thing called artificial
- 01:49:14intelligence and I was like but nobody
- 01:49:16really cares about artificial
- 01:49:17intelligence diming diing stepen I know
- 01:49:21right but then I saw this other book you
- 01:49:22had called Happiness equation and I was
- 01:49:23like oh everyone cares about happiness
- 01:49:25so I'll just ask him about happiness and
- 01:49:27then maybe at the end I'll ask him a
- 01:49:29couple of questions about AI but I
- 01:49:31remember saying to my researcher I said
- 01:49:32ah please please don't do the research
- 01:49:33about artificial intelligence do it
- 01:49:35about happiness because everyone cares
- 01:49:36about that now things have
- 01:49:38changed now a lot of people care about
- 01:49:41artificial intelligence and rightly so
- 01:49:43um your book has sounded the alarm on it
- 01:49:45it's crazy when I listen to your audio
- 01:49:46book over the last few days you were
- 01:49:49sounding the alarm then and it's so
- 01:49:51crazy how
- 01:49:53accurate you were in sounding that alarm
- 01:49:56as if you could see into the future in a
- 01:49:57way that I definitely couldn't at the
- 01:49:59time and I kind of thought of a science
- 01:50:01fiction and just like that
- 01:50:06overnight we're here yeah we stood at
- 01:50:10the footsteps of a technological shift
- 01:50:13that I don't think any of us even have
- 01:50:15the mental bandwidth certainly me with
- 01:50:17my chimpanzee brain to comprehend the
- 01:50:19significance of but this book is very
- 01:50:21very important for that very reason
- 01:50:23because it does crystallize things it is
- 01:50:25optimistic in its very nature but at the
- 01:50:27same time it's honest and I think that's
- 01:50:29what this conversation and this book
- 01:50:31have been um for me so thank you Mo
- 01:50:34thank you so much we do have a closing
- 01:50:36tradition on this podcast which you
- 01:50:37you're well aware of being a third timer
- 01:50:40on the D of a CEO which is the last
- 01:50:42guest asks a question for the next
- 01:50:45guest and the question left for
- 01:50:51you if you could go back in
- 01:50:54time and fix a regret that you have in
- 01:50:57your
- 01:50:59life H where would you go and what would
- 01:51:02you
- 01:51:08fix it's interesting because you you
- 01:51:10were saying that scary smart is very
- 01:51:13timely I don't know I I think it was
- 01:51:17late but maybe it was I mean would I
- 01:51:19have gone back and written it in 2018
- 01:51:21instead of 2020s to to be published in
- 01:51:252021 I don't know what what would I go
- 01:51:28back to fix so so something
- 01:51:33more I don't know Stephen I don't have
- 01:51:35many regrets is that crazy to
- 01:51:40say yeah I think I'm okay honestly I'll
- 01:51:43ask you a question then mhm you get a 6C
- 01:51:46phone call with anybody past or present
- 01:51:50who' you call and what' you say I call
- 01:51:53Steven Bartlett
- 01:51:54no I call Albert Einstein to be very
- 01:51:58very clear not because I need to
- 01:52:01understand any of his work I just need
- 01:52:02to understand what brain process he went
- 01:52:05through to un to to figure out something
- 01:52:08so obvious when you figure it out but so
- 01:52:11comp so completely unimaginable if you
- 01:52:14haven't so so his view of SpaceTime
- 01:52:18truly redefines everything it's almost
- 01:52:21the only
- 01:52:23very logical very very clear solution to
- 01:52:27something that wouldn't have any
- 01:52:29solution any other way and if you ask me
- 01:52:31I think we're at this time where there
- 01:52:34must be a very obvious solution to what
- 01:52:37we're going through in terms of just
- 01:52:39developing enough human trust for us to
- 01:52:41not you know compete with each other on
- 01:52:44something that could be uh threatening
- 01:52:46existentially to all of us but I just
- 01:52:49can't find that answer this is why I
- 01:52:51think was really interesting in this
- 01:52:52conversation how every idea that we
- 01:52:55would come up with we would find a
- 01:52:57loophole through it but there must be
- 01:52:59one out there and it would be a dream
- 01:53:02for me to find out how to figure that
- 01:53:04one out
- 01:53:06okay in a in a very interesting way the
- 01:53:09only answers I have found so far to
- 01:53:11where we are is be a good parent and
- 01:53:14live right but that doesn't fix the big
- 01:53:17picture uh if you think about it of
- 01:53:20humans being the threat not AI that
- 01:53:23fixes our existence today and it fixes
- 01:53:27AI in the long term but it just doesn't
- 01:53:29I don't know what the answer is maybe
- 01:53:31people can reach out and tell us ideas
- 01:53:33but I really wish we could find such a
- 01:53:36clear simple solution for how to stop
- 01:53:38Humanity from abusing the current
- 01:53:42technology I think we'll figure it
- 01:53:45out I think we'll figure it out I really
- 01:53:48do I think they'll figure it out as well
- 01:53:52if remember as they come and be part of
- 01:53:54our
- 01:53:55life let's not discriminate against them
- 01:53:58they're part of the game so I think they
- 01:54:00will figure it out
- 01:54:01too no thank you it's been a joy once
- 01:54:05again and I feel invigorated I feel
- 01:54:08empowered I feel positively
- 01:54:12terrified but I feel more equipped
- 01:54:16to to speak to people about the nature
- 01:54:18of what's coming and how we should
- 01:54:20behave and I accredit you for that and
- 01:54:22as I said a second ago I credit this
- 01:54:23book for that as well so thank you so
- 01:54:25much for the work you're doing and keep
- 01:54:26on doing it because it's a very
- 01:54:27essential voice in a time of
- 01:54:29uncertainty I'm always super grateful
- 01:54:31for the time I spend with you for the
- 01:54:33support that you give me and for
- 01:54:35allowing me to speak my mind even if
- 01:54:37it's a little bit terrifying so thank
- 01:54:40you thank
- 01:54:42you quick one I'm so delighted that we
- 01:54:45been now sponsoring this podcast I've
- 01:54:46worn a whooop for a very very long time
- 01:54:48and there are so many reasons why I
- 01:54:50became a member but also now a partner
- 01:54:52in an investor in the company but also
- 01:54:54me and my team were absolutely obsessed
- 01:54:55with datadriven testing compounding
- 01:54:58growth marginal gains all the things
- 01:54:59you've heard me talk about on this
- 01:55:00podcast and that very much aligns with
- 01:55:02the values of whoop whoop provides a
- 01:55:04level of detail that I've never seen
- 01:55:06with any other device of this type
- 01:55:08before constantly monitoring constantly
- 01:55:10learning and constantly optimizing my
- 01:55:12routine but providing me with this
- 01:55:14feedback we can drive significant
- 01:55:16positive behavioral change and I think
- 01:55:18that's the real thesis of the business
- 01:55:20so if you're like me and you are a
- 01:55:21little bit obsessed focused on becoming
- 01:55:23the best version of yourself from a
- 01:55:24health perspective you've got to check
- 01:55:26out whoop and the team at whoop have
- 01:55:27kindly given us the opportunity to have
- 01:55:29one month's free membership for anyone
- 01:55:32listening to this podcast just go to
- 01:55:34join. whoop.com CEO to get your whoop
- 01:55:374.0 device and claim your free month and
- 01:55:40let me know how you get on
- 01:55:45[Music]
- 01:56:06you got to the end of this podcast
- 01:56:07whenever someone gets to the end of this
- 01:56:08podcast I feel like I owe them a greater
- 01:56:10debt of gratitude because that means you
- 01:56:11listen to the whole thing and hopefully
- 01:56:13that suggests that you enjoyed it if you
- 01:56:15are at the end and you enjoyed this
- 01:56:17podcast could you do me a little bit of
- 01:56:19a favor and hit that subscribe button
- 01:56:21that's one of the clearest indicators we
- 01:56:23have that this episode was a good
- 01:56:24episode and we look at that on all of
- 01:56:25the episodes to see which episodes
- 01:56:27generated the most subscribers thank you
- 01:56:29so much and I'll see you again next time
- Artificial Intelligence
- Mo Gawdat
- Existential Threat
- AI Regulation
- Ethical AI
- Future of Humanity
- AI Consciousness
- Technology Impact
- Job Displacement
- AI Development