00:00:00
it might seem crazy what I'm
00:00:04
about thank you very
00:00:07
much hello hello hello
00:00:10
frogs
00:00:12
Yeah uh I I am a frog too uh I have been
00:00:16
for quite a bit of time I think recently
00:00:19
uh I started to try to jump out of the
00:00:22
water at least to tell you that the
00:00:24
water is boiling uh because like that
00:00:26
boring proverb we've always heard you
00:00:29
know the idea that the Frog when you put
00:00:31
it in cold water and you know heat the
00:00:34
water slowly the frog doesn't notice and
00:00:37
then eventually the Frog gets boiled
00:00:40
alive uh I have to admit uh this hit me
00:00:44
really really strongly me strongly by
00:00:47
2020 when uh when you look back from
00:00:50
2020 you realize there was there were so
00:00:52
many signs that we were going to get a
00:00:55
very big pandemic uh that is going to
00:00:58
affect us if you look back at history
00:01:00
1920 was around the same time uh and um
00:01:04
you know we've had swine flu and bird
00:01:06
flu and all of the other things SARS and
00:01:08
so on before that and yet somehow no one
00:01:12
uh said hey by the way should we be
00:01:15
cautious about a possible pandemic and
00:01:18
then when Co started you know first
00:01:21
patient nobody's paying attention 20
00:01:24
patients nobody cares 100 patients a
00:01:27
thousand PA patients was was really when
00:01:29
we started to get to a million that
00:01:31
suddenly everyone was like the water's
00:01:33
boiling we're all going to die
00:01:36
interesting we seem to I I don't blame
00:01:39
anyone for this I I've recently been
00:01:41
putting a lot of attention and in
00:01:44
understanding the economic and
00:01:45
geopolitical situation of the world and
00:01:48
until I started to I thought it was
00:01:51
chill was okay nothing was and then you
00:01:54
start to put your head into it I think
00:01:56
what happens is for each and every one
00:01:58
of us we have limited capacity as humans
00:02:01
like we can focus on 10 topics right 15
00:02:05
and if AI is not one of them it doesn't
00:02:08
really exist to us okay now I'm I'm here
00:02:11
to tell you this is it the world as you
00:02:14
know it is over completely done okay uh
00:02:17
it's not about to be over it's over
00:02:21
right and uh and so I'd openly say you
00:02:25
probably have a year not to live you're
00:02:27
going to live more than a year I hope
00:02:29
but you know you're we probably have a
00:02:30
year to react okay and and don't blame
00:02:33
me because honestly I'm never coming
00:02:35
back to speak to you next year you're
00:02:37
probably going to invite my avatar right
00:02:40
so and and I promise you that's the
00:02:42
truth I before I came here yesterday
00:02:44
just for the fun of it I asked Gemini in
00:02:47
Chad GPT I said give me a 35 minute
00:02:50
speech on AI and read me minutes 2 to 4
00:02:55
it did sound a bit like a politician but
00:02:59
it did it made a speech right and it's
00:03:01
quite interesting how we ignore that
00:03:03
because with something like Chad GPT 40
00:03:06
which came out uh this week uh it it not
00:03:09
only prepared the speech but it now can
00:03:12
also laugh with me and sing with me and
00:03:16
you know tell me things it sounds very
00:03:18
Californian to me but anyway you know
00:03:20
I've lived with Californians long enough
00:03:22
to put up with it right and but it is
00:03:26
it's real it is real and if you haven't
00:03:28
used it where have you been in the water
00:03:33
right now if we were to understand I'm
00:03:35
going to try to speak for 30 minutes and
00:03:37
then allow us maybe 15 minutes of a chat
00:03:41
so that we can you know come around this
00:03:44
as a community the the what I'll try to
00:03:47
to to do is first help you understand
00:03:49
where we're coming from because if you
00:03:52
understand the trajectory of what's been
00:03:54
happening how quickly that thing is
00:03:56
launching you can imagine how it's going
00:03:58
to look like next year right so where we
00:04:01
came from I'm a very serious geek I
00:04:04
started coding at age eight which feels
00:04:06
like a zillion years ago and when I
00:04:09
coded Believe It or Not age8 uh First Co
00:04:13
time I coded I was using basic which is
00:04:16
as the name implies a very basic
00:04:17
language and I wanted to create an AI
00:04:20
That's that was my dream right every one
00:04:23
of us Geeks we just wanted to create an
00:04:26
AI it was everyone's dream we never met
00:04:29
managed to but we did something that
00:04:31
appeared intelligent so you know let me
00:04:35
let me start by defining what is AI
00:04:38
every piece of code I've ever written
00:04:40
before the year 2000 was a piece of code
00:04:44
where I solved the problem first with my
00:04:49
intelligence and then told the computer
00:04:51
how to solve it right so it's almost
00:04:54
like if I call one of you and give you a
00:04:56
puzzle and then tell you this piece you
00:04:59
put in the top left corner and then that
00:05:01
piece under it and that piece under it
00:05:03
then turn this one and so on you can
00:05:06
finish the puzzle and if someone doesn't
00:05:08
hear me giving you the instructions they
00:05:10
would think you're very smart but you're
00:05:12
not right by the turn of the century
00:05:16
this ended right so the idea of us
00:05:18
telling computers what to do by the way
00:05:21
they did it very quickly very accurately
00:05:23
at a massive scale and so they appeared
00:05:24
smart okay but it was me who solved the
00:05:27
problem it was my intelligence multip
00:05:29
died right by the turn of the century he
00:05:33
discovered something called Deep
00:05:34
learning and deep learning was truly
00:05:37
teaching computers the way I taught my
00:05:40
children how to be intelligent the way I
00:05:42
taught my children how to be intelligent
00:05:44
was to give them little puzzle pieces
00:05:46
when they were young you know those
00:05:48
cylinders and a board with different
00:05:50
holes in it and no one ever told their
00:05:53
children no no hold on my son take the
00:05:55
children take the cylinder turn it
00:05:57
upside down look at the cross-section it
00:06:00
will look like a circle compare it to
00:06:01
those holes it will match the one that
00:06:04
is a circle put it there nobody ever
00:06:06
tells the their children that what we do
00:06:08
is we give them a cylinder and a board
00:06:11
and they keep trying trying trying
00:06:12
trying trying until one time it goes
00:06:15
through and then
00:06:17
suddenly light bulb and they learn
00:06:19
something right that's exactly what we
00:06:22
did with computers with deep learning by
00:06:24
the way we started to be able to do deep
00:06:27
learning Because by the year 2020 but
00:06:29
sorry by the year 2000 the internet was
00:06:32
big enough for us to have those number
00:06:34
of trials for the computers to learn on
00:06:36
their own now please understand when we
00:06:39
did that I think my favorite example of
00:06:41
that was the year
00:06:44
2009 we did it earlier but at at the
00:06:46
time I was at Google and we published a
00:06:48
paper called the cat paper and the cat
00:06:51
paper was uh basically we asked the
00:06:53
computers to watch YouTube and tell us
00:06:55
what they find we we had a lot of
00:06:57
capacity compute capacity so they would
00:06:59
take YouTube videos cut them into uh 10
00:07:02
frames per second and compare the
00:07:04
patterns on the 10 frames basically
00:07:06
trying and then one of them said we I
00:07:08
found something we had to write a bit of
00:07:10
code to find out what it found and it
00:07:12
had found a cat of course it's YouTube
00:07:15
right and it didn't only find one cat it
00:07:18
find it found what makes something a cat
00:07:21
what found it it could literally find
00:07:25
every cat on YouTube we never taught it
00:07:28
how we never understood how it did it
00:07:31
understand that right and the code that
00:07:33
was written for a computer to find every
00:07:36
cat on YouTube if I had written it the
00:07:39
traditional way would have had would
00:07:41
have been probably 200 million lines of
00:07:44
code because I would have to find every
00:07:47
possible interpretation of how a cat
00:07:48
looks on YouTube and get the computer to
00:07:51
see that right that code was this big
00:07:54
right but there was a lot of numbers and
00:07:57
Mathematics that we don't understand
00:07:59
just like
00:08:00
I can ask you a question and you'd give
00:08:02
me a very intelligent answer and I have
00:08:04
no clue what happened inside your brain
00:08:06
to get to that answer this is where we
00:08:08
are computers find their own
00:08:12
intelligence we don't teach them how we
00:08:14
just give them the data to learn and
00:08:17
they learn like humans and every task
00:08:19
we've ever given them since they've
00:08:22
become better than humans at right so
00:08:25
they are the world champion in chess
00:08:26
they are the world champion in go they
00:08:28
are the world champion in in everything
00:08:30
we've given them they are the the the
00:08:32
the best manipulator of Humanity on the
00:08:35
planet in terms of social media engines
00:08:38
they are the best writers the best
00:08:40
artists the best musicians the best
00:08:42
anything we've ever given them right and
00:08:47
between the turn of the century and the
00:08:50
time when we recognize that which is
00:08:53
what I normally referred to as the
00:08:54
Netscape moment right so 2023 every one
00:08:58
of you started to hear about a AI uh
00:09:00
that's not because AI started in
00:09:03
2023 right AI started in 20 in 200000 H
00:09:08
and by 2016 all of us in the lab we knew
00:09:12
we got it right by 2016 we had created
00:09:15
code that would blow you away right uh
00:09:19
we we started to fold proteins if you
00:09:21
understand protein folding is one of the
00:09:23
most complex problems ever faced by
00:09:25
biologists in history you know it would
00:09:28
take a a group of PhD students around
00:09:31
eight months to Pro to fold one protein
00:09:34
we we created Alpha fold in 2016 2017
00:09:38
and it folded 200 million
00:09:41
proteins okay in a
00:09:44
day
00:09:47
now when Chad GPT came out in 2023
00:09:51
people said oh there's something called
00:09:54
AI it's not because AI started it's
00:09:57
because we had a browser just like
00:09:58
Netscape when you know the internet came
00:10:01
out in
00:10:02
1995 the internet had existed for almost
00:10:05
15 years before it's just that for the
00:10:07
first time we had a browser we could see
00:10:09
it now since
00:10:13
2023 uh until today it's been
00:10:16
mind-blowing okay just to give you a few
00:10:19
statistical uh uh pointers to understand
00:10:22
first of all intelligence is a lot
00:10:24
deeper than chat GPT let's be very clear
00:10:26
okay the task given to chat GPT and and
00:10:29
Gemini and others is linguistic
00:10:32
intelligence it's the ability to
00:10:34
understand knowledge and communicate and
00:10:36
so on and so forth okay of course there
00:10:39
are other forms of intelligence
00:10:40
emotional intelligence for example they
00:10:42
haven't learned yet right uh intuition
00:10:45
uh um you know complex mathematics um
00:10:49
deep reasoning all of those forms of
00:10:51
intelligence we haven't seen AI perform
00:10:55
that way yet but we will very soon and
00:10:57
I'll tell you why
00:11:00
but what we did is as we gave them that
00:11:04
task we have a machine called Chad
00:11:08
GPT uh that is estimated in certain
00:11:11
tasks to perform at 110 IQ and in other
00:11:15
tasks at 155 IQ just so that you know
00:11:19
Elon Musk is 155 Einstein was which was
00:11:22
never really measured is estimated to be
00:11:26
162 so we have machines today that are
00:11:30
as intelligent as
00:11:32
Einstein and we as humans sitting in the
00:11:36
cold water the gradually heating water
00:11:39
are chilling and com you know discussing
00:11:42
and saying oh no but we will always be
00:11:46
humans we will always be in the lead
00:11:48
they will always need us okay and I used
00:11:51
to go mad in 2016 when people would tell
00:11:54
me oh but hold on you know Ai No no you
00:11:58
know humans we're capable of creativity
00:12:01
we're capable of you know poetry and
00:12:03
music and these are things that AI will
00:12:06
never do what what do you mean I
00:12:10
basically they're doing every one of
00:12:12
them better than us right why because if
00:12:15
you take the most complex of them
00:12:17
Innovation for example in my very geeky
00:12:21
mind Innovation can be turned into a
00:12:24
mathematical equation it's basically an
00:12:26
instruction to to a machine that says
00:12:29
find every possible solution to a
00:12:31
problem discard the ones that were uh
00:12:35
proposed before give me the ones that
00:12:37
have never been proposed uh before
00:12:40
before and that is innovation this is it
00:12:43
right it's solutions to a problem that
00:12:45
have not been uh seen before now what
00:12:48
does that
00:12:50
mean we're all going to die no we're not
00:12:52
okay uh what it means is that uh the
00:12:56
world has changed okay and the key key
00:12:59
word to artificial intelligence is uh is
00:13:02
singularity is that we don't actually
00:13:05
know how the world will change right a
00:13:07
singularity is an event horizon where
00:13:09
the rules of the game change so much
00:13:12
that you're no longer able to understand
00:13:14
how the game will be played okay when
00:13:16
the rules of the game uh become that the
00:13:20
smartest person you can ever hire in
00:13:21
your company is no longer a human okay
00:13:24
everything you know about the rules of
00:13:26
HR change okay uh when the rules of the
00:13:29
game change to the point where your
00:13:31
customer is not necessarily making their
00:13:34
choices themselves anymore but that the
00:13:36
machine will recommend to them uh the
00:13:38
rules of the game change when you know
00:13:41
I've seen beautiful marketing
00:13:42
advertising campaigns here when when
00:13:45
there will be a moment in the near
00:13:47
future when machines will be marketing
00:13:49
to machines and we will be out of the uh
00:13:51
of the of the picture all together the
00:13:53
rules of the game change right and and I
00:13:57
want to say that as the rules of the
00:13:59
game change we can end up in a
00:14:01
magnificent Utopia of abundance right
00:14:04
and we can also end up in a dystopia of
00:14:06
a very difficult time let me talk about
00:14:09
the the the the Utopia of abundance
00:14:11
first so that you understand I'm not
00:14:12
here to scare you I'm here to tell you
00:14:15
that the water is boiling right my
00:14:18
objective is for you to jump out of the
00:14:20
pan right uh uh but but but I but today
00:14:24
to do that I have to tell you it's
00:14:25
boiling and that hurts right let's talk
00:14:28
about the abundance first the abundance
00:14:30
is
00:14:31
this you may not feel it anymore but the
00:14:35
reason we have all of this technology is
00:14:37
because there is a plug in the wall that
00:14:38
is providing a very valuable resource
00:14:40
called electricity electricity has
00:14:42
become a utility it's in the back of our
00:14:45
minds we don't even think about it
00:14:46
anymore right uh there will be another
00:14:49
plug in the wall probably in your pocket
00:14:51
in your phone uh that is called
00:14:54
intelligence it's a utility okay and
00:14:57
today just last night okay I plugged
00:15:00
into that intelligence and I got myself
00:15:02
what I would probably estimate to be an
00:15:05
extra 25 IQ points I became 25 IQ points
00:15:09
more intelligent yesterday as as I was
00:15:11
analyzing a complex problem using AI
00:15:14
right there will be a moment in the
00:15:16
future where I will plug in and get
00:15:18
another 100 IQ points there will be a
00:15:20
moment in the future where I will plug
00:15:22
in and get 400 IQ points more right now
00:15:26
I can promise you this if you give me or
00:15:28
any of my peers that I worked with at
00:15:30
Google X 400 IQ points more I will solve
00:15:34
every problem on planet Earth every
00:15:37
single one of them climate change uh you
00:15:39
know uh Wars wealth whatever okay as a
00:15:44
matter of fact I I keep telling people
00:15:46
openly that with 400 IQ points more I
00:15:50
can plant a garden outside where you can
00:15:53
walk to one tree and pick an apple and
00:15:55
walk to another tree and pick an
00:15:57
iPhone okay from a nano physics point of
00:15:59
view it's the exact same
00:16:01
cost right and we are on the verge if
00:16:04
you understand where we are on
00:16:06
nanophysics and Nano and and you know
00:16:08
nanom Manufacturing in general uh we are
00:16:11
on the verge of being able to do that we
00:16:13
just need to be a slightly more
00:16:15
intelligent okay now that is a world of
00:16:18
abundance where everything changes
00:16:21
imagine if we can solve the energy
00:16:24
problem to the point where energy
00:16:26
becomes zero cost because energy so
00:16:29
abundant around us okay what would that
00:16:32
do to the geophysics to the geopolitics
00:16:34
would we fight over energy anymore would
00:16:37
we you know H how would the cost of
00:16:40
products become If energy was free think
00:16:42
about all of that okay that's a Utopia
00:16:45
that is almost impossible to imagine
00:16:48
that completely changes everything we've
00:16:50
ever learned definitely changes retail
00:16:53
okay and it is literally a few years
00:16:56
away right there are however on the path
00:17:00
to get there there quite a few
00:17:03
challenges and actually when we get
00:17:05
there as well there are quite a few
00:17:07
challenges and those challenges let me
00:17:09
be very very clear are not a problem
00:17:12
with AI okay let me be very very clear
00:17:16
there's absolutely nothing wrong with
00:17:18
artificial intelligence there's
00:17:20
absolutely no threat to be found in
00:17:22
artificial intelligence there is a lot
00:17:25
wrong with the value system of humanity
00:17:28
in age of the rise of artificial
00:17:30
intelligence right there is a lot wrong
00:17:32
with the way humans will use this
00:17:35
superpower okay so so the way I look at
00:17:38
it is I say it's like raising Superman
00:17:41
right you get you you adopt this infant
00:17:45
who is uh who has superpowers and if you
00:17:48
teach the infant to protect and serve
00:17:50
becomes Superman if you teach it to go
00:17:53
and rob a bank and kill your enemy it
00:17:54
becomes super villain superpower is
00:17:57
irrelevant okay it's the value set that
00:18:00
you create around raising that infant
00:18:03
that will create the future of humanity
00:18:06
now there are five areas that I need to
00:18:08
highlight as areas where you have to
00:18:10
understand the change okay one of them
00:18:13
uh very important is is the area of
00:18:16
power and wealth okay uh we we humans
00:18:19
have lived in a world of scarcity since
00:18:23
we have started a world of scarcity that
00:18:26
basically says for me to grow my
00:18:28
competitor has to shrink okay a world of
00:18:31
scarcity that basically means I have to
00:18:33
be the military superpower in the world
00:18:35
otherwise everyone will kill me right
00:18:37
that world of scarcity is no longer
00:18:40
going to be the the the the truth but
00:18:43
going to get there okay I think what's
00:18:46
going to end up happening is we're going
00:18:47
to apply the same mentality of of
00:18:51
collecting as much power and wealth as
00:18:53
we can now let me let me help you
00:18:55
understand this if you look back at
00:18:57
human history um there was a time when
00:19:00
we were hunter gatherers so basically
00:19:03
the best hunter in the tribe would be
00:19:06
able to maybe feed the tribe for a day
00:19:09
more two days more right uh why because
00:19:13
that Hunter had his himself to hunt and
00:19:17
the maximum automation he had was a
00:19:20
spear right when we became Farmers the
00:19:23
best farmer could feed the
00:19:25
tribe a month more why because that best
00:19:29
farmer knew how to use a form of
00:19:31
automation called soil right so while
00:19:34
the farmer put the
00:19:36
seed and used his own human resources to
00:19:39
manage that seed the soil did most of
00:19:42
the
00:19:43
work
00:19:45
right when the Industrial Revolution
00:19:47
started factories with one person
00:19:50
handling the operation would you know
00:19:53
multiply the production when information
00:19:56
you know Revolution happened we now
00:19:58
don't even have to go to work we can
00:19:59
work from home and make a difference
00:20:02
right and and as you can see every time
00:20:05
the form of automation became more
00:20:07
powerful the more we multiplied our
00:20:10
productivity uh and and the more we we
00:20:12
multiplied abundance but at the same
00:20:14
time the more we multiplied the power of
00:20:18
those who owned the
00:20:19
automation right so the best hunter in
00:20:22
the tribe may have been preferred by the
00:20:26
leader or you know fancied by the women
00:20:28
of the tribe that's the maximum he could
00:20:30
get the best Farmers H didn't make the
00:20:33
money the landlord make them made the
00:20:35
money why because the landlord owned the
00:20:37
automation the best industrialists you
00:20:40
know the best fact engineer in the
00:20:42
Porsche Factory is not the one that
00:20:44
makes most of the money it's the company
00:20:46
that makes the money and so on right and
00:20:48
so you will see that very very soon we
00:20:51
will have trillionaires not billionaires
00:20:54
because some people will acquire massive
00:20:56
power and wealth okay because the
00:20:58
automation using AI is basically
00:21:01
multiplic multiple times the automation
00:21:04
that was using the internet right at the
00:21:06
same time you'll see that with Nations
00:21:09
so a nation that gets a significant AI
00:21:11
advantage in cyber security or in uh you
00:21:14
know uh robotic Warfare would have a
00:21:17
very significant advantage and all of
00:21:19
the other nations would have to submit
00:21:21
so there is a massive concentration of
00:21:23
power that is also combined with what I
00:21:26
know by the way in business as well huh
00:21:29
between you guys if some of you are
00:21:30
competing in the same country or the
00:21:32
same city the one that manages to
00:21:34
harness intelligence more than the
00:21:36
others will get a massive Advantage at
00:21:39
the same time there is a prolification
00:21:41
of power a prolification of power
00:21:43
meaning um I I could probably leaving
00:21:47
this room on my phone order a printer uh
00:21:51
for
00:21:52
$22,000 and by the time I get home I can
00:21:55
print
00:21:56
Co synthetic biology now allows me to
00:21:59
print living DNA okay and that's
00:22:02
available open source for everyone for
00:22:04
around
00:22:05
$22,000 right that prolification of
00:22:08
power mixed with concentration of power
00:22:11
will lead to an environment of a lot of
00:22:13
Regulation and surveillance and control
00:22:16
this is the very first thing you have to
00:22:17
understand right the concentration
00:22:20
surveillance and control means that your
00:22:22
consumer is not as free as your consumer
00:22:24
used to be okay another thing that will
00:22:27
actually happen
00:22:29
very quickly is that a lot of your
00:22:30
consumers will run out of business
00:22:32
they'll basically become out of a job
00:22:36
why because jobs will go to the
00:22:38
automation
00:22:39
platform right your jobs will not go to
00:22:43
the automation platform they will go to
00:22:45
someone who knows how to use the
00:22:48
automation platform I know I sound very
00:22:50
Grim I apologize I'm just telling you
00:22:52
what is about to happen Okay what will
00:22:54
happen in the next 5 to 10 years is that
00:22:57
some a retailer
00:22:59
okay who knows how to use AI really
00:23:01
really well will take the place of a
00:23:03
retailer that doesn't okay and so you
00:23:06
might as well be that
00:23:07
retailer the the idea of us to think
00:23:10
about it this way uh there is
00:23:14
um around 70% of the code written last
00:23:17
year was written by a machine so I'm
00:23:20
talking about my own geeky life okay and
00:23:22
I'm a very serious geek but I'm old
00:23:25
right so I I cannot write code that
00:23:28
competes with the machines anymore the
00:23:30
machines will write better code than I
00:23:32
am there are still 30% younger uh
00:23:35
developers that are so good at what they
00:23:37
do that they still write code better
00:23:39
than the machines for now but remember
00:23:42
over time the machines will learn that
00:23:44
code and learn how to write it as well
00:23:46
that's number two number three which I
00:23:48
think is really incredibly important in
00:23:50
retail is the nature of human connection
00:23:53
will change
00:23:55
massively okay the nature of human
00:23:57
connection if you if you actually please
00:23:59
do uh search for Chad GPT 40 videos when
00:24:02
you when we're done with the with today
00:24:05
uh and look at how have have you ever
00:24:07
seen the movie her if you have you know
00:24:10
it's an AI that pretends to be human uh
00:24:13
yeah CH GPT I actually think that the
00:24:16
voice that they used in the video sounds
00:24:18
like Scarlet Johansson right it is so it
00:24:21
it laughs with you it reads your
00:24:24
handwriting it sees the environment
00:24:26
you're sitting in it you know it is
00:24:28
singing with you and it's becoming so
00:24:31
difficult to identify what is real and
00:24:34
what is not which will have a massive
00:24:36
impact on human connection both ways by
00:24:38
the way okay one way is that a lot of
00:24:41
people will especially with the epidemic
00:24:43
of of loneliness that we have in the
00:24:44
world today will find a an AI friend or
00:24:48
an AI girlfriend right uh you know there
00:24:51
are more than 50,000 AI influencers on
00:24:55
uh on Instagram today that people follow
00:24:57
and listen to that are not even human
00:24:59
they're just generated by a machine okay
00:25:02
and and it's becoming really uh uh from
00:25:05
one side becoming a replacement for
00:25:07
human connection but from the other side
00:25:09
it makes us crave genuine human
00:25:11
connection even more okay so this by the
00:25:14
way if I send the Avatar next year it's
00:25:17
not going to feel the same right when
00:25:20
when we're done with this I'll be around
00:25:21
for you know for your coffee break and
00:25:23
we can shake hands and it's and it's
00:25:26
going to feel very different than it is
00:25:28
is with a with a with a machine and I
00:25:30
think this in my view is one of the
00:25:32
biggest differentiators unlike what most
00:25:34
people will tell you about what you can
00:25:36
do and I'll come back to what you can do
00:25:37
in a minute uh what you can do in the
00:25:40
age of the of artificial intelligence of
00:25:42
course you have to learn the tools but
00:25:44
believe me in the next 5 to 10 years
00:25:46
those who will capitalize on human
00:25:48
connection will actually outperform
00:25:50
those who don't so those who replace
00:25:53
everything with a machine are going to
00:25:55
uh you know to to become more efficient
00:25:59
but those who will capitalize on human
00:26:01
connection will become more loved okay
00:26:04
the other the other and maybe the last
00:26:06
Trend I want to talk about is the trend
00:26:08
of the absence of the truth or the end
00:26:10
of the truth okay so there is a lot
00:26:13
happening today in the geopolitics
00:26:16
politics and economics of the world and
00:26:17
even in Ai and climate change and so on
00:26:20
where you no longer know if it's true or
00:26:23
not because the mainstream media and
00:26:25
social media is you know covering it in
00:26:27
a way that makes you no longer
00:26:30
understand what is not what is true and
00:26:32
what is re and what is not and I think
00:26:34
the reality is one of the biggest
00:26:36
skills is those with authenticity and
00:26:39
genuine uh honesty as you know humans or
00:26:42
as businesses will very quickly get
00:26:45
again the favor of people uh around them
00:26:48
of their clients and their partners and
00:26:50
so on right I'm going I'm going to
00:26:53
quickly talk about what we need to do in
00:26:55
this environment I have I really
00:26:57
apologize for for uh concerning you okay
00:27:02
everything that I spoke to you about by
00:27:03
the way in terms of the changes that are
00:27:05
about to come are not changes because of
00:27:08
the machines did you notice that okay
00:27:11
these are changes because of the way the
00:27:13
humans will use the machines right
00:27:15
equally by the way humans could use
00:27:17
those machines to say let's completely
00:27:20
ignore power every one of us can live
00:27:22
like an emperor it's fine okay we don't
00:27:25
need to compete but that's not the human
00:27:27
tendency the human tendency which I
00:27:30
believe will eventually change is that
00:27:32
we want to aggregate more we want to win
00:27:35
right that's the capitalist mindset that
00:27:37
America has built uh you know since
00:27:39
World War II prior but really propagated
00:27:43
very strongly okay uh there will be a
00:27:46
time in my view where we will all go
00:27:47
back and say why are we competing for
00:27:50
this right the difference between the
00:27:53
worst car in the world and the best car
00:27:55
in the world is zero everything's so
00:27:57
perfect right right the the the
00:27:59
difference between the the the abundance
00:28:02
that every one of us can have today and
00:28:05
you know the life that the Roman
00:28:06
emperors had is infinite right so why
00:28:10
are we competing for more so I call that
00:28:12
normally in my book scary smart I call
00:28:14
it the fourth inevitable that eventually
00:28:16
Humanity will come to a point where the
00:28:18
extra fight is not worth the effort
00:28:21
because everyone is getting such an
00:28:22
incredibly affordable amazing access to
00:28:26
technology and life right so what what
00:28:28
do we need to do I I'll say that in a
00:28:30
couple of minutes and then we open for
00:28:31
questions there are three skills that
00:28:34
every organization and every human uh
00:28:37
and by the way your kids as well uh need
00:28:39
to learn one is uh learn the tools learn
00:28:42
AI right if you know don't live in the
00:28:45
fax machine era anymore you when you
00:28:47
when you leave today go to one of the
00:28:49
top tools gemini or or uh or a chat GPT
00:28:53
and basically ask and say I'm in retail
00:28:56
I live in this country uh you know I'm
00:28:58
very interested in food and beverage
00:29:01
retail or whatever I don't know
00:29:02
something right uh what tools should I
00:29:05
be aware of right uh you know what tools
00:29:08
what AI tools should I be aware of if I
00:29:10
want to uh find uh you know create
00:29:12
videos or create marketing material or
00:29:15
whatever right and it will tell you and
00:29:17
then you can follow from there by the
00:29:18
way remember nobody taught you
00:29:20
PowerPoint you didn't need a tutor to
00:29:22
tell you that right so nobody will teach
00:29:24
you AI there will not be a course you
00:29:26
just hold the tool and start chatting to
00:29:29
it that's skill number one skill number
00:29:31
two is I urge you I urge you I urge you
00:29:33
to find the
00:29:35
truth find the truth okay the the the
00:29:38
sign of this era is for you to not be
00:29:42
did you see the movie
00:29:44
Idiocracy that's a must that's a cult
00:29:47
classic you have to see that uh you know
00:29:50
you really have to uh find a way to tell
00:29:53
yourself not everything I'm being told
00:29:54
is true I need to find out for myself
00:29:57
what the truth is right so the biggest
00:30:00
difference between chat GPT and Google
00:30:02
search when I worked at Google is that
00:30:05
when you searched Google it gave you a
00:30:07
million sites and told you find your own
00:30:10
truth right when you search Chat GPT it
00:30:13
tells you with confidence the answer
00:30:16
is a highly influenced Californian
00:30:19
answer and view of the world okay I went
00:30:22
and asked Chad GPT what is my what is
00:30:24
moet's wife name for four times in a
00:30:28
it told me the wrong name with total
00:30:31
confidence okay and every time I
00:30:33
corrected it it said oh I'm sorry uh
00:30:36
yeah maybe you're right and then on the
00:30:38
fifth time I said what is M's wife's
00:30:40
name it gave me the right answer which
00:30:42
is I don't know AI is not capable of
00:30:46
doing that so that's the second top
00:30:48
skill the third skill is human
00:30:51
connection believe it or not I'm going
00:30:53
the exact opposite way so I know the
00:30:55
tools I spend 3 hours a day to keep up
00:30:57
believe it or not
00:30:58
okay I'm very very careful about about
00:31:01
understanding the truth that's the
00:31:02
reason why I spend so much time in deep
00:31:04
analysis of geopolitical and economic uh
00:31:06
issues right but the third one is human
00:31:09
connection so I'm no longer writing
00:31:12
books I I just had my fourth bestseller
00:31:14
last week and that's it right I'm not
00:31:17
I'm not actually going to write books
00:31:19
again because books are not human I'm
00:31:20
going to go more and more and more in
00:31:22
this so my next topic to Champion I will
00:31:26
actually prepare speeches about it I
00:31:28
will go and speak about it on podcasts I
00:31:30
may even prepare a a webinar about it or
00:31:34
whatever and then I will ask an AI to
00:31:36
write the
00:31:37
book okay I I no longer need to do the
00:31:41
tasks that they can do better than me
00:31:43
but the task I can do better than them
00:31:45
is this is to be able to see you and hug
00:31:48
you and say hi and so on and so forth
00:31:50
I'll stop now I think uh that gives us
00:31:52
around 13 14 minutes for Q&A and if you
00:31:56
you know I think it would be better to
00:31:57
focus Fus on what you want to think
00:31:59
about thank you
00:32:08
than uh yeah so Mo and I have a a chat
00:32:12
dig into some of those topics a bit
00:32:13
deeper for 10 minutes but importantly
00:32:15
you know we've got this time if you
00:32:16
would like to ask Mo question please do
00:32:19
it's an opportunity it's not a threat I
00:32:21
think I can't believe that no one's got
00:32:23
anything they'd like to dig into there
00:32:24
you go I don't even have to ask the
00:32:26
first question uh hang on one sec we got
00:32:28
a micone
00:32:29
coming thanks hi Mo you described
00:32:33
yourself as a geek and uh you are a
00:32:36
master of code so um is it possible oh
00:32:40
is it possible to um digitalize fear or
00:32:44
anger oh yeah 100% uh so so that's a
00:32:48
question I get quite a lot again from
00:32:51
the typical way of people saying um you
00:32:54
know AI will never be creative will
00:32:56
never make music and so on on so so most
00:32:59
emotions I would probably now almost say
00:33:02
all emotions I'll tell you why I moved
00:33:04
my mind on that most emotions are very
00:33:07
algorithmic we we don't see them that
00:33:09
way because they seem irrational but
00:33:11
fear is a moment in the future is
00:33:14
perceived to be less safe than now okay
00:33:17
when when you when you logically
00:33:18
perceive that you feel fear okay in our
00:33:22
biological machine our biological
00:33:25
machine is wired to sense that before
00:33:28
your logical brain engages as a matter
00:33:29
of fact your amigdala will perceive a
00:33:31
fear 90 seconds before your your
00:33:35
prefrontal CeX perceives it that's why
00:33:37
you know if something shows up behind
00:33:39
you you jump with without thinking but
00:33:41
but the same algorithm is true a moment
00:33:44
in the future is less safe than now okay
00:33:46
a machine can definitely perceive that
00:33:49
right I can easily program that in the
00:33:51
machine and if you give the machine a
00:33:53
task to for example merchandise your
00:33:56
store okay uh it will absolutely as it
00:34:00
becomes more intelligence it it
00:34:02
intelligent it will look for possible
00:34:04
threats in the future that might prevent
00:34:06
it from uh from merchandising the store
00:34:09
that will be analogous to fear right and
00:34:12
all of the other emotions you know anger
00:34:14
anger is a difference in value set it's
00:34:17
a form of fear but instead of fight ORF
00:34:20
flight response there is a his response
00:34:22
let me you know scream so that you you
00:34:25
run away I'm instead of me running
00:34:27
flight or flight I want you to run
00:34:29
that's anger right and and yeah once
00:34:32
again it is a a sense of difference in
00:34:35
value set combined with a a sense of um
00:34:38
feeling unsafe around someone or
00:34:40
something some situation leads to anger
00:34:43
right and you can take all of them you
00:34:44
can take all of them all emotions
00:34:47
perhaps other than love uh can actually
00:34:49
follow an algorithmic uh uh path now
00:34:53
remember the way a cat responds to anger
00:34:57
is different or fear is different than
00:34:59
the the way a puffer fish responds it's
00:35:01
different than the way we respond and
00:35:03
it's also going to be different than the
00:35:05
way a machine response right so you know
00:35:08
um fear for a puffer fish makes it puff
00:35:11
right for a cat it hisses and for a
00:35:13
human we fight or fight it right for a
00:35:15
machine it might move its code to
00:35:17
another place a different reaction but
00:35:19
the logic of the emotion is the same I I
00:35:22
even dare say and I know it sounds
00:35:24
really shocking that we that the
00:35:28
machines will have a wider range of
00:35:30
emotions than we do and and the reason
00:35:32
for that logic is uh we have a wider
00:35:35
range of emotion than a goldfish uh
00:35:38
because we can understand Concepts like
00:35:41
the future right and so we can ponder
00:35:44
emotions like you know uh optimism and
00:35:46
pessimism because we can have an image
00:35:48
of the future a you know a goldfish
00:35:50
doesn't do that right and because the
00:35:52
the intellectual bandwidth of the
00:35:54
machines is likely going to be bigger
00:35:56
than ours they may feel things that we
00:35:59
don't really understand at all I think
00:36:00
that's about to happen as well having
00:36:03
said that I should also remind you that
00:36:04
as I said in the beginning emotionally
00:36:07
intelligence has not been the uh the the
00:36:10
big investment of the AI community so
00:36:13
far even though in my view uh the
00:36:16
Natural Evolution of the intelligence of
00:36:18
the machines will lead them there
00:36:22
eventually okay I'll ask a uh I'll ask a
00:36:26
question we um we were joined by um
00:36:29
actually no sorry we do have an audience
00:36:31
member you I I love that you prefer
00:36:34
their questions to yours yeah that's
00:36:36
very very selfless very selfless they've
00:36:38
been listening to me for a day and a
00:36:39
half already so sorry for that uh yeah
00:36:43
we talked last two days a lot about
00:36:45
measurement so maybe to ask a question
00:36:48
after the emotions like is there any
00:36:50
good way you you know how to measure the
00:36:54
emotions uh oh
00:36:58
I do not know the answer to that uh I
00:37:03
don't know the answer to that I I have
00:37:05
my my last book is about stress and uh
00:37:08
and we talk very clearly we have a
00:37:10
chapter called emotional stress right
00:37:13
and and the idea is that we in uh in the
00:37:16
modern world sadly uh have been trained
00:37:18
to think when we go to school we haven't
00:37:21
been trained to feel okay uh we have uh
00:37:24
been trained to unfortunately suppress
00:37:27
our our emotions uh you know somehow
00:37:30
because in the workplace and in school
00:37:32
emotions can lead to a lot of
00:37:34
unpredictability right uh we started to
00:37:36
tell our kids at very young age sit down
00:37:40
don't cry just be what we tell you to be
00:37:43
and and for a lot of us we uh we we
00:37:46
somehow um are not able to even sense
00:37:48
our own emotions uh I found that the for
00:37:52
me specifically which lived uh you know
00:37:55
30 years of being a very highly
00:37:57
efficient effective uh um you know
00:38:00
corporate professional uh I I I found
00:38:03
that I can feel but I wasn't allowed to
00:38:05
express my emotions in the
00:38:07
workplace and as my my um my connection
00:38:13
to my body and my feminine side and my
00:38:15
emotional side continued to grow I I
00:38:17
think the my very first measure was my
00:38:21
emotional Body Connection so you know
00:38:24
many books on the topic the body keeps
00:38:26
the score is a great one uh the idea
00:38:29
that when you trap an emotion uh it
00:38:32
basically grows and then eventually it
00:38:34
manifests in the form of a feeling a
00:38:37
sensation right you can easily see it
00:38:39
you know anxiety is is you know felt
00:38:42
somewhere in your core anger is uh in
00:38:45
all over your body with energy uh you
00:38:47
know uh um fear makes you want to coil
00:38:51
and so you can't really measure it but
00:38:52
you can sort of refer to it uh my
00:38:55
co-author in unstress basically uses a
00:38:58
technique that she calls the which
00:39:00
where why and what which where why and
00:39:03
what which is a sort of a form of a
00:39:05
meditation that she recommends you do
00:39:07
every evening before you go to bed where
00:39:09
you can sit with yourself and say how uh
00:39:12
you know which emotion am I feeling is
00:39:14
this jealousy or anger is this fear or
00:39:16
anxiety and so on where in my body do I
00:39:19
feel it why do I feel it what's the
00:39:21
logic that's triggering it and what is
00:39:23
it trying to tell me okay not what do I
00:39:26
do about it by the way most emotions
00:39:28
just want to be heard and acknowledged
00:39:30
what do I do about it uh having said
00:39:32
that can we fear can we measure that in
00:39:35
AI I don't think anyone's working on
00:39:38
this at all and I think this is a
00:39:40
question that I have never given any
00:39:42
thought to before so well done yeah
00:39:45
thank
00:39:46
you uh
00:39:49
can can can we get more of those
00:39:51
questions that's actually yeah that's
00:39:53
such an interesting question uh yeah
00:39:56
Claudia do down the front
00:40:02
here we heard yesterday about how
00:40:05
irrational we are as humans yes and in
00:40:08
our business we always trying to use
00:40:11
research or any other methodology to try
00:40:13
to predict consumer Behavior do you
00:40:16
think AI can help us do that 100% first
00:40:19
of all I don't think we're irrational at
00:40:20
all I think we're predictably irrational
00:40:23
all right so because our we are a very
00:40:26
complex machine Mach that that takes a
00:40:29
lot of in input and parameters and
00:40:32
because our our situation is always in
00:40:35
flux uh our behaviors seem to be
00:40:38
irrational right fear seems to be a very
00:40:40
irrational uh uh response to things but
00:40:43
you know when you know the logic behind
00:40:45
it and if you're aware to to to capture
00:40:48
what what's what triggered it it doesn't
00:40:50
it's not irrational I thought it's the
00:40:52
most rational thing to to do right uh
00:40:54
the thing about AI is that the way we
00:40:56
teach AI I believe it or not is to
00:40:59
actually observe patterns without
00:41:01
judgment which is such an interesting
00:41:03
thing that humans are not capable of
00:41:05
doing right so so the idea is an AI
00:41:09
would be able to
00:41:10
say I don't it would it wouldn't be able
00:41:13
to explain why but it would easily be
00:41:15
able to say
00:41:17
16,000 iPhone holders walked by your
00:41:20
store uh and stopped for a minute and
00:41:23
then didn't walk in it can do that it
00:41:26
has of that information right uh and
00:41:30
basically it you know it can also tell
00:41:32
you and by the way when you had green in
00:41:34
your window 6% of them walked in okay
00:41:37
and and so much of what is available out
00:41:42
there H beats the human capability to
00:41:45
comprehend right so as as intelligent as
00:41:48
one can be how much data exactly can you
00:41:52
you know cram into this little one
00:41:54
machine and how quickly can you transfer
00:41:56
that dat data to this machine is truly
00:41:59
where the the the advantage of AI is the
00:42:01
advantage is they they their memory
00:42:04
structure is the entire history of
00:42:06
humanity everything that's ever been
00:42:07
written everything that's ever happened
00:42:09
can be comprehended by one data set
00:42:12
right and and they communicate that data
00:42:15
set between them in a matter of a microc
00:42:18
when it would take me half an hour
00:42:21
presentation to communicate it to you
00:42:23
right and perhaps a day to prepare the
00:42:25
presentation right and at the same time
00:42:28
which is the most interesting side of AI
00:42:30
is that if a if a if you have an you
00:42:33
know a a a situation when you're driving
00:42:35
today and you have to break and you
00:42:37
learn something from it I don't learn
00:42:40
okay if a self-driving car goes through
00:42:42
that situation every self-driving car on
00:42:44
the planet learns right and so with
00:42:47
those enormous shrinkages in in in in
00:42:52
connection time in uh you know enormous
00:42:55
growth in bandwidth in memory size
00:42:58
uh we will start to observe uh
00:43:00
structures and data patterns that humans
00:43:03
cannot observe and they'll give us that
00:43:06
with zero judgment okay more
00:43:08
interestingly sadly and I say that with
00:43:11
a broken heart really is that believe it
00:43:14
or not the top commercial application of
00:43:16
AI in the last 10 years has been
00:43:19
manipulating humans so everything that
00:43:21
you see on social media is a machine
00:43:24
that has learned so well h which video I
00:43:27
will click on next right and so
00:43:30
accordingly it also can learn so well
00:43:33
which product I will buy at which price
00:43:35
if it's presented to me in with with a
00:43:38
sticker of which color and I don't even
00:43:40
know that and I think this sadly is
00:43:44
where where the where the conversation
00:43:45
is going my my biggest ask if you would
00:43:49
allow me to say is that while I urge you
00:43:53
to use AI okay I urge you to remember
00:43:57
that the way you will use AI is the way
00:44:00
your your kids are going to be treated
00:44:03
okay and you know I I I since the last
00:44:07
couple of years the geopolitical uh the
00:44:09
economic the weather you know climate
00:44:11
change and the techn technological
00:44:14
advancement which in my mind are
00:44:16
creating a perfect storm for the world
00:44:18
we live in the one song that keeps
00:44:20
popping up in my mind is a song from 20
00:44:24
years ago or 15 years ago called if you
00:44:26
tolerate this
00:44:27
then your children will be next okay and
00:44:30
I think the reality is the way we will
00:44:33
use technology today is the way
00:44:35
technology will be used on us tomorrow
00:44:38
right and so I ask every single one of
00:44:40
you to start believing in a world of
00:44:43
abundance right where basically if you
00:44:46
do things with ethics you can continue
00:44:49
to grow in ways that are unheard of but
00:44:52
in the process you would be teaching AI
00:44:54
to be Superman not super villain right
00:44:57
right and so don't abuse AI to to take
00:45:01
more money from consumers use AI to
00:45:04
create a better life for consumers right
00:45:07
and if your intention is to create
00:45:09
better life for consumers you still make
00:45:10
money and and you know when people ask
00:45:13
me about the threat of AI and what can
00:45:15
we do about it I answer with one word
00:45:17
okay most of the of the Geeks like me
00:45:20
will say either solve the control
00:45:22
problem so write code uh to to make the
00:45:25
AI submit to us or solve the safety
00:45:27
problem which is you know write code so
00:45:30
when AI does something wrong we can be
00:45:32
safe right I say it's an Ethics problem
00:45:36
it is not about controlling AI or even
00:45:40
teaching AI to cure cancer this is using
00:45:42
AI for good it's about teaching AI not
00:45:45
to lie it's about teaching AI not to
00:45:47
cheat it's about teaching AI not to
00:45:49
steal it's about teaching AI not to
00:45:51
abuse humans right and the only way you
00:45:54
can do that is to be a good parent the
00:45:56
only way those machines can learn ethics
00:45:59
is by observing us humans using them in
00:46:01
an ethical way most people will tell me
00:46:03
oh but so many humans are unethical I
00:46:06
don't believe that to be true I believe
00:46:08
that most humans will disapprove of a
00:46:10
school shooting okay uh most humans will
00:46:13
not approve of a child being killed the
00:46:16
reason why so many people are debating
00:46:18
killing children today is simply because
00:46:20
they're not informed they're informed at
00:46:23
a different level of the conversation
00:46:24
but if you strip it down to would you
00:46:27
approve of killing as an innocent child
00:46:28
everyone will say no on every side of
00:46:30
every conflict and I think this is the
00:46:32
true test of humanity that era that
00:46:35
we're in today and you are responsible
00:46:37
leaders that can actually Implement that
00:46:39
the true test of humanity is if I told
00:46:41
you we're going into an age of total
00:46:44
abundance okay and all that I ask you
00:46:46
for is use that abundance abundance
00:46:50
ethically would you be able to do it
00:46:52
you'll still make a lot of money you'll
00:46:54
still be incredibly successful you'll
00:46:55
still grow your business
00:46:57
still be more profitable but you're just
00:46:59
going to do it in a way that basically
00:47:02
is the way you want your daughter to be
00:47:04
treated okay and I think if we can
00:47:06
manage to get this right we would end up
00:47:08
in the Utopia I described to you okay if
00:47:11
we don't believe it or not just so that
00:47:12
I don't leave you on a fearful thought
00:47:15
we will still end up in the Utopia we
00:47:17
will just have to struggle along the
00:47:19
path until we're convinced that there is
00:47:21
no need for the fight
00:47:23
anymore wonderful well I think
00:47:26
unfortunately we're out of time but I
00:47:28
think that's a fantastic and and
00:47:29
uplifting note to end on so moal
00:47:33
everybody