00:00:12
Chris Anderson: Hello.
Welcome to this TED Dialogues.
00:00:16
It's the first of a series
that's going to be done
00:00:20
in response to the current
political upheaval.
00:00:24
I don't know about you;
00:00:25
I've become quite concerned about
the growing divisiveness in this country
00:00:28
and in the world.
00:00:30
No one's listening to each other. Right?
00:00:33
They aren't.
00:00:34
I mean, it feels like we need
a different kind of conversation,
00:00:38
one that's based on -- I don't know,
on reason, listening, on understanding,
00:00:44
on a broader context.
00:00:46
That's at least what we're going to try
in these TED Dialogues,
00:00:49
starting today.
00:00:50
And we couldn't have anyone with us
00:00:53
who I'd be more excited to kick this off.
00:00:56
This is a mind right here that thinks
pretty much like no one else
00:01:00
on the planet, I would hasten to say.
00:01:02
I'm serious.
00:01:03
(Yuval Noah Harari laughs)
00:01:04
I'm serious.
00:01:05
He synthesizes history
with underlying ideas
00:01:10
in a way that kind of takes
your breath away.
00:01:12
So, some of you will know
this book, "Sapiens."
00:01:16
Has anyone here read "Sapiens"?
00:01:18
(Applause)
00:01:19
I mean, I could not put it down.
00:01:22
The way that he tells the story of mankind
00:01:26
through big ideas that really make you
think differently --
00:01:30
it's kind of amazing.
00:01:31
And here's the follow-up,
00:01:33
which I think is being published
in the US next week.
00:01:36
YNH: Yeah, next week.
00:01:37
CA: "Homo Deus."
00:01:38
Now, this is the history
of the next hundred years.
00:01:42
I've had a chance to read it.
00:01:44
It's extremely dramatic,
00:01:46
and I daresay, for some people,
quite alarming.
00:01:51
It's a must-read.
00:01:52
And honestly, we couldn't have
someone better to help
00:01:58
make sense of what on Earth
is happening in the world right now.
00:02:02
So a warm welcome, please,
to Yuval Noah Harari.
00:02:06
(Applause)
00:02:14
It's great to be joined by our friends
on Facebook and around the Web.
00:02:18
Hello, Facebook.
00:02:20
And all of you, as I start
asking questions of Yuval,
00:02:24
come up with your own questions,
00:02:25
and not necessarily about
the political scandal du jour,
00:02:28
but about the broader understanding
of: Where are we heading?
00:02:34
You ready? OK, we're going to go.
00:02:36
So here we are, Yuval:
00:02:37
New York City, 2017,
there's a new president in power,
00:02:41
and shock waves rippling around the world.
00:02:44
What on Earth is happening?
00:02:46
YNH: I think the basic thing that happened
00:02:49
is that we have lost our story.
00:02:51
Humans think in stories,
00:02:54
and we try to make sense of the world
by telling stories.
00:02:58
And for the last few decades,
00:02:59
we had a very simple
and very attractive story
00:03:02
about what's happening in the world.
00:03:04
And the story said that,
oh, what's happening is
00:03:07
that the economy is being globalized,
00:03:10
politics is being liberalized,
00:03:12
and the combination of the two
will create paradise on Earth,
00:03:16
and we just need to keep on
globalizing the economy
00:03:19
and liberalizing the political system,
00:03:21
and everything will be wonderful.
00:03:23
And 2016 is the moment
00:03:25
when a very large segment,
even of the Western world,
00:03:29
stopped believing in this story.
00:03:32
For good or bad reasons --
it doesn't matter.
00:03:34
People stopped believing in the story,
00:03:36
and when you don't have a story,
you don't understand what's happening.
00:03:41
CA: Part of you believes that that story
was actually a very effective story.
00:03:45
It worked.
00:03:46
YNH: To some extent, yes.
00:03:47
According to some measurements,
00:03:49
we are now in the best time ever
00:03:52
for humankind.
00:03:53
Today, for the first time in history,
00:03:56
more people die from eating too much
than from eating too little,
00:04:00
which is an amazing achievement.
00:04:02
(Laughter)
00:04:05
Also for the first time in history,
00:04:07
more people die from old age
than from infectious diseases,
00:04:11
and violence is also down.
00:04:13
For the first time in history,
00:04:15
more people commit suicide
than are killed by crime and terrorism
00:04:20
and war put together.
00:04:22
Statistically, you are
your own worst enemy.
00:04:26
At least, of all the people in the world,
00:04:28
you are most likely
to be killed by yourself --
00:04:32
(Laughter)
00:04:33
which is, again,
very good news, compared --
00:04:36
(Laughter)
00:04:38
compared to the level of violence
that we saw in previous eras.
00:04:42
CA: But this process
of connecting the world
00:04:44
ended up with a large group of people
kind of feeling left out,
00:04:48
and they've reacted.
00:04:50
And so we have this bombshell
00:04:52
that's sort of ripping
through the whole system.
00:04:54
I mean, what do you make
of what's happened?
00:04:57
It feels like the old way
that people thought of politics,
00:05:01
the left-right divide,
has been blown up and replaced.
00:05:04
How should we think of this?
00:05:05
YNH: Yeah, the old 20th-century
political model of left versus right
00:05:10
is now largely irrelevant,
00:05:11
and the real divide today
is between global and national,
00:05:16
global or local.
00:05:18
And you see it again all over the world
00:05:21
that this is now the main struggle.
00:05:23
We probably need completely
new political models
00:05:26
and completely new ways
of thinking about politics.
00:05:32
In essence, what you can say
is that we now have global ecology,
00:05:37
we have a global economy
but we have national politics,
00:05:41
and this doesn't work together.
00:05:43
This makes the political
system ineffective,
00:05:45
because it has no control
over the forces that shape our life.
00:05:49
And you have basically two solutions
to this imbalance:
00:05:52
either de-globalize the economy
and turn it back into a national economy,
00:05:57
or globalize the political system.
00:06:00
CA: So some, I guess
many liberals out there
00:06:05
view Trump and his government
as kind of irredeemably bad,
00:06:11
just awful in every way.
00:06:14
Do you see any underlying narrative
or political philosophy in there
00:06:21
that is at least worth understanding?
00:06:22
How would you articulate that philosophy?
00:06:24
Is it just the philosophy of nationalism?
00:06:28
YNH: I think the underlying
feeling or idea
00:06:33
is that the political system --
something is broken there.
00:06:38
It doesn't empower
the ordinary person anymore.
00:06:41
It doesn't care so much
about the ordinary person anymore,
00:06:45
and I think this diagnosis
of the political disease is correct.
00:06:50
With regard to the answers,
I am far less certain.
00:06:53
I think what we are seeing
is the immediate human reaction:
00:06:57
if something doesn't work, let's go back.
00:06:59
And you see it all over the world,
00:07:01
that people, almost nobody
in the political system today,
00:07:05
has any future-oriented vision
of where humankind is going.
00:07:09
Almost everywhere,
you see retrograde vision:
00:07:13
"Let's make America great again,"
00:07:15
like it was great -- I don't know --
in the '50s, in the '80s, sometime,
00:07:18
let's go back there.
00:07:19
And you go to Russia
a hundred years after Lenin,
00:07:24
Putin's vision for the future
00:07:26
is basically, ah, let's go back
to the Tsarist empire.
00:07:29
And in Israel, where I come from,
00:07:32
the hottest political vision
of the present is:
00:07:35
"Let's build the temple again."
00:07:37
So let's go back 2,000 years backwards.
00:07:40
So people are thinking
sometime in the past we've lost it,
00:07:45
and sometimes in the past, it's like
you've lost your way in the city,
00:07:48
and you say OK, let's go back
to the point where I felt secure
00:07:51
and start again.
00:07:53
I don't think this can work,
00:07:54
but a lot of people,
this is their gut instinct.
00:07:57
CA: But why couldn't it work?
00:07:59
"America First" is a very
appealing slogan in many ways.
00:08:03
Patriotism is, in many ways,
a very noble thing.
00:08:07
It's played a role
in promoting cooperation
00:08:09
among large numbers of people.
00:08:11
Why couldn't you have a world
organized in countries,
00:08:15
all of which put themselves first?
00:08:19
YNH: For many centuries,
even thousands of years,
00:08:22
patriotism worked quite well.
00:08:25
Of course, it led to wars an so forth,
00:08:27
but we shouldn't focus
too much on the bad.
00:08:30
There are also many,
many positive things about patriotism,
00:08:33
and the ability to have
a large number of people
00:08:37
care about each other,
00:08:39
sympathize with one another,
00:08:40
and come together for collective action.
00:08:43
If you go back to the first nations,
00:08:46
so, thousands of years ago,
00:08:48
the people who lived along
the Yellow River in China --
00:08:51
it was many, many different tribes
00:08:54
and they all depended on the river
for survival and for prosperity,
00:08:58
but all of them also suffered
from periodical floods
00:09:03
and periodical droughts.
00:09:04
And no tribe could really do
anything about it,
00:09:07
because each of them controlled
just a tiny section of the river.
00:09:11
And then in a long
and complicated process,
00:09:14
the tribes coalesced together
to form the Chinese nation,
00:09:18
which controlled the entire Yellow River
00:09:21
and had the ability to bring
hundreds of thousands of people together
00:09:26
to build dams and canals
and regulate the river
00:09:31
and prevent the worst floods and droughts
00:09:34
and raise the level
of prosperity for everybody.
00:09:37
And this worked in many places
around the world.
00:09:40
But in the 21st century,
00:09:43
technology is changing all that
in a fundamental way.
00:09:46
We are now living -- all people
in the world --
00:09:49
are living alongside the same cyber river,
00:09:53
and no single nation can regulate
this river by itself.
00:09:58
We are all living together
on a single planet,
00:10:02
which is threatened by our own actions.
00:10:05
And if you don't have some kind
of global cooperation,
00:10:09
nationalism is just not on the right level
to tackle the problems,
00:10:14
whether it's climate change
or whether it's technological disruption.
00:10:19
CA: So it was a beautiful idea
00:10:21
in a world where most of the action,
most of the issues,
00:10:25
took place on national scale,
00:10:28
but your argument is that the issues
that matter most today
00:10:30
no longer take place on a national scale
but on a global scale.
00:10:34
YNH: Exactly. All the major problems
of the world today
00:10:37
are global in essence,
00:10:40
and they cannot be solved
00:10:41
unless through some kind
of global cooperation.
00:10:45
It's not just climate change,
00:10:47
which is, like, the most obvious
example people give.
00:10:50
I think more in terms
of technological disruption.
00:10:54
If you think about, for example,
artificial intelligence,
00:10:57
over the next 20, 30 years
00:11:00
pushing hundreds of millions of people
out of the job market --
00:11:04
this is a problem on a global level.
00:11:06
It will disrupt the economy
of all the countries.
00:11:09
And similarly, if you think
about, say, bioengineering
00:11:13
and people being afraid of conducting,
00:11:16
I don't know, genetic engineering
research in humans,
00:11:19
it won't help if just
a single country, let's say the US,
00:11:24
outlaws all genetic experiments in humans,
00:11:27
but China or North Korea
continues to do it.
00:11:31
So the US cannot solve it by itself,
00:11:34
and very quickly, the pressure on the US
to do the same will be immense
00:11:39
because we are talking about
high-risk, high-gain technologies.
00:11:44
If somebody else is doing it,
I can't allow myself to remain behind.
00:11:48
The only way to have regulations,
effective regulations,
00:11:54
on things like genetic engineering,
00:11:56
is to have global regulations.
00:11:58
If you just have national regulations,
nobody would like to stay behind.
00:12:03
CA: So this is really interesting.
00:12:05
It seems to me that this may be one key
00:12:07
to provoking at least
a constructive conversation
00:12:11
between the different sides here,
00:12:12
because I think everyone can agree
that the start point
00:12:16
of a lot of the anger
that's propelled us to where we are
00:12:18
is because of the legitimate
concerns about job loss.
00:12:21
Work is gone, a traditional
way of life has gone,
00:12:25
and it's no wonder
that people are furious about that.
00:12:28
And in general, they have blamed
globalism, global elites,
00:12:33
for doing this to them
without asking their permission,
00:12:35
and that seems like
a legitimate complaint.
00:12:38
But what I hear you saying
is that -- so a key question is:
00:12:41
What is the real cause of job loss,
both now and going forward?
00:12:46
To the extent that it's about globalism,
00:12:49
then the right response,
yes, is to shut down borders
00:12:53
and keep people out
and change trade agreements and so forth.
00:12:57
But you're saying, I think,
00:12:59
that actually the bigger cause of job loss
is not going to be that at all.
00:13:04
It's going to originate
in technological questions,
00:13:07
and we have no chance of solving that
00:13:09
unless we operate as a connected world.
00:13:11
YNH: Yeah, I think that,
00:13:13
I don't know about the present,
but looking to the future,
00:13:16
it's not the Mexicans or Chinese
who will take the jobs
00:13:19
from the people in Pennsylvania,
00:13:21
it's the robots and algorithms.
00:13:23
So unless you plan to build a big wall
on the border of California --
00:13:27
(Laughter)
00:13:28
the wall on the border with Mexico
is going to be very ineffective.
00:13:32
And I was struck when I watched
the debates before the election,
00:13:38
I was struck that certainly Trump
did not even attempt to frighten people
00:13:44
by saying the robots will take your jobs.
00:13:47
Now even if it's not true,
it doesn't matter.
00:13:49
It could have been an extremely
effective way of frightening people --
00:13:52
(Laughter)
00:13:53
and galvanizing people:
00:13:55
"The robots will take your jobs!"
00:13:56
And nobody used that line.
00:13:58
And it made me afraid,
00:14:00
because it meant
that no matter what happens
00:14:04
in universities and laboratories,
00:14:07
and there, there is already
an intense debate about it,
00:14:09
but in the mainstream political system
and among the general public,
00:14:13
people are just unaware
00:14:16
that there could be an immense
technological disruption --
00:14:20
not in 200 years,
but in 10, 20, 30 years --
00:14:24
and we have to do something about it now,
00:14:27
partly because most of what we teach
children today in school or in college
00:14:33
is going to be completely irrelevant
to the job market of 2040, 2050.
00:14:39
So it's not something we'll need
to think about in 2040.
00:14:43
We need to think today
what to teach the young people.
00:14:46
CA: Yeah, no, absolutely.
00:14:50
You've often written about
moments in history
00:14:54
where humankind has ...
entered a new era, unintentionally.
00:15:01
Decisions have been made,
technologies have been developed,
00:15:04
and suddenly the world has changed,
00:15:06
possibly in a way
that's worse for everyone.
00:15:09
So one of the examples
you give in "Sapiens"
00:15:11
is just the whole agricultural revolution,
00:15:13
which, for an actual person
tilling the fields,
00:15:17
they just picked up a 12-hour
backbreaking workday
00:15:20
instead of six hours in the jungle
and a much more interesting lifestyle.
00:15:26
(Laughter)
00:15:27
So are we at another possible
phase change here,
00:15:30
where we kind of sleepwalk into a future
that none of us actually wants?
00:15:35
YNH: Yes, very much so.
00:15:38
During the agricultural revolution,
00:15:40
what happened is that immense
technological and economic revolution
00:15:44
empowered the human collective,
00:15:47
but when you look at actual
individual lives,
00:15:50
the life of a tiny elite
became much better,
00:15:54
and the lives of the majority of people
became considerably worse.
00:15:58
And this can happen again
in the 21st century.
00:16:01
No doubt the new technologies
will empower the human collective.
00:16:06
But we may end up again
00:16:08
with a tiny elite reaping
all the benefits, taking all the fruits,
00:16:13
and the masses of the population
finding themselves worse
00:16:17
than they were before,
00:16:18
certainly much worse than this tiny elite.
00:16:22
CA: And those elites
might not even be human elites.
00:16:25
They might be cyborgs or --
00:16:26
YNH: Yeah, they could be
enhanced super humans.
00:16:29
They could be cyborgs.
00:16:30
They could be completely
nonorganic elites.
00:16:32
They could even be
non-conscious algorithms.
00:16:35
What we see now in the world
is authority shifting away
00:16:40
from humans to algorithms.
00:16:42
More and more decisions --
about personal lives,
00:16:46
about economic matters,
about political matters --
00:16:48
are actually being taken by algorithms.
00:16:51
If you ask the bank for a loan,
00:16:54
chances are your fate is decided
by an algorithm, not by a human being.
00:16:58
And the general impression
is that maybe Homo sapiens just lost it.
00:17:04
The world is so complicated,
there is so much data,
00:17:09
things are changing so fast,
00:17:12
that this thing that evolved
on the African savanna
00:17:15
tens of thousands of years ago --
00:17:17
to cope with a particular environment,
00:17:20
a particular volume
of information and data --
00:17:24
it just can't handle the realities
of the 21st century,
00:17:28
and the only thing
that may be able to handle it
00:17:31
is big-data algorithms.
00:17:33
So no wonder more and more authority
is shifting from us to the algorithms.
00:17:40
CA: So we're in New York City
for the first of a series of TED Dialogues
00:17:44
with Yuval Harari,
00:17:46
and there's a Facebook Live
audience out there.
00:17:50
We're excited to have you with us.
00:17:52
We'll start coming
to some of your questions
00:17:54
and questions of people in the room
00:17:56
in just a few minutes,
00:17:57
so have those coming.
00:17:59
Yuval, if you're going
to make the argument
00:18:03
that we need to get past nationalism
because of the coming technological ...
00:18:11
danger, in a way,
00:18:12
presented by so much of what's happening
00:18:14
we've got to have
a global conversation about this.
00:18:17
Trouble is, it's hard to get people
really believing that, I don't know,
00:18:20
AI really is an imminent
threat, and so forth.
00:18:22
The things that people,
some people at least,
00:18:25
care about much more immediately, perhaps,
00:18:27
is climate change,
00:18:29
perhaps other issues like refugees,
nuclear weapons, and so forth.
00:18:34
Would you argue that where
we are right now
00:18:39
that somehow those issues
need to be dialed up?
00:18:42
You've talked about climate change,
00:18:45
but Trump has said
he doesn't believe in that.
00:18:48
So in a way, your most powerful argument,
00:18:51
you can't actually use to make this case.
00:18:54
YNH: Yeah, I think with climate change,
00:18:56
at first sight, it's quite surprising
00:18:59
that there is a very close correlation
00:19:02
between nationalism and climate change.
00:19:05
I mean, almost always, the people
who deny climate change are nationalists.
00:19:10
And at first sight, you think: Why?
00:19:12
What's the connection?
00:19:13
Why don't you have socialists
denying climate change?
00:19:16
But then, when you think
about it, it's obvious --
00:19:18
because nationalism has no solution
to climate change.
00:19:22
If you want to be a nationalist
in the 21st century,
00:19:25
you have to deny the problem.
00:19:27
If you accept the reality of the problem,
then you must accept that, yes,
00:19:32
there is still room in the world
for patriotism,
00:19:35
there is still room in the world
for having special loyalties
00:19:39
and obligations towards your own people,
towards your own country.
00:19:43
I don't think anybody is really
thinking of abolishing that.
00:19:47
But in order to confront climate change,
00:19:50
we need additional loyalties
and commitments
00:19:55
to a level beyond the nation.
00:19:57
And that should not be impossible,
00:19:59
because people can have
several layers of loyalty.
00:20:03
You can be loyal to your family
00:20:05
and to your community
00:20:07
and to your nation,
00:20:08
so why can't you also be loyal
to humankind as a whole?
00:20:12
Of course, there are occasions
when it becomes difficult,
00:20:15
what to put first,
00:20:17
but, you know, life is difficult.
00:20:19
Handle it.
00:20:20
(Laughter)
00:20:23
CA: OK, so I would love to get
some questions from the audience here.
00:20:27
We've got a microphone here.
00:20:29
Speak into it, and Facebook,
get them coming, too.
00:20:32
Howard Morgan: One of the things that has
clearly made a huge difference
00:20:36
in this country and other countries
00:20:38
is the income distribution inequality,
00:20:40
the dramatic change
in income distribution in the US
00:20:44
from what it was 50 years ago,
00:20:46
and around the world.
00:20:47
Is there anything we can do
to affect that?
00:20:50
Because that gets at a lot
of the underlying causes.
00:20:56
YNH: So far I haven't heard a very
good idea about what to do about it,
00:21:01
again, partly because most ideas
remain on the national level,
00:21:05
and the problem is global.
00:21:06
I mean, one idea that we hear
quite a lot about now
00:21:09
is universal basic income.
00:21:11
But this is a problem.
00:21:13
I mean, I think it's a good start,
00:21:14
but it's a problematic idea because
it's not clear what "universal" is
00:21:18
and it's not clear what "basic" is.
00:21:20
Most people when they speak
about universal basic income,
00:21:23
they actually mean national basic income.
00:21:26
But the problem is global.
00:21:28
Let's say that you have AI and 3D printers
taking away millions of jobs
00:21:33
in Bangladesh,
00:21:35
from all the people who make
my shirts and my shoes.
00:21:38
So what's going to happen?
00:21:39
The US government will levy taxes
on Google and Apple in California,
00:21:46
and use that to pay basic income
to unemployed Bangladeshis?
00:21:50
If you believe that,
you can just as well believe
00:21:53
that Santa Claus will come
and solve the problem.
00:21:57
So unless we have really universal
and not national basic income,
00:22:02
the deep problems
are not going to go away.
00:22:05
And also it's not clear what basic is,
00:22:08
because what are basic human needs?
00:22:10
A thousand years ago,
just food and shelter was enough.
00:22:13
But today, people will say
education is a basic human need,
00:22:17
it should be part of the package.
00:22:19
But how much? Six years?
Twelve years? PhD?
00:22:22
Similarly, with health care,
00:22:24
let's say that in 20, 30, 40 years,
00:22:27
you'll have expensive treatments
that can extend human life
00:22:31
to 120, I don't know.
00:22:33
Will this be part of the basket
of basic income or not?
00:22:38
It's a very difficult problem,
00:22:39
because in a world where people
lose their ability to be employed,
00:22:46
the only thing they are going to get
is this basic income.
00:22:49
So what's part of it is a very,
very difficult ethical question.
00:22:54
CA: There's a bunch of questions
on how the world affords it as well,
00:22:58
who pays.
00:22:59
There's a question here
from Facebook from Lisa Larson:
00:23:02
"How does nationalism in the US now
00:23:04
compare to that between
World War I and World War II
00:23:08
in the last century?"
00:23:09
YNH: Well the good news, with regard
to the dangers of nationalism,
00:23:14
we are in a much better position
than a century ago.
00:23:18
A century ago, 1917,
00:23:20
Europeans were killing
each other by the millions.
00:23:23
In 2016, with Brexit,
as far as I remember,
00:23:28
a single person lost their life,
an MP who was murdered by some extremist.
00:23:33
Just a single person.
00:23:35
I mean, if Brexit was about
British independence,
00:23:37
this is the most peaceful
war of independence in human history.
00:23:42
And let's say that Scotland
will now choose to leave the UK
00:23:48
after Brexit.
00:23:50
So in the 18th century,
00:23:52
if Scotland wanted -- and the Scots
wanted several times --
00:23:55
to break out of the control of London,
00:23:59
the reaction of the government
in London was to send an army up north
00:24:03
to burn down Edinburgh
and massacre the highland tribes.
00:24:07
My guess is that if, in 2018,
the Scots vote for independence,
00:24:12
the London government
will not send an army up north
00:24:16
to burn down Edinburgh.
00:24:17
Very few people are now willing
to kill or be killed
00:24:22
for Scottish or for British independence.
00:24:24
So for all the talk
of the rise of nationalism
00:24:30
and going back to the 1930s,
00:24:32
to the 19th century, in the West at least,
00:24:36
the power of national sentiments
today is far, far smaller
00:24:42
than it was a century ago.
00:24:44
CA: Although some people now,
you hear publicly worrying
00:24:48
about whether that might be shifting,
00:24:50
that there could actually be
outbreaks of violence in the US
00:24:54
depending on how things turn out.
00:24:56
Should we be worried about that,
00:24:58
or do you really think
things have shifted?
00:25:00
YNH: No, we should be worried.
00:25:01
We should be aware of two things.
00:25:03
First of all, don't be hysterical.
00:25:05
We are not back
in the First World War yet.
00:25:08
But on the other hand,
don't be complacent.
00:25:11
We reached from 1917 to 2017,
00:25:16
not by some divine miracle,
00:25:19
but simply by human decisions,
00:25:21
and if we now start making
the wrong decisions,
00:25:23
we could be back
in an analogous situation to 1917
00:25:28
in a few years.
00:25:29
One of the things I know as a historian
00:25:32
is that you should never
underestimate human stupidity.
00:25:36
(Laughter)
00:25:38
It's one of the most powerful
forces in history,
00:25:42
human stupidity and human violence.
00:25:44
Humans do such crazy things
for no obvious reason,
00:25:48
but again, at the same time,
00:25:50
another very powerful force
in human history is human wisdom.
00:25:53
We have both.
00:25:55
CA: We have with us here
moral psychologist Jonathan Haidt,
00:25:57
who I think has a question.
00:26:00
Jonathan Haidt: Thanks, Yuval.
00:26:02
So you seem to be a fan
of global governance,
00:26:04
but when you look at the map of the world
from Transparency International,
00:26:08
which rates the level of corruption
of political institutions,
00:26:11
it's a vast sea of red with little bits
of yellow here and there
00:26:14
for those with good institutions.
00:26:16
So if we were to have
some kind of global governance,
00:26:18
what makes you think it would end up
being more like Denmark
00:26:21
rather than more like Russia or Honduras,
00:26:23
and aren't there alternatives,
00:26:25
such as we did with CFCs?
00:26:27
There are ways to solve global problems
with national governments.
00:26:30
What would world government
actually look like,
00:26:32
and why do you think it would work?
00:26:34
YNH: Well, I don't know
what it would look like.
00:26:38
Nobody still has a model for that.
00:26:41
The main reason we need it
00:26:44
is because many of these issues
are lose-lose situations.
00:26:48
When you have
a win-win situation like trade,
00:26:51
both sides can benefit
from a trade agreement,
00:26:54
then this is something you can work out.
00:26:56
Without some kind of global government,
00:26:58
national governments each
have an interest in doing it.
00:27:01
But when you have a lose-lose situation
like with climate change,
00:27:05
it's much more difficult
00:27:07
without some overarching
authority, real authority.
00:27:12
Now, how to get there
and what would it look like,
00:27:15
I don't know.
00:27:16
And certainly there is no obvious reason
00:27:20
to think that it would look like Denmark,
00:27:22
or that it would be a democracy.
00:27:24
Most likely it wouldn't.
00:27:26
We don't have workable democratic models
00:27:32
for a global government.
00:27:34
So maybe it would look more
like ancient China
00:27:38
than like modern Denmark.
00:27:39
But still, given the dangers
that we are facing,
00:27:45
I think the imperative of having
some kind of real ability
00:27:50
to force through difficult decisions
on the global level
00:27:54
is more important
than almost anything else.
00:27:59
CA: There's a question from Facebook here,
00:28:01
and then we'll get the mic to Andrew.
00:28:03
So, Kat Hebron on Facebook,
00:28:05
calling in from Vail:
00:28:07
"How would developed nations manage
the millions of climate migrants?"
00:28:12
YNH: I don't know.
00:28:14
CA: That's your answer, Kat. (Laughter)
00:28:16
YNH: And I don't think
that they know either.
00:28:18
They'll just deny the problem, maybe.
00:28:20
CA: But immigration, generally,
is another example of a problem
00:28:23
that's very hard to solve
on a nation-by-nation basis.
00:28:26
One nation can shut its doors,
00:28:27
but maybe that stores up
problems for the future.
00:28:30
YNH: Yes, I mean --
it's another very good case,
00:28:34
especially because it's so much easier
00:28:36
to migrate today
00:28:38
than it was in the Middle Ages
or in ancient times.
00:28:42
CA: Yuval, there's a belief
among many technologists, certainly,
00:28:46
that political concerns
are kind of overblown,
00:28:48
that actually, political leaders
don't have that much influence
00:28:52
in the world,
00:28:53
that the real determination of humanity
at this point is by science,
00:28:57
by invention, by companies,
00:28:59
by many things
other than political leaders,
00:29:03
and it's actually very hard
for leaders to do much,
00:29:06
so we're actually worrying
about nothing here.
00:29:09
YNH: Well, first, it should be emphasized
00:29:12
that it's true that political leaders'
ability to do good is very limited,
00:29:17
but their ability to do harm is unlimited.
00:29:20
There is a basic imbalance here.
00:29:22
You can still press the button
and blow everybody up.
00:29:26
You have that kind of ability.
00:29:27
But if you want, for example,
to reduce inequality,
00:29:31
that's very, very difficult.
00:29:33
But to start a war,
00:29:34
you can still do so very easily.
00:29:36
So there is a built-in imbalance
in the political system today
00:29:40
which is very frustrating,
00:29:42
where you cannot do a lot of good
but you can still do a lot of harm.
00:29:46
And this makes the political system
still a very big concern.
00:29:51
CA: So as you look at
what's happening today,
00:29:53
and putting your historian's hat on,
00:29:55
do you look back in history at moments
when things were going just fine
00:29:59
and an individual leader really took
the world or their country backwards?
00:30:05
YNH: There are quite a few examples,
00:30:07
but I should emphasize,
it's never an individual leader.
00:30:10
I mean, somebody put him there,
00:30:12
and somebody allowed him
to continue to be there.
00:30:15
So it's never really just the fault
of a single individual.
00:30:19
There are a lot of people
behind every such individual.
00:30:24
CA: Can we have the microphone
here, please, to Andrew?
00:30:30
Andrew Solomon: You've talked a lot
about the global versus the national,
00:30:34
but increasingly, it seems to me,
00:30:36
the world situation
is in the hands of identity groups.
00:30:38
We look at people within the United States
00:30:41
who have been recruited by ISIS.
00:30:42
We look at these other groups
which have formed
00:30:45
which go outside of national bounds
00:30:47
but still represent
significant authorities.
00:30:49
How are they to be integrated
into the system,
00:30:51
and how is a diverse set of identities
to be made coherent
00:30:55
under either national
or global leadership?
00:30:59
YNH: Well, the problem
of such diverse identities
00:31:02
is a problem from nationalism as well.
00:31:05
Nationalism believes
in a single, monolithic identity,
00:31:09
and exclusive or at least
more extreme versions of nationalism
00:31:13
believe in an exclusive loyalty
to a single identity.
00:31:17
And therefore, nationalism has had
a lot of problems
00:31:20
with people wanting to divide
their identities
00:31:23
between various groups.
00:31:25
So it's not just a problem, say,
for a global vision.
00:31:30
And I think, again, history shows
00:31:34
that you shouldn't necessarily
think in such exclusive terms.
00:31:40
If you think that there is just
a single identity for a person,
00:31:43
"I am just X, that's it, I can't be
several things, I can be just that,"
00:31:48
that's the start of the problem.
00:31:51
You have religions, you have nations
00:31:53
that sometimes demand exclusive loyalty,
00:31:57
but it's not the only option.
00:31:58
There are many religions and many nations
00:32:01
that enable you to have
diverse identities at the same time.
00:32:05
CA: But is one explanation
of what's happened in the last year
00:32:09
that a group of people have got
fed up with, if you like,
00:32:14
the liberal elites,
for want of a better term,
00:32:17
obsessing over many, many different
identities and them feeling,
00:32:22
"But what about my identity?
I am being completely ignored here.
00:32:26
And by the way, I thought
I was the majority"?
00:32:29
And that that's actually
sparked a lot of the anger.
00:32:32
YNH: Yeah. Identity is always problematic,
00:32:35
because identity is always based
on fictional stories
00:32:40
that sooner or later collide with reality.
00:32:43
Almost all identities,
00:32:45
I mean, beyond the level
of the basic community
00:32:48
of a few dozen people,
00:32:50
are based on a fictional story.
00:32:52
They are not the truth.
00:32:53
They are not the reality.
00:32:55
It's just a story that people invent
and tell one another
00:32:58
and start believing.
00:32:59
And therefore all identities
are extremely unstable.
00:33:05
They are not a biological reality.
00:33:07
Sometimes nationalists, for example,
00:33:09
think that the nation
is a biological entity.
00:33:12
It's made of the combination
of soil and blood,
00:33:16
creates the nation.
00:33:18
But this is just a fictional story.
00:33:21
CA: Soil and blood
kind of makes a gooey mess.
00:33:23
(Laughter)
00:33:25
YNH: It does, and also
it messes with your mind
00:33:28
when you think too much
that I am a combination of soil and blood.
00:33:33
If you look from a biological perspective,
00:33:36
obviously none of the nations
that exist today
00:33:39
existed 5,000 years ago.
00:33:42
Homo sapiens is a social animal,
that's for sure.
00:33:45
But for millions of years,
00:33:48
Homo sapiens and our hominid ancestors
lived in small communities
00:33:53
of a few dozen individuals.
00:33:55
Everybody knew everybody else.
00:33:57
Whereas modern nations
are imagined communities,
00:34:01
in the sense that I don't even know
all these people.
00:34:04
I come from a relatively
small nation, Israel,
00:34:07
and of eight million Israelis,
00:34:09
I never met most of them.
00:34:11
I will never meet most of them.
00:34:13
They basically exist here.
00:34:16
CA: But in terms of this identity,
00:34:18
this group who feel left out
and perhaps have work taken away,
00:34:24
I mean, in "Homo Deus,"
00:34:26
you actually speak of this group
in one sense expanding,
00:34:29
that so many people
may have their jobs taken away
00:34:33
by technology in some way
that we could end up with
00:34:37
a really large -- I think you call it
a "useless class" --
00:34:41
a class where traditionally,
00:34:43
as viewed by the economy,
these people have no use.
00:34:45
YNH: Yes.
00:34:47
CA: How likely a possibility is that?
00:34:50
Is that something
we should be terrified about?
00:34:52
And can we address it in any way?
00:34:55
YNH: We should think about it
very carefully.
00:34:57
I mean, nobody really knows
what the job market will look like
00:35:00
in 2040, 2050.
00:35:02
There is a chance
many new jobs will appear,
00:35:05
but it's not certain.
00:35:07
And even if new jobs do appear,
00:35:09
it won't necessarily be easy
00:35:11
for a 50-year old unemployed truck driver
00:35:14
made unemployed by self-driving vehicles,
00:35:17
it won't be easy
for an unemployed truck driver
00:35:21
to reinvent himself or herself
as a designer of virtual worlds.
00:35:25
Previously, if you look at the trajectory
of the industrial revolution,
00:35:30
when machines replaced humans
in one type of work,
00:35:34
the solution usually came
from low-skill work
00:35:38
in new lines of business.
00:35:41
So you didn't need any more
agricultural workers,
00:35:44
so people moved to working
in low-skill industrial jobs,
00:35:50
and when this was taken away
by more and more machines,
00:35:53
people moved to low-skill service jobs.
00:35:56
Now, when people say there will
be new jobs in the future,
00:35:59
that humans can do better than AI,
00:36:02
that humans can do better than robots,
00:36:04
they usually think about high-skill jobs,
00:36:06
like software engineers
designing virtual worlds.
00:36:10
Now, I don't see how
an unemployed cashier from Wal-Mart
00:36:16
reinvents herself or himself at 50
as a designer of virtual worlds,
00:36:20
and certainly I don't see
00:36:22
how the millions of unemployed
Bangladeshi textile workers
00:36:25
will be able to do that.
00:36:27
I mean, if they are going to do it,
00:36:29
we need to start teaching
the Bangladeshis today
00:36:32
how to be software designers,
00:36:34
and we are not doing it.
00:36:35
So what will they do in 20 years?
00:36:38
CA: So it feels like you're really
highlighting a question
00:36:42
that's really been bugging me
the last few months more and more.
00:36:46
It's almost a hard question
to ask in public,
00:36:49
but if any mind has some wisdom
to offer in it, maybe it's yours,
00:36:52
so I'm going to ask you:
00:36:54
What are humans for?
00:36:57
YNH: As far as we know, for nothing.
00:36:59
(Laughter)
00:37:00
I mean, there is no great cosmic drama,
some great cosmic plan,
00:37:06
that we have a role to play in.
00:37:09
And we just need to discover
what our role is
00:37:12
and then play it to the best
of our ability.
00:37:15
This has been the story of all religions
and ideologies and so forth,
00:37:20
but as a scientist, the best I can say
is this is not true.
00:37:23
There is no universal drama
with a role in it for Homo sapiens.
00:37:29
So --
00:37:30
CA: I'm going to push back on you
just for a minute,
00:37:33
just from your own book,
00:37:34
because in "Homo Deus,"
00:37:35
you give really one of the most coherent
and understandable accounts
00:37:40
about sentience, about consciousness,
00:37:43
and that unique sort of human skill.
00:37:46
You point out that it's different
from intelligence,
00:37:48
the intelligence
that we're building in machines,
00:37:51
and that there's actually a lot
of mystery around it.
00:37:54
How can you be sure there's no purpose
00:37:58
when we don't even understand
what this sentience thing is?
00:38:02
I mean, in your own thinking,
isn't there a chance
00:38:04
that what humans are for
is to be the universe's sentient things,
00:38:09
to be the centers of joy and love
and happiness and hope?
00:38:12
And maybe we can build machines
that actually help amplify that,
00:38:15
even if they're not going to become
sentient themselves?
00:38:18
Is that crazy?
00:38:19
I kind of found myself hoping that,
reading your book.
00:38:23
YNH: Well, I certainly think that the most
interesting question today in science
00:38:26
is the question
of consciousness and the mind.
00:38:29
We are getting better and better
in understanding the brain
00:38:32
and intelligence,
00:38:34
but we are not getting much better
00:38:36
in understanding the mind
and consciousness.
00:38:39
People often confuse intelligence
and consciousness,
00:38:42
especially in places like Silicon Valley,
00:38:44
which is understandable,
because in humans, they go together.
00:38:48
I mean, intelligence basically
is the ability to solve problems.
00:38:52
Consciousness is the ability
to feel things,
00:38:54
to feel joy and sadness
and boredom and pain and so forth.
00:39:00
In Homo sapiens and all other mammals
as well -- it's not unique to humans --
00:39:04
in all mammals and birds
and some other animals,
00:39:06
intelligence and consciousness
go together.
00:39:09
We often solve problems by feeling things.
00:39:13
So we tend to confuse them.
00:39:14
But they are different things.
00:39:16
What's happening today
in places like Silicon Valley
00:39:19
is that we are creating
artificial intelligence
00:39:22
but not artificial consciousness.
00:39:24
There has been an amazing development
in computer intelligence
00:39:28
over the last 50 years,
00:39:29
and exactly zero development
in computer consciousness,
00:39:33
and there is no indication that computers
are going to become conscious
00:39:37
anytime soon.
00:39:40
So first of all, if there is
some cosmic role for consciousness,
00:39:45
it's not unique to Homo sapiens.
00:39:47
Cows are conscious, pigs are conscious,
00:39:50
chimpanzees are conscious,
chickens are conscious,
00:39:53
so if we go that way, first of all,
we need to broaden our horizons
00:39:57
and remember very clearly we are not
the only sentient beings on Earth,
00:40:01
and when it comes to sentience --
00:40:03
when it comes to intelligence,
there is good reason to think
00:40:06
we are the most intelligent
of the whole bunch.
00:40:10
But when it comes to sentience,
00:40:12
to say that humans are more
sentient than whales,
00:40:16
or more sentient than baboons
or more sentient than cats,
00:40:20
I see no evidence for that.
00:40:22
So first step is, you go
in that direction, expand.
00:40:26
And then the second question
of what is it for,
00:40:30
I would reverse it
00:40:31
and I would say that I don't think
sentience is for anything.
00:40:36
I think we don't need
to find our role in the universe.
00:40:40
The really important thing
is to liberate ourselves from suffering.
00:40:46
What characterizes sentient beings
00:40:49
in contrast to robots, to stones,
00:40:52
to whatever,
00:40:53
is that sentient beings
suffer, can suffer,
00:40:57
and what they should focus on
00:40:59
is not finding their place
in some mysterious cosmic drama.
00:41:03
They should focus on understanding
what suffering is,
00:41:07
what causes it and how
to be liberated from it.
00:41:11
CA: I know this is a big issue for you,
and that was very eloquent.
00:41:14
We're going to have a blizzard
of questions from the audience here,
00:41:18
and maybe from Facebook as well,
00:41:20
and maybe some comments as well.
00:41:21
So let's go quick.
00:41:23
There's one right here.
00:41:26
Keep your hands held up
at the back if you want the mic,
00:41:29
and we'll get it back to you.
00:41:31
Question: In your work, you talk a lot
about the fictional stories
00:41:34
that we accept as truth,
00:41:35
and we live our lives by it.
00:41:37
As an individual, knowing that,
00:41:39
how does it impact the stories
that you choose to live your life,
00:41:43
and do you confuse them
with the truth, like all of us?
00:41:48
YNH: I try not to.
00:41:49
I mean, for me, maybe the most
important question,
00:41:52
both as a scientist and as a person,
00:41:54
is how to tell the difference
between fiction and reality,
00:41:58
because reality is there.
00:42:01
I'm not saying that everything is fiction.
00:42:03
It's just very difficult for human beings
to tell the difference
00:42:06
between fiction and reality,
00:42:07
and it has become more and more difficult
as history progressed,
00:42:12
because the fictions
that we have created --
00:42:15
nations and gods and money
and corporations --
00:42:18
they now control the world.
00:42:20
So just to even think,
00:42:21
"Oh, this is just all fictional entities
that we've created,"
00:42:24
is very difficult.
00:42:25
But reality is there.
00:42:28
For me the best ...
00:42:30
There are several tests
00:42:33
to tell the difference
between fiction and reality.
00:42:35
The simplest one, the best one
that I can say in short,
00:42:39
is the test of suffering.
00:42:40
If it can suffer, it's real.
00:42:43
If it can't suffer, it's not real.
00:42:44
A nation cannot suffer.
00:42:46
That's very, very clear.
00:42:47
Even if a nation loses a war,
00:42:49
we say, "Germany suffered a defeat
in the First World War,"
00:42:53
it's a metaphor.
00:42:55
Germany cannot suffer.
Germany has no mind.
00:42:57
Germany has no consciousness.
00:42:59
Germans can suffer, yes,
but Germany cannot.
00:43:02
Similarly, when a bank goes bust,
00:43:05
the bank cannot suffer.
00:43:07
When the dollar loses its value,
the dollar doesn't suffer.
00:43:11
People can suffer. Animals can suffer.
00:43:13
This is real.
00:43:14
So I would start, if you
really want to see reality,
00:43:19
I would go through the door of suffering.
00:43:21
If you can really understand
what suffering is,
00:43:24
this will give you also the key
00:43:26
to understand what reality is.
00:43:28
CA: There's a Facebook question
here that connects to this,
00:43:31
from someone around the world
in a language that I cannot read.
00:43:34
YNH: Oh, it's Hebrew.
CA: Hebrew. There you go.
00:43:36
(Laughter)
00:43:37
Can you read the name?
00:43:38
YNH: Or Lauterbach Goren.
00:43:40
CA: Well, thank you for writing in.
00:43:42
The question is: "Is the post-truth era
really a brand-new era,
00:43:47
or just another climax or moment
in a never-ending trend?
00:43:52
YNH: Personally, I don't connect
with this idea of post-truth.
00:43:55
My basic reaction as a historian is:
00:43:58
If this is the era of post-truth,
when the hell was the era of truth?
00:44:02
CA: Right.
00:44:03
(Laughter)
00:44:05
YNH: Was it the 1980s, the 1950s,
the Middle Ages?
00:44:09
I mean, we have always lived
in an era, in a way, of post-truth.
00:44:14
CA: But I'd push back on that,
00:44:17
because I think what people
are talking about
00:44:19
is that there was a world
where you had fewer journalistic outlets,
00:44:26
where there were traditions,
that things were fact-checked.
00:44:30
It was incorporated into the charter
of those organizations
00:44:34
that the truth mattered.
00:44:36
So if you believe in a reality,
00:44:38
then what you write is information.
00:44:40
There was a belief that that information
should connect to reality in a real way,
00:44:44
and if you wrote a headline,
it was a serious, earnest attempt
00:44:47
to reflect something
that had actually happened.
00:44:49
And people didn't always get it right.
00:44:51
But I think the concern now is you've got
00:44:53
a technological system
that's incredibly powerful
00:44:55
that, for a while at least,
massively amplified anything
00:45:00
with no attention paid to whether
it connected to reality,
00:45:02
only to whether it connected
to clicks and attention,
00:45:06
and that that was arguably toxic.
00:45:07
That's a reasonable concern, isn't it?
00:45:10
YNH: Yeah, it is. I mean,
the technology changes,
00:45:12
and it's now easier to disseminate
both truth and fiction and falsehood.
00:45:17
It goes both ways.
00:45:19
It's also much easier, though, to spread
the truth than it was ever before.
00:45:24
But I don't think there
is anything essentially new
00:45:28
about this disseminating
fictions and errors.
00:45:32
There is nothing that -- I don't know --
Joseph Goebbels, didn't know
00:45:36
about all this idea of fake
news and post-truth.
00:45:42
He famously said that if you repeat
a lie often enough,
00:45:46
people will think it's the truth,
00:45:48
and the bigger the lie, the better,
00:45:50
because people won't even think
that something so big can be a lie.
00:45:56
I think that fake news
has been with us for thousands of years.
00:46:02
Just think of the Bible.
00:46:04
(Laughter)
00:46:05
CA: But there is a concern
00:46:06
that the fake news is associated
with tyrannical regimes,
00:46:10
and when you see an uprise in fake news
00:46:13
that is a canary in the coal mine
that there may be dark times coming.
00:46:19
YNH: Yeah. I mean, the intentional use
of fake news is a disturbing sign.
00:46:27
But I'm not saying that it's not bad,
I'm just saying that it's not new.
00:46:32
CA: There's a lot of interest
on Facebook on this question
00:46:35
about global governance
versus nationalism.
00:46:41
Question here from Phil Dennis:
00:46:42
"How do we get people, governments,
to relinquish power?
00:46:46
Is that -- is that --
actually, the text is so big
00:46:50
I can't read the full question.
00:46:51
But is that a necessity?
00:46:53
Is it going to take war to get there?
00:46:55
Sorry Phil -- I mangled your question,
but I blame the text right here.
00:46:59
YNH: One option
that some people talk about
00:47:01
is that only a catastrophe
can shake humankind
00:47:06
and open the path to a real system
of global governance,
00:47:11
and they say that we can't do it
before the catastrophe,
00:47:15
but we need to start
laying the foundations
00:47:18
so that when the disaster strikes,
00:47:21
we can react quickly.
00:47:23
But people will just not have
the motivation to do such a thing
00:47:27
before the disaster strikes.
00:47:29
Another thing that I would emphasize
00:47:31
is that anybody who is really
interested in global governance
00:47:36
should always make it very, very clear
00:47:39
that it doesn't replace or abolish
local identities and communities,
00:47:46
that it should come both as --
00:47:49
It should be part of a single package.
00:47:52
CA: I want to hear more on this,
00:47:56
because the very words "global governance"
00:47:59
are almost the epitome of evil
in the mindset of a lot of people
00:48:03
on the alt-right right now.
00:48:05
It just seems scary, remote, distant,
and it has let them down,
00:48:08
and so globalists,
global governance -- no, go away!
00:48:12
And many view the election
as the ultimate poke in the eye
00:48:16
to anyone who believes in that.
00:48:17
So how do we change the narrative
00:48:21
so that it doesn't seem
so scary and remote?
00:48:24
Build more on this idea
of it being compatible
00:48:26
with local identity, local communities.
00:48:29
YNH: Well, I think again we should start
00:48:32
really with the biological realities
00:48:35
of Homo sapiens.
00:48:37
And biology tells us two things
about Homo sapiens
00:48:41
which are very relevant to this issue:
00:48:43
first of all, that we are
completely dependent
00:48:46
on the ecological system around us,
00:48:49
and that today we are talking
about a global system.
00:48:52
You cannot escape that.
00:48:54
And at the same time, biology tells us
about Homo sapiens
00:48:57
that we are social animals,
00:49:00
but that we are social
on a very, very local level.
00:49:04
It's just a simple fact of humanity
00:49:08
that we cannot have intimate familiarity
00:49:13
with more than about 150 individuals.
00:49:17
The size of the natural group,
00:49:21
the natural community of Homo sapiens,
00:49:24
is not more than 150 individuals,
00:49:27
and everything beyond that is really
based on all kinds of imaginary stories
00:49:34
and large-scale institutions,
00:49:36
and I think that we can find a way,
00:49:40
again, based on a biological
understanding of our species,
00:49:45
to weave the two together
00:49:47
and to understand that today
in the 21st century,
00:49:50
we need both the global level
and the local community.
00:49:56
And I would go even further than that
00:49:58
and say that it starts
with the body itself.
00:50:02
The feelings that people today have
of alienation and loneliness
00:50:06
and not finding their place in the world,
00:50:09
I would think that the chief problem
is not global capitalism.
00:50:16
The chief problem is that over
the last hundred years,
00:50:19
people have been becoming disembodied,
00:50:22
have been distancing themselves
from their body.
00:50:26
As a hunter-gatherer or even as a peasant,
00:50:28
to survive, you need to be
constantly in touch
00:50:33
with your body and with your senses,
00:50:35
every moment.
00:50:36
If you go to the forest
to look for mushrooms
00:50:38
and you don't pay attention
to what you hear,
00:50:41
to what you smell, to what you taste,
00:50:43
you're dead.
00:50:44
So you must be very connected.
00:50:46
In the last hundred years,
people are losing their ability
00:50:51
to be in touch with their body
and their senses,
00:50:53
to hear, to smell, to feel.
00:50:56
More and more attention goes to screens,
00:50:59
to what is happening elsewhere,
00:51:00
some other time.
00:51:02
This, I think, is the deep reason
00:51:04
for the feelings of alienation
and loneliness and so forth,
00:51:08
and therefore part of the solution
00:51:11
is not to bring back
some mass nationalism,
00:51:15
but also reconnect with our own bodies,
00:51:19
and if you are back
in touch with your body,
00:51:22
you will feel much more at home
in the world also.
00:51:25
CA: Well, depending on how things go,
we may all be back in the forest soon.
00:51:29
We're going to have
one more question in the room
00:51:32
and one more on Facebook.
00:51:33
Ama Adi-Dako: Hello. I'm from Ghana,
West Africa, and my question is:
00:51:36
I'm wondering how do you present
and justify the idea of global governance
00:51:41
to countries that have been
historically disenfranchised
00:51:44
by the effects of globalization,
00:51:46
and also, if we're talking about
global governance,
00:51:49
it sounds to me like it will definitely
come from a very Westernized idea
00:51:53
of what the "global"
is supposed to look like.
00:51:55
So how do we present and justify
that idea of global
00:51:58
versus wholly nationalist
00:52:01
to people in countries like Ghana
and Nigeria and Togo
00:52:04
and other countries like that?
00:52:07
YNH: I would start by saying
that history is extremely unfair,
00:52:14
and that we should realize that.
00:52:18
Many of the countries that suffered most
00:52:21
from the last 200 years of globalization
00:52:26
and imperialism and industrialization
00:52:28
are exactly the countries
which are also most likely to suffer most
00:52:33
from the next wave.
00:52:36
And we should be very,
very clear about that.
00:52:41
If we don't have a global governance,
00:52:44
and if we suffer from climate change,
00:52:47
from technological disruptions,
00:52:49
the worst suffering will not be in the US.
00:52:53
The worst suffering will be in Ghana,
will be in Sudan, will be in Syria,
00:52:58
will be in Bangladesh,
will be in those places.
00:53:01
So I think those countries
have an even greater incentive
00:53:07
to do something about
the next wave of disruption,
00:53:12
whether it's ecological
or whether it's technological.
00:53:14
Again, if you think about
technological disruption,
00:53:17
so if AI and 3D printers and robots
will take the jobs
00:53:22
from billions of people,
00:53:24
I worry far less about the Swedes
00:53:27
than about the people in Ghana
or in Bangladesh.
00:53:31
And therefore,
because history is so unfair
00:53:36
and the results of a calamity
00:53:41
will not be shared equally
between everybody,
00:53:43
as usual, the rich
will be able to get away
00:53:47
from the worst consequences
of climate change
00:53:51
in a way that the poor
will not be able to.
00:53:55
CA: And here's a great question
from Cameron Taylor on Facebook:
00:53:58
"At the end of 'Sapiens,'"
00:54:00
you said we should be asking the question,
00:54:02
'What do we want to want?'
00:54:05
Well, what do you think
we should want to want?"
00:54:08
YNH: I think we should want
to want to know the truth,
00:54:11
to understand reality.
00:54:15
Mostly what we want is to change reality,
00:54:20
to fit it to our own desires,
to our own wishes,
00:54:23
and I think we should first
want to understand it.
00:54:27
If you look at the long-term
trajectory of history,
00:54:31
what you see is that
for thousands of years
00:54:34
we humans have been gaining
control of the world outside us
00:54:37
and trying to shape it
to fit our own desires.
00:54:41
And we've gained control
of the other animals,
00:54:44
of the rivers, of the forests,
00:54:45
and reshaped them completely,
00:54:49
causing an ecological destruction
00:54:52
without making ourselves satisfied.
00:54:55
So the next step
is we turn our gaze inwards,
00:54:59
and we say OK, getting control
of the world outside us
00:55:04
did not really make us satisfied.
00:55:06
Let's now try to gain control
of the world inside us.
00:55:08
This is the really big project
00:55:11
of science and technology
and industry in the 21st century --
00:55:15
to try and gain control
of the world inside us,
00:55:19
to learn how to engineer and produce
bodies and brains and minds.
00:55:23
These are likely to be the main
products of the 21st century economy.
00:55:28
When people think about the future,
very often they think in terms,
00:55:32
"Oh, I want to gain control
of my body and of my brain."
00:55:36
And I think that's very dangerous.
00:55:39
If we've learned anything
from our previous history,
00:55:42
it's that yes, we gain
the power to manipulate,
00:55:46
but because we didn't really
understand the complexity
00:55:49
of the ecological system,
00:55:51
we are now facing an ecological meltdown.
00:55:54
And if we now try to reengineer
the world inside us
00:56:00
without really understanding it,
00:56:02
especially without understanding
the complexity of our mental system,
00:56:06
we might cause a kind of internal
ecological disaster,
00:56:11
and we'll face a kind of mental
meltdown inside us.
00:56:16
CA: Putting all the pieces
together here --
00:56:18
the current politics,
the coming technology,
00:56:21
concerns like the one
you've just outlined --
00:56:23
I mean, it seems like you yourself
are in quite a bleak place
00:56:26
when you think about the future.
00:56:28
You're pretty worried about it.
00:56:29
Is that right?
00:56:31
And if there was one cause for hope,
how would you state that?
00:56:37
YNH: I focus on the most
dangerous possibilities
00:56:41
partly because this is like
my job or responsibility
00:56:44
as a historian or social critic.
00:56:46
I mean, the industry focuses mainly
on the positive sides,
00:56:51
so it's the job of historians
and philosophers and sociologists
00:56:54
to highlight the more dangerous potential
of all these new technologies.
00:56:59
I don't think any of that is inevitable.
00:57:01
Technology is never deterministic.
00:57:04
You can use the same technology
00:57:06
to create very different
kinds of societies.
00:57:09
If you look at the 20th century,
00:57:11
so, the technologies
of the Industrial Revolution,
00:57:14
the trains and electricity and all that
00:57:17
could be used to create
a communist dictatorship
00:57:20
or a fascist regime
or a liberal democracy.
00:57:23
The trains did not tell you
what to do with them.
00:57:26
Similarly, now, artificial intelligence
and bioengineering and all of that --
00:57:30
they don't predetermine a single outcome.
00:57:34
Humanity can rise up to the challenge,
00:57:37
and the best example we have
00:57:39
of humanity rising up
to the challenge of a new technology
00:57:43
is nuclear weapons.
00:57:45
In the late 1940s, '50s,
00:57:48
many people were convinced
00:57:50
that sooner or later the Cold War
will end in a nuclear catastrophe,
00:57:54
destroying human civilization.
00:57:56
And this did not happen.
00:57:57
In fact, nuclear weapons prompted
humans all over the world
00:58:04
to change the way that they manage
international politics
00:58:09
to reduce violence.
00:58:11
And many countries basically took out war
00:58:14
from their political toolkit.
00:58:16
They no longer tried to pursue
their interests with warfare.
00:58:21
Not all countries have done so,
but many countries have.
00:58:24
And this is maybe
the most important reason
00:58:28
why international violence
declined dramatically since 1945,
00:58:34
and today, as I said,
more people commit suicide
00:58:38
than are killed in war.
00:58:40
So this, I think, gives us a good example
00:58:45
that even the most frightening technology,
00:58:49
humans can rise up to the challenge
00:58:51
and actually some good can come out of it.
00:58:54
The problem is, we have very little
margin for error.
00:58:59
If we don't get it right,
00:59:01
we might not have
a second option to try again.
00:59:06
CA: That's a very powerful note,
00:59:07
on which I think we should draw
this to a conclusion.
00:59:10
Before I wrap up, I just want to say
one thing to people here
00:59:13
and to the global TED community
watching online, anyone watching online:
00:59:19
help us with these dialogues.
00:59:22
If you believe, like we do,
00:59:24
that we need to find
a different kind of conversation,
00:59:27
now more than ever, help us do it.
00:59:30
Reach out to other people,
00:59:33
try and have conversations
with people you disagree with,
00:59:35
understand them,
00:59:37
pull the pieces together,
00:59:38
and help us figure out how to take
these conversations forward
00:59:42
so we can make a real contribution
00:59:44
to what's happening
in the world right now.
00:59:47
I think everyone feels more alive,
00:59:50
more concerned, more engaged
00:59:53
with the politics of the moment.
00:59:55
The stakes do seem quite high,
00:59:58
so help us respond to it
in a wise, wise way.
01:00:02
Yuval Harari, thank you.
01:00:04
(Applause)