00:00:01
I guess I just want to start by being a
00:00:16
little bit inclusive and asking who here
00:00:18
in the audience is not a human who's an
00:00:21
alien at a human pride event tonight
00:00:23
it's a couple of down the front thank
00:00:25
you who feels like an alien in today's
00:00:28
technological world on occasion ok a
00:00:31
little bit more so I want to start with
00:00:35
this slide because this is what we're
00:00:39
talking about we're talking about this
00:00:41
idea of how we as human beings create
00:00:45
technology but how technology creates us
00:00:48
in return
00:00:50
so that's the kind of key message that I
00:00:52
want to hammer on as to be honored as
00:00:55
the opening speaker for TEDx Carew's and
00:00:58
in particular I want you to kind of
00:01:00
think in the different ways in which you
00:01:03
are creating technology and your
00:01:04
identities have changed because for
00:01:06
those of you who are alien in the
00:01:08
audience if you visited a thousand years
00:01:11
ago the human beings you would have
00:01:13
encountered would have been completely
00:01:14
different
00:01:15
to those who you're going to meet
00:01:16
tonight and if you come back in ten
00:01:18
years will be completely different again
00:01:19
and the process of how that happens
00:01:22
and what control we have about this
00:01:23
process is I think one of the essential
00:01:26
questions of our rage today so this is
00:01:29
the main question that I'd like to pose
00:01:31
how can we ensure not that we stay human
00:01:33
human beings are always in flux but that
00:01:37
we can retain and deepen our sense of
00:01:39
what it means to be human over time and
00:01:41
there's a really interesting question
00:01:43
here about why I proudly human right now
00:01:46
if this is a process that's been
00:01:48
happening over millennia what are we
00:01:51
worried about
00:01:52
so tell me if you've seen this video
00:01:55
around a robot doing a back flip Boston
00:01:58
Dynamics latest robot it's called the
00:02:01
Atlas it's absolutely phenomenal when I
00:02:04
watch these kinds of videos I have four
00:02:08
or three or four really really
00:02:09
contrasting emotions going on at each
00:02:12
point in time one of them is
00:02:13
embarrassment that I can't
00:02:14
do a backflip another one is absolute
00:02:17
amazement at the ingenuity of the lab in
00:02:20
Boston that manages to combine biologic
00:02:23
and biological physics with materials
00:02:25
with artificial intelligence to do this
00:02:27
a third is fear fear that someday I
00:02:30
could be out there protesting and rather
00:02:33
than dehumanize people with riot shields
00:02:37
these could be on the other side in a
00:02:38
security or military context and the
00:02:41
fourth fourth concern offer forth
00:02:43
emotion I have is a curious sense of
00:02:46
connection particularly when you watch
00:02:48
the robot fall this empathy with again
00:02:50
the fal ability of this completely
00:02:52
created thing and I guess as I want you
00:02:55
to keep in mind this idea of when we
00:02:57
look at technology we're looking at
00:02:59
ourselves in different ways and this is
00:03:01
happening more and more every day so
00:03:02
what do we want to see and how can we
00:03:04
understand it if we ask ourselves what
00:03:07
it means to be human
00:03:08
this is not a new question right not
00:03:10
only can you find many Ted videos on
00:03:12
this topic but philosophers have been
00:03:15
asking this question for thousands of
00:03:17
years human beings have been asking and
00:03:19
one philosophical answer to this
00:03:22
question is humans of humans because we
00:03:24
can ask the question we can reflect on
00:03:26
ourselves we have self-awareness in
00:03:28
different ways another one is that
00:03:30
humans are humans because we have
00:03:31
culture because we are positioned in
00:03:34
different ways because we are bodily
00:03:36
beings we have a certain biology or
00:03:39
because we create institutions and we
00:03:41
engage in the world in a very particular
00:03:43
way but this is probably one of the
00:03:47
closest things we can think about when
00:03:48
we say what it means to be human
00:03:50
when archaeologists look in the fossil
00:03:53
record for when a human existed they
00:03:56
look for this and by the way this idea
00:04:00
of humans as technological beings as
00:04:02
being created this is something that we
00:04:05
in might the work that I do every day
00:04:07
talk about constantly and the ideas here
00:04:11
of framing the technological human have
00:04:14
really been inspired by someone who's in
00:04:15
the audience tonight Tom Philbeck but
00:04:18
also many colleagues who are starting to
00:04:20
bring to bear in the public
00:04:22
consciousness ideas that have been in
00:04:24
social science for many years but were
00:04:26
only just realizing how important they
00:04:28
to our daily lives there are three ways
00:04:31
in which technology really affects us at
00:04:33
least let me just give you these first
00:04:35
three the first one will be familiar to
00:04:37
any of you who have ever heard the
00:04:39
phrase to a person with a hammer
00:04:41
everything looks like a nail the objects
00:04:44
we create focus our attention they give
00:04:48
us power and we use that power as
00:04:50
Langdon winner said artifacts have
00:04:53
politics that may not look super
00:04:56
political to you replace it with a gun
00:04:59
or an atom bomb and it immediately
00:05:01
becomes incredibly political tools and
00:05:05
Technology affects us because we use it
00:05:08
to shape the world around us look at
00:05:09
this space we're in or think of the
00:05:10
Gothic cathedral to people entering a
00:05:13
Gothic cathedral a thousand years ago or
00:05:14
800 years ago they could not speak the
00:05:17
language of the institution they were
00:05:18
entering but they were nevertheless
00:05:21
immediately informed what it means to be
00:05:23
a human in that space and modern
00:05:25
technology today means we literally see
00:05:28
through technology I see through
00:05:30
technology I have contact lenses in
00:05:32
today which means my experience of the
00:05:34
world is dramatically different than it
00:05:35
would be otherwise so I'm a cyborg in
00:05:38
that respect but now we have the
00:05:40
opportunity to envelop ourselves in
00:05:42
different ways and the most annoying
00:05:44
cyborgs today are these people okay the
00:05:46
people who put their hands up and their
00:05:48
mobile phones at concerts and record the
00:05:51
concert so even if you're tall like me
00:05:53
and you feel like you have a natural
00:05:54
advantage in watching people it's now
00:05:56
removed by the fact that everyone's
00:05:58
looking through the screen don't do that
00:06:00
please don't be that kind of cyborg it's
00:06:02
not the best type of cyborg but this is
00:06:04
happening more and more we're literally
00:06:05
viewing the world through our devices
00:06:08
why is this a problem
00:06:10
let me give you three fears that we have
00:06:13
today number one are we becoming
00:06:15
redundant after millennia of using and
00:06:19
creating tools and we had a turning
00:06:21
point where we no longer needed in some
00:06:24
sense or is it actually more about
00:06:27
division are we scared of being
00:06:30
separated from one another through
00:06:31
technology or third are we being
00:06:34
squashed flattened out are we being
00:06:36
distributed and pulled too much in a way
00:06:39
that means that we can't be
00:06:41
the full rich people that we want to be
00:06:43
or that we think we should be so let me
00:06:46
just quickly delve into each of those
00:06:47
the question of jobs the future of jobs
00:06:50
I won't go in to the data except to say
00:06:52
I don't believe most of it because it's
00:06:55
up to us to decide today
00:06:57
how we want humans and machines to
00:06:59
relate but the scary figures are out
00:07:00
there from my friend Mike Osborne 47
00:07:03
percent of US jobs at risk from work
00:07:06
from Bruegel up to 60 percent of
00:07:07
European jobs are at risk
00:07:09
let's just focus on the fact that in all
00:07:12
of history
00:07:13
this has always been happening and it's
00:07:14
actually not the jobs that disappear
00:07:16
it's the tasks within occupations that
00:07:19
change dramatically and the second thing
00:07:22
to say here is there's great evidence
00:07:24
from Australian studies recently that
00:07:27
automation normally act takes away that
00:07:30
the work you don't want to do and
00:07:32
Australian workers in the last 15 years
00:07:35
alone have gained more than two hours a
00:07:37
week a substantial amount of time in
00:07:39
interpersonal work in creative work in
00:07:42
information synthesis work which is all
00:07:45
highly correlated with increasing job
00:07:47
satisfaction so the question is what
00:07:50
kind of stories can we tell each other
00:07:51
to make that keep going as opposed to a
00:07:55
CEO announcing they're laying off 20
00:07:57
percent of their workforce in advance of
00:07:59
automation just because they're worried
00:08:01
and we need to get the story straight
00:08:02
because just in May this year these are
00:08:06
three different takes on the same set of
00:08:08
data robots are going to take all our
00:08:10
jobs actually robots are not going to
00:08:13
take all of our jobs or my favorite The
00:08:15
Wall Street Journal robots aren't taking
00:08:17
enough of their jobs
00:08:19
the second big issue here is really
00:08:23
around this question of division it's
00:08:25
this idea that somehow the world is not
00:08:29
only becoming more unequal but the
00:08:31
technology is driving it this is turqu
00:08:34
Avera a fantastic Brazilian photographer
00:08:35
famous photo of his of a slum in New
00:08:39
Buenos Aires quit Paris operas it really
00:08:42
illustrates this idea that the built
00:08:44
environment is already dividing us but
00:08:46
what about the point when we talk about
00:08:48
emerging technologies that we have 4.1
00:08:52
billion people around the world that
00:08:53
don't have access to the inter
00:08:55
yet 2.4 billion without access to water
00:08:58
and sanitation 1.2 billion without
00:09:00
access to energy to electricity and
00:09:02
almost 600 million smallholder farmers
00:09:05
who haven't even gone through the first
00:09:06
Industrial Revolution the greatest
00:09:10
social injustice of any technological
00:09:12
revolution is those who are left out so
00:09:15
let's keep that in mind as we move
00:09:17
forward and understanding where what the
00:09:19
system is where we want them to take us
00:09:21
and this galaxy image also in honor of
00:09:25
Tom in the audience is to show that
00:09:26
we're being flattened and if you're
00:09:28
interested in this topic look at sherry
00:09:30
turkle's videos on Ted or the books that
00:09:32
she's written but there are three big
00:09:35
concerns who's in control of our
00:09:37
attention today when we work through
00:09:39
digital networks when we see the world
00:09:41
through the devices we carry what
00:09:43
happens when we lose the ability to be
00:09:45
bored or to have conversation and if we
00:09:47
don't understand each other how can we
00:09:51
be like Confucius and the philosophical
00:09:54
kind of history truly reflective and
00:09:57
understanding ourselves as human beings
00:09:59
what might we lose all of this is not
00:10:04
about individual technologies it's not
00:10:07
going to be solved by saying platform
00:10:11
designer why you should redesign your
00:10:14
front page to look like this or robotics
00:10:16
designer X you should build a robot that
00:10:18
does y&z because all of this is part of
00:10:22
a broader system and when we zoom out
00:10:25
and think about the relationship of
00:10:27
humans and technologies we have to pay
00:10:29
attention to these things who and how
00:10:32
are we educating around technology and
00:10:34
more broadly what are the incentives for
00:10:37
investment in different types of
00:10:38
technologies what are our priorities
00:10:40
what conversations do we do we want to
00:10:43
have should we be having so taking this
00:10:45
systems view should it mean that we all
00:10:48
sit here and say oh gosh this is a
00:10:50
really hard problem if we want to change
00:10:52
our relationship to technology to be
00:10:54
more inclusive we need to change
00:10:55
everything from tax from social
00:10:58
relationships it's true but it's also
00:11:02
incredibly empowering
00:11:03
because if we are really now as I
00:11:05
believe on a cusp of an entirely new set
00:11:08
of
00:11:08
amazing empowering technologies the
00:11:12
question is what do we want to make what
00:11:14
does this next system look like and
00:11:15
what's our role so let me finish by
00:11:19
giving you four roles for opportunities
00:11:22
for you here in the city of courage in
00:11:26
the canton of geneva
00:11:27
in your organization's can grasp to
00:11:30
start to focus more on this topic and
00:11:32
make it very conscious but hopefully
00:11:34
also to take us all to a far better and
00:11:36
more inclusive space number one be
00:11:39
political
00:11:39
if technologies are political you cannot
00:11:43
afford to not be political okay it
00:11:46
doesn't mean you have to go left or
00:11:48
right it means you have to engage with
00:11:51
the fact that we are being influenced by
00:11:54
the things that are created and if we
00:11:56
don't have power over those decisions at
00:11:59
the end of the day we are entirely at
00:12:00
the mercy of those designing and
00:12:03
investing in those systems so get
00:12:05
political like the revolutionaries here
00:12:07
in 1847 in Geneva and have the
00:12:11
conversations ask the questions what do
00:12:13
we really want and how do we influence
00:12:15
to get there the second question is at
00:12:18
the other end of the spectrum be human
00:12:21
the anciently human stop as I am tempted
00:12:25
to do stop taking photos of your kids
00:12:26
and be with your kids think about the
00:12:29
fact that if you put a mobile phone on a
00:12:32
table between the two of you that
00:12:34
changes a conversation it changes your
00:12:36
memory of a conversation and your sense
00:12:38
of connection with them be as human and
00:12:41
find the points in time where technology
00:12:43
can bring that more to you the third
00:12:47
option
00:12:47
our third opportunity is to empower
00:12:50
yourself and this is a World Economic
00:12:52
Forum young global leader called Jeremy
00:12:54
Howard jeremy has an online course in
00:12:57
deep learning where if you have about a
00:13:00
year's worth of coding experience in
00:13:02
some of the very accessible coding
00:13:04
languages like Python you can apply the
00:13:07
latest deep learning techniques
00:13:09
available open source today a were
00:13:13
literally a world-class level so if you
00:13:15
are privileged enough to sit in this
00:13:17
room have access to the Internet know
00:13:19
about Ted videos know about the way that
00:13:22
you can engage
00:13:22
in online learning taking this
00:13:25
opportunity to actually say right now
00:13:27
it's still the opportunity for me to be
00:13:30
on the frontier myself that's something
00:13:33
that I would urge all of us to do and
00:13:34
I'm a lawyer currently going through
00:13:36
this deep learning course and it is mind
00:13:38
blowing the final thing is if we live at
00:13:42
a point in time where the systems of
00:13:45
technology are changing so rapidly the
00:13:47
rules are being written we have an
00:13:50
absolute responsibility to engage in
00:13:53
empowering others our revolutionary is
00:13:56
not a revolutionary just for themselves
00:13:58
or for their family they're a
00:14:00
revolutionary for a broader sense of
00:14:03
community and ideal for future
00:14:05
generations and so to think about the
00:14:08
way that here in Geneva connecting with
00:14:10
others in Carew's connecting across the
00:14:13
international organizations the business
00:14:15
community civil society all the
00:14:17
different parts of the innovation
00:14:18
ecosystem here that make Geneva so
00:14:20
unique that means that you can then
00:14:23
connect with people all over the world
00:14:25
communities who are in a far less
00:14:28
privileged position than us to bring the
00:14:30
same sense of empowerment and the same
00:14:33
sense of opportunities over time and I
00:14:36
want to finish then with a quote from
00:14:39
the Smithsonian Institutes human origins
00:14:42
origins exhibit so starting in about
00:14:45
2011 the Smithsonian had this idea of
00:14:48
looking back in time and asking the
00:14:50
question what is human through the the
00:14:53
artifacts of that we've created over
00:14:55
time and more than 10,000 people put
00:14:59
ideas our answers to the question what
00:15:01
does it mean to be human I love this one
00:15:04
this is just a random from the first
00:15:05
page of about you know a thousand pages
00:15:07
but it says to love to share to express
00:15:11
ourselves to be curious and to wonder
00:15:13
the exact question what does it mean to
00:15:16
be human that's what makes us human so
00:15:20
if you're an alien today or you feel
00:15:23
like an alien this is what to look out
00:15:26
for with one another and also this is
00:15:30
what I really hope that we can do
00:15:31
together here in Courage here in part of
00:15:34
the the TEDx and Ted
00:15:36
and thank you so much for giving me the
00:15:38
chance to raise these issues raised
00:15:41
these questions I look forward to B to
00:15:43
being proudly human with all of you
00:15:45
thank you
00:15:52
you