00:00:04
[Music]
00:00:28
This is me when I'm feeling Nothing.
00:00:32
No joy,
no sorrow, no pain, just
00:00:36
nothing. ABC,
maybe that's just what
00:00:40
scrolling on your phone is. But
00:00:43
according to my therapist, the problem
00:00:46
is that the numbness is spreading. Okay.
00:00:48
Yeah. I've been working with patients
00:00:49
for about 20 years now and formally that
00:00:53
they come into the the session rooms
00:00:55
complaining about strong emotions that
00:00:58
they have this anxiety or the depression
00:01:00
or feeling abandoned and feeling alone.
00:01:04
But now they have
00:01:06
the opposite. They don't feel anything
00:01:08
anymore. They are what you call it some
00:01:11
kind of flatliner.
00:01:16
What is happening?
00:01:18
It's not just me. So many people sense
00:01:20
that something is messing with their
00:01:22
emotions.
00:01:26
I want to find out why.
00:01:28
[Music]
00:01:30
My job is to make prime time science and
00:01:32
tech documentaries for public
00:01:34
television. Can you can you guys like
00:01:36
maybe wear these? Is that like a thing?
00:01:38
Is that like a thing?
For years, I've
00:01:40
covered stories all over the world of
00:01:42
tech.
00:01:43
[Music]
00:01:45
So, I decide as I travel the world
00:01:48
reporting on stories about the internet,
00:01:50
I'll seek my own answers.
00:01:54
I'll film how cuttingedge technology
00:01:56
hacks our emotions. Every single one of
00:02:00
them.
00:02:04
[Music]
00:02:08
The phalamus is not consciously
00:02:10
controlled.
Speak response in the brain.
00:02:13
kiss my
who are the people that change
00:02:16
the way we feel
and I'll smash you
and
00:02:20
how can I make myself feel again
00:02:36
[Music]
00:02:40
at the dawn of the internet trolls were
00:02:42
some of first to recognize the power of
00:02:44
manipulating emotions. While other early
00:02:47
users of the web saw it as a way of
00:02:50
exchanging simple information,
computer
00:02:52
call from JD Sumner, Elish,
00:02:56
the trolls recognized just how powerful
00:02:58
pushing emotional buttons could be.
00:03:03
Andrew is a prime example of a
00:03:06
present-day troll with an almost
00:03:07
scientific approach to pissing people
00:03:10
off. All
00:03:14
right.
00:03:18
So, I can say something like, um, LOL
00:03:21
kid. Just throwing in the word kid will
00:03:25
get their attention.
00:03:27
Throw in the word baby doll.
00:03:29
Maybe someday
00:03:33
you can cook breakfast for me
00:03:38
in the kitchen. You can do this thing
00:03:40
where you
00:03:42
belong.
00:03:50
He shows me around the adult video shop
00:03:53
in Orlando, Florida, where he works as
00:03:55
manager.
00:03:56
[Music]
00:04:12
Andrew sees himself as something of an
00:04:14
expert in channeling anger. To see what
00:04:17
this could do to someone, I meet with
00:04:19
Kim.
00:04:21
She had been personally trolled by
00:04:23
Andrew for more than a year. Um, I met
00:04:27
him online uh in a Facebook chat room
00:04:32
and he was trolling everyone and playing
00:04:34
tricks on everyone and just trying to
00:04:37
get a negative reaction out of the
00:04:39
group.
Do you sometimes feel like your
00:04:42
emotions are just being totally messed
00:04:45
with? Yeah, it's just it's so
00:04:48
frustrating the way that he just has to
00:04:50
engage with people by pissing them off,
00:04:54
offending them.
Hey, I was thinking
00:04:56
let's do another women's march. The last
00:04:58
one was so effective. What do you say,
00:05:00
ladies? How about we go out tomorrow and
00:05:03
march or are you too busy drinking wine?
00:05:07
Trying to get attention in this way.
00:05:09
almost drawing power from annoying other
00:05:11
people and kind of keeping himself in
00:05:14
your thoughts almost gives him power
00:05:17
almost dominating.
00:05:20
[Music]
00:05:24
This guy did not get enough attention as
00:05:26
a child.
00:05:34
Andrew is an OG internet troll. He sees
00:05:37
himself as an avantguard artist working
00:05:40
pro bono. But at the same time, other
00:05:43
parties see the potential capital gains
00:05:45
of messing with people's emotions.
00:05:50
In China, I see this become the basis of
00:05:52
entire industries.
00:06:09
I enter a large factory complex. The
00:06:12
hundreds of people working here are the
00:06:14
opposite of trolls.
00:06:16
While trolls master anger, these workers
00:06:19
craft a very different product.
00:06:28
Chuni Entertainment is a live streaming
00:06:30
talent agency. From a complex with
00:06:32
countless fake bedrooms, the company's
00:06:35
entirely female talent live stream their
00:06:38
lives 24/7.
00:06:44
[Music]
00:06:49
Yo,ch.
00:06:50
[Music]
00:07:27
Do you think we can move this a little
00:07:28
bit to no? Is that okay?
Yeah.
Jean is
00:07:32
the biggest star at the Love Factory.
00:07:35
No makeup. Makeup.
Mhm.
00:07:39
She's a master at making people far and
00:07:41
wide develop strong feelings for her. I
00:07:44
want to know how she does it. So, I ask
00:07:46
her manager, Emma.
00:08:00
Wow.
00:08:05
[Music]
00:08:16
[Music]
00:08:53
Okay.
00:09:05
[Music]
00:09:21
Truly
00:09:28
for
00:09:47
[Applause]
00:09:47
[Music]
00:09:59
Show Fore
00:10:00
[Music]
00:10:11
[Music]
00:10:21
[Music]
00:10:25
speech.
00:10:28
[Music]
00:10:46
[Music]
00:11:08
She said,
00:11:16
"Sorry, my fault."
00:11:19
[Music]
00:11:21
Maybe it isn't surprising that there's a
00:11:23
huge demand to feel loved. But actually,
00:11:26
it seems there's a market for every
00:11:28
emotion conceivable,
00:11:30
even the ones we try to avoid.
00:11:34
It is true that forces around the world
00:11:37
are messing with our emotions, including
00:11:40
some of the most brilliant minds of our
00:11:41
time.
00:11:48
I'm at the Massachusetts Institute of
00:11:50
Technology
00:11:53
hate
00:12:00
to see how the world of computer science
00:12:02
approaches emotion. I meet with Dr.
00:12:05
Rosalyn Pequard.
Can you uh can you show
00:12:09
me anger now?
That's not allowed.
00:12:14
She's the founder of the field of
00:12:16
effective computing, a branch of
00:12:18
computer science that researches how to
00:12:20
influence human emotion.
00:12:22
[Music]
00:12:24
She's showing me a centagramraph, a
00:12:26
machine that measures your emotions
00:12:29
based on the way your fingers touch a
00:12:31
button.
00:12:34
It's the very first machine ever made
00:12:36
for measuring someone's emotions.
Grief.
00:12:41
And the first one of these
00:12:43
represents
00:12:45
anger
invented by the scientist Manfred
00:12:48
Klein.
Number one,
the centiggraph
00:12:50
assumed that the body channeled emotions
00:12:52
in a universal and measurable way.
00:12:56
Wow.
00:12:58
It was one of Rosalyn's inspirations to
00:13:01
start an entire field of computer
00:13:03
science. Originally when I defined a
00:13:06
effective computing, I defined it as
00:13:08
computing that relates to, arises from
00:13:10
or deliberately influences emotion and
00:13:14
more broadly affective phenomenon.
00:13:17
And when you are crafting the computer
00:13:20
interaction so that you are mindful of
00:13:24
the person's affective state, what it's
00:13:26
likely to be and what you wish it were,
00:13:29
uh then it's an example of a effective
00:13:31
computing trying to directly influence
00:13:34
those emotions with the computing.
00:13:37
What started with the centagramraph has
00:13:39
now evolved into endless new
00:13:41
technologies.
The new pupil would get a
00:13:43
little bit larger.
Okay.
00:13:46
Billions of dollars are being invested
00:13:48
in the companies that make them.
And
00:13:50
these seven core emotions are joy.
00:13:53
They're building a future where paying
00:13:55
customers can see what people are
00:13:57
feeling
at all times.
This is the
00:14:01
future. Everybody is going to read your
00:14:03
emotions.
00:14:05
Do you think that now that people can
00:14:07
measure emotions with such precision
00:14:10
that more people will mess with our
00:14:12
emotions?
00:14:14
People did this before the technology
00:14:16
too, right? They just watch what
00:14:18
provokes people. Oo, I like, you know,
00:14:20
if I pull the girl's hair, does that
00:14:21
provoke her? No. If I kick her in the
00:14:23
shin, that provokes her. I'll keep doing
00:14:24
that. Right? There are people who just
00:14:26
like to get reactions.
00:14:28
um now that you can get them at scale.
00:14:41
I can't help but think that effective
00:14:43
computing companies have a curious way
00:14:45
of looking at our emotions.
00:14:48
One that traces back half a century.
00:14:51
I look into the very image of modern
00:14:53
mind control. And maybe it's not so
00:14:56
scary.
00:14:58
I asked myself, maybe there's part of us
00:15:01
that wants to be controlled.
00:15:05
Sometimes you meet someone who gives you
00:15:07
a new sense of clarity and perspective.
00:15:11
I, the one and only tech, the mistress
00:15:16
of high tech domination.
00:15:19
Mistress Harley holds the trademark for
00:15:21
the term tech domination along with
00:15:24
countless other iterations of the term.
00:15:27
I'll take care of this right away.
Yes,
00:15:29
you will.
I go to Beverly Hills to meet
00:15:32
her.
Good. And grab that
along with her
00:15:34
manservant, Dick Jones.
00:15:37
[Music]
00:15:40
From her home, she shows me the
00:15:41
surveillance footage of the thousands of
00:15:43
men she controls over the internet.
00:15:47
She says that you don't need whips and
00:15:49
chains to truly dominate a human being.
00:15:52
Through using technology to control
00:15:54
their emotions, she can dominate people
00:15:57
on a deeper level than was ever possible
00:15:59
before.
00:16:01
Starting with her tool of preference.
00:16:08
I am Mistress Harley's slave. I'm a
00:16:11
complete loser. Totally pathetic and
00:16:13
weak. I do anything Mistress Harley
00:16:15
tells me to do. Um, if she tells me what
00:16:18
to eat, if she tells me to make
00:16:19
confessional video, she controls what I
00:16:21
wear, so she doesn't let me wear clothes
00:16:23
when I'm home. Um, she makes me wear
00:16:25
this collar. Um, she controls my
00:16:28
privacy, so she can make me make
00:16:30
confessional videos. She has cameras in
00:16:32
my home. She has I have one here in my
00:16:34
room. I have another one in my bathroom.
00:16:36
I'm getting one for my living room. So,
00:16:37
I have no privacy at all. Uh, the only
00:16:39
freedom I have in my life is stuff that
00:16:41
she hasn't decide to dictate to me. Um,
00:16:43
otherwise, I just do whatever she tells
00:16:45
me to do.
00:16:49
If you think about the physiological
00:16:52
sensations of shame or humiliation,
00:16:56
they're very similar to the same
00:16:58
physiological sensations of arousal. To
00:17:00
make somebody feel shame, you need to
00:17:02
get to know them. Not everybody is
00:17:04
ashamed of the same things. Some people
00:17:07
are ashamed of being called a [ __ ]
00:17:09
Other people are proudly [ __ ] So the
00:17:12
more you know about someone, the easier
00:17:15
it is to figure out what they are
00:17:17
ashamed of. And gathering data on
00:17:21
someone will give you great insight into
00:17:23
that. Google knows how to sell you
00:17:25
peanuts. I know how to make you feel
00:17:27
bad. You're bad and you should feel bad.
00:17:36
[Music]
00:17:52
[Music]
00:18:02
that it's fascinating. So, uh, a lot of
00:18:05
similarity, but then a couple of
00:18:07
interesting individual differences. So
00:18:09
Ali had the kind of the four the cor
00:18:11
corugator muscle the for eyebrow plus a
00:18:14
little bit of the upper eyelid rays very
00:18:17
strong 23 uh which is that tightening of
00:18:20
the lip and the buxinator muscle.
00:18:23
Facial action coding turns emotions and
00:18:25
feelings into a simple set of numbers
00:18:28
all discernable through scanning a face.
00:18:32
[Music]
00:18:37
It's the cornerstone of tech that reads
00:18:39
and manipulates our emotions. Joy, love,
00:18:43
sorrow,
00:18:45
just a number 20.
That's the same for
00:18:48
everyone in all places and all times. I
00:18:51
think it's heated because people feel
00:18:55
that emotions
00:18:57
are
00:18:59
sacred qualities of their identity or
00:19:02
what we used to call the soul. And there
00:19:06
that
00:19:08
I am the only one who can be the arbiter
00:19:10
of the meaning of my emotions. There is
00:19:12
this feeling like my passions are just
00:19:16
uniquely who I am. And that's and that's
00:19:19
sacred. Um, and it turns out that's
00:19:21
wrong.
00:19:24
A machine can only understand our inner
00:19:26
world by reducing it to numbers.
00:19:30
But I can't help but think it can't be
00:19:32
right. If our emotions are just numbers,
00:19:36
then the right formula could make us
00:19:38
feel anything.
00:19:41
And we're going to try another pigeon
00:19:43
now. And uh I will try to pick out some
00:19:46
particular pattern of behavior and uh
00:19:49
make it more a more frequent part of the
00:19:52
repertoire.
My thoughts returned to BF
00:19:55
Skinner. After years of experimenting on
00:19:57
pigeons, he discovered something.
00:20:00
Behavior can be molded pretty much any
00:20:02
way you want.
00:20:03
[Music]
00:20:08
You've talked about the need for a
00:20:09
technology of behavior.
Yes. Well, we
00:20:13
certainly do need one. All the great
00:20:15
problems today need a behavioral
00:20:18
solution. How are we going to get people
00:20:20
to stop breeding so much to cut down on
00:20:22
the consumption of goods that are
00:20:24
running? We're running out of supplies
00:20:25
and so on. Stop polluting the
00:20:27
environment, stop beating each other up
00:20:29
personally or uh internationally. And
00:20:33
so, um, these are all behavioral
00:20:34
problems and they have to be solved by
00:20:36
something like a behavioral technology.
00:20:38
It seems to me
he became obsessed with
00:20:40
creating a machine that could build a
00:20:42
better world by controlling everyone's
00:20:45
behavior.
A play pen has been selected
00:20:47
for our experimental environment because
00:20:49
it is one to which the child is well
00:20:51
acquainted. The unfamiliar additions
00:20:54
include a food dispenser loaded with
00:20:56
snacks.
00:20:58
The Skinner box was meant to
00:20:59
revolutionize the raising of children.
00:21:03
By keeping a child confined to this
00:21:05
space, they could be connected at all
00:21:07
times to a powerful technology capable
00:21:10
of molding their behavior.
It is best to
00:21:13
separately establish the function of the
00:21:15
light and sound as stimuli which
00:21:17
reliably signal an upcoming
00:21:19
reinforcement.
00:21:21
What if children could always be
00:21:23
connected to a machine, a device that
00:21:25
could use light and sound to shape the
00:21:28
way they act, think, and feel? No
00:21:31
recurring group of activities
00:21:32
immediately preceding the action of the
00:21:34
light and sound is singled out for
00:21:36
reinforcement in order to avoid the
00:21:39
building in of an unwanted form of
00:21:40
behavior.
00:21:42
Skinner dreamed of a day when this
00:21:44
technology would take over the world.
00:21:53
[Applause]
00:21:58
Maybe Russia wasn't quite what BF
00:22:00
Skinner was dreaming of.
00:22:07
[Music]
00:22:14
[Music]
00:22:37
Ivonne is the fake name of a social
00:22:39
media entrepreneur in St. Petersburg,
00:22:41
Russia.
00:22:43
[Music]
00:22:50
[Music]
00:23:00
Ivonne's main business is the creation
00:23:02
of Tik Tok content funded by the Russian
00:23:04
government.
00:23:10
[Music]
00:23:14
He shows us some videos that represent
00:23:16
his work.
00:23:21
[Music]
00:23:53
[Music]
00:24:00
Heat.
00:24:03
Heat.
00:24:06
[Music]
00:24:13
[Music]
00:24:32
[Music]
00:24:36
World of
00:24:42
[Music]
00:24:57
for
00:25:03
[Music]
00:25:10
life.
00:25:13
[Music]
00:25:15
Good night. Good night.
00:25:17
[Music]
00:25:52
I feel like I've hit a dead end.
00:25:54
Everyone I've met seems to be perfectly
00:25:56
happy living their lives on the
00:25:58
internet.
00:26:00
Am I all by myself?
00:26:04
I think maybe I need to talk to someone
00:26:06
else.
So, near where I live in
00:26:08
Copenhagen, Denmark, I meet with
00:26:10
therapist Morton Fanger.
00:26:15
He's one of Denmark's leading experts on
00:26:17
how the internet affects our emotions.
00:26:20
He's well known for being passionate and
00:26:22
committed to raising awareness of the
00:26:24
issue.
00:26:27
Nobody attend the real world anymore.
00:26:28
They prefer this telephone all the time.
00:26:31
They're just like zombies.
00:26:34
What should I do?
00:26:37
Every day, people come in and out of
00:26:39
Morton's office feeling exactly the same
00:26:41
way I do.
00:26:44
Maybe they're better at talking about
00:26:46
the problems than I am.
00:26:49
So, is it is it okay for it's okay for
00:26:51
us to film or would you rather us like
00:26:53
go outside or
00:26:55
I'm okay with you staying there.
I can
00:26:57
stay here.
Yeah, sure.
Okay. Okay. And
00:26:59
then All right.
Yeah.
00:27:02
You want to Yeah.
00:27:20
foreign.
00:27:23
Yeah.
00:27:33
Uh
00:27:37
[Music]
00:27:42
this this in in
00:27:54
so like how does it happen like why are
00:27:58
we feeling numb? So you have this
00:28:00
concentrate of the feeling of joy, of
00:28:03
anger, of anxiety. Everything is so
00:28:06
concentrated. The internet creates
00:28:08
feelings for people. So they just have
00:28:11
all the feelings served at to them and
00:28:13
they just have to pick them up and eat
00:28:15
them. And then after a while, they feel
00:28:18
empty again. They don't create the
00:28:20
feelings themselves. They don't know how
00:28:22
to do it. It's the first time in history
00:28:24
they have experts to create feelings. So
00:28:27
you have experts to create feelings on
00:28:29
the internet and that's the new stuff.
00:28:32
You're not accustomed to that.
00:28:36
[Music]
00:28:41
Do you have some sort of like personal
00:28:44
reason for your thoughts on the
00:28:46
internet? the the the job or challenge
00:28:49
for me as a father is every day is a
00:28:51
struggle to keep their attention
00:28:54
by me and getting them connected to me
00:28:58
instead of being connected to the
00:28:59
internet. I feel every day I lose every
00:29:03
day I lose ground to the internet.
00:29:06
So I feel defeated by the internet. They
00:29:09
have taken away my children from me.
00:29:13
They have abducted them or kidnapped
00:29:16
them.
00:29:22
I think I finally feel something.
00:29:26
I feel despair.
00:29:30
But then I realize
00:29:32
this feeling is shared by nearly
00:29:34
everyone I've met.
00:29:38
We're all exhausted.
00:29:40
But what about the people who exhaust
00:29:42
us?
00:29:44
Back in Florida, I ask Andrew what he
00:29:46
thinks the reason behind his trolling
00:29:48
is.
00:29:51
Do you think there's like something
00:29:52
behind the trolling?
00:29:57
I don't know. It's just maybe part of it
00:30:00
is
00:30:02
things that I don't want to confront
00:30:04
about myself or things that I hate about
00:30:05
myself. I don't know.
That does what?
00:30:10
Uh,
00:30:18
I don't know. It's hard to explain, man.
00:30:20
I mean, there's got to be a reason for
00:30:21
all the trolling, right? Like,
00:30:27
that's the thing, though. It's like I'm
00:30:28
not necessarily like super mean to
00:30:30
anyone.
00:30:40
I mean, I like getting attention. I
00:30:42
guess not as much anymore, care as much
00:30:45
anymore. But
00:30:48
this might be what messing with other
00:30:50
people's emotions does to a person. But
00:30:53
what I'm starting to see is an entire
00:30:56
world that runs on manipulating
00:30:58
emotions.
00:31:04
Sugar treat.
00:31:06
[Music]
00:31:27
Make it
00:31:30
[Music]
00:31:47
[Music]
00:31:53
again.
00:32:20
[Music]
00:32:23
Yeah.
00:32:25
I didn't invite you,
00:32:29
Le.
00:32:31
Huh?
00:32:37
[Music]
00:32:46
[Music]
00:32:53
Morton, you've been treating internet
00:32:55
addiction for like 20 years. Is there
00:32:57
like a big thing that you've learned?
00:33:00
Biggest horror for me and my profession
00:33:02
is that when people comes to me to have
00:33:05
my service, they actually prefer to have
00:33:10
the internet instead of being alive.
00:33:12
When they have to do it the feelings
00:33:14
themselves, it's too difficult. It's too
00:33:16
hard for them. So they prefer to stay in
00:33:19
the artificial world with these
00:33:21
artificial feelings making them feel
00:33:24
empty.
00:33:27
People want structure. They want
00:33:29
control. They want a power imbalance.
00:33:32
They don't want to be free. They want
00:33:35
someone to tell them what to do and how
00:33:37
to do it. And in some ways, there's a
00:33:38
liberation in submission.
00:33:41
I see that people prefer to feel
00:33:43
nothing, to be left without feelings,
00:33:46
because it's easier. They don't want to
00:33:48
be in the real world because it's too
00:33:50
flat. It's too ugly. It's too gray.
00:33:55
[Music]
00:33:57
It's a much better life on the internet.
00:34:20
I can remember a time when I felt things
00:34:23
stronger than I do now.
00:34:27
Sometimes when I'm scrolling away on my
00:34:29
bed, I ask myself,
00:34:33
is this just how it's going to be now?
00:34:44
Or is there a way out
00:34:53
back in Copenhagen? I'm trying to digest
00:34:56
what I've seen.
00:35:00
[Music]
00:35:10
I decide I want to make something nice
00:35:12
out of everything I've seen.
00:35:16
So, I invite Morton to my editing suite
00:35:18
to get some advice.
00:35:22
Okay. All right. You did all of this,
00:35:25
all these feelings, all these emotions.
00:35:28
I'm fascinated about this. Good movie.
00:35:32
Okay. over there. Propaganda. Tik Tok.
00:35:34
Oh yes. A huge phenomenon. Tik Tok. What
00:35:38
is called in in the Chinese version?
00:35:40
There's another version in Chinese.
00:35:42
Yeah.
00:35:44
Yeah.
00:35:45
[Music]
00:35:47
I don't know what I learned so much
00:35:50
during this. I mean, I
I went out to
00:35:53
find all these people and I guess all I
00:35:56
saw was, you know, that for every
00:35:59
emotion we have
Yeah. There's someone
00:36:02
really good on the internet at
00:36:05
manipulating it.
Yeah. Some some kind of
00:36:08
wizard down out there.
Yeah. There's
00:36:10
like a wizard for every emotion. Yes.
00:36:13
And completely changing the way we feel
00:36:15
it
and overloading us
and warping that
00:36:18
emotion.
Yeah.
I saw that
the the the
00:36:21
crap on our feelings, they control us
00:36:23
completely,
right?
Yeah. And you know,
00:36:27
you don't think it would be like
00:36:31
practically and academically sound for
00:36:35
me to like give this film a happy ending
00:36:38
in some like a hopeful ending in some
00:36:40
way.
00:36:46
Yeah, let me think about it, David. I
00:36:48
think that there are always two
00:36:50
opportunities or two ways to go. The one
00:36:54
way where you change people's
00:36:56
understanding of the world and the other
00:36:58
is where you tell them what to do. So
00:37:01
you can be a prophet saying say no to
00:37:03
the internet because it will harm you.
00:37:06
The other way is that we have a much
00:37:08
better solution. You can have beloved
00:37:09
ones. You can go out and have feelings
00:37:12
right now. Just go out. It's so simple.
00:37:15
It's already the model for a better
00:37:16
world is to be in the real life. It's
00:37:18
already there. It's it's for free. You
00:37:20
just have to be with people you love.
00:37:22
Just do it.
00:37:39
[Music]
00:37:42
I returned to Florida to help Andrew
00:37:45
pack up his house. He's getting ready to
00:37:48
drive to Vermont to move in with a
00:37:50
girlfriend he met on the internet.
00:38:00
[Music]
00:38:08
Comes a point in time where you wake up
00:38:10
next to no one so much that like it's
00:38:14
becomes like the normal. It's
00:38:16
comfortable enough, but
00:38:21
[Music]
00:38:23
she she actually makes me feel like I I
00:38:26
deserve somebody so nice.
00:38:29
The curious thing is that this nice girl
00:38:32
was not just any nice girl. She was a
00:38:35
very specific nice girl.
This guy did
00:38:38
not get enough attention as a child.
00:38:44
[Music]
00:38:57
Hey,
00:39:11
I love you. I love you, too.
00:39:16
Do you see me as like a low down dirty?
00:39:19
No. No. But I see your social media
00:39:23
presence and whatnot as
slime ball cuz I
00:39:26
work at a video store.
Just sort of
00:39:28
irritating at points and it's just like,
00:39:30
come on, man. Just be yourself. Um
00:39:33
because I know that's not who you are.
00:39:37
[Music]
00:39:47
You did it. I'm proud of you.
00:39:51
Just think if this is the hardest thing
00:39:52
that you've had to do in your life.
It's
00:39:54
definitely not
exactly. So, let's keep
00:39:57
going.
Well, usually the hard stuff has
00:39:59
a reason.
The reason for this, you'll
00:40:02
find out.
00:40:09
So close.
00:40:10
I just don't care about it.
Well, what
00:40:13
do you care about? That's something to
00:40:15
explore.
Can I please not do it? I'll do
00:40:17
anything else.
00:40:20
Feeling terrible is like recovering from
00:40:22
surgery. That's feeling terrible.
00:40:24
There's nothing you can do about it. You
00:40:25
can't.
There's something wrong with me.
00:40:26
I'm sick or ill or something, so I can't
00:40:28
do it.
00:40:30
You just You just kind You let your mind
00:40:34
get the best of you. That's fine. I
tell
00:40:36
you, I can't do this. You're I can't do
00:40:38
this. If you just said, "I can do this."
00:40:39
And you just took your time and you
00:40:41
smiled through it, it would be more
00:40:43
enjoyable. It would move quicker.
And
00:40:45
I'm sorry if being like, "Come on, you
00:40:47
can do it. Come on, hurry up, is like
00:40:48
upsetting you, but I just don't know
00:40:50
how."
It's not nothing is upsetting me
00:40:52
besides the fact that I just don't want
00:40:54
to do this anymore. I don't know.
What
00:40:56
if Will you do it for me?
I guess. Yeah.
00:40:59
So, come on. Let's go.
00:41:03
Just a couple
00:41:06
to put on some speed.
00:41:11
Follow my lead.
00:41:14
Oh, how I need
00:41:19
someone
00:41:21
to watch
00:41:25
over me.
00:41:32
Heat.
00:41:34
[Music]
00:41:46
Heat.
00:41:47
[Applause]
00:41:48
[Music]
00:42:02
Won't you tell him please to put on some
00:42:07
speed?
00:42:09
Follow my lead.
00:42:12
Oh, how I need
00:42:17
someone to watch
00:42:21
[Music]
00:42:22
over me.