00:00:01
Watch closely this video interview between a
hiring manager at Vidoc Security Lab and a
00:00:05
job candidate.
00:00:06
"Uh, can you take your hand and put it,
like,
00:00:09
in front of your face to,
like, cover it partially?"
00:00:11
"No, no, it's not a joke,
because I can see that you're using some kind
00:00:20
of software."
00:00:21
The applicant was hiding his true identity by
using a deepfake AI filter.
00:00:25
Something was really off.
00:00:26
I wanted to have proof that he was a real
person or not.
00:00:29
And the first thing that came to my mind was,
00:00:33
"Let's put your hand in front of your face,"
because I knew AI tools that change your face
00:00:39
like Snapchat filters,
they won't work when you partially occlude
00:00:43
the face. I asked him to do it and he knew
that the filter would break and he didn't do
00:00:48
it. At this point, I was like 100% sure that
he was trying to trick me.
00:00:52
"Okay. Thank you. Bye bye."
00:00:55
As AI technology rapidly evolves,
fake AI-generated job candidate profiles are
00:01:00
undermining the hiring process.
00:01:02
Research and advisory firm Gartner predicts
that by 2028,
00:01:05
1 in 4 job candidates worldwide will be fake.
00:01:08
Deepfake candidates are infiltrating the job
market at a crazy,
00:01:12
unprecedented rate.
00:01:14
They are real people seeking jobs,
but they take on a different persona when
00:01:19
they come on screen, and that is enabled by
deepfake technology.
00:01:23
It's very, very simple right now to deepfake
yourself.
00:01:26
All you need is either a static image of you
and a few seconds of audio.
00:01:31
So we're seeing the rise of fake candidates
taking on jobs in organizations in the United
00:01:38
States, European Union and UK.
00:01:40
These candidates take on roles using
deepfake-powered technology and forged
00:01:46
identity documents. This is of huge national
security concern for us,
00:01:51
and we need to take it seriously.
00:01:53
Right now across multiple job postings,
because we are trying to understand the
00:01:58
problem more, we're finding that 16.8% of
applicants turn out to be fake.
00:02:06
What that means is 1 out of every 6
applicants that applies for a job in your
00:02:11
company is a fake candidate.
00:02:13
They either have a fake résumé or a fake
LinkedIn profile,
00:02:17
t he places that they've worked in are fake,
or they're applying from North Korea,
00:02:22
or they're trying to misrepresent themselves
in some sort of way.
00:02:26
Here's how deepfake job seekers are
infiltrating the labor market.
00:02:32
The rise of fake job seekers,
powered by deepfake technology,
00:02:37
is directly correlated to the rise of remote
working,
00:02:41
which kicked off during the pandemic.
00:02:43
And at that time, we had no option but to
work remotely and therefore hire remotely.
00:02:48
So, on a positive side,
it opened up a whole different pool of
00:02:51
candidates, which was otherwise previously
not available to organizations.
00:02:56
Now, the negative side of that then,
people could take on a fake persona and take
00:03:01
on a job which would otherwise have not been
available to them for different reasons,
00:03:06
either because they are from sanctioned
nations or because they need sponsorship and
00:03:10
Visa, or because they were in a different
time zone.
00:03:13
Although corporate America is gradually
returning to pre-pandemic work policies,
00:03:16
remote and hybrid job postings remain well
above pre-pandemic levels.
00:03:19
Companies continue to rely o n virtual
non-in-person interviews because A:
00:03:25
they can interview a lot of people that are
not immediately in the same location, i t's
00:03:28
easier for scheduling a lot of times,
and it is probably true that it's lower cost
00:03:32
to have somebody talk virtually than to have
them come in in person.
00:03:37
But the shift of virtual interviews has also
opened the door to new risks.
00:03:40
Remote jobs unlocked the possibility of
tricking companies into hiring fake
00:03:45
candidates.
00:03:46
This is absolutely a pattern that is linked
to remote job trends,
00:03:50
and the first thing that these candidates ask
about is,
00:03:54
"Am I going to be remote?" Let's face it,
if you're a fraudulent candidate,
00:03:58
you can't show up to the office every day.
00:04:03
Approximately 17% of U.S.
00:04:05
hiring managers have encountered candidates
using deepfake technology during video
00:04:08
interviews, according to a survey from
software platform Resume Genius.
00:04:11
They asked 1000 hiring managers across the
United States.
00:04:14
When you hire a fake worker,
they can gain access to the servers that
00:04:19
contain sensitive data,
and also the person will be able to steal
00:04:23
data from the company.
00:04:24
So, the person will be able to probably talk
with other people and try exfiltrating data
00:04:30
from them.
00:04:31
Once this individual comes into the
organization,
00:04:34
they are then quite likely writing malicious
code or vulnerable code,
00:04:38
which means they are leaving the door open
for other types of fraud to happen to this
00:04:43
company.
00:04:44
Deepfake scams have already cost companies
millions of dollars worldwide and the threat
00:04:47
is only growing. AI-generated fraud,
including tactics like deepfakes,
00:04:51
could cost the U.S. financial sector up to
$40 billion by 2027,
00:04:55
up from $12.3 billion in 2023.
00:04:58
A lot of these deepfake candidates are just
looking for paychecks,
00:05:01
and sometimes they're looking for just one or
two paychecks. And so,
00:05:04
that's the most simple risk.
00:05:06
And then in other cases,
they're remaining on the books,
00:05:09
earning money over time and sending that
money back to wherever is the crime syndicate
00:05:14
that they're connected to.
00:05:17
Fraudulent job seekers can originate from
anywhere.
00:05:20
Fake candidates linked to North Korea have
drawn significant headlines in recent months.
00:05:24
In May 2024, the Justice Department alleged
that more than 300 U.S.
00:05:27
companies had unknowingly hired imposters
with ties to North Korea for remote IT roles,
00:05:32
sending at least $6.8 million in earnings to
the North Korean regime.
00:05:35
The workers allegedly used stolen American
identities to apply for remote jobs,
00:05:39
and employed virtual networks and other
techniques to conceal their true locations.
00:05:43
In the same year, cybersecurity company
KnowBe4 revealed that it also inadvertently
00:05:48
hired a North Korean IT worker using a stolen
U.S.
00:05:51
identity.
00:05:52
He had a really good résumé.
00:05:54
We also asked him to submit credentials so
that we could go through a background
00:05:57
verification. The initial phone interviews
went really well.
00:06:01
We went through a couple interviews,
sent him a laptop,
00:06:04
and within 10-15 minutes of him turning on
the laptop,
00:06:07
IT security reached out to them to find out,
"Hey,
00:06:09
what's going on? You're supposed to be a new
employee turning on a laptop,
00:06:13
and it looks like we're detecting some sort
of a password-stealing trojan trying to be
00:06:17
installed." That person made up a bunch of
weird excuses that didn't make any sense.
00:06:21
And so I think within 31 minutes,
we had already locked down that laptop so
00:06:25
they couldn't do anything. It was pretty
quick response.
00:06:28
"The biggest improvements I want to do is,
like, optimizing the transaction."
00:06:31
Voice authentication startup Pindrop Security
recently caught a deepfake job candidate they
00:06:36
nicknamed Ivan X, who was located in Russia
near the North Korean border.
00:06:40
"So, I've been a first-tech engineer for 11
years."
00:06:43
In the case of Ivan, he was perpetrating to
be from U.S.
00:06:47
and, in one case, in Europe.
00:06:49
In both of these cases,
he was not in U.S.
00:06:51
or Ukraine. When we actually tracked where he
was from,
00:06:55
he was from Khabarovsk region in Russia,
which is very close to North Korea,
00:07:01
but it's not yet North Korea.
00:07:02
We've identified other deepfake candidates
from North Korea itself,
00:07:06
but Ivan himself was from that Khabarovsk
region.
00:07:10
He wasn't from the U.S.
00:07:11
like he was perpetrating to be.
00:07:13
Pindrop says that of all the candidates it
sees, 1 in 343 is linked to North Korea.
00:07:18
Among those, 1 in 4 used deepfake technology
during a live interview.
00:07:22
So when we hire fake candidates,
which are from sanctioned nation,
00:07:26
it becomes a national security concern
because once these individuals are in an
00:07:31
organization, they are taking that salary and
funding activities back in those nations.
00:07:37
And those activities can be illicit as well,
s o inadvertently,
00:07:41
we are funding illicit activities in
sanctioned nations.
00:07:44
The second issue is that when they poison the
algorithms,
00:07:48
when there is an adversarial AI attack within
the organization by this fake employee,
00:07:54
our algorithms create outputs which are not
intended to,
00:07:58
and that can be malicious output.
00:08:01
They can be racial, they can be segregating
and causing a societal divide in our
00:08:06
countries, etc. So, without realizing we are
becoming a problem for social disruption,
00:08:11
which is also a national security concern.
00:08:16
With a number of fake job candidates on the
rise,
00:08:18
concerns are growing about their impact on
remote hiring and corporate recruitment
00:08:22
practices.
00:08:23
If this trend continues and if we will
experience more and more fake candidates,
00:08:28
every company that is hiring remotely will
need to adjust their hiring processes,
00:08:34
and probably they will have to switch to
offline interviews as if it was before Covid.
00:08:41
And, only after the offline interviews,
t hey will be able to hire the person to work
00:08:46
for them remotely. It's expensive,
but it works.
00:08:50
We might also see, unfortunately,
biases in hiring because we might,
00:08:55
for the worry of hiring a fake candidate,
we might actually see a preference for hiring
00:09:00
locally and therefore having all the
interviews in person.
00:09:04
The whole reason you need to worry about
deepfake job seekers is,
00:09:07
at the very least, they're making the real
employees,
00:09:10
potential employees and candidates,
not able to get the job or his job as easy.
00:09:14
It can create all kinds of disruption,
just making the hiring process longer and
00:09:18
more expensive: you know,
potentially, you could even be applying for a
00:09:22
job and someone's not sure whether you're
real or not, and you don't even get that call
00:09:25
and you don't know why you didn't get the
call. It was all because perhaps they saw
00:09:29
something that made them think that maybe
you're a deepfake candidate, even when you
00:09:32
weren't.
00:09:33
The worry you have, of course,
is just the cost.
00:09:35
It's, "do employers engage in less hiring or
just spend a lot more time hiring and
00:09:39
therefore are able to make less hires? Is it
much more frustrating?
00:09:42
Or does it cost more time or resources for
candidates to prove they're real?" And I
00:09:47
think that's the biggest worry in some sense,
00:09:50
is that it slows things down so much that
people that would normally get jobs without
00:09:55
this issue, end up not getting them because
it throws sand in the gears of the process.