00:00:00
(Host)
The nine people
00:00:01
on stage are shaping global artificial intelligence policy.
00:00:05
Representatives from America, the European Union,
00:00:08
global tech giants.
00:00:10
A year ago,
00:00:11
the question was how to make AI safely work for people.
00:00:15
Now America's answer is clear —
00:00:17
that's optional.
00:00:19
(JD Vance)
The AI future is not going to be won
00:00:22
by handwringing about safety.
00:00:24
(Sarah Myers West)
I was in the room
00:00:25
during JD Vance’s speech,
00:00:27
and it did really feel like a shift
00:00:30
in the entire space around AI regulation.
00:00:35
(Host)
Right now, AI systems are deciding who gets medical care,
00:00:38
loans, job interviews,
00:00:40
often illegally.
00:00:42
And no one's watching.
00:00:43
Which is exactly what the tech industry wants.
00:00:46
Now, the most powerful companies
in the world are mounting
00:00:49
an unprecedented
00:00:50
lobbying push to make sure it stays that way.
00:00:53
(News Clip)
State lawmakers could soon be barred
00:00:55
from regulating artificial intelligence for the next decade.
00:00:58
(Brian Merchant)
I guess ‘shocked’ is the wrong word.
00:01:00
More ‘horrified’, right?
00:01:02
(Host)
And they're calling it a favor.
00:01:04
(Sundar Pichai)
The biggest risk could be missing out.
00:01:06
(Host)
But here's what the executives don't want you to know —
00:01:10
why the rush?
00:01:11
Why buy influence at every level of government?
00:01:14
Why risk dismantling legal protections
00:01:16
that took decades to build?
00:01:17
Because behind the promise of a golden age for humanity,
00:01:21
they're hiding something much more cynical.
00:01:27
About six years ago,
00:01:29
a health insurer worked with hospitals to use
00:01:30
AI to determine patient care.
00:01:33
The algorithm saw less money spent on Black
00:01:35
and poor patients, and concluded they needed less care.
00:01:39
This wasn't a lab experiment,
00:01:40
but something deployed across 200 million Americans.
00:01:44
We have laws that make this illegal.
00:01:46
In fact,
00:01:47
we have a whole bunch of rules
00:01:48
to protect people from all sorts of abuse.
00:01:51
Antitrust laws to keep powerful businesses in check,
00:01:54
protections for consumers
00:01:57
and children,
00:01:58
rules that keep your medical records private,
00:02:01
civil rights, labor protections,
00:02:04
equal opportunity,
00:02:06
intellectual property,
00:02:07
and resting on top
00:02:08
this imperfect foundation is us, society.
00:02:12
But right now, companies are using
00:02:14
AI in ways that break these laws.
00:02:16
And most of the time, nobody
knows until the damage is done.
00:02:20
(Sarah)
Much of what we know is the product of,
00:02:22
you know,
00:02:23
whistleblowing, or investigative journalism
00:02:25
or independent research,
00:02:27
rather than companies doing
their due diligence and making
00:02:30
sure that, in the first place,
00:02:31
they're complying with the laws that we already have.
00:02:34
(Eric)
That's Sarah Myers West.
00:02:35
She previously advised the Federal Trade
00:02:37
Commission on artificial intelligence.
00:02:40
(Sarah)
The way that AI’s opacity works is that it
00:02:42
makes it harder for us to seek accountability.
00:02:46
(Eric)
Artificial intelligence executives tend
00:02:48
not to talk about this.
00:02:50
Instead, we hear stuff like this —
00:02:52
We can cure all disease.
00:02:53
We can travel to the stars.
00:02:55
We have like unlimited power.
00:02:57
(Eric)
To be clear,
00:02:57
there are some really promising applications of AI,
00:03:00
but the biggest ones are in the future.
00:03:03
Here's what we know is happening right now.
00:03:06
Amazon hoarding recordings
00:03:07
of your kids indefinitely.
00:03:10
Your biometric data being harvested and sold
00:03:12
to police departments.
00:03:14
Companies rejecting
00:03:15
job applicants based off of facial analysis.
00:03:18
Health insurers are accused of using
00:03:20
AI to mass reject medical claims.
00:03:23
Landlords may use it to collude around rent prices.
00:03:27
There's probably more.
00:03:29
We don't know, because there's no federal laws
00:03:31
that require companies to tell us what's actually going on.
00:03:36
So that's where the states stepped in.
00:03:39
(Brian Merchant)
Right now,
00:03:40
states like California, Colorado
00:03:44
and some others are considering bills
00:03:47
that sort of address AI.
00:03:50
(Eric)
Brian Merchant covers technology.
00:03:52
He's been watching Silicon Valley's
00:03:53
AI push from the beginning.
00:03:55
These bills? According to Brian, it's basic stuff.
00:03:59
Follow the law,
00:04:00
be transparent about AI use,
00:04:02
common sense
00:04:03
guardrails to ensure AI is subject to the same rules
00:04:06
as everything else.
00:04:07
Then in May, something changed.
00:04:09
Buried deep in Trump's
00:04:11
"big, beautiful" budget bill was a provision.
00:04:13
(News Clip)
Now, the tax bill bans states from regulating
00:04:16
AI for a decade.
00:04:18
(Brian)
I guess shocked is the wrong word.
00:04:20
More horrified, right?
00:04:21
It’s pretty clear that the end game, if they're successful,
00:04:25
is just no rules, no meaningful regulations around
00:04:29
AI at all.
00:04:31
(Eric)
After week of negotiations,
00:04:32
Republicans ultimately dropped the ban.
00:04:35
(Brian)
Just the fact that the Republicans and Silicon Valley
00:04:39
are willing to sort of team up to try to do this,
00:04:43
I think, is worrying enough,
00:04:46
and it's sort of a sign of what's to come.
00:04:51
(Eric)
How does an industry manage to exert
00:04:53
so much power that it could make it
00:04:54
illegal for state and local governments
00:04:57
to govern them at all?
00:04:58
For a decade?
00:05:00
Follow the money,
00:05:00
and that's really all you have to do.
00:05:02
(Eric)
That's Rumman Chowdhury,
00:05:04
former AI envoy for the State Department,
00:05:06
used to run Twitter's ethical AI team.
00:05:09
Past tense, because Elon Musk fired her.
00:05:12
Since ChatGPT
00:05:13
triggered the current AI gold rush,
00:05:15
five major tech companies have poured over $100 million
00:05:19
into lobbying the federal government.
00:05:21
But there's more to it.
00:05:23
We talk a lot about lobbying — there are also other ways
00:05:26
in which companies influence
00:05:27
what happens in the policy space.
00:05:29
(Eric)
Major tech companies bankroll academic research,
00:05:32
steering the whole field.
00:05:34
(Rumman)
This was not necessarily maliciously done, right?
00:05:36
This comes from the fact
00:05:38
that these companies have so much more money
00:05:40
than academic institutions.
00:05:41
(Eric)
In some ways, they're richer than the government.
00:05:44
(Theodora Skeadas)
Foundations within this community have funded
00:05:47
interns or full time staff in Senate
00:05:51
and congressional House offices around the U.S.
00:05:55
as public interest technologists.
00:05:58
(Eric)
Theodora Skeadas worked with Rumman at Twitter.
00:06:01
She was also a victim of Musk.
00:06:03
But it goes a lot deeper than paying
00:06:05
for congressional staffers to advise politicians.
00:06:08
It's the actual bureaucracy itself.
00:06:11
(Theodora)
The U.S. AI Safety Institute,
00:06:14
housed in the National Institute for Standards
00:06:17
and Technology
00:06:18
under the Department
00:06:19
of Commerce, in the executive branch, increasingly
00:06:23
has absorbed more individuals from this ecosystem.
00:06:26
(Eric)
All of this influence —
00:06:27
the lobbying, the academic funding,
the staffing pipeline —
00:06:30
it's having exactly the effect they want.
00:06:35
I started this reporting focused on Texas.
00:06:38
Policy experts told me that Republicans
00:06:40
there were working on a fantastic AI bill.
00:06:43
We’re setting the tone, and it’s being
used as a model in other states,
00:06:46
to say, “Hey, this is data, it’s important, let’s protect it.”
00:06:50
(Eric)
I called, emailed.
00:06:52
None of the sponsors would talk to me about it.
00:06:54
Then a few weeks later, this happened.
00:06:57
(News Clip)
Representative Giovanni Capriglione
is celebrating the success
00:07:00
of his omnibus
00:07:01
artificial intelligence bill.
00:07:03
(Eric)
But the bill was completely rewritten.
00:07:05
Gutted.
00:07:05
A former FTC
00:07:06
lawyer told me that it gave the industry
00:07:08
exactly what they wanted.
00:07:10
Politicians are falling in line.
00:07:12
But the public? People want more
00:07:14
AI regulation, not less, by almost 3 to 1.
00:07:18
And when you've spent that much money
00:07:20
and people still think you're full of shit,
00:07:22
there's only one move left.
00:07:24
We’re in a race with China,
00:07:26
and my view is if they’re gonna be killer robots,
00:07:28
I’d rather they be American killer robots than Chinese.
00:07:31
(Sarah)
There's for a long time
00:07:33
been this discourse that emerges at moments
00:07:37
where AI is very close to being more stringently regulated
00:07:41
around there being an arms
race between the U.S. and China.
00:07:44
Be afraid. Be afraid.
00:07:46
Pure evil exists, okay? It does exist.
00:07:50
Okay, and these people, they plan on dominating us
00:07:54
in AI, in chips.
00:07:57
(Eric)
Industry voices argue that
00:07:58
if we subject AI to all the usual rules,
00:08:01
it will slow down AI development
00:08:03
and cause the U.S. to lose the AI race to China.
00:08:07
There's just one potential problem with their argument.
00:08:10
(Rumman)
China actually has a very developed, responsible
00:08:13
AI framework.
00:08:14
(Theodora)
They are one of the best regulated
00:08:16
AI environments in the world.
00:08:18
I'm concerned that
00:08:21
this kind of competitive language
00:08:26
is a Trojan horse for deregulation,
00:08:30
and where regulation is warranted.
00:08:32
(Eric)
Even accepting the China threat at face value,
00:08:35
there's no reason to deregulate, because China isn't.
00:08:39
In that case, what's the ultimate goal of AI companies?
00:08:42
I asked everyone I spoke with.
00:08:44
Consolidation of power.
00:08:46
Maximize profits.
00:08:47
Power.
00:08:48
Profit.
00:08:49
(Eric)
And this is the dirty little secret of the industry.
00:08:54
(Sarah)
So the AI industry has actually been in some ways
00:08:57
in real financial trouble.
00:08:59
I'm standing here outside a Meta data center.
00:09:01
Investors have been pouring money
00:09:02
into AI infrastructure projects.
00:09:04
This one has cost about $700 million.
00:09:07
Overall, it's tough to give an exact amount
00:09:09
on the industry's investment
00:09:10
because all of this money is private.
00:09:12
But everyone more or less guesses
00:09:14
it's at about $200 billion.
00:09:17
(Brian)
There is no clear path to profitability.
00:09:20
(Sarah)
The original idea was that
00:09:22
AI was going to be sold to businesses,
00:09:24
but that has failed to materialize
00:09:26
for all of the reasons that AI just isn't working
00:09:29
all that well in the first place.
00:09:32
(Brian)
These systems require the use of immense amounts
00:09:36
of resources, compute.
00:09:37
They assume that if they get enough
people to use a product
00:09:42
like ChatGPT, if it's popular enough, then there just has to be a way.
00:09:50
(Eric)
In February, a hedge fund did some back envelope math.
00:09:53
If you strip out investments in data centers and training,
00:09:56
how much money was actually spent on buying actual
00:09:59
AI products and services?
00:10:00
About $16 billion, maybe less.
00:10:03
To actually turn a reasonable profit?
00:10:05
They need to raise that number to about $200 billion.
00:10:09
Industry revenue is growing,
00:10:11
but the money behind it is getting nervous.
00:10:14
Exact estimates vary, but the general thesis has been
00:10:17
backed by major venture capitalists and investment banks.
00:10:20
And there's one thing that potentially complicates
00:10:22
business cases for these systems —
00:10:24
our existing laws.
00:10:27
(Brian)
If there are any road bumps into how
00:10:29
they can do that,
00:10:30
then selling enterprise AI software suddenly becomes
00:10:33
less appealing, both to them and to the clients.
00:10:37
(Eric)
These road bumps are society's foundations.
00:10:40
Fair credit reporting violations —
00:10:43
when AI denies loans without proper disclosures.
00:10:46
Fraud statutes —
00:10:47
when AI systems hallucinate
00:10:49
financial results and deceive investors.
00:10:51
Equal employment violations —
00:10:53
when you weed out candidates from women's colleges.
00:10:55
Civil rights violations — when a biased data set doesn't
00:10:58
suggest medical care to poor or Black people.
00:11:03
And these protections are exactly
00:11:04
what deregulation is trying to undo.
00:11:07
(Sarah)
I don't see how that's good for anybody,
00:11:09
except for the few firms
00:11:10
that are making a ton of profit off of it.
00:11:14
(Eric)
But here's the thing about companies
00:11:16
that spend
00:11:17
hundreds of billions of dollars
00:11:18
on products with no clear path to profitability —
00:11:21
they try to create their own reality.
00:11:23
(Rumman)
So a lot of terms are being thrown around by CEOs
00:11:27
that sound incredibly impressive.
00:11:28
Artificial general intelligence is one.
00:11:30
(News Clip)
That’s a theoretical type of AI
00:11:32
that possesses human intelligence and can perform any intellectual task a person can.
00:11:38
We said from the very beginning we were going to go after AGI.
00:11:41
(Eric)
I'm sure you've heard it before.
00:11:43
How long before we’re literally dealing with a god?
00:11:45
It used to be that like AGI was this very binary moment.
00:11:49
OpenAI has had a definition
00:11:52
of AGI in its charter, and I quote from its charter,
00:11:56
“Highly autonomous systems that outperform humans
00:12:00
at most economically valuable work.”
00:12:05
(Eric)
Microsoft and OpenAI have an investment deal
00:12:07
that further defines AGI.
00:12:09
In the contract language, specifically,
00:12:12
that they would consider a system AGI
00:12:16
once it had generated $100 billion in profits.
00:12:20
Let that sink in.
00:12:21
Artificial general intelligence isn't about curing cancer
00:12:24
or American dominance.
00:12:26
It's about whatever makes them $100 billion, and suddenly
00:12:29
everything clicks.
00:12:31
The state preemption bills, the lobbying,
00:12:33
the policy capture, the China scare tactics,
00:12:36
all of it designed to eliminate
00:12:37
any regulation that might slow their path to power
00:12:41
and profit.
00:12:42
So, just keep that in mind
00:12:45
whenever you hear of, you know, OpenAI
00:12:47
talking about how it's going to cure cancer
00:12:49
or build a technology that's going to benefit
00:12:53
all of humanity — this is the aim.
00:12:55
It's to build
00:12:57
the most potentially profitable, the most business-friendly,
00:13:01
the most worker-crushing technology in history.