00:00:00
Computers used to take up a whole room
00:00:02
and were only as powerful as a basic calculator!
00:00:05
Since then, computers have
gotten a lot more impressive.
00:00:09
And I’m not talking about the one
you’re watching this video on,
00:00:12
or even the ones powering groundbreaking
experiments at your local university.
00:00:16
Although those are very cool,
and they’re doing a great job.
00:00:18
I’m talking about the world’s
most powerful supercomputers.
00:00:23
These are massive machines doing
trillions of calculations every second
00:00:28
to process data about everything
from the tiniest cells
00:00:31
in our bodies to distant galaxies.
00:00:33
If your laptop is a minivan,
these things are like Ferraris.
00:00:37
And lucky for us, there’s a Top500
list that keeps track of them.
00:00:40
I’m not a supercomputer, so I
can’t tell you about all of them.
00:00:44
But I can tell you about the top 5.
00:00:47
Including what makes them so powerful,
00:00:50
where they get that power
from, and what on Earth …
00:00:53
or in space … they’re doing with it.
00:00:55
[intro music]
00:00:59
Let’s start with the fifth most
powerful computer in the world:
00:01:02
the HPC6 from the Italian
energy company Eni, in Rome.
00:01:06
This computer was just switched on in 2024,
00:01:10
and it has a whopping 477.9 petaflops of power!
00:01:16
No, that’s not the name
for how it lands in a pool.
00:01:18
In fact, keep this computer far
away from large bodies of water.
00:01:22
A petaflop is a unit to
measure a computer’s power,
00:01:25
which is basically how fast
it can carry out calculations.
00:01:27
For computers, just like sports cars, this
is the name of the game. Speed = power.
00:01:33
Flops is an acronym which stands for “floating point
operations per second.”
00:01:37
It still sounds like it might
have something to do with water …
00:01:39
or maybe beach sandals.
00:01:41
But anyways, here’s what it’s really about.
00:01:44
Computers work by breaking
down a task into basic math,
00:01:48
adding and multiplying
numbers, trillions of times.
00:01:51
Each addition or multiplication is a “flop.”
00:01:54
The “peta” part refers to the fact that these
computers do huge amounts of “flops” every second,
00:01:59
as in something like 10 to the 15th power!
00:02:01
a standard desktop computer
is generally in the billions of flops.
00:02:05
But the HPC6 does more than 400 quadrillion
additions or multiplications every second!
00:02:12
It needs that much power because
Eni is an energy company.
00:02:16
A lot of HPC6’s computing power
goes toward energy research
00:02:20
like developing better batteries.
00:02:22
As cool as that research is,
00:02:24
the most interesting thing about HPC6
00:02:26
is that it’s an example of how high-performance
computing can be done sustainably.
00:02:31
Supercomputers like HPC6 have a pretty
hefty energy impact for two main reasons:
00:02:37
firstly, they need energy to run,
00:02:39
and two, they need energy to power
the systems that cool them down.
00:02:43
But HPC6 is located at Eni’s Green
Data Center that efficiently controls
00:02:48
the temperature of computers,
using direct liquid cooling.
00:02:52
That means they’re running a solution
00:02:54
that’s weirdly similar to antifreeze
close to the computer chip.
00:02:58
This antifreeze solution pulls
heat from the computer directly,
00:03:02
which is more efficient than letting
it just spread out into the air.
00:03:05
Then, the facility can even redirect some of the
00:03:07
heat from the computer to other parts of
the building when the weather is cold.
00:03:11
So even though it’s one of the most
powerful computers in the world,
00:03:15
HPC6 has an astonishingly low energy impact.
00:03:19
It uses one third or less power than
most of the other Top 5 supercomputers!
00:03:24
Speaking of, at number four on the list,
00:03:26
we have Eagle, with a power of 561.2 PFlops.
00:03:32
And the cool thing about this one is that
00:03:34
it’s not just used for specialized
tech and research applications.
00:03:38
It’s a supercomputer you can
access from your own couch.
00:03:42
That’s because Eagle is part of
Microsoft’s Azure cloud computing services,
00:03:47
which provide computing power to a
large number of individual users.
00:03:51
“Cloud” computers aren’t based in one location.
00:03:54
Instead, their servers, storage,
and software are in the cloud,
00:03:57
or accessed using the internet.
00:04:00
This is useful because cloud computers
00:04:02
can split up a lot of power into smaller pieces
00:04:05
that can be sent to customers as needed.
00:04:08
See, huge computers like
Eagle have tons of “cores”,
00:04:12
distinct computing units that can
be divided out to different tasks.
00:04:15
Eagle has more than 2 million of them
in data centers around the world,
00:04:20
and each core can do calculations on its own.
00:04:22
If a user needs more than one, they need
to be able to communicate efficiently.
00:04:27
And since we’re in “computer time”, on the
scale of trillions of calculations per second,
00:04:32
this means fast.
00:04:33
Eagle does this by using a
networking method called InfiniBand
00:04:36
that connects servers and individual computers.
00:04:39
While many networks become slower
00:04:41
as they try to process more and more data
00:04:43
at the same time, Inifiniband sends
packets of data one at a time,
00:04:48
allowing data to move through the
network quickly and efficiently.
00:04:51
This communication makes a wide
variety of applications possible,
00:04:55
from training machine learning
models to setting up a huge database.
00:04:58
So Eagle’s power comes not just from
its ability to do calculations quickly
00:05:02
but from its ability to communicate
those calculations quickly with itself.
00:05:07
The next three computers on the list
take power to a whole new level.
00:05:10
They’re all “exascale” computers,
meaning they can reach 10^18 Flops per second.
00:05:16
And at 1,012 PFlops, the world’s
third most powerful computer
00:05:21
is Aurora at Argonne National
Lab in Lemont, Illinois.
00:05:25
his computer is twice as powerful as Eagle.
00:05:28
One reason for its performance
is its standout GPU.
00:05:31
A computer’s GPU and CPU …
00:05:33
the Graphics Processing Unit
and Central Processing Unit …
00:05:37
act as the computer’s brain.
00:05:39
They’re the things doing all those
calculations and making the computer compute.
00:05:43
But while the CPU processes a wide
variety of tasks one after the other,
00:05:47
the GPU is more specialized.
00:05:50
It’s particularly good at doing a
lot of the same task in parallel,
00:05:54
or at the same time.
00:05:55
And as it happens, this is useful
for a lot of scientific applications,
00:05:59
which are Aurora’s bread and butter.
00:06:02
Argonne and other labs use Aurora as
a resource for computational research,
00:06:06
which can leverage supercomputer
power in a few different ways.
00:06:09
First, Aurora can be used to analyze data.
00:06:12
We have tons of scientific data out there,
00:06:14
ranging from telescope images to genome sequences,
00:06:17
and it takes lots of computing
power to make that data usable
00:06:21
or get any information from it.
00:06:23
Second, Aurora can be used to generate data.
00:06:26
For example, simulating a system
00:06:28
we don’t have data for at the required scale,
00:06:31
like the constant tiny motions
of molecules in our body,
00:06:34
and how they interact with one another.
00:06:36
And third, Aurora can be used
to make predictions from data.
00:06:39
Training machine learning models
on existing data can find patterns
00:06:42
that let us use what we know to make
educated guesses about what we don’t.
00:06:46
All three of these applications
involve doing the same thing
00:06:50
over and over and over again
on slightly different inputs,
00:06:54
making them perfect tasks for a good GPU.
00:06:57
And since the hardware is up to the task,
00:06:59
researchers using Aurora are building tools
00:07:02
that make the best use of the
computer’s power and speed.
00:07:05
For example, one of these tools is a
machine learning model called MProt.
00:07:09
Scientists can use it to study
proteins, important molecules
00:07:12
that allow cells and organs to function.
00:07:14
Being able to change proteins or make new ones
00:07:16
can be helpful for things like
improving agriculture, treating disease,
00:07:21
and even breaking down plastic in the environment.
00:07:24
Of course, running machine learning models
00:07:25
can take a lot of time and
a lot of computing power.
00:07:29
Luckily, Aurora has plenty of the latter,
00:07:31
and MProt is designed to
run tasks at the same time.
00:07:34
This makes it faster and easier for
scientists to get the predictions they need.
00:07:39
We’re getting close to the
fastest computer on Earth.
00:07:41
But before I tell you about
the last two, a quick ad.
00:07:45
Thanks to Saily for supporting this SciShow video!
00:07:47
Saily is an eSIM app that helps you stay connected
00:07:50
with a local SIM card in over 200 places
around the world from a single installation.
00:07:56
That’s the power of eSIMs. Just download
an app, pick a plan, and install.
00:08:01
It’ll activate instantly when
you land in that cool new spot.
00:08:04
You can choose between a global or regional plan
00:08:07
depending on where your travels will take you.
00:08:10
And all Saily eSIM plans are
compatible for iOS or Android devices.
00:08:14
If it doesn’t work on your
phone, you get a full refund.
00:08:18
And even if it does, you have
a 30-day money-back guarantee.
00:08:21
And there’s chat support available 24/7
for help at any point along that process.
00:08:27
Even SciShow’s Staff Writer went on a weekend trip
00:08:29
that would have been a lot smoother with Saily.
00:08:32
She took a Spanish rideshare
from Madrid to Salamanca.
00:08:36
But at the end of the trip, she couldn’t find
the meetup spot for the car back to Madrid!
00:08:40
The driver said they were across town.
00:08:42
So she had to pull up a map
fast with no time to find wifi.
00:08:46
After spending more money than
she would have if she had Saily,
00:08:49
she made it home.
00:08:50
You don’t have to be like Emma.
00:08:51
You can download the Saily app
and use code SCI at checkout
00:08:55
for an exclusive 15% discount
on Saily eSIM data plans.
00:09:00
We’re getting close to the
fastest computer on Earth.
00:09:00
Aurora’s power is just beat
out by the number two computer.
00:09:03
Frontier, at Oak Ridge National Lab in Tennessee,
00:09:07
clocks in at 1,353 PFlops.
00:09:11
Like Aurora, it’s used to
accelerate scientific research.
00:09:15
But it beats Aurora in power
because a computer’s power
00:09:18
is about more than just the computer itself.
00:09:21
On top of the energy, space, and cooling
systems it takes to run a supercomputer,
00:09:26
its power also relies on software
that uses that hardware effectively.
00:09:30
An inefficient application slows the
computer’s performance way down.
00:09:34
To figure out the best software
implementations for Frontier,
00:09:38
Oak Ridge National Lab launched the Center
for Accelerated Application Readiness,
00:09:42
or CAAR, project.
00:09:44
This project studied a variety of
scientific software across different fields,
00:09:49
from biology to physics, to
optimize the use of Frontier.
00:09:52
One example is CHOLLA, a software
00:09:55
for simulating the behavior
of gases inside galaxies.
00:09:59
This type of simulation can
help scientists understand
00:10:01
how different types of galaxies formed over time,
00:10:04
and predict the fates of stars
and supernovae within them.
00:10:08
CHOLLA is optimized for
parallel computing on GPUs,
00:10:11
which is already pretty good
for a scientific supercomputer.
00:10:15
But to really take advantage of
Frontier’s hardware and organization,
00:10:19
scientists had to make a few adjustments.
00:10:21
Originally, CHOLLA split its
calculations between the GPU and CPU.
00:10:25
This meant that most of the data
stayed on the CPU side of things
00:10:28
while the program ran,
00:10:29
and the program needed to spend extra
time sending data back and forth.
00:10:32
But with exascale computers like Frontier,
00:10:35
the GPUs are usually the star of the show,
00:10:37
and it’s more efficient to not
make them share the limelight.
00:10:40
So in a new version of CHOLLA, researchers
00:10:43
moved data storage and multiple
calculations over to the GPU,
00:10:47
which was quite a task.
00:10:49
It involved some heavy
editing of the original code,
00:10:52
and writing a whole new software package!
00:10:54
But their hard work paid off,
because the new version of CHOLLA
00:10:57
could run on Frontier 8 times faster than before.
00:11:01
Thanks to the optimizations,
00:11:03
the code now works with Frontier’s
hardware to get lightning-fast performance,
00:11:07
helping scientists simulate galaxies
at a rate that’s out of this world.
00:11:11
And that brings us to the world’s
top most powerful supercomputer:
00:11:15
the aptly-named El Capitan,
with 1,742 PFlops of power
00:11:21
El Capitan is also a supercomputer used
for scientific research at national labs.
00:11:26
But it edges out Frontier and Aurora
because it’s just built different.
00:11:30
See, its cores use an architecture
00:11:32
that combines the powers of
the CPU and the GPU into one –
00:11:36
the APU, or Accelerated Processing Unit.
00:11:40
Combining them makes it easier
to send calculations to each GPU,
00:11:44
and makes parallel computations a lot faster.
00:11:47
This boost in speed opens doors
for even heftier applications,
00:11:51
including one known as “Cognitive
simulation”, or “CogSim”.
00:11:54
This application involves three
parts: analyzing experimental data,
00:11:59
running simulations, and
training machine learning models.
00:12:02
That’s all the stuff Aurora can do.
00:12:04
The difference with El
Capitan is that it’s working
00:12:07
to make machine learning predictions more accurate
00:12:09
by doing all of those things together.
00:12:12
Scientists at El Capitan’s home
lab in Livermore, California,
00:12:17
use simulation, machine learning,
and experiment in a cycle.
00:12:20
Experimental data and simulated
data help train models,
00:12:24
which can be used to predict experimental
results and improve simulation.
00:12:27
Those predictions then go back
into running new simulations
00:12:30
and designing new experiments,
00:12:32
and the cycle continues.
00:12:33
With this iterative process, each new step
00:12:36
doesn’t just provide more insight
into the system being studied,
00:12:39
it also slowly improves the
methods we use to study it.
00:12:43
They call this process cognitive simulation
00:12:45
because it uses machine
learning to more systematically,
00:12:48
or “intelligently”, improve simulations.
00:12:50
And implementing a m ore efficient
way to blend together simulation
00:12:55
and experiment using machine learning
doesn’t just inform new experimental designs;
00:12:59
it can also automate them.
00:13:01
One group used CogSim during data collection on
how lasers interact with high-energy plasmas,
00:13:06
basically swimming pools of dense,
super hot electrons that resemble stars.
00:13:11
The idea was to help scientists
understand the extreme environments
00:13:15
that facilitate things like nuclear fusion.
00:13:17
Using CogSim models sped up experiments
00:13:20
by automatically adjusting the lasers
00:13:22
and iterating on each previous experiment,
00:13:24
which helped scientists build better
models out of their data in real time.
00:13:28
As you might expect, looping through
these steps is a long and slow process.
00:13:33
But El Capitan’s APU gives it a power boost
that lets scientists speed up the loop,
00:13:38
iterating through the pieces
of scientific discovery
00:13:41
faster than any previous computers could.
00:13:43
These five sports cars of the computing world
00:13:46
are making huge strides in
technology and scientific research,
00:13:50
allowing us to learn more things
more efficiently than ever before.
00:13:53
They still take up entire rooms,
00:13:56
but we’ve come a long way
from the basic calculator.
00:13:58
And the Top500 list is
constantly changing! Next year,
00:14:02
a new supercomputer might dethrone El Capitan.
00:14:05
The bigger they are, the harder they Pflop.
00:14:07
[ outro ]