00:00:00
[Applause]
00:00:00
Good morning and welcome to
00:00:03
build. And here we are uh in 2025
00:00:08
building out this open aentic web at
00:00:11
scale. And we're going from these few
00:00:15
apps with you know vertically integrated
00:00:17
stacks to more of a platform that
00:00:21
enables this open scalable agentic web.
00:00:25
More
00:00:26
importantly, you know, it's all about
00:00:28
expanding that opportunity for
00:00:31
developers across every layer. Uh we
00:00:34
have a bunch of new updates we're
00:00:36
rolling out at build starting with
00:00:37
Visual Studio. It is the most powerful
00:00:40
IDE for .NET and C++. Uh make and we're
00:00:43
making it even better, right? NET 10
00:00:46
support, a live preview at design time,
00:00:49
improvements to Git tooling, a new
00:00:52
debugger for crossplatform apps, and
00:00:54
much much more. And when it comes to VS
00:00:57
Code, just a couple of weeks ago, we
00:01:01
shipped we shipped our 100th release in
00:01:04
the open. It included improved
00:01:06
multi-wind support and made it easier to
00:01:10
view stage directly from within the
00:01:13
editor. And GitHub continues to be the
00:01:16
home for
00:01:17
developers. GitHub enterprise has
00:01:20
tremendous momentum in in the
00:01:22
enterprise. And we're doubling down for
00:01:25
developers building any
00:01:27
applications. Trust, security,
00:01:30
compliance,
00:01:32
auditability, data residency are even
00:01:35
more critical today. As GitHub copilot
00:01:39
has evolved inside VS Code, AI has
00:01:42
become so central to how we code. And
00:01:46
that's why we're open sourcing C-pilot
00:01:48
in VS Code. We're really excited about
00:01:51
this.
00:01:54
Starting today, we will integrate these
00:01:56
AI powered capabilities directly into
00:01:59
the core of VS Code, bringing them into
00:02:02
the same open-source repo that powers
00:02:05
the most world's most loved uh dev tool.
00:02:09
In fact, we're building app
00:02:10
modernization right into agent mode,
00:02:12
right? So, Copilot now is capable of
00:02:15
upgrading frameworks like a Java 8 to
00:02:18
20, Java 21 or .NET 6 to 9 and migrate
00:02:23
any on premise app to the cloud. And the
00:02:25
next thing we're introducing is an
00:02:27
autonomous agent for site reliability
00:02:30
engineering or S sur. The S sur agent
00:02:33
starts automatically triaging root
00:02:36
causing mitigating the issue and then it
00:02:39
logs the incident management report as a
00:02:42
GitHub issue with all the repair items.
00:02:45
Uh and from there you can even assign
00:02:47
the repair items uh to GitHub copilot. a
00:02:50
full coding agent built right into
00:02:54
GitHub, taking C-pilot from being a pair
00:02:57
programmer to a peer programmer. You can
00:03:00
assign issues to Copilot, bug fixes, new
00:03:05
features, code maintenance, and it'll
00:03:08
complete these tasks auto autonomously.
00:03:11
And today, I'm super excited that it's
00:03:14
now available to all of you. I don't
00:03:17
think since teams launched we've had an
00:03:20
update of uh this uh level and it really
00:03:24
brings together chat search notebooks
00:03:28
create and agents all into this one
00:03:31
scaffolding that's intuitive right I
00:03:34
always say this is the UI for AI and
00:03:37
chat for example is grounded both on web
00:03:40
data as well as your work data and
00:03:42
that's the game changer especially with
00:03:44
pages uh search works across all of your
00:03:48
applications, right? Whether it's
00:03:50
Confluence or Google Drive or Jira or
00:03:52
Service Now, not just M365 data. Uh with
00:03:56
notebooks, I can now create these
00:03:58
heterogeneous collections of data,
00:04:00
right? In fact, I can have chats and
00:04:02
pages and any documents, emails all in
00:04:06
that collection. Um and then in fact, I
00:04:09
can get all these audio reviews or
00:04:11
podcasts out of it. Um, you know, I can
00:04:14
use create, you know, to turn a
00:04:16
PowerPoint into a new explainer video or
00:04:19
generate an image. Um, and when it comes
00:04:22
to agents, we have a couple of special
00:04:24
agents like researcher, right? It's been
00:04:26
perhaps the biggest game changer for me
00:04:28
because it's synthesizing across both
00:04:31
the web and enterprise sources, right?
00:04:33
Applying deep chain of thought reasoning
00:04:35
to any topic or any project. Uh, analyst
00:04:39
goes from raw data across multiple
00:04:41
source files. I can just upload a bunch
00:04:43
of Excel files. It will g get the
00:04:47
insights. It'll do forecast. It'll do
00:04:49
all the visualizations. All of the
00:04:51
agents you build can now show up in
00:04:54
Teams and in Copilot. And you can ask
00:04:56
questions, assign action items, or kick
00:04:59
off a workflow by just at mentioning an
00:05:02
agent in a chat or meeting. And with the
00:05:05
team's AI library, building multiplayer
00:05:08
agents is easier than ever. it now
00:05:10
supports MCP and with just one line of
00:05:14
code you can even have it create enable
00:05:16
A2A uh and you can add you know things
00:05:19
like episodic or uh semantic memory by
00:05:23
using Azure search and a new retrieval
00:05:25
system which I'll talk about later and
00:05:27
as a developer you can now publish and
00:05:30
this is the biggest thing right now you
00:05:31
can build an agent you can publish your
00:05:33
agent to the agent store and have them
00:05:36
discovered and distributed across both
00:05:39
copilot
00:05:40
and teams providing you access to the
00:05:42
hundreds of millions of users and
00:05:44
unlocking that opportunity. Today, we're
00:05:46
introducing a new class of
00:05:49
enterprisegrade agents you can build
00:05:51
using models fine-tuned on your
00:05:54
company's data workflows and style. We
00:05:58
call it co-pilot
00:06:00
tuning. You know, Copilot can now learn
00:06:03
your company's unique tone uh and
00:06:06
language, and soon it'll even go further
00:06:09
understanding all of the company's
00:06:11
specific expertise and knowledge. All
00:06:13
you need to do is seed the training
00:06:16
environment with a small set of
00:06:17
references and kick off a training run.
00:06:20
The customized model inherits the
00:06:23
permissions of all the source control.
00:06:26
uh and once integrated into the agent,
00:06:29
it can deploy to authorized users. Our
00:06:32
new model router will automatically
00:06:34
choose the best OpenAI model for the
00:06:37
job. No more sort of those, you know,
00:06:39
manual model selections. Uh an approach
00:06:42
today though goes from having apps that
00:06:45
you built or agents you build only bind
00:06:48
to one model to truly becoming
00:06:51
multimodel. Uh that's why today we're
00:06:54
thrilled to announce Grock from XAI is
00:06:57
coming to
00:06:59
Azure. When you have multiple models,
00:07:01
what you need is a new capability in how
00:07:04
you use these models. And now you can
00:07:06
provision throughput once on Foundry and
00:07:10
you can across you can use all that
00:07:13
provision throughput across multiple
00:07:15
models including Grock, right? That's
00:07:17
just a gamecher in terms of how you
00:07:19
think about uh models and model
00:07:21
provisioning and the foundry agent
00:07:24
service lets you build declarative
00:07:27
agents in fact just with few lines of
00:07:29
code just on the port in the portal uh
00:07:31
for complex workflows it supports
00:07:33
multi-agent orchestration uh and I'm
00:07:35
excited to share that now the agent
00:07:38
service is generally available and we're
00:07:40
making it straightforward for example
00:07:42
for you to connect Foundry to your
00:07:44
container app or functions uh and deploy
00:07:47
deploy any open-source model into AKS uh
00:07:50
whether it's in the cloud or in hybrid
00:07:52
mode with ARC and you can now take a
00:07:54
model uh fine-tune it in or post-
00:07:58
trainin it uh in Foundry and then drop
00:08:00
it right into Copilot Studio so that you
00:08:03
can now use that post-train model to
00:08:05
automate a workflow or build an agent.
00:08:07
This healthcare agent orchestrator that
00:08:09
Stanford used is now available to
00:08:11
everyone in Foundry. It's pretty
00:08:13
awesome. You know, we now have new
00:08:16
observability features coming to Foundry
00:08:19
to help you monitor and manage AI in
00:08:22
production. Uh you can track the impact,
00:08:25
quality, safety as well as cost all in
00:08:28
one place. Uh with Entra ID, agents now
00:08:33
get their own identity, permissions,
00:08:36
policies, access controls. uh the agents
00:08:39
you build in Foundry and Copilot Studio
00:08:42
show up automatically in an agent
00:08:44
directory in Entra. Uh we're also
00:08:47
partnering with Service Now and Workday
00:08:49
to bring automated provisioning and and
00:08:51
management to their agents via Entra.
00:08:54
And when it comes to data governance,
00:08:57
Purview now integrates with Foundry,
00:08:59
right? So when you write an agent,
00:09:01
automatically because of Purview, you
00:09:03
can ensure end-to-end data protection.
00:09:05
Another massive safety consideration. Uh
00:09:08
and on the security side, uh defender
00:09:11
now integrates with Foundry. So that
00:09:13
means uh your agents are also protected
00:09:16
just like an endpoint would be from
00:09:18
threats like wallet abuse or credential
00:09:21
theft by with defender. Now we want to
00:09:25
bring the power of this app server and
00:09:28
app building capability to the edge and
00:09:31
clients as well with Foundry local uh
00:09:34
which we're announcing today. You know
00:09:36
it includes a fast higherformance
00:09:40
runtime models agents as a service uh
00:09:45
and a CLI for local app development. And
00:09:48
yes it's fully supported on Windows and
00:09:51
the Mac. We're excited to announce the
00:09:54
Windows AI Foundry. You know, Windows AI
00:09:58
Foundry is what uh we used in fact
00:10:01
ourselves internally to build features
00:10:04
SDK and now we're extending this
00:10:06
platform to support the full dev life
00:10:09
cycle right not just on co-pilot PCs but
00:10:12
across CPUs, GPUs, NPUs, all and in the
00:10:16
cloud, right? So you can build your
00:10:17
application and have it run across all
00:10:20
of that silicon. And Foundry local is
00:10:22
built into Windows AI Foundry. So you
00:10:25
can tap into this rich catalog of these
00:10:28
pre-optimized open-source models that
00:10:30
you can run locally on your device.
00:10:33
We're announcing native support for MCP
00:10:37
in Windows. Windows will now include
00:10:40
several built-in MCP servers like file
00:10:43
systems, settings, app actions, as well
00:10:46
as windowing. Uh, and we're adding
00:10:48
native MCP registry that lets MCP
00:10:51
compatible uh, client discover the
00:10:55
secure MCP servers that have been vetted
00:10:58
by us for security performance all while
00:11:02
keeping you in control. We first
00:11:04
announced Bash on Ubuntu on Windows
00:11:09
nearly 10 years ago. Uh it subsequently
00:11:12
became what we obviously call, you know,
00:11:14
today
00:11:15
WSL. Today we're making WSL fully open
00:11:19
source. And so we're announcing today,
00:11:21
and you all should go check out the code
00:11:23
in the GitHub repo, uh NL web. It is a
00:11:26
way for anyone who has a website or an
00:11:29
API already to very easily make their
00:11:33
website or their API uh an Agentic
00:11:36
application. We're in integrating Cosmos
00:11:39
DB directly into Foundry. So that means
00:11:42
any agent can store uh and retrieve
00:11:45
things like conversational history. Um
00:11:48
and soon they'll be able to do uh also
00:11:51
use Cosmos for all their rag application
00:11:54
uh needs. uh and we're taking it further
00:11:57
uh with Azure data bricks uh connecting
00:11:59
your data in genie spaces or in AI
00:12:02
functions to foundry. The other very
00:12:04
cool capability is now inside of a
00:12:07
Postgress SQL query, you can have LLM
00:12:09
directly, you know, LLM responses
00:12:11
directly integrated. Uh we're bringing
00:12:13
Cosmos DB to fabric too, right? Because
00:12:16
AI apps need more than just structured
00:12:18
data. Uh they need semi-structured data
00:12:21
whether it's text, images, audio. And
00:12:24
with Cosmos and Fabric and your data
00:12:26
instantly available alongside SQL uh you
00:12:30
can now unify your entire data estate
00:12:32
and make it ready uh for AI. And uh
00:12:35
there's a lot more. In fact, we are even
00:12:37
building our digital twin uh builder
00:12:39
right into fabric. Uh now you can you
00:12:42
know very easily take digital twins with
00:12:45
no code, low code. Uh as you can see
00:12:48
here, you can map the data from your
00:12:50
physical assets and systems super fast.
00:12:53
Uh we're also announcing you know
00:12:56
shortcut transformations in one link.
00:12:58
You can think of this as AI uh driven
00:13:01
ETL. You can apply all these pre-built
00:13:04
AI powered transformations you know
00:13:06
audio to text or sentiment analysis when
00:13:09
it's data is coming in summarization
00:13:12
uh all powered by foundry straight into
00:13:15
fabric. So in fact the largest
00:13:18
GB200based supercomputer is going to be
00:13:20
Azure. And so we're very very excited
00:13:22
about scaling this um and making it
00:13:25
available to all of you as developers.
00:13:27
We're bringing together the entire stack
00:13:30
I talked about today um and saying look
00:13:33
let's apply it to science and the
00:13:35
scientific workflow the scientific
00:13:37
process uh that's our ambition with
00:13:41
Microsoft discovery which we are
00:13:43
announcing today. It understands the
00:13:46
nuance knowledge in the scientific
00:13:49
domain right from public domain as well
00:13:52
as your own data if you're a biioarma
00:13:54
company discoveries built on foundry
00:13:57
bringing advanced agents uh highly
00:14:00
specialized in R&D not just for
00:14:03
reasoning but for conducting research
00:14:05
itself. It's great to see how companies
00:14:08
across every industry are already using
00:14:10
uh discovery to accelerate their R&D and
00:14:13
I can't wait to see this in the hands of
00:14:15
more R&D labs all over uh and what they
00:14:18
can do. So that was you know a quick
00:14:21
comprehensive whatever you want to call
00:14:23
it walk uh through the full stack and
00:14:27
how we're creating new opportunity for
00:14:29
you across the agentic web. uh we're
00:14:32
talking we're taking really a a systems
00:14:35
approach a platform approach which you
00:14:38
can expect from Microsoft across every
00:14:40
layer uh of the stack. back.