Originally aired on May 23 @ 12:00 PM - 12:30 PM EDT
In this episode, host João Tomé is joined by Cloudflare Product Managers Nevi Shah and Dina Kozlov to go into the world of MCP (Model Context Protocol), and why it’s gaining traction fast.
We explore what MCP is, why it matters now, and how developers can easily spin up their own MCP servers on Cloudflare. The episode includes live demos, practical use cases, and a look at how companies like Stripe, PayPal, and Anthropic are getting involved.
We also cover the latest from the AI and agent ecosystem, including announcements from Microsoft Build, Google I/O, OpenAI’s new responses API, and Anthropic’s growing MCP support.
To wrap things up, Nevi and Dina answer a round of fast questions:
- What’s the biggest technical challenge with MCP today?
- Best surprise you’ve seen from a developer?
- Favorite new server or integration?
- And: Why should people try MCP right now?
We also highlight recent posts on the Cloudflare blog.
Mentioned blog posts:
Three, two, one. And what's amazing is that when you're interacting with an MCP server, especially one that's, for example, built on Cloudflare that has memory, it can keep track of the conversation history.
It can understand you better.
So every time that you make a tool call, it gives you better and better results.
But the really cool thing is that, you know, MCP plus LLM, it can explain to you as a user, like, hey, Dina, you like to do this and that.
Do you want to be able to do all of these things?
And it's tailored to me, to my use case, to the projects that I've been building.
It's exciting. It's an exciting time. And I think when you start building with agents, when you start kind of understanding, like, the future and the possibilities, people's eyes really, really open.
You kind of realize, like, wow, this is the future.
Like, this is where we're headed. Hello, everyone, and welcome to This Week in NET.
It's the May the 23rd, 2025 edition. I'm your host, Juan Tomei, based in Lisbon, Portugal.
Today, we're going again to talk about one of the hottest things happening in AI and developer platforms, MCP, or Model Context Protocol.
And with me, I have Nevi Shah and Dina Kozlov from our product team.
Thanks for joining. Thank you for having us. Where are you both based?
We're both usually based in New York, but I'm actually in San Francisco this week.
For those who don't know, can you give us, like, a very quick run -through of what is your current job at Kaltler?
What do you do specifically? Dina?
So I'm a product manager on the developer platform team these last few months since MCP has launched.
I've been working with our engineering team to figure out how do we make it as easy as possible to build and deploy MCP servers.
But aside from that, I also get to work on our team that allows you to scale workers.
It's called Workers for Platforms.
That allows you to extend our developer platform for a website building or some kind of platform like that to extend to their end customers so they can customize their web pages, APIs, et cetera.
Nevi? Yep.
And I am also a product manager on the workers team. Worked on pages for some time, but now focus more on workers.
In our broader developer platform, I work specifically on growth, trying to drive more developers to our platform.
And so this concept and topic of MCP has really been interesting to me as it relates to how developers are building applications and what tools they're using today.
We had a few episodes, actually, a couple of episodes.
One about Developer Week that was in early April.
We did a recap of Developer Week and we had an episode with some use cases of MCP.
MCP is really recent. For those who don't know, what can we say about what is MCP?
Why is it so relevant and so many people are talking about it specifically?
So I think the best way to understand what it is is maybe to show it, to show what life is like without it.
So I'm sure a lot of the viewers have used some kind of AI tool like maybe ChatGPT or Cloud or Cursor to help with some kind of task.
It's where a lot of people are turning today. And so let's say that I'm a new business owner and I need to figure out how to generate a PayPal invoice.
So I'm going to ask Claude, hey, can you help me generate a $2 invoice for socks?
And so Claude is really smart. It has a lot of information and knowledge built into its large language model.
Okay, well, it's going to try to code because it always tries to go above and beyond.
It just helped me figure out how to do it. But essentially what it's going to do is it's going to give me instructions on how I can do this.
It's going to give me step -by-step. You can do this, you can do that.
Here's the information that you need. How can I help you format this information?
But it's still up to me as the user to go and complete each one of these tasks to go interact with the PayPal service and do what I need to do.
Now, imagine I could have my AI agent like Claude integrated with my PayPal account.
So let's say I add an integration here.
And to really integrate into my PayPal account, I need to connect it to my login.
I log in, it then brings me back to Claude.
And imagine it now knows about my PayPal account.
And so now if I ask it for help with generating an invoice. Invoice for $2 socks for, I've just done this exercise enough that I know what it needs.
But imagine I ask it for help with this.
And now instead of giving me instructions, what it can actually do is because it's integrated with PayPal, it can actually go and generate that invoice for me.
Once it gets all the information that I need. And essentially get me to the end result as quickly as possible.
So think of it as without MCP, AI agents are really good at telling you how to complete tasks.
But it's still up to you as the user to complete them.
With MCP, you get to integrate the large language model LLM with your favorite services so that it can actually do things for you.
So sorry, I was gonna say it generated the invoice. And here it is, which is pretty amazing.
I did not really have to go and figure anything out or even go to the PayPal dashboard or interface to be able to do this.
And so MCP servers, basically, if you saw on Dina's screen, there was a little box that popped up that says, you know, do you want to use this tool?
So MCP servers basically expose a set of tools to the LLM.
And based on what the user prompts, the LLM can pick which tools it thinks is applicable to whatever the user is needing.
After you set it up like you did, it's already connected. It's already authenticated, right?
So you don't need to do it every time. What you did is like an integration from the get-go.
After that, you have the process already in place. So it could be automatic.
You could do it for you every time. Correct. And you don't always have to use one that requires authentication.
There's some open MCP servers, for example, for searching through documentation where you don't need to log in.
But the nice thing about being able to log in is that any interactions with an MCP server, you'll be able to see them within your dashboard or the interface that you use for that service.
And it's going to stay consistent and have information about your account.
What's also nice about it is that you can use Cloud, for example, from your laptop or from your phone.
And because it is all done over the Internet, you can just log in from any of the devices and maintain that integration.
You can do many things more than usual, even with your phone, because it's already integrated in your chat box.
Exactly. Think about, I think this is a really cool example of just how powerful natural language can be, right?
Think about what this could mean if, for example, I don't know, Expedia had an MCP server, you could literally say, Hey, Expedia, book me a flight.
It calls all the tools and you have a flight booked and you're going to Cancun tomorrow.
Right. So if you think about what Dina showed, but apply it to any of the other day-to-day services that you use as a consumer, that developers use as a consumer, any sort of industry, right?
MCP is a very applicable set of tools. I've seen recently even news organizations using it for their own content, their own archive.
Like you can explore that news organization archive with much detail using these types of tools, which is kind of interesting.
Also developers or libraries, archives to have the agent explore in depth what is there.
It can also be interesting even without an authentication specifically.
Yeah. So Nevi has been helping build out all the Cloudflare MCP servers.
And what I love about it is we have an MCP server that allows you to build applications on our developer platform.
And you can just talk to the agent in natural language.
You can say, Hey, I want a website that does this, or I want a service that does X, Y, and Z.
And you don't even have to know how to code.
It can write all of that code for you and deploy it. And so really take you to that end state.
And it instantly makes not just services, but skills like this accessible to a much broader audience.
Things that before would have taken maybe hours, weeks of training.
You now no longer need to have that expertise to be able to do something.
Yeah. It's like a superpower. Yeah, it is. Think about like, we think about this at Cloudflare too, right?
Cloudflare is a technical company, right?
There's technical products involved. A lot of developer documentation we have, it takes sometimes like lots of training to understand all of the services we offer.
So just being able to go into Cloud and say like, Hey, what can DNS provide me?
What are the benefits of using my CDN? What configurations can I put on my CDN to make it more performant, to make it more secure?
Those are things that consumers can do.
But I also think it's really interesting to think about some of the use cases for different companies, right?
So just taking Cloudflare as an example, if I was working on an account team and I wanted to kind of give recommendations to my customer, right?
Or if I wanted to check the usage of my, a certain customer that I had, like that's something that is actually not enabled right now because of authentication permissions, but a use case that we could kind of think about and companies could start thinking about in the future.
So super interesting to think about how that kind of relates to internal operations.
So not just customer teams, but also support teams, product teams, right?
There's a lot of insight that we can get super easily without having to be experts in like Salesforce and, you know, all of these different products that our customers are using.
So super, super neat to think about it from that aspect. And on that note, actually, Navi, you wrote a blog post on May the first, about 13 new MCP servers from Cloudflare that people could use specifically.
There's one that I like, which is, I like many, but one is close heart, which is radar.
It's there as well, but there's many others for people to explore.
So I actually, I got involved in the MCP server projects and kind of like deploying Cloudflare MCP servers because I was really interested in the developer persona, right?
So I kind of, as Dino was saying, I work a lot on bringing developers to tools that they need to deploy workers applications.
MCP seems like a really, really good use case for that.
So we started building out a binding server. Workers has this concept of bindings where you can create and bind these primitives to your workers.
So like a database, a key value store.
And we started thinking about how MCP could sort of create tools so that you could just create those things within cloud.
And then from there we thought, well, what if I could, what if we could just create a container for a user and then they could literally just run and install any packages that they need.
They could run commands. They could talk to Git. And then we thought, well, what if, what else would a developer need, right?
They would need observability.
So now what you can do as a developer using our observability server is you can say, Hey, I deployed my worker, but I'm running into this weird error.
How can I fix it?
Right? Then it can read the workers logs and it can gather all of that information.
So we basically, without any rhyme or reason, we basically started creating MCP servers, right?
And I think now we're sort of thinking about what the strategy behind, well, how many MCP servers do we need and how many MCP servers are too many MCP servers and how many tools are too many tools.
But I think that the good part about this blog post is it kind of shows the broad use cases of everything that MCP can support.
So everywhere from our digital experience monitoring to our AI products, to our storage products, our DNS products, like all of those MCP servers are kind of ones that we launched today.
And we're really looking to expand our use cases and kind of see how people are using them.
But I will kind of just, maybe I can share my screen and just sort of show the different ones that we launched.
So yeah, the radar server is super cool. I think that's one that you actually don't need to be authenticated in, but I love sort of using the radar server to think about accessibility of websites.
So radar is basically something that can sort of scan a website and think about the underlying technology.
So sometimes I'll, on my team, on the growth team, we'll build a new external website, like a marketing site or something.
And I'll drop the link into cloud and I'll say, Hey radar, can you tell me what the accessibility is like?
How's the performance of this?
Do you have any recommendations of how I can make this better?
And so really cool tools like that, that are just super easily accessible to product teams and engineering teams.
So one of the other things that I think is really, yeah, sorry.
One of the things that I think is really interesting as well is as we were building out these MCP servers, we actually learned a lot about what makes a good MCP server, a good MCP server.
Like, so for example, when we first started building out the MCP server, we had a ton of tools and in this MCP server, we basically made this a one-to-one mapping of our API.
And then we realized with that, the LLM actually got pretty confused.
It would just call a bunch of random tools and it wouldn't actually be an effective output.
And so we realized it's not actually a one-to-one mapping of the API that you need.
It's not just a wrapper around your API. We're actually taking use cases and maybe calling multiple APIs, right?
Maybe we're just thinking about the jobs to be done.
What does the user actually want to do? And then thinking about whether the APIs would then support that use case.
And so we've instead kind of built out a bunch of MCP servers with a specific use case in mind.
We've also made authentication specific to different servers because different servers need different permissions, right?
You may want to have write permissions on workers for one server, but you may not want to have write permissions on another server for a different user.
And so I think kind of important to sort of separate that.
So if you notice when you're using some of these Cloudflare MCP servers, or when you're using any MCP server, you have to authenticate with every server that you use.
And I think the last thing that I would say that we learned is when you're actually building out these MCP servers, every tool has a description.
So we found that tool descriptions were super, super important when actually writing the functions of your MCP server.
So every time there was a tool, for example, for our binding server, creating a KB namespace is a tool, create KB namespace.
Then we basically had to write a super good tool description that would match what the user would maybe put in for their prompt so that the LLM knows got to go find the create KB tool to use.
So really important that tool descriptions when you're building out your MCP servers are good in natural language and can match whatever the user is putting in as well.
That helps the user, but also the LLM, right?
Exactly, exactly. And Dina, the blog post you wrote was more about how 10 leading AI companies built MCP servers on Cloudflare.
And there's a lot there. Entropic, we already mentioned, Asana, Atlassian, Block, Intercom, PayPal, already mentioned Stripe.
There's so many. But that was like May the 1st. We're now in several days later.
So things have changed quickly, right? Yes. So we had this initial launch of remote MCP servers.
And Entropic, they launched the first big remote MCP client, cloud.ai.
And before this, if you used MCP, it was very much bound to your device.
You had to find the MCP servers somewhere in a GitHub repo. You had to essentially almost install it locally, have some kind of API token.
And there was, of course, we started making progress in the remote world, but I know there was still a lot of skepticism.
And so I think having all of these companies adopted and having Entropic put it out there and not even labeling it as MCP, but labeling it as integrations really takes a big step to having it be this service that is used by anybody, not just developers.
And by making it remote, it is a lot more secure because before that- The remote versus local?
So local means that it is running on your machine.
So before, if you used MCP, you would have to use it from cloud code, for example.
And if I log into my cloud code on this laptop, and then let's say I use another computer, my MCP configuration would not persist because it is bound to my device.
And local was important to get the protocol out there, to get it in developers' hands, to get it to start, to evolve into what it is today.
But it did have a lot of security concerns because a lot of developers were creating MCP servers for services that they didn't own.
And so lots of people were nervous. And especially anybody on a security team at a company is like, what MCP servers are my employees using?
It says this is a Figma MCP server, but it's not the one that Figma made.
How do I know that it's not just taking API tokens and doing something malicious and grabbing user data?
And this is where remote is now like everything runs over the Internet.
It is exposed over an endpoint. So it can be mcp .example.com.
But it's actually a lot more similar to how we trust a website or an API endpoint.
It's hosted by the actual service provider. It integrates with their actual authentication system and authorization.
And it removes a lot of these concerns because you're no longer going from, you know, it's still great that people are developing these and it should be open source, but a bit more of a playground to more of these are legitimate services and integrations that are built So remote is more the standard.
It's more the typical way people use now. Exactly. Exactly.
And the only reason why it is not, why we're still talking about local versus remote is because a lot of MCP clients like Cursor, Windsurf have yet to upgrade to support it.
And so Cloud is a bit ahead of the curve. They also is from Anthropic who developed the protocol.
But once they do support it natively, I think local is only going to be used in cases where you do want to grab data from a specific device.
For example, terminal logs, use cases like that. But that's going to be, I would say like the tail end.
For those who just want to try it out, may that be developers or not, because this is not for developers only for sure.
What would be the use case? You say, Hey, we already show one with PayPal, but the go-to of, Hey, if you want to try it out to see it play, what people can use potentially?
Yes. I would honestly recommend starting with. So like I said, my whole job these last few months has been to make it as easy as possible to deploy these and our engineering team absolutely crushed it.
And so you can use this deploy to Cloudflare button, which actually Nevis team built out.
And it takes you straight to your account.
And from there, it takes a boilerplate example of a remote MCP server.
It already has two hello world tools built into it. And really all you have to do here is add a name.
So I can add, you know, remote MCP net.
Mostly because still the docs page. Sorry, you guys did not miss anything. It just took me.
When you click it open that, that's right. It opened this. Exactly. You can go see it for yourself and double check my work, but it is already connected to my GitHub repo and I can, it already had a name prefilled.
I've just deployed that one many times.
So I just added net. There's really nothing else that I have to click on.
Just create and deploy. And so now it's going to build the MCP server and deploy it.
And the nice thing about building your MCP server on Cloudflare is that it's automatically deployed to our global network.
So that makes it highly available.
It also makes it very performant because it is automatically, not just closer to the end user, but closer to wherever AI agents that are going to be talking to it may be hosted.
And so we will, we will let this build real quick.
But honestly, my biggest advice is whatever idea you may have, take this boilerplate and then go ask Claude, give it to your idea and say, Hey, I want to build an MCP server that does X, Y, and Z.
It can write out the code for you to build out those tools.
And the cool thing is that you can use this to bring any functionality that you want directly into Claude.
So I'll even give you an example. I still need to get my, my demo working.
So I won't showcase it today. Yesterday, I built a book recommender MCP server and it uses workers AI to, um, uh, to grab the recommendations.
But the nice thing is that it can, it's logged in with my GitHub account.
It remembers me as a user and I can tell it, here's what I like. Here's what I don't like.
Here's what I've read. And it stores that history and it keeps it stored.
And so every time I ask it for a recommendation, it gives me a better and better result.
And so I can close out my tabs, leave my computer. I can come back a month later.
Let's say I read a book. I can tell that I read this book. It will pull up the conversation history from its state and it will add it to the list of, you know, preferences, knowledge that it has.
And when I ask it for a recommendation, it'll give me something even better.
So just an example, a personal application specifically there, right?
Exactly. But in this example, so it was deployed here.
So I'm going to copy this and then bring this over to our playground.
This is a remote MCP client that we built out. We mostly built this because we continue to follow the protocol very closely.
And we are almost the first every time to adopt the latest changes to the standard.
And of course, as soon as we bring support, for example, in on the server side, we instantly want to be able to test it out.
And so instead of waiting on clients to add that support themselves, we keep this playground up to date so that anybody that's future-proofing their MCP server can instantly test this out and not have to wait for cloud to support.
But I'm going to add the endpoint here and then I'm going to add dash SSE.
That is just where MCP servers run.
And then I will connect it. And here you can see it now has two available tools, one to add numbers, one to calculate.
And so I can ask it, can you add two and three?
And this will call the tool and it will add the numbers and it will give me the response.
And so this is, again, just very boilerplate.
Think of this as a hello world. But if you actually look at the script, the actual GitHub repo, all this is, is we've added the rest of the code so you don't have to worry about, transport, MCP protocol specifics.
All you have to define as a developer is the tool itself.
And so it can really be as simple as, this tool is called add.
This is what it does. Here's some code for calculate. It can now multiply.
And so you can add anything in here. It's just code. What's cool is you can basically, you can also paste in an API spec, right?
And you can say, here's what I want the MCP server to be able to do.
Can you pull these APIs? It's really fun to build tools for LLMs while also using LLMs to build those tools.
And Dina, I know there are a couple of players right now that are also kind of in the server generation space.
Do you want to talk about any of this? Sure. In terms of the server generation, I mean, of course there is different companies.
I think anybody that's investing in their own agents SDK is definitely looking at MCP server generation.
Right now we are very focused on MCP, but I think very soon in the future, it is all just going to be AI agents that we interact with.
And MCP is just going to be a feature that they have.
And so an AI agent can itself be an MCP server that you talk to, but it itself can also be a client that can then talk to other agents and get tools from them.
And, you know, this can continue to chain. But I think right now MCP is very focused on user talking to an agent using a service.
But I think we're quickly going to start seeing it evolve to agent to agent interactions, but also just making sure to be able to support both types of interfaces use cases.
Makes sense. This week there were several events, not only related to MCP, but SCP was around in a way.
We had Microsoft event. We had a Google big developer event full of announcements.
Yesterday, OpenAI announced remote MCP servers.
What can we say in terms of this week of the new things on MCP?
So the two big things are OpenAI. They launched their MCP support for their responses API.
So think of this as you can use their API endpoint to be able to talk to their models, but you can also now use it to be able to connect to MCP servers.
And so they handle, you don't have to worry about the transport piece, the off piece.
All you have to do is add the URL of the MCP server you'd like to connect to and the token.
And so that is one way that you can now bring MCP into ChatGPT and then, or just into the OpenAI ecosystem.
And they are working on native support in ChatGPT.
And then Anthropic, they just announced support in their connector API. I'm just double checking.
Yes, it is called the connector API. And it's the same thing as OpenAI's API endpoint.
It allows you to essentially, it's like the Anthropic API that allows you to interact with their models.
But now through this API endpoint, you can also connect to remote MCP servers and be able to do even more if you're already using those APIs to build different applications or your agents, for example.
And so I hope this serves as a motivation for even more people to go and start building remote MCP servers.
Both the APIs actually rely on remote.
Neither of them use local. So, you know, if you want to get started, go to our documentation.
I'm also really excited about seeing more MCP clients adopting remote MCP because I think that seems to be like a little bit of a bottleneck today.
So I think as we see more MCP servers being built, as we see more demand for MCP servers, I think it's important that MCP clients are also adopting the remote MCP kind of setup.
The possibility is there specifically, right? People can pick and choose different aspects and have agents interact with different clients in a sense, right?
Yeah, I mean, think about Cloud. If you think about like Cloud AI today, right?
It's a basic chat and not basic, it's not just basic, but there's more that it can do.
But if you think about something like a cursor or a windsurf where it's actually generating code and it's an IDE, if you think about the possibilities of remote MCP servers, right?
I think right now it only supports local.
But if you think about what a user could do and how much faster a developer could go in sort of an IDE experience, it's really quite exciting.
I have a quick questions area of the segment here for you.
So one can reply, one can add. So feel free to answer.
But let's start. So biggest technical challenge with MCP today?
Biggest, you go, Nevi. I was just going to say, kind of reiterating what I had previously said, which is that the availability, or not the availability, but the clients that exist today, right?
There's not many clients that kind of are adopting the remote MCP.
And I think as servers are being built out, there's a handful of clients, right?
And our playground that we've built is kind of one of them.
So I'm really interested to see what other players are sort of entering the market to build different good MCP client to MCP server relationships.
I would say one of the, in a way, technical challenge, in a way, I think it's just the spec is still evolving and this is a new protocol.
But there's a lot of things, and Nevi was talking earlier about, what is MCP server design look like?
What is, you know, ideally, actually, we don't have maybe 20 Cloudflare MCP servers.
Ideally, everybody connects to mcp.Cloudflare.com.
But from there, the tools can be dynamic based on your permissions, based on what you're trying to do.
And so I think there's a lot more to be thought out there.
Also, how do you make it personalized?
There's the concept of prompts, but prompts are also not really adopted across clients yet.
But prompts are kind of like templates that you can feed the server.
If, for example, you have some kind of workflow that you wanted to follow for specific tasks that the user asks for.
I think the other part of it is the authentication and authorization piece.
I think just what was initially developed was it's using OAuth, but a lot of providers have their own existing auth to the upstream.
So they may use API tokens, they may use an auth provider, some login password.
And so we've made it easy to allow you to plug in your own auth system, but then still have it be compatible.
But there's still some limitations around you authorize yourself to the server.
And so it's almost like from the start, you have to know here are all the permissions I would need to have.
And for example, if you only granted read permissions, and then later you want to call a tool that requires write permissions, there is not a really great way for the user to go back and reconsent.
And so the auth spec is being changed right now.
And it aims to solve a lot of these problems to just make it a better user experience, but also just make it security first as well, but also more flexible so more people can adopt it.
I'm surprised first how fast the protocol was around, and then people start doing something with it.
It's surprising, it's incredible.
But to your point, there's still several bottlenecks that are needed to be taken care of.
But even thinking of that specifically, and it's very difficult to do, let's think about the future in this area for sure.
But where do you see MCP integrations clients in one year's time, like the evolution?
Where do you see it going?
I think there's going to be a lot more companies that adopt MCP, not just for their end users, but internally to boost productivity.
Even internally right now, we're thinking about we have all of these services that our teams interact with.
How do we put an MCP server on them so that account teams no longer have to figure out how to use these different things to get what they need?
I think we're all about to.
There's so many tasks that are in a way going to be automated, but in a way we're going to be able to get so much further because we're going to be able to rely on LLMs to process just these large amounts of information to get us what we need.
I think there's also, I think I hopefully clients are supporting remote and that becomes a priority.
But also I think this is where we're going to start moving closer and closer to the agentic world where you eventually do take the human out of the loop and you have agents that can talk to one another.
For example, I have a travel booking agent. Maybe it can talk to Expedia and then it can talk to Google Flights and it can talk to another MCP server and all of these things can be chained together so that you're always getting the best results in a way as a user.
But I think there's a lot of security considerations as well and privacy and I hope the spec makes progress in that direction.
I know people are thinking about it. Also just directories. How do you discover what MCP servers are out there?
Who is going to manage these directories?
Lots of open questions, but everybody's actively working on it. Yeah, I think that the thing that really resonates with me is that the spec is always changing, but what I love so much about Cloudflare is we're really dedicated to helping find that vision and kind of helping to craft that story as well.
So internally, we're also doing a bunch of experimentation, always trying to figure out what it means to build good MCP servers, what it means to build a good client-server relationship, and so kind of excited to also help in having that spec.
And it's open source, so all of the feedback that we have, that the community has, I like that it's being worked on in the open.
Anybody can contribute to it and I think that's what made it a really great protocol.
And so fast, being integrated, being around exciting developers, the amount of excitement, it's really ChatGPT type of excitement almost, which is kind of surprising.
It's addicting. Once you start using it, you're like, oh my gosh, there's so much more I can do.
And my favorite thing is always asking Claude or whichever client I'm using, I'm like, what's next, what else?
And it's like, oh, okay. And what Nevi was talking about earlier, it was nice building out an MCP server, but then using MCP to be able to debug and fix that MCP server to give it more tools, it gets meta very fast.
It's perfect sense, actually, especially when you start to interact with these systems, you can see that they can help you deal with the actual system.
So it's like an ecosystem of sorts.
Just to wrap the fast questions one, best surprise you've seen from a developer using MCP?
From a specific developer? I think in general, I think the concept of bringing AI into development workflows has been interesting to see across the industry.
But I think when you think about specific companies, there's a little bit of like, oh, I don't know, should we do it?
Should I trust it? And I work with one engineer on my team who was very resistant to it at the beginning.
Now he's building out an agent for Cloudflare that auto fixes different build issues.
And I remember exactly what he said to me. He was like, I have never felt so happy to be doing work after building with agents and kind of getting involved in the AI space.
So I think it's exciting. It's an exciting time. And I think when you start building with agents, when you start kind of understanding like the future and the possibilities, people's eyes really, really open.
And you kind of realize like, wow, like this is the future.
Like this is where we're headed. That was a fun kind of tidbit.
Yeah. Christina, do you have one? One of my favorite ones that I've seen, I think this is a community favorite.
If you're on Twitter, is there's a Blender MCP server.
So Blender allows you to do modeling, like 3D modeling.
And so it was really cool because you just watch somebody that's chatting with it, just using sentences.
And all of a sudden, it's creating these very advanced 3D models, which is incredible because if you've ever tried to use one of these tools, I did back when I did engineering in college, it is very difficult, very steep learning curve.
But another fun one that I've been seeing on Twitter is people are creating personal MCP servers as their resumes and giving it information.
You can chat with it. And so I think that's just been very creative. Favorite new server or integration?
New one. I have not seen this one, but I will put it out there.
I think it would be very fun to be able to order groceries or food directly from Claude.
It's funny. I've been on customer calls and they're like, are you talking to these food delivery services?
Because I want to tell it, like, I'm trying to be healthy.
Just order this for me. So somebody out there building. That would be useful.
Definitely useful. Can I have one, Navi? I think that I'm really excited about the possibility of MCP and what that brings to support.
I'm so excited about what Intercom's MCP server is sort of bringing to the table.
I think that I've definitely been in the loop before where I'm just like talking to a random robot on the other side or maybe someone who doesn't have like knowledge about my account and it takes forever for the time loop to get for me to get the answers that I need.
And so kind of excited to see how this sort of spreads into the support space.
Actually, one I forgot to mention, but this has been one of my favorites, is Square, their product block.
They have an MCP server and it allows somebody to take a picture of a physical menu and it will, out of that photo, spin up a whole storefront for you.
And so I just think that is so incredible, especially for business owners who are not technically savvy.
Makes me think of my friends' parents who own a restaurant to be able to spin something up like that and instantly have an online presence.
Lots of possibilities there. Absolutely. Last but not least, in one sentence, why should people try MCP today?
I think you have to see it to believe it, but it's honestly, think of anything that you would like to bring into Cloud or GPT and you can go and build that out yourself.
And the best thing is you can ask one of these agents to give you the tool, to give you the code to build it out.
So let your imagination run wild. Yeah, I think the thing that I was explaining this to my dad the other day when I was talking to him about MCP, but I think it gives you the power to access information that would otherwise be you wasting time trying to understand what it is that you have to do.
I think I got a message the other day about like, you know, you need to upgrade your phone, you can get a new iPhone, whatever.
And I was like, this is like a perfect example of why MCP could be so cool, right?
Verizon could have an MCP server and that MCP server could read my account and see what iPhone I have and then give me the recommendation as to which one I should get.
And I kind of just feel like it can enable so many different workflows, but also give your users access to information and understanding information that otherwise would be really difficult for them to parse.
So forget, you know, putting a lot of stress on like having the perfect marketing site, having documentation be perfect.
The world is sort of changing in the way that users are interacting with your tool and with your information.
You can put it out there in a way that your customers, your users, someone you want to show something off, will have access to the best information as someone would read it and be really not knowledgeable about that information or make something from...
Exactly, in the languages that they want, in the way that they're asking, right?
Like you can ask like whatever client you're using, like simplify this, break this down for me, say it in Spanish, say it in French.
Like, I think it's so cool.
Like the access to information, I feel like it's really pushing.
You can take any service and essentially tailor it to you. And so if you think about a dashboard, it's the same experience for everyone.
And, you know, as a product manager, it is, I can tell you, it's very difficult to design the perfect experience for every user.
Everybody has their things they care about, their own persona.
And so, and APIs, they also themselves are, they're not super flexible.
You have to build integrations. And what's amazing is that when you're interacting with an MCP server, especially one that's, for example, built on Cloudflare that has memory, it can keep track of the conversation history.
It can understand you better.
So every time that you make a tool call, it gives you better and better results.
And I think even also from a company perspective, if you think about it as an upsell tool, when you show a user a list of features and you're like, this what's on the free plan, this what's on the paid plan.
A lot of the times the user's like, I don't exactly know what that means for me.
But the really cool thing is that, you know, MCP plus LLM, it can explain to you as a user, Hey Dina, you like to do this and that.
Do you want to be able to do all of these things? And it's tailored to me, to my use case, to the projects that I've been building.
And, you know, I can be like, yes, actually that is what I want to do and see the value in the service and get to that aha moment much faster.
I was a journalist for several years writing about tech.
And I remember some of these ideas coming about several years ago.
We'll be able to personalize your own experience at a store, things like that.
It was already there, the ideas, but the execution was not.
And what this seems to enable is execution beyond your wildest dreams in the sense.
And may that be for companies having happy customers and customers and people interacting with the services, having a very personalized experience that knows that person really well.
It's really powerful. You can see it. Exactly. Thank you so much.
This was great. And there will be much more opportunity to talk about MCPs, LLMs, AI, the future for sure.
But you can see that is definitely something that is growing.
It is. Every week there's a new update. Quite incredible.
This was great. Thank you, Dina. Thank you, Navi. Thank you. Thank you.
And that's a wrap. Here are some recent highlights from the Cloudflare blog.
On the security front, we recently patched a request smuggling vulnerability in our Pingora framework, CVE-2025-4366.
We also emphasized vulnerability transparency, detailing our responsible disclosure process.
For those interested in network insights, real-time BGP route visibility is now available on Cloudflare Radar, providing immediate insights into global Internet routing.
We also explored the challenges of performance measurements at scale.
Finally, regarding efficiency, we've blogged about making it easier for customers to use their own IP addresses across various Cloudflare services, allowing for more efficient address space usage and smoother IPv4 management.
Check blog.Cloudflare.com for more.