Cloudflare TV

🏗 On the Edge of My Server

Presented by Brian Rinaldi
Originally aired on 

Cloudflare Platform Week: Developer Speaker Series

As part of Cloudflare's Platform Week, we're thrilled to feature an array of expert web dev speakers, developers, and educators here on Cloudflare TV.

Edge functions can be potentially game changing. You get the power of serverless functions but running at the CDN level - meaning the response is incredibly fast. With Cloudflare Workers, every worker is an edge function. In this talk, we’ll explore why edge functions can be powerful and explore examples of how to use them to do things a normal serverless function can't do.

Visit the Platform Week Hub for every announcement and CFTV episode — check back all week for more!

And join the community and members of the Cloudflare team at the Cloudflare Developer Discord

Platform Week

Transcript (Beta)

Hey, everybody. My name is Brian, and for the next 30 minutes, I'm going to be talking to you about edge functions.

I'm really excited to talk about this topic today because I think there's been a ton of buzz.

You hear people talking about Edge functions all the time. But I think there's also a lot of confusion around what they are, what they're good for, and how they're different from your regular serverless functions.

So what we're going to cover today so first of all, we're going to talk obviously about what is an edge function and what makes them different from a regular serverless function and how that can be useful.

We're also going to cover some unique use cases for edge functions.

I think there are things that edge functions can do that you that are really kind of difficult to do with a regular serverless function or any other kind of code.

Edge functions are particularly useful for some unique use cases. And we're going to talk a little bit about what those are.

And then finally, we're going to look at what is an edge function within Cloudflare and what can I do with that?

So before I get started all that, I just give a brief introduction.

Again, my name is Brian Rinaldi.

I'm a developer experience engineer at Launch Darkly.

We do feature flags if you've if you haven't heard of us before.

I also am very interested in JamStack.

So I'm a coauthor of Jamstack book from Manning, which just came out.

I'm an editor of the Jastacked newsletter from Cooper Press, and I run a community called

It's not specifically Jamstack, it's everything about code.

It's just an online community you can join as well as the Orlando devs.

If any of you watching are in the Orlando area.

I run the local Orlando devs meetup.

Speaking of the Jamstack book, I'm giving away three copies, so please just I can't do it obviously on here if you.

The first three people that DM @remotesynth, I'll give you a free copy or you can buy a copy at 35% off with that code.

Okay, enough about me stuff.

What are edge functions?

This is my own personal definition.

I didn't pull this from anywhere, but an edge function is essentially a serverless cloud function that runs on is replicated across CDN/edge nodes.

So before to better understand what that means, I want to kind of take a quick trip back in time to kind of understand how that's different from what's come before.

So much of the history of the Web is based on upon a user on the browser connecting to a single location.

This could have been a single server, a rack of servers or even a server region, right?

So this is a fun example. This is the earliest example, actually.

It's the CERN computer that ran the first website.

It's as you can see, it's got a fun little sticker there.

It says, This machine is a server, do not power down.

Because, well, there really wasn't Web servers before this.

So while this is a unique example, it was pretty common even in my early days of development that we'd have a rack of servers in the office or even a single server in the office, sometimes very early days, or we'd start co-locating servers somewhere.

But these are all existed in a single location and cloud computing came along and it helped us with things like auto scaling so that you didn't overuse expensive resources on your least busy day, but you still had enough resources to handle your busiest day, even if that busiest day exceeded expectations.

You didn't have to add servers to a rack.

It auto scaled for you to take care of that.

But in the end, whether your code was running on the cloud, it was still sitting in a server region in a location, a particular location.

For me, I tended, for instance on AWS, I would tended to deploy it to US East 1 won a lot of you on the West Coast probably did I think US West 2 is a common one.

It's all still sitting in a single region or location.

So even with things like jam stack, we would push the static assets to a CDN.

So when the user connected and pulled the static assets, they would get those from the CDN.

But then any API calls that were made either from the client or API calls even that were made from the application when the Jamstack application when it built, we're all calling a server region.

So you still had whatever latency is involved with calling the back end of your application in that server region.

So edge functions aim to solve this problem by bringing some or all of the back end processing up to the user, to the CDM level in or edge node level in proximity to the user.

This can help remove any latency based based on where the user is geographically because they get what the back end is coming from wherever is closest to them.

So imagining an SSR app or Jamstack application, this gets very, very simple.

It's just the user gets still gets their static assets from the CDN, but then they also get the data and back end processing all comes from the CDN as well.

This help can help you even provide a consistent performance across the globe because no matter where the user is connected to, they'll get they'll get the back end that's closest to them from whatever CDN node is closest to them.

So for example, we're here on Cloudflare TV.

Cloudflare has 250 data centers, at least according to the map that I pulled most recently across the world.

This is a map of all of the locations.

You can see that from pretty much wherever you're connecting in the world, you're going to have a CDN location that's close to you that way.

The back end of your application is, is your Worker code is basically replicated across the CDNs and you can get that from, from any of these particular nodes.

One of the interesting things is does though that you don't get with a regular serverless function is that you can do some interesting things like you can intercept the request in the response.

This is more if you have a server like let's say you have a server, your application still served from a server region, but you put a Cloudflare Worker in front of it.

You can intercept that request going in or you can intercept the response coming out.

Now, that actually can do some really, really powerful things.

Some of the use cases we're going to talk about in a moment are enabled by this particular capability of edge functions.

So what makes an edge function different from a regular serverless function?

Just to summarize, they're replicated across edge nodes.

Typically, this is a CDN.

Now I say edge nodes and I don't just say CDN because some providers actually it's replicated across an edge region.

So particularly AWS replicates their lambda edges across edge regions, whereas cloud front functions are deployed to CDN.

So, so this really depends on the provider.

With Cloudflare, it is the CDN.

So the edge functions have this proximity to the end user because they're served from the edge.

That's the whole point of them. So this removes any latency with having to do with whatever distance your the location of your servers might be from the user.

And as I mentioned, they can intercept the request and response.

And we'll talk about some of how that's useful in a moment.

So let's just cover some common use cases.

And what I'm talking about here is common use cases that are particular to edge functions, as we'll see in a moment how edge functions are on Cloudflare is somewhat unique.

So this is particular uses for like for edge functions regardless of, say, which provider you're deploying to.

So you can do A/B testing without a flash of rendering.

Now sometimes you'll see like A/B test done on the client side where the text will suddenly quickly replace out for you.

It's it's not a great ideal experience, whereas here you can replace it on the spot by modifying the response and just change the text.

And the user never sees that flash and you don't actually have to do a round trip to the server to do this.

So it's a great use case for it.

You can modify HTTP headers, for instance, depending on the user's information, you might want to throw some custom HTTP headers in there.

You can do a conditional routing, route a user to a new location without any latency, so you don't have to wait for a server side redirect or a client side redirect.

This is done immediately during the request and you catch that request and reroute them.

For instance, you might route them to a login page if they're not signed in.

You can serve up different version of your site for search crawlers. This is an interesting use case.

I've never tried this myself, but I know of people who have talked about doing this.

In particular, you can check for user authentication or authorization. And as we talked about, you can root them otherwise redirect them.

You can remove important latency for IoT and gaming.

So IoT and gaming, in particular, that little bit of latency you might get by crossing the globe to wherever your server region is, could be critical.

These are applications where the timing is critical.

And so like by moving this to the edge, you can remove that latency and have a more responsive IoT or gaming app.

You can do personalization, like by modifying the response, again, we can personalize the content without any of the flash of rendering.

For instance, sometimes you'll go to a page and you'll see that the login state suddenly flash because that's changed client side.

You can do that, you can intercept the response, see if they're logged in and change that before the response is ever received.

Eliminating that flash of rendering.

And this is another interesting one.

When I was doing some research on this, I haven't tried this myself, but you can actually serve up different versions of your site depending on the user location to meet some location specific compliance issues.

So that's that's another interesting use case specifically for edge functions.

So when I said on Cloudflare, it's somewhat unique because every Cloudflare Worker is effectively an edge function.

That's whether you're doing using it as a standard serverless function or whether you're doing any intercepting of the response and request.

A Cloudflare Worker is an edge function.

That's different from pretty much every other provider every other provider you'll see they have their serverless function offering and then they have their edge function offering.

And even if you're on a AWS, they have their serverless function and say lambda, then they have their edge function, which is edge node functions, which is lambda at edge.

And then they have their CDN level functions, which is cloud functions. So you have all this separation, but this pretty much even other providers tend to have an edge function and a serverless function and they're two different things.

On Cloudflare, you don't have that distinction, and that's actually really interesting because there's other providers also put a lot of restrictions on what you can do in an edge function versus what you can do in a serverless function.

Not every provider has a ton of restrictions on them, but some of them do. Again, Amazon is kind of, AWS is the kind of canonical example here.

You can't do certain things even in a lambda edge.

You can't actually modify the response.

And the cloud front function is very, very limited.

It can only do things like modify headers and stuff like that.

It can't even intercept a request in the same manner that a lambda edge function does, and other providers have similar types of restrictions, whereas you could pretty much do whatever you want in your Cloudflare Worker, which is really, really cool, even though it's living at the edge.

So but the one message I want to get across here is that it's not all like, oh, hey, I just start moving everything to the edge.

And I don't really have to change the way I do anything.

Moving things to the edge is super powerful, but it also requires you to think a little bit differently sometimes about how you might architect your functions, particularly when it comes to interacting with data and APIs.

So let's talk a little bit about that.

So imagine I'm getting my function is still, my application and assets are living in the CDN and my function is living on the CDN, but my data is still living off in, say, US East 1.

So any data calls or API calls that I might make from this, from my function, would have to cross that have that latency or any if say I'd make API calls from the client side, it would also have that latency.

Now, I'm not making a big deal of this, this may or may not be critical to your application that it's or it might not be terribly different from if you had deployed your functions to the server region in a kind of traditional serverless example, but it is something to consider because you want to think about how you can move things to more things to the edge, and remove any of that latency and get better performance, take full advantage of the edge.

So the bad news is that like, again, to use that example, the edge functions, as Sam says, are cool.

But your database is probably in US East 1, and this is something that like a lot of the buzz around edge functions tend to gloss over.

So we're going to talk about that a little bit and how ways you can mitigate that.

Again, it may not be a huge issue, but there are ways you can even mitigate that.

And the good news is that Cloudflare actually gives you multiple tools to mitigate that data latency.

They have a cache API that you can cache the response and then, you know, for a period of time and then serve it up from the edge via that cache.

They have a KV, a key value store where you can store any arbitrary data and key value pairs and retrieve that data from the edge.

They also have Durable Objects in the transactional storage API, something I personally don't know a ton about, but I know that it does actually give you another option for storing data at the edge.

We are going to talk about those first two, which is why they're kind of highlighted.

So I'm going to transition off the slides for a moment here, and we're going to look at some examples very quickly.

I know we only have like 15 minutes left, but I do have quite a few examples I want to show you before closing out.

So we're going to look at using the cache API.

We're going to look at using the KV, the key value store.

And then I'm also going to use Launch Darkly as kind of a use case for some of these examples to show you how we're doing that in our cloud front, our Cloudflare integration.

So I'm going to get off the slides here and I'm going to look at this is a sample application here.

I'm actually going to show you it running. Before you get I'm not going to go into the beautiful CSS and other stuff that I did here.

Obviously, my design skills are are superior. But this is a very simple application.

All it does is it gets characters from the Rick and Morty API, which you've probably seen, used a lot in examples.

And then I can click through and get a detail page which has just their name and photo.

Very, very simple application. But because I'm using the Rick and Morty API here as kind of a proxy for where my data might be coming from, that might be an API, that might be a database.

And how can I mitigate some of the issues of calling that data, which is not necessarily living on the edge, it may be living anywhere in the globe and maybe mitigate some of the latency in calling that data.

So I've built an example here.

This, this example here is built using, at Cloudflare Pages, it's built with Astro, which I've been toying with a lot lately, and it's really, really fascinating, worth checking out, but it's a simple application.

Again, two pages that just display the all the characters, which is coming from the API that I've built here, as well as the character page.

The API is all built.

For those of you who haven't messed with Cloudflare Pages recently, you can actually easily add in any Cloudflare Workers by simply adding a functions folder, and everything in this function folder will get deployed as a Cloudflare Worker.

It's really, really powerful. So I've got a couple of examples here.

In this case, I am just getting the character data from the Rick and Morty API and returning the JSON response to the user.

But again, I don't want to have to go, every time this API, this data isn't going to change that much, this isn't something I need to provide live data all the time, I can actually cache this response, but I'm not going to catch it right here in this function.

I'm going to use — you can use what's called middleware within Cloudflare Pages, allow you to have middleware functions, which actually is the part that intercepts that request and response.

So I've created this underscore middleware JS, which we get deployed as middleware and I am checking to see first of all, I am checking getting a cache URL by using the request URL and then I'm checking to see does that, does that cache actually exist already?

So I check to see do I get a cache match on that, on that, on that URL.

If I do, I'm going to go ahead and skip all this and just return the page, the API response from the cache.

But if I don't have it in the cache already, I'm going to go ahead and insert it into the cache by going to get the response, it's something specific to middleware, it basically gets the continues the sequence of the response here.

So in this case, it would get me the response data.

I'm going to store that in a variable here.

I'm going to set some cache control headers for so that this will only cache for a certain amount of time.

Obviously, in this example, you could cache this for a really long time, this character data doesn't change much at all, but I'm only caching it for a short period of time here for the purposes of example.

And then I am going to basically store that, put that cache, that response in the cache so that next time the user comes, they'll get the same they'll get this data from the cache instead of at the edge instead of having to go all the way to the API.

And then I'm going to just quickly show you I want to show you this one, which is basically a different approach for dealing with this.

In this case, I'm getting all the character data.

This isn't...

I wouldn't necessarily recommend this for working with an API, but you could follow this response.

I'm basically taking the character data and I am checking to see if this environment variable represents my CVV and I'm checking to see are there are the characters already in the KV.

If they're not in the KV, I'm actually going to loop through the results and insert them in the KV because my page only needs the ID and the name.

So that's all I'm going to insert in this KV.

I'm going to use the ID as the key and the name as the value, and then I'm going to go ahead and pull that all back again.

You do have to implement — this is me pulling it back from the KV, so in the case of it already existing, we would skip this part.

We just pull the data from the KV, which exists on the edge.

I do need to sort it because it doesn't always come back from the KV in the proper sort order.

And then I return the response which I have assembled for you as a JSON string.

So that, in this case, instead of going back to the API, every time I'm actually returning data from the KV at the edge, I could have actually stored the full character data JSON in the KV, but I chose not to here.

So I am also going to switch here quickly to another application that I built.

This one is using the Cloudflare Workers integration.

If you go if you're a Launch Darkly user, if you go to integrations and you go here, you can set up this Cloudflare Workers integration.

As you can see, I tie it to a particular KV namespace.

I can do that for both the preview and the live API.

So now what happens is Launch Darkly is going to synchronize all of your flag data automatically into your KV in for your Cloudflare Worker, and that will allow you to get flag data without ever having to go to Launch Darkly directly.

So we have an edge SDK that does all this for you.

I got to reload this page, but you'll see this is a page that uses Cloudflare integration with Launch Darkly.

I want to show you, if you look down here, I know it happens really fast, but you'll see that this result coming back from, theoretically from Launch Darkly, happens immediately.

That's because we're just getting this from the Cloudflare Worker.

We're going to show you really, really quickly how that's done.

And just to kind of show you, you can also you can still, even though it's coming from the Cloudflare Worker, I've integrated it so that if I want to show show the About us page, I flipped this flag and then suddenly this is coming live and there that About us comes back true in the about as page content is actually added onto the page.

So we're low on time.

So I'm going to quickly switch and show you how this is done very, very quickly.

I'm using the Cloudflare Edge SDK from Launch Darkly, which we provide.

And what I'm doing is I'm getting in this particular line, I'm getting all of the flags that are marked as client side flags.

So flags that will affect my client side interactions.

I'm getting those and I'm actually writing those out to a string here, a JSON string, to the client.

And what I'm able to do with that is actually then bootstrap my Launch Darkly initialization here with those flags.

So it never actually goes back to the server other than to synchronize any updates.

But those are synchronized within my Cloudflare Workers. So this is the most recent Launch Darkly data is always synchronized with that Cloudflare Worker with a KV.

So at no point do I really need to go back to the server to get that information other than for any updates that happen after the page is loaded.

But even then, any calls like for instance, where I'm doing my in this case I would do perhaps A/B testing.

I'm getting this server side flag value from Launch Darkly.

But even that is actually pulling because this is using the edge SDK that's pulling from the KV because that KV data is always up to date.

So this is a particularly interesting use case where like getting flag data can add a little bit of latency, particularly when you're getting client side flags.

But doing this, we've now eliminated any latency in getting the data by moving a lot of that data to the edge.

So the point here is you can do some really cool stuff at the edge, but also take advantage of tools like the cache, like the KV and other tools to move what data you can because you have to think about that like where is my data coming from?

Is it worth moving that to the edge? And how can I how can I achieve that?

So I'm just going to go back really quickly because we're kind of finishing up here with just 5 minutes remaining.

So I want to share some resources that if you want to learn more, particularly about the Launch Darkly stuff, but it actually covers a lot about using Cloudflare Workers.

That site that I built was actually built with the Cloudflare Workers sites as opposed to Cloudflare Pages.

So you can see an example of using that.

I've got links in these to both those repositories so that using Launch Darkly with Cloudflare Workers is a full guide.

It goes through the entire the entire application I just showed you and shows you how to set that up and how to use the integration, but also how all of this works behind the scenes.

I did a blog post called Flagging at the Edge, which is again about Launch Darkly, but this looks at other providers as well as Cloudflare.

We only have the Cloudflare integration right now for the Edge.

So if you're Cloudflare Fan, yay, and then there's a Cloudflare Edge SDK docs, again, all of these are Launch Darkly specific, but there is some cool stuff in there.

If you want to just learn more about edge functions, about using things like the KV and the cache as well.

So that's it for my slides, for everything else.

I can't obviously easily answer your questions right here, so I'm happy to answer them if you reach out to me.

My DMs are open on Twitter @remotesynth.

Again, if you want to reach out to me and get those one of those three copies of the Jamstack book as well.

But if you have any questions, feel free to reach out on Twitter and then my email is

I'm happy to take your emails again.

I hope that this was useful to you.

The key thing I want to point out here is edge functions are actually really, really cool.

They do deserve a lot of the hype that they're getting. But you do need to think about where is your data coming from because you don't automatically remove all of the latency involved in your application by moving things to the edge, because parts of that application may not exist at the edge.

So third point, take advantage of things like the cache API and the KV to basically move what data you can to the edge to further enhance the performance of your application using Cloudflare.

So thank you all.

I hope that you enjoyed this again.

Reach out to me on Twitter or email.

Bye. Q2 customers love our ability to innovate quickly and deliver what was traditionally very static old school banking applications into more modern technologies and integrations in the marketplace.

Our customers are banks, credit unions and fintech clients.

We really focus on providing end to end solutions for the account holders throughout the course of their financial lives.

Our availability is super important to our customers here at Q2.

Even one minute of downtime can have an economic impact.

So we specifically chose Cloudflare for their Magic Transit Solution because it offered a way for us to displace legacy vendors and the layer three and layer four space, but also extend layer seven services to some of our cloud native products and more traditional infrastructure.

I think one of the things that separates Magic Transit from some of the legacy solutions that we had leveraged in the past is the ability to manage policy from a single place.

What I love about Cloudflare for Q2 is it allows us to get ten times the coverage as we previously could with legacy technologies.

I think one of the many benefits of Cloudflare is just how quickly the solution allows us to scale and deliver solutions across multiple platforms.

My favorite thing about Cloudflare is that they keep developing solutions and problems.

They keep providing solutions, they keep investing in technology. They keep making the Internet safe.

Security has always been looked at as a friction point, but I feel like with Cloudflare doesn't need to be you can deliver innovation quickly, but also have those innovative solutions be secure.

Thumbnail image for video "Platform Week Developer Speaker Series"

Platform Week Developer Speaker Series
As part of Platform Week, Cloudflare is thrilled to welcome an array of expert speakers, developers, and educators. Click here to see the full schedule.
Watch more episodes