Cloudflare TV

πŸ’» Why Edge Computing Is The Future of Writing Software

Presented by Kristian Freeman, Jay Phelps
Originally aired onΒ 

Edge computing represents a paradigm shift in software development. Learn how to take advanage of this technology to solve some of your toughest challenges when building applications.

Jay Phelps, Co-founder of Outsmartly, and Kristian Freeman, Cloudflare Developer Advocate, will discuss how edge computing is different than anything before it, and will review what's new in the space – from deploying static sites to the edge, to writing powerful serverless functions.

English
Developer Week
JAMstack

Transcript (Beta)

Hey, everyone. Welcome to another Developer Week segment. I hope you're enjoying your Developer Week so far.

There's a lot of blog posts and all kinds of other stuff to check out this week.

But I'm really excited about this panel. My name is Kristian Freeman.

I'm the Developer Advocate for Cloudflare Workers and Pages.

You'll see me probably a couple times this week in various videos here and there.

And I'm joined by Jay Phelps. Jay, do you want to introduce yourself real quick?

Sure. Thank you. I'm Jay Phelps, and I'm one of the co-founders at a company called Outsmartly.

What Outsmartly does is we are a platform for doing digital optimizations for your website.

So things like dynamic content, A-B testing, personalization, optimization, those sorts of things.

And we run at the edge. Yeah, totally.

Yeah, you're the perfect person, I think, to talk about this subject. So yeah, this session is about edge computing and why, what did we put?

Why edge computing is the future of writing software.

So maybe to start, we should explain kind of what edge computing is.

So if you've been watching the segments so far up until now, there's probably been, I would imagine, some talk about Cloudflare Workers and particularly kind of the function of having workers run at the edge.

So network applications, you have a client and you have an origin. By the way, Jay, if I explain anything in a way where you feel like this is confusing, feel free to augment any of the stuff I'm saying.

But yeah, basically, you have a client, which is like your computer, your phone, et cetera.

And then you have an origin in like traditional network applications.

So the origin is like a server running somewhere in like some big data center or something like that.

And with Cloudflare, we were providing like caching and like firewalls and all this stuff on all of these servers around the world.

So we have hundreds of them all over the place. And those servers are called like the edge, right?

So there's one, I don't know, Jay, if you've ever checked, I think the closest one to us in Austin and Cloudflare kind of edge servers, I think is Dallas last I checked, but- Yeah.

That was the last I checked as well.

Yeah. There is no Austin one, which is interesting. That is interesting.

You'd think that like in the back closet somewhere or something that you all would have- At the Cloudflare office.

Yeah. That is interesting. I've never really thought about that.

Anyway, so, you know, we have all of these servers all over the place, right?

Which we're doing, you know, firewalls and caching and all kinds of stuff that I don't really understand from a network perspective, but it seems very impressive.

And then, you know, someone was like, hey, we have all these servers.

What if we let people deploy JavaScript code onto it? And so now you have this idea of you're not only running code, you know, at the origin and like maybe the client is doing some stuff as well.

If you're writing something that's like a client side application, but now you have this kind of middle place where you can run stuff, you know, whether that's in Jay's case, like optimization stuff, I imagine we'll talk about kind of the details of what you guys do.

But the idea is, you know, now we have this kind of central place where we can also run code, whether that's like augmenting existing code or just running code entirely on the edge and just skipping having an origin entirely.

There's all kinds of different ways to do that.

How did I do Jay? Did I miss anything? It's wonderful.

Yeah. It's, yeah, it's just the, the edge is just the term for just, you know, it's the edge, like edges is a term that unfortunately gets used a little bit loosely, depending on where you're like, which circle of people, like in the, in the IoT world, edge means something very different.

Right. Like, so but the important thing is, is like, like, when I when we're talking to people who are not, I say, I'm actually surprised, most people are still not yet familiar with the term edge.

Yeah. So we typically like to say, like, it runs in the CDN. Like, I do understand, like, at least, usually that that, you know, it's a term they've heard.

And that kind of gets the light bulb.

But yeah, no, you did a great job. So why do you so like, what excited you about, you know, that?

So I guess, like your background, I think I was familiar with your work before you, I think you're the co founder of outsmartly, right?

So like, I was, so I was familiar with your work before that, like, what excited you about the edge initially?

What was kind of the thing that got you thinking about and thinking, like, this is something kind of, well, I don't know if you would say it's the future of writing software.

But that's what we're saying.

What made you kind of be interested in it? Well, the biggest thing is that, like, so I was smartly is kind of founded on the principle of that.

I guess the realization that this, the ability to have edge computers the first time where you're able to kind of have your cake and eat it too, for some respect.

So previously, you had to choose between dynamic content, and like doing any kind of dynamic content.

Now that might anything that's dynamic is actually deceptive, like carts.

It's not like real time things. It could be anything that's that literally needs to be changed dynamically, usually had to change between dynamic content and caching, you know, so you had to choose which one do you want to do you want dynamic content?

Well, you got to go back to the origin, you can't cache that, or the other way around.

And so outsmartly allows you to combine the two edge computing in general, I mean, we make it easier, but edge computing in general allows you to combine the two together, we can have caching and computation and be able to do some dynamic stuff.

And so that, for us, it was it was, you know, we were working on AB testing related and personalization related stuff.

And it was very obvious that that was it was people were adding these tools to their website, they and immediately slowing it down, sometimes by like 10 seconds, which is like crazy, right?

Like, you know, you're trying to do these things to increase conversions.

But then when you slow your site down that much, you're actually decreasing conversions.

So you needed an even bigger lift to overcome that. So that's kind of what we set out to do.

And then along the way, figured out, you know, got more and more in love with edge compute.

And you know, in the platform expanded on other optimizations as well.

For me, it's just, you know, that's the that's the biggest thing is being able to have your cache and your dynamic content in the same place.

Yeah, yeah. What I've done in the past, you know, like is, is, you know, like having your own smaller CDN network, that's barely not even a really a CDN.

It's more like multi origin sort of. Yeah, right. Before Cloudflare, you know, clever.

I mean, I know, this isn't necessarily supposed to be class or specific discussion, but it's kind of hard not to talk about Cloudflare, because sure, you guys really have pioneered it, you know, like, and really plowed the way on this edge compute, and all the other CDNs are playing catch up, they'll catch up, don't get me, I mean, without a doubt, they will catch up.

But, or, but maybe you guys will keep going ahead, you know, doing things before.

Yeah, before they'll, I guess I should say they'll catch up to where you are now.

How about that? That's a good that's Oh, yeah.

But I'm just excited about that. Just because I think and I and you mentioned something kind of in passing about that.

Some people will even not even need an origin at some point, you know, like they'll be able to just deploy directly to the edge, and it's completely origin lists in it, you can make you can mentally just think about it as the edge is your origin.

And sure, yeah, no need to go to any anywhere else.

You know, and I know that I think you guys just released that Cloudflare pages is general availability today.

Is that Yeah, yeah.

Cool. I should say, yes, Cloudflare pages is in general availability today. If you haven't seen the announcement going Cloudflare's blog.

Very excited about it. But yeah, anyway, sorry to interrupt.

Yes. Yep. Cool. Yeah. So I think yeah, there's there's a lot of stuff to unpack there.

Frankly enough, I think we could probably just pick any one of those topics and talk about it for the next 20 minutes.

But I think the first thing that I kind of hear you saying is like, you know, there are a lot of things like that, that compromise is really interesting between like dynamic and static, right?

Like that's, that's very tricky. It makes me think a lot about like the sort of recent push to like Jamstack and stuff like that, where like, people are recognizing the advantages of making as much of your stuff static as possible, right?

If you can get away with doing static, like a static application of some kind, it's going to be a lot faster than, you know, having everything, both from a bandwidth perspective, right?

Like going back and forth from an origin and stuff like that.

And also, in many ways, if you can build it in a way that that makes sense, it is a lot easier to reason about as well.

But I'm curious, let's start from kind of the some of the stuff you mentioned, like A-B testing and personalization and stuff like that.

Like, what kind of things do you think edge computing is solving from like a business use case?

So like, there's things that you mentioned, like you can have your cake and eat it too, which is great.

Like those, there's things that you couldn't do in the past that you can do now, right?

And so like, what kind of things are you guys thinking about or trying to solve there?

Yeah, I mean, there's really everything you can do.

I mean, like anything you could have done on the origin, now you can do it at the edge to some, to some expect, you know, there's still some limitations and stuff like that, especially because of the nature of it being distributed, right?

And also like your databases aren't running at the edge.

And, you know, and if you, you know, if you can get away with a key value store, you can use KV and durable objects and these sorts of things, but that's, they're limited, still limited in scope.

You know, at the end of the day, a lot of people still have these origin databases that they have to go back to.

So there's still, you know, there's still room for improvement in that respect, but you can do a lot at the edge, you know, even have some of your, even your API calls could be at the edge and, and get better performance, frankly, like, like, because, you know, like if your API endpoints are at the edge, then they can, they can follow.

Like, so if your connection between the browser and the edge is not great, or if, so how do I say this?

If the browser, if the device's Internet connection is not great, it's better to get to the edge, terminate at the edge, and then let it go through Cloudflare's, you know, tremendous tunneling crap to your, to your origin, than it would be to just have them go all the way to the, to have to go to the origin directly.

Right. Right. But, you know, the difference is not like crazy, like we're going to blow your mind.

So the things that we, we focused on, like we, you, you mentioned about Jamstack, so huge push to Jamstack because you can do most of your work ahead of time, right?

Like why make, why make the server do it just in time, do the work ahead of time, build the page, and then serve it statically.

That all sounds nice in theory, but in practice, there are business requirements that are dynamic in nature, AP testing and personalization being the two most obvious.

So what, what does everyone do? Well, historically, what everyone had done is, is they do their static site generation, then they slap on a client -side JavaScript snippet, right?

And the, like I was saying, I mean, I don't want to beat it to death, but that ironically just ends up actually making things slower than if you'd just done server-side rendering to begin with.

Like, like, and so you lose a lot of those benefits. So that's kind of where we, or AltSmartly is trying to, to solve that problem where you can, where you can have to keep static site generation but, but just do the minimum amount of re-rendering on the page that you need to.

We, we, I'm not going to turn this into a commercial for AltSmartly, but the gist is, it is, it's kind of cool to hear about that we do, we do static analysis, compiler magic, essentially ahead of time to figure out the minimum amount of changes that would be required if you change a component's props.

And then that's how we're able to just do the minimum amount of re-rendering at the edge.

So we don't do, like, a server-side rendering.

Like, it's, I mean, it's server -side rendering in the vague sense, but it's, it's definitely a different paradigm.

But because of the static analysis, it just works, you know, magically.

You don't have to, to handle the logistics and all that.

Yeah, and, and I think what's interesting from, you know, we're in, like, developer week or whatever.

So, like, from a sort of technical perspective, like, it really is the kind of thing where you are full-on, like, running code on the edge and, like, changing stuff about, like, an already architected application.

Like, that's, that's something that just, like, really wasn't possible before, right?

And that, that's, so I think, I think it's, like, it's useful to kind of help people understand, like, it's not, because, because workers, I think, up until now has been thought a lot of as just, like, kind of a serverless function space to deploy things, right?

Like, I can deploy these API routes or whatever, right?

And, and that is cool, and it works well, and all of that stuff is still very, you know, very valid use case.

But there's also this thing where it's, like, when you have a very, you have, like, a full platform, like, a JavaScript platform that's sitting in between, you know, your client and your origin, you can do things like what you guys are doing, which is, like, full-on, like, I don't want to say, like, code replacement.

I don't know how you would, how would you refer to it to simplify it?

It's a different, it's something that you, that makes no sense anywhere else other than CDN.

And so it's a different, it's really difficult, the, it's, if you, if we're talking about the technical details, like, the using it is, is, is easy, and you don't have to know how it works sort of thing.

But the technical details is, it is somewhat confusing to understand, because essentially, we're able to take the cached HTML, and just apply, apply, re -render just the minimum amount of your application, and then apply that difference to the HTML document while it's actually being streamed to the browser.

So we can start the stream, start, can concurrently start figuring out what the minimum amount of changes are.

And then the streaming HTML parser will, will pause if it, if it's basically creating a race intentionally.

And then if whoever wins has to pause and wait for the other one, right?

So if we've been streaming the HTML, and we don't have the result yet of this re-render, which is very rare, but it could happen, then we pause the HTML streaming and wait for it, and then, and the other way around, right?

But yeah, like, I think the more important takeaway from, from that is just that this is a use case that was just physically, I mean, not possible prior to, to, to these edge compute works.

And it's, it's a different paradigm.

And I think we'll see a lot of that sort of thing where, you know, durable objects being a great example of that, of a paradigm that really just doesn't make any sense in any other way.

And so like, and, and new applications, I think, I think that's going to be the funnest thing for you all.

It's also been fun for me to play with is just, you traditionally, like, if you're, I've been a developer for a while, like, how do I say this?

I'm trying to get my, get my mind out.

If you've been a developer for a while, you've learned about limitations of servers and learned about limitations of distributed computing and stuff, right?

I like to kind of unlearn some of those things that we've learned and come and realize, like, there are new opportunities, new businesses that could be developed around this, these new emerging technologies, not just the stuff that Akamai and Fastly and Netlify and all these other folks are doing too.

Sure. Yeah.

It's like kind of unlearning some laws of computing that you were taught or whatever.

Yeah. Yeah. That's, I think that's super important. So that's actually a great segue into my next question.

So we talked about from like a business use case, right?

Like, and I think we kind of nationally walked into like the developer experience part of it as well.

Like, you know, because you have this new place where you can deploy stuff, there's business implications, but there's also developer experience implications.

So like kind of, you know, you've, you've been building a product on this platform and not just on the platform, like it's not just workers, right?

As you've said, to be fair, like it's, it's kind of the mindset or the paradigm, like you said.

So what kind of things have been a shift for you there in terms of kind of writing code on the edge at the scale that you're doing and stuff like that?

I can tell you that that that's probably the biggest frustration at this point, ironically.

So, and it's getting better every, every few months, every six months or so there's been improvements on the, in the various CDN worker providers, but no one's doing this well enough that you won't feel the pain, unless it's a very simple thing.

You know, if you're, if it's a very simple thing, of course it's fine.

It's no big deal, but the big pains are debugging. That's the biggest pain so far.

Cause none of, no, no cloud, no edge cloud worker provider, Cloudflare included, lets you actually run your code locally with an actual debugger, you know, sort of thing.

So, you know, doing testing is, is any kind of debugging, testing, any, any scenario where typically you kind of have to have it running locally, like physically locally, not just like fake it.

Cause like Cloudflare does a really good job with Wrangler with faking it, making it seem like it's running locally when in fact it is not just running, it's running in some, you know, sandbox on the edge, not actually in the edge.

I don't think you guys actually deployed to the edge, but anyway, the that's a pain point, a bit, pretty big pain point, especially when you got significant code that has a lot of different possible things where it can go in.

So console logging, you know, is, is kind of your friend, but even that story, unfortunately isn't, isn't super solid in, in any of the cloud providers, Cloudflare included.

So I think that's a big, and I, and, and I'm, I'm, I would be shocked if you guys don't, and all the cloud providers don't know it.

It's, it's, it's a challenge, right? Yeah, for sure. How do you ship Cloudflare's infrastructure to someone's computer?

You know, you really, you know, there's totally issues, there's logistical issues, there's security issues, there's, you know, there's IP issues.

There's so many issues on how you do this.

And so it's tough, you know, and testing is another thing, like, so we've got just these monstrosity of fake mocks to do local testing, because we've tried doing testing in the actual worker, and it's, there's limitations that come with it.

And you, but so the other is just a lot, we're, we're at the bleeding edge.

When it comes to those things, I would say the use cases that vast majority of people are using them for today, where it's like very simple, like, I just need to do some basic logic to rewrite a URL, or replace these images, or replace this thing.

Like, basically, if you can read, if you can put all the code in, in one or two pages that you have, you're probably not going to feel these pain, this pain is nearly as much, because it's like, ah, you're probably doing console debugging anyway, even if you could do local deep, like, because it's just so simple, right?

Yeah. And then when I say simple, I don't want to discredit the power, like, like, you know, these are these are all things that you couldn't do before.

Yeah. But when you build a full fledged, like edge specific application, it's an application who's designed to run on this edge, and it's a big thing, then it definitely you feel those pains.

And so that's, I mean, I that's, that's something that I think, yeah, so but no one's doing that.

Well, today.

Sure. Yeah, I think that totally makes sense. And yeah, like you said, it's, you know, it's a function of like, when you have something running at the edge, you have not, it's not just like JavaScript, right?

I mean, it is the code that you're writing is JavaScript.

There's also all this stuff around it, like KB or durable objects, or like Cloudflare's cache.

I mean, it's not just us to write like any, any of the providers, like, so it's, it's just a really hard, you know, it's one of those things, like, you know, like the always sunny in Philadelphia, like where he has all the things plotted on the wall, like the diagram.

Yeah, it's like, if you were to plot out everything that's going on in the network, it's really like that, that would be really hard to model locally, right?

So I completely agree.

I think that's, that is definitely, that's tricky. Yeah. I mean, what have, I'd be curious more for you guys specifically, like what, so when you guys are doing stuff like replacing props on the edge, basically, which is crazy, like, that's, that's awesome.

Like, that's such, but that seems like such a wild thing, right?

Like, how, how did you even get started building? Like, how do you even begin approaching that problem?

Because like, you're not just taking in like a string of React code, right?

Like, it's not just a bundle, you're thinking of it as like, this is an application, we actually need to be able to like inspect and talk to and stuff.

So what was that process like? I mean, it was definitely iterative.

You know, we, I have a pretty substantial compiler background. And we didn't, I, I, we really didn't actually even reach to the compiler stuff at first, just because it just didn't ring any, I didn't think about it.

And the stuff like the first early iterations were manual, like, you know, you annotate things.

And then it was like, you use helper components and wrap these things, not unlike a lot of, a lot of existing solutions today.

But there's, there were a limit, there were performance limitations with that.

And that involved full renders. And like, there was no, there was no, like, so essentially, we were trying to optimize further and further and further.

And then, you know, we came up with an approach that worked, but only worked, like, it basically would only let you replace, you could replace content, but you would have to give it the replacement, like, you'd have to give it the replacement string, right?

And that's not a good user experience, like, developer experience, especially if it's some complicated thing.

Like, it seems prone to error also. Oh, yeah. And then like, you know, like, what if you need, and then how do you deal with rehydration?

Like, that was one of the big problems that a lot of people just don't kind of realize.

How do you deal with client-side?

Like, what if the client, what if they're doing client -side navigation onto the page?

Same thing has to work, right? So, so we wanted, we needed to come up with an approach that worked no matter what, worked server-side, worked client-side, worked with rehydration.

And static analysis really was what allowed us to, to do that, is we had to, you know, walk in.

And we wouldn't be able to, we, we, we decided to focus on React first because it's one of the hardest to support, because it's just JavaScript.

Like, you know, you have to be able to analyze JavaScript, which arbitrary JavaScript is hard.

And it's not that we can support literally anything.

I mean, you can't use eval and stuff like that.

But the big, the big thing that led us to kind of cheat is that we only support hooks, right?

Components that use hooks. Oh, it just, okay. Because there's rules around it.

Like, like, like, one of the rules is that you can't have an observable, you can't have any observable side effects outside of that thing unless you put it inside a use effect.

And then, so the use effects just don't run, right? They don't run until after the component has been, you know, mounted on the record.

And that allowed us to, to, you know, get away, some of the problems with static analysis in general is, is side effects are really hard to follow without actually interpreting the application.

I can't even imagine. Just give me a headache thinking about it.

But even, I mean, so, but there's still, don't get me wrong, there's still, it was a really, it was, we thought it would be hard.

We didn't, like, we thought it'd be really hard.

We didn't realize it's actually gonna be, it was actually real, like, way harder than we thought.

It's 10x harder than however hard you thought it was going to be.

Yeah. But we did it, you know, and it's, and we're, and we also have had help with some of the webpack community and we're releasing a webpack plugin here shortly that'll allow it to more integrate with your existing, like, if you have weird loaders and all sorts of weird things, it will work with that too.

But yeah, so that's, that's the, just there. And we just, we saw that, that there was a big opportunity for, you know, for doing these things and the power, like, so that was like the initial stuff.

And then we've also, as we started working with more and more companies, we realized that most of them actually haven't, almost all of them, if not all of them, I think I can only think of one example off the top of my head, had not yet actually been utilizing the edge in any, in any capacity, other than just like what was built into whatever their provider, you know, if they're using Vercel, it's like, oh, they've got an edge, you know, and they're just like, I don't know what that is.

I don't know what it does. It's, it's doing stuff.

Right. And so we also set out to, to you know, we're already at the edge.

We might as well help you solve that problem, make it easier for you to, to take advantage of these sorts of optimizations, like everyone's website, if you run it through Lighthouse, which is this tool that Google provides for, I'm sure you're aware of, but this tool that Google provides that can give you more insights into your performance and your SEO and all these sorts of fun things.

Right.

I mean, almost always, almost all websites have just awful performance. Like it's, it's, it's laughable.

I mean, even the people who are like, it's the funnier thing is to go to someone's, go to a blog post or go to like, go to one of the Google pages.

That's like telling you how important it is to have good performance and run that page through their own tool.

Like one web dev, web.dev is, is where you can run this.

If you run web.dev through the performance tool itself, you'll see how dismal their performance is.

The point being is not that they, it's not trying to be a knock on them.

It's just, it's hard. Yeah. Yeah. It's not just hard to get it fast, but I'd say even harder is to keep it fast.

You can do all this effort, move into Jamstack, you move all this stuff.

And then the performance slips. Marketing says, Hey, I need this script, you know, whatever this and that and the other, and there's, you add the fonts and you change this.

And before you know it, your, your performance is slow.

And so in you somehow you, you kind of have to be an expert on performance alone to kind of know what all the quirks and, and, and in interesting ways.

So we're trying to solve that problem as well and just make it optimization in general, because all those things affect your conversions, your, your, your performance on those sorts of things.

And in some cases they can affect it dramatically.

If you're a direct to consumer brand, a majority of your traffic is probably mobile these days.

And mobile is even more important to have a performance.

They get killed on that. So, yeah. Yeah. I mean, you, you talked earlier about like the idea of like someone who doesn't have a great network connection, like what if they can just terminate whatever the request is that's coming from the phone at the edge?

Like that's a, that's a really big thing. There's, there's some stat I always see about like e-commerce speeding up by some percent, increases sales by some percent.

It's like some cool number, but I don't. Google did a study that everyone loves to, to quote.

I don't know the numbers off the top of my head, but yeah, Google did a study that shows and it's somewhat counterintuitive too, because it actually shows you would, I expected before I read the study for it to have diminishing returns much sooner.

Like in the sense of like the difference between three seconds and two seconds, you know, being like way less than the difference between three and four seconds.

But in fact, it's the complete opposite that like, I don't know why, but, but people who are smarter than me about those sorts of things did study, did a study on it and showed that that's actually the opposite, that the more, and there's a diminishing returns, you know, eventually, right.

You know, the difference between 30 milliseconds and 60 milliseconds is probably not, you know, like if you were able to actually display your page in that performance, which you can't, but.

Right. Yeah. Yeah. That, that totally makes sense.

So we only have a couple of minutes left. One, one more question for you.

So since we're in developer week, let's leave it with a kind of developer focused question.

Like what, do you think that there is a good first, like where to start, right?

Is kind of the question, like, is there, is there an obvious first, like this should be the thing that you bring to the edge, or do you think that it's broad enough, like broadly applicable enough that people should just try it?

Like what, what, what do you think about that? Knowing what, you know, now?

Yeah. I mean, I, that's tough. Everyone's, everyone's situations can be different, but I mean, like the, the most important thing to, when you first, to get the most bang for your buck, most people still haven't, haven't correctly set their caching headers and their caching policies and these sorts of things.

That's, before you do anything, I would do that first, to be honest.

Caching is voodoo, but it's a lot, if you did caching 10 years ago and you're now afraid because of what happened 10 years ago, it's so much, it's a lot better these days.

Like cache control is almost unanimously supported, so you don't have to, like the cache control header, whereas before, like it's just, you got 50 different, you know, not 50, but there were, the browser support was weird.

And now with things like, you know anyway, so, so, you know, there's, but, but there's still tricky parts, right?

So Cloudflare and Netlify and Akamai, I'm trying to remember which one.

So I think Fastly is the only CDN provider, which supports surrogate control, which is basically the ability to do cache control on your CDN differently than what you want on your browser with a header, driven by a header.

I think they're the only ones. Now you can do it manually, but, and which is what, you know, like, but anyway, my point of pointing that out was just that that's where one point, one place where you can use a worker.

If you're like, let's say you're on Cloudflare, like, you know, you can work around that, right?

You can, you can do caching yourself within the worker. So you can write your own policies essentially in the worker and cache things differently, however you want.

So I think caching is probably one of the most important places to start.

It's difficult. It is a little difficult, but it's, it's, it's a very, very important thing.

**JASON LENGSTORF** Okay. We got just a couple seconds left.

So people can find you on Twitter underscore Jay Phelps if they have any questions.

I'm SignalNerve on Twitter. Yeah. I think we're out of time. Thank you everyone.

And yeah, thank you, Jay.

Thumbnail image for video "Developer Week"

Developer Week
All the building blocks you need to create & deploy full-stack applications on Cloudflare. Tune in all week for exciting new product announcements and more!
Watch more episodesΒ