Yesterday, Today on the Cloudflare Community
Presented by: Tim Cloonan, Scott Dayman
Originally aired on September 27, 2021 @ 10:30 AM - 11:00 AM EDT
A fast paced look at Cloudflare Community activity, a deep dive into the hot issues from yesterday -- and related CommunityTips and tutorials. Featuring an interactive troubleshooting session led by a Community MVP.
Original Airdate: July 17, 2020
English
Tutorials
Community
Transcript (Beta)
Good morning. Welcome to Yesterday Today on the Cloudflare community. I'm your host, Tim Cloonan.
If you'd like to know more about the Cloudflare community, join us every Friday for a new edition of Yesterday Today.
Each week, we start by looking at a summary of popular topics on the community, site traffic from last week with this community day and the community traffic report, starting at 10 a.m.
Pacific.
That's followed by a review of our top community issues from last week, the ever-informative using the community tip, occasional interviews with community MVPs, customer support engineers, or Cloudflare product managers.
And every week, we conclude with In Class with Cloudflare, where we learn a few things from the community.
Turning to the traffic report, overall community traffic was up versus the seasonal slowness that we saw last week, with new posts and new topics both up over the prior week.
The number of new customers joining the community was flat week over week, but remains up for the year.
For this community day last week, the top three searches on the community were for maximum file upload.
That's the setting in the network app of the Cloudflare dashboard.
DNS questions ranging from how to change my name servers to gray and orange clouding DNS records, followed by searches regarding WordPress speed.
The most popular category for discussion with the community last week was the developers category with questions about workers leading the space.
If you remember two weeks ago, all of the categories were about equally active, and we had several ties.
Activity was close this past week.
Security was followed by both was performance, and they were both actively trailed developers in terms of popularity.
And that leads us to our top story today.
Today, we're going to talk about a topic that has come up frequently over the past several months, comes up often on the community as customers look to optimize their site performance.
Before we dig in, we'll be referencing a number of sites, posts, tips, and tutorials throughout the show today.
Go to the link that's shown on the screen to follow along or go to it later to review the assets that we're going to be talking about today.
Next, let me ask you to join the show by submitting your questions to live studio at Cloudflare.tv or just hit the email into the show button on Cloudflare.tv, and we'll take your questions live on the show.
And today, I'm joined by one of our long term time Cloudflare MVPs, Scott Damon.
Scott, welcome.
We appreciate you being here today. Well, thank you, Tim. I really like watching Cloudflare TV.
I've been watching quite a few episodes lately. Fantastic. Yeah, I didn't realize I had quite as much time available in the day to watch it as I do, and it's quite enjoyable.
So we're going to dig into our top story today, performance 101.
And we're going to be talking primarily about speeding up your website with caching and caching options using page rules, workers, minification, rocket loader.
And we'll probably touch a bit on origin performance optimization as well, because that dramatically affects overall site performance.
Scott, can you talk us through some of the issues and options with performance optimization?
Sure, Tim. I was going to start with Cloudflare's default configuration.
So if you just add a fresh domain to Cloudflare, there's a bunch of default settings that you get with that.
And then I'm going to go into some more settings and options to improve your performance by using Cloudflare settings, page rules, and workers.
The websites that we look at generally fall into two categories.
We've got static websites where the content really doesn't change from page to page.
And then there's dynamic, which frequently changes per viewer.
Dynamic sites usually are using some software to generate that HTML code on the page every single time somebody visits the page.
And again, sometimes that page content never changes, even though it's, quote unquote, the dynamic site.
Other times, websites vary the content for every user that is on there, if there's a comment on a post or especially if you're using a shopping cart in an e-commerce site.
But again, most sites that we look at have files that never change, like style sheets for CSS that change the looks of the page and the layout, JavaScript that tells the browser to execute some commands, and of course, images.
For this demonstration, I've configured a virtual private server in Los Angeles that's close to me for better performance.
This server has four identically configured websites running WordPress with the 2020 theme and is populated with WordPress's theme unit test data.
You can download that file if you want to populate a sample site.
The origin site itself has no special optimizations for caching or any plugins at all.
It's just a straight vanilla 2020 theme with nothing else going on.
The server itself is configured to use HTTP2, so all the resources are served in parallel.
So usually with the old HTTP1, it would send you a file and send you the next file, the next file, but HTTP2, it can just send them all together at the same time, so it greatly reduces the time that you spend waiting for the site.
And Cloudflare itself also offers HTTP2 delivery of content.
So the first server that I'll cover is a direct connection that does not use Cloudflare at all, except for DNS.
My domain is going through DNS at Cloudflare, but this first site is a straight IP address to the server.
The other three, they go through Cloudflare with their own unique settings.
A default package with, again, the standard settings you get when you add a site to Cloudflare, and then I have one that uses a cache everything page rule, and then the final one, I get a little help from workers on that one.
So for sites going through Cloudflare, users can choose from many settings.
Unless otherwise configured, all sites have standard caching. Again, that's with images, CSS, and JavaScript, and those will be cached at Cloudflare data centers near their visitors after first visit.
Usually cache is empty, but a visitor will hit it, and then Cloudflare pulls it and then saves it for a while.
Other options that can include minification to compact the text in your website, Rocket Loader, what helps to speed up and optimize JavaScript, and then page rules that can toggle these options on or off for different parts of your site.
And Cloudflare also offers a word of course product that will let you store and execute portions of your site in the Cloudflare data center.
So for our first one, it is, let's see, it is demoflare.cf.
And that one, again, is my raw server straight exposed to the Internet.
And that's my baseline test for my performance testing.
Again, it's using DNS, but DNS is pretty quick for most sites anyhow.
So visitors will connect directly to the origin server for all the content, the HTML, the images, the CSS, and the JS.
And for setting up my baseline test, I'm using fasterslow.com, excuse me, to test the site performance from 18 locations around the world.
And I ran it a bunch of times to get the best score. And so the highest of the global average was 59.
And that's not so great. You can see a lot of yellow bars for kind of median performance and only the few green bars that are physically closest to my data center.
But if we take a look at the time to first byte, that's TTFB, it's about 2.1 seconds.
Again, that's a global average. So in some areas which are closer to the server, it's going to be pretty fast.
But farther away, you got to traverse the entire globe to get to the server.
And again, TTFB is when you're first starting getting data from the website.
And then we get the first CPU idle.
And that means, all right, your browser has downloaded the site and it's rendered the whole thing.
And that's kind of the finish line for when you're looking at it.
And in this example, the first CPU idle is at 2.74 seconds.
So during this test, it's taken 2 .74 seconds before your visitor's browser can see the entire site.
So keep those numbers in mind. TTFB for this raw server was about 2.1 seconds and completion at 2 .74 seconds.
And that's in an elapsed time of about 0.6 seconds for that content to finally make the trip.
For my next one, I've set up another one.
Again, same server, same identical configuration, except this time it's going through Cloudflare with this default configuration.
And as I said, standard configuration will cache just the static files, CSS, JavaScript, and the images themselves.
And I ran through this test several times, again, just like the last time, run it a bunch of times, as I mentioned earlier, it takes a few times for the cache at the edge servers to populate.
So the very first visitor to your site in that area, through that data center, they're not getting the cache content.
It's got to pull it from your origin server.
After a few hits over there, the Cloudflare cache for images and the static files is already full.
So anybody who comes in later, they're getting much better performance.
But still, the HTML has to come from your origin server.
And that's certainly, that can certainly tie things down a little bit.
However, because Cloudflare has cached all the static files, now you're not having, as you see in the data, is 16 requests to get the website.
Now, 15 of those requests are handled by Cloudflare because they're caching static files, and only that one request makes it through to the server.
So now the server's got a lot less work to do.
So the overall time looks a lot better because of that improvement in time to first byte.
It does seem though, nonetheless, I mean, we see the improvement, but we do see a lot of questions in the community about people that don't seem to notice the performance improvements when they set up Cloudflare.
Can you talk to that?
Sure. I know a lot of people take a look at one specific data point, such as time to first byte, and they're like, oh, well, my time to first byte hasn't improved.
And generally, that's because, as I said, it's HTML, and it still has to go to the origin server.
And nothing happens until the visitor's browser gets that HTML to start pulling in the other static files.
So TTFP doesn't change a whole lot for a lot of visitors.
In this case, the server's pretty fast, and TTFP has improved a little bit because it's not loaded up.
But it can get better if I can avoid that long trip to the server for it.
What people don't realize often is that if you take a look at the end time, again, that first CPU idle, we've shaved off over a second of that.
It's almost cut it in half because Cloudflare has cached all that static data.
And now it's just shooting it to your browser as quick as it can without having to grab it from your origin server.
So the numbers in this one, if you look at the elapsed time, it's about the same because we're still running HTTP2.
Everything runs in parallel, and it happens a little bit faster.
However, that TTFP has gone down quite a bit because the server's not working as hard.
So that's certainly a good thing. And you'll also notice all the green bars is a little bit surprising because, again, the only thing that's changed is TTFP has lowered.
So everything is happening a little bit earlier.
And just because you get a little jump start on that, now you're getting all green bars on that, and our score is a 90.
If you look at the only yellow number up there for the data, it's first meaningful paint.
So it's taking a little bit of time to get the content still, but we can do a little bit better.
So for my next one, we've done standard caching.
And if you looked at a WordPress blog site, it's pretty static.
It's just your posts, and sometimes you get comments, and I'll talk to that in a little bit because comments introduce some dynamic data.
Now, with cache everything, it's certainly something that you've got to be really careful about because you're caching that first visit.
So whatever a person sees is what the next person sees.
Now, if somebody is going to see an image file, that's not going to change at all for the next visitor.
The CSS doesn't change, and the JavaScript doesn't change.
However, it's the HTML that you have to be careful about.
On this site, because I don't have comments turned on, and there's really nothing else changing from view to view, it's safe to cache everything.
Now, a lot of people have landing pages where it doesn't change.
Anybody comes and visits the landing page, it's just kind of a little sales pitch for their product.
And there's not comments on there.
There's not a shopping cart. It's just information, and that doesn't change.
So you can use the cache everything page rule to set that up for you.
So if you've got cache everything, well, congratulations. Now, every single resource from your website is cached at the edge close to your visitor.
Now, if you look at time to first byte, it has dropped significantly more.
It's down to about 103 milliseconds. And that's because now the HTML, which is that first thing that your browser needs, is right at that cauliflower data center very close to you.
And then thanks to HTTP2, it's grabbing everything else in parallel as fast as it can.
So not surprisingly, it's still about 0.5 seconds of elapsed time to grab all that data using HTTP2.
But now, if you look at the scores on fast or slow, it's almost straight 100s.
Just that very first one on first contentful paint is about a 99, I believe.
And that's fantastic. But again, using cache everything, you got to be really careful about that.
And just keep in mind that what one person sees is what everybody's going to see.
And so now if you look at the first CPU idle, it is down to about 650 milliseconds.
And that's just amazingly fast.
If you go to that site, cache.demoflare.cf, you'll see it's a pretty long page of content, along with quite a few images as well.
And to get a full page like that loading in 650 milliseconds is pretty amazing.
So cache everything seems like a panacea.
That is, I mean, we have folks that struggle on the community.
It's a nice alternative to standard caching. The performance bump is fantastic.
What am I missing? What should I be worrying about? Well, as I mentioned, cache everything really needs to be used cautiously.
Whoever looks at your site, they're getting presented with some information on the page.
Now, on this demo site, that doesn't change at all.
But if you look at something like a WooCommerce site, where you've got that little shopping cart at the top of the screen, that's going to change depending on what's going on with that visitor.
So, you know, anybody who's not logged into WooCommerce or like a membership site or like a forum site, it's just kind of generic text on there.
It doesn't change. They don't have any special privileges in the access.
So, they're just getting kind of the plain vanilla view.
But just imagine if you have a shopper who goes into a page and they click on an item and they go, oh, that's pretty neat.
I'm going to add this to my cart.
Well, now that page has a shopping cart at the top that says, oh, you've got an item in your shopping cart.
Now, if you use cache everything, Cloudflare is going to cache the view of that shopping cart and has one item in there.
Now, if the next person comes up and looks at that item and they happen to hit that page that's been cached, they're going to go, wait a minute.
One item in my shopping cart?
I haven't been shopping. I haven't done any of this. You're looking at a version that was cached for somebody else.
And even a little bit more. Sorry? I was going to say that's, yeah.
And now I understand why we've seen a few comments on that in the community where people are surprised, right?
And it's like, oh, I've been hacked or something, right?
And they interpret it in all sorts of different ways, but it seems to be correct.
Right. So imagine if you're a WordPress admin and you happen to have that toolbar turned on, that toolbar option turned on when you're visiting your site after you're logged in, every page you go to, if you're using cache everything, you're going to load the cache with that toolbar on the top.
And so some visitor is going to come along and then go hit the front page of your website and like, wait a minute, I can see an admin bar on this page.
Do I have admin access?
Well, thanks to cookies and security within WordPress, you're not going to be able to get in.
You can click those things and all of a sudden WordPress is going to go, oh, sorry, we can't let you do anything because you're not logged in.
But somebody who's on Cloudflare who decides to turn on cache everything because they saw it in a tutorial somewhere, they're going to get a little bit unnerved because they go, wait a minute.
My visitor is saying they're seeing the admin bar or they go to one of my stories and there's an edit link.
Well, right.
Unfortunately if you're trying to use cache everything on a site that really is dynamic, you're going to run in some snags like that.
There's still some safety.
For example, like the WooCommerce person who sees something in the shopping cart that they didn't order.
Well, when it comes time to check it out, that whole thing is going to kind of fall apart because they're not going to be able to check out accurately and it's really just going to bomb out.
So in examples like that, cache everything really isn't the solution for them.
What is the solution?
A really fast server. If you're going to be running WooCommerce and that's where a lot of people in the community go, Hey, I'm trying to speed up my WooCommerce site.
Well, again, that time to first byte is a thing that a lot of people look at and they don't really quite notice that.
Well, yeah, my time to first byte is still five seconds, but okay.
But your product page or your catalog page that has 50 images and all this text, those images are going to come through really quickly because they're going to be in the cloud for a cache.
So you're going to speed up the overall time. But if your server still takes five to 10 seconds for that WooCommerce page to generate, they're going to be a little bit unhappy.
It's just, it's really a server performance issue.
Anything that's dynamic needs to be generated dynamically. So the server's got to be fast to generate that stuff dynamically.
Now there are some servers that are out there where internally they can do their own special caching, but if you're going to use cache everything on Cloudflare, it's kind of an all or nothing for that page.
Some fancy servers can do partial caching. So that little shopping cart at the top is not going to be cached, but the rest of that product page will be cached, but that's a server issue.
It's really something you need a special server for WooCommerce if you want high performance.
Gotcha. All right. Now my final setup is kind of like my little cheater setup.
Thanks to Cloudflare workers, you can speed up a lot of activity on your website.
Now this one, you're looking at the numbers like, oh, that's awful.
Look, it's only a 67. I see some red numbers out there.
Now this is a very first load of a site. So the Cloudflare cache has not been populated at all.
And it kind of looks like, almost like my standard view from the very beginning, not even standard view, the default, the straight to origin, a mix of green bars and yellow bars.
However, if you take a look at time to first byte, it's amazing.
It's 106 milliseconds. But then from there to first CPU idle is close to two seconds.
So that elapsed time has gone from that 0.5 seconds up to two seconds.
Now that's because in this one, my little cheater mode for workers is I have the HTML.
It's kind of like pre -caching it at the edge.
I've loaded up the HTML content as a worker and pushed it out to every edge location at Cloudflare.
So that's one of the nice things about workers.
Just the standard workers, that script that you push out for workers is pre-loaded at every edge location at Cloudflare.
There's a tiny bit of a, I forget what they call it.
It's kind of a cold start for that script to run, but it's there.
It's out there, but look, it's 106 milliseconds for that worker to load up the HTML.
But now your browser has got to hit Cloudflare and grab all that content back from your origin server.
And we're kind of back to the beginning again, where we're not caching anything and all those hits have to go to the origin server.
So that score was a 67.
And again, I talked about some of the numbers. Now in our final example, this is what it looks like when your HTML is out at the edge.
And it looks very much like the cache everything rule.
And it's kind of the same rules apply as well, is your HTML is already pushed out there.
Now this one is really just a landing page demo.
As I said, landing page quite often just has no dynamic content at all.
Anybody comes out and visit it, it's going to look the same for everybody.
So this is a safe approach. It's similar to if you did a cache everything rule, but your matching parameter is just, let's just match worker .demoflare.cf slash and nothing after that.
Your homepage will cache out to the edge, but like all the other cache resources, it's going to take somebody, a visitor to hit that the first time to load up that cache.
Not so with a worker though.
The cache is effectively preloaded as a worker. So first bite on this one again, about the same time, 106 milliseconds.
But now thanks to having everything else preloaded in the cache, our first CPU idle is down at 680 milliseconds.
So now this site that originally loaded in about two and three quarter seconds loads down in less than one second.
We've shaved off two seconds of load time, and that is phenomenally fast.
Again, if you take a look at that page, there's a lot of text, but there's also a fair amount of images from the data in here.
The transfer size is 564 K and they're all about that same. If you go back and look through the, the other stats, it's about the same size of the transfer, but thanks to caching or having things out at the edge of Cloudflare, it's a lot faster.
So the, the goal here is to try to take advantage of edge resources at Cloudflare, whether it's caching your static data, or if you're able to find a way to cache some of your HTML or put it in the worker.
Now, again, the downside is that this caching everything or caching the HTML really depends on the site being pretty static for that page.
Now, if you're starting again, if your site's content varies with users are logged in, like it's a membership side or an active bulletin board or commenting, or especially e-commerce, you're really going to have to get all that data straight from the origin, find the ways to speed up that origin server.
Now, following up on that, other things that you can do is go through some of the options that are already in Cloudflare.
Again, in all of these examples, that entire domain, I have not changed any of the default settings at Cloudflare, except for when I added a page rule for cache everything.
So if you go into the speed section of the Cloudflare dashboard, you're going to see some options for minifying your text.
Again, that's compacting your HTML, your CSS, new JS, JavaScript files, just JS, sorry.
And that's going to shave off a little bit of time, not a whole lot.
If you go to some page testers, like, you know, PageSpeed Insights or GTmetrix, some of them might say, Hey, you really need to minify your text.
It'll help you a little bit. And especially for mobile browsers who are slower, any little bit helps on that one.
Rocket Loader is also nice to use for JavaScript.
It usually works. It works on most sites, just straight ahead WordPress sites, Rocket Loader works pretty well.
But if you start using more complex plugins and page builders, Rocket Loader kind of throws a little wrench into some of that, and it's not quite compatible.
You know, if you like to, and you find that Rocket Loader interferes with your admin panel, you can start adding some page rules to disable Rocket Loader for certain parts of your site.
Generally though, I have Rocket Loader disabled because I do a lot of my optimizations at the server side.
So I got my JavaScript already optimized. If you look at PageSpeed Insights from Google, a lot of times they say, Oh, your JavaScript, it's slowing your rendering down a little bit.
You should defer some of this till later. Now Rocket Loader can help with some of that or doing it on your site.
As far as the images go, a lot of times I optimize most of my images on the server.
But if you just want to do a kind of a plug and play sort of a thing, once you get into the paid plan to Cloudflare, it opens up some more options for you such as Polish, and that's going to compress your images, which does a pretty nice job.
And it also can deliver them as a WebP, which is even more efficient for browsers.
And then Mirage, which is really a mobile-friendly way to scale down the images or lazy load them to make the mobile experience better.
Generally, though, once I get a site up and running, I run it through Google PageSpeed Insights.
And sometimes within the Chrome browser, you can use Lighthouse to audit a site or GT metrics.
And I look at all of them because they all have suggestions on what you can do to your site to make it better.
And I'm certainly open to suggestions. So I do what I can based on their suggestions.
Cloudflare has quite a bit of it, but I do tweak things once I got everything else up and running.
That is fascinating. I've not played with worker sites at all, and I've played a bit with workers, but what types of sites work best with worker sites?
Well, similar to my little cheater method of using workers to preload that HTML out the side, a worker script as a very basic one works best if you have one resource that you're generating with a worker.
Now, in my example, that one resource I chose to support with workers is just the HTML, the homepage.
But workers themselves as a baseline is pretty standard, pretty static.
And if you take a look at the Cloudflare Wranglers product, that's going to grab an entire static website, one that you generated with like a page builder or something somewhere where it doesn't change and there's no database driving the content.
And Wrangler will take all that, it'll package it up and it'll push it up to workers, but it'll do it in a couple of ways.
You have the worker itself, which is just the basic HTML code, but Wrangler also packages things up using workers.kv.
Now, workers themselves are stored at the edge.
The kv is a database that's not always stored at the edge.
So it's a little bit slower. It's still faster than coming from your server though.
Now, if you look at a blog post from today, you're going to see quite a complex setup for workers that's kind of database driven.
Yeah, that was a fantastic article.
So we are coming up to the end of our time today.
Scott, thank you very much. Thank you for joining Yesterday Today on the Cloudflare community.
We appreciate your time. And I'm your host, Tim Clunan. We'll see you next Friday at 10 a.m.
Pacific for a new edition of Yesterday Today on the Cloudflare community.
Thank you, Tim. Thanks, Scott.