Originally aired on January 2 @ 12:30 PM - 1:00 PM EST
Welcome to our weekly review of stories from our blog and other sources, covering a range of topics from product announcements, tools and features to disruptions on the Internet. João Tomé is joined by our CTO, John Graham-Cumming.
In this week's program, we go over some blog posts we wrote about Internet speed, but also our Cloudflare One Zero Trust platform being named in the Gartner® Magic Quadrant™ for Security Service Edge. We talk about some announcements, such as our new Network Analytics dashboard, and how it provides security professionals with insights into their DDoS attack and traffic landscape. There’s also a hardware deep dive, related to memory bandwidth performance and some Easter, Passover and Ramadan trends.
In our “Around NET” short segment, we “travel” to Perth, Australia, to hear from Damian Matacz, from our Network Strategy team.
Related blog posts:
Hello everyone, and welcome to This Week in Net. It's the April 21st, 2023 edition. I'm João Tomé, based in Lisbon, Portugal, and with me I have, as usual, our CTO, John Graham-Cumming.
Hello, John. How are you? I'm fine, thank you. And I'm also in Lisbon.
Exactly, so two people in Lisbon in this case, and today is a little bit rainy, so it's different from the previous days.
It's true, it's been really pretty nice here, hasn't it?
It's been pretty warm, a bit unusual because normally in April, Abril, Aguas Mil, it should rain a lot and it hasn't.
Exactly, we have that tradition, that saying, where in April it rains a lot, which in the past few years, that's not really the case.
It rains a bit, but not a lot. Oh well, we didn't do it last week, so we have a few blog posts that were out in these past two weeks.
DDoS report, our DDoS report for Q1 2023 was out, also Internet disruptions Q1 2023 was out, and of course a few announcements.
I can share my screen and we'll take it from there.
Yeah, because we skipped a week. We skipped a week and it was a busy time on the blog too, so we've got a bit of a backlog of blogs.
I don't think we'll manage to go through all of them today, but give us a quick scroll and just take a look at what we have.
Exactly, so the last one that we did previously in terms of highlighting in this show was when these were coming out.
So the DDoS report I just mentioned and the Internet disruption overview with a lot of disruptions that we monitor on Cloudflare radar, but also a few announcements.
For example, Cloudflare 1 was named in Gartner Magic Quadrant for Security Service Edge, which is always important.
We also introduced Cloudflare's new network analytics dashboard, which is also something in terms of DDoS attacks that is important for security professionals, for example.
And interesting enough, it wasn't speed week, but we had a few blog posts related to speed, right?
Network quality and speed, so that's also interesting in a sense.
But Cloudflare Xeras also with one announcement, consent management made easy and clear now with a new tool.
And we also have this deep dive about DDDR for memory organization and how it affects memory bandwidth.
And a few trends. Where should we go? Where should we start? Exactly. Well, let's just scroll down because I don't think we can talk about all these, but there's a couple I'd like to just we'll just briefly do without clicking through to them, but just go back to the Gartner one.
We don't need to talk about all the details of it, but just to say that I think the important thing here is that Cloudflare 1, which is our Zero Trust, Secure Service Edge, SSE, SASE, to use all of those Gartner type terms, Cloudflare is on the Magic Quadrant.
And so you're going to see us making lots of moves around Cloudflare 1 and Zero Trust.
So if you want to understand more about that, that blog post is about it.
And you're going to see this area develop enormously for Cloudflare because it's an enormous part of our business and something that we've been investing a great deal in.
And we think because of the size of our network, we're uniquely placed to be a huge, huge vendor in Secure Service Edge, SASE, whatever the thing is being called today.
Exactly. And in this case, it's a validation. The service is already there, but validations are also important in a sense, in terms of trust.
Yep, absolutely.
And of course, in this day and age, in terms of security and vulnerabilities and email phishing, all that, Zero Trust is a good way for companies to be protected.
We mentioned this often. That's right. That's right. So that was one thing.
You mentioned the analytics. It's one of those very much sort of behind the scenes pieces of work going on to create this new analytics dashboard.
If you're used to our analytics, they've got a big upgrade, particularly for people who use Magic Transit and Spectrum.
So the people who are sending traffic through us that we are not necessarily doing any inspection on because it's not like web traffic where we're running our WAF or something, but it's traffic that's going through the network level and to be able to understand those flows.
We have a whole new UI for this and a whole new backend, which really explains how this works.
It's always important to understand it better, how your services are running and working.
So it's about information, right? A good dashboard is good for those who are managing a network.
Well, I mean, the goal of this is, you know, one thing is you just want to understand what happens in general.
But I think the most important thing is like, there's something happening on my network.
Show me what it is. And when Cloudflare becomes part of your network through Spectrum or through Magic Transit, you want to have that ability to see it.
And so this is what the new NavV2 is all about.
And we can show you exactly what happened to your traffic, how we handle it, did we drop it?
And you can then mix and match and then you can go in and dig into that.
There's also a mention here to DDoS attacks. Looking at a DDoS attack and trying to see what it's all about, where it's coming from and how you might use the Unity interface.
So built on a lot of technologies that Cloudflare has built over the years, but it's very powerful for those people who are using Cloudflare as an extension of their network or as their network.
Exactly. I like this blog post, not only because it's a good tool, but because it explains well the tool.
It's a lot of graphics, a lot of examples. It's very, very good.
And there is an accompanying post, which I don't think is out yet, which explains how we built this behind the scenes, which I think Alex is writing.
That will be also a good one. Where should we go next? Well, you mentioned there are two posts about speed and maybe people don't know that we have our own speed test, which is speed .Cloudflare.com.
And from that speed test, people can go in and here's a great screenshot of it, can understand their home network and how it operates.
And it will give you information about lots of things about the network, but there's this new thing on here, which is the network quality score, which is this thing.
And so one of the things that's interesting about network forms is we can give you a load of statistics, right?
So we can the bandwidth is, and obviously your ISP is probably selling you bandwidth, right?
I know that I have a one gigabit connection here.
You probably have something similar. We're very blessed in Portugal with very good Internet connectivity, but there's also latency.
There's also what happens under load and there's packet loss and all those sort of things.
Now there's this new network quality score, which we've added, which looks at given all that, is my connection good for video streaming?
Is my connection good for online gaming?
Is my connection good for video chatting? And so it gives these measures here.
And this is actually pretty important because like right now, look, you and I, we're video chatting.
So it's important that we have a good connection for that.
But there are different characteristics of bandwidth and latency particularly, and also jitter.
And jitter is latency, right? Is how long it takes the data to get from one place to another and often back again.
It's often measured as a back and forth.
And you'd kind of like that to be consistency. You can think about this.
I was driving this morning. If you want to drive from say Lisbon to Cachcaix, which is not a very far journey, the latency is how long it takes you to get there, right?
And hopefully you wouldn't have the latency change, right?
But there's traffic happens, right? So you get variable latency.
One day is different from another. That change in the latency is the jitter.
It's like how much it varies. And one of the things that's interesting is, and you'll know this if you drive, jitter on the roadway is very frustrating.
It normally takes me 30 minutes, but now today it took me an hour.
Well, the same thing causes a problem on the Internet, which is the latency gives you the time to get to Cachcaix, but the jitter tells you how much it changes.
And this can cause a problem for certain sorts of things.
For example, what we're doing right now, video chatting, if there's a lot of change in the latency, it can be difficult to deal with.
If the latency is very poor, then it makes things like real-time gaming very hard because, you know, you do an action that has to go to some server somewhere that incurs the latency cost.
And, you know, maybe you're dead because some other bullet arrived from some other player and didn't realize, right?
And so we've wrapped all this up into these measures and we're working with an outside party to put together this database of real data about how your Internet connection is working.
And it makes a really great point in here that, you know, if you look at this like far from the access point, like right now, I'm actually quite far from the access point in my home here.
And I suspect that my latency is a little bit worse and probably my bandwidth is a bit worse.
And so it's not just your ISP, right?
There's lots of things that go into this. One of them is where you are relative to Wi-Fi access point.
And one of the reasons why I love ethernet and actually plug in quite often is you actually control all of these issues.
And so the idea of this score is to help you understand that.
And actually, if you walk around your apartment with running speed .cloud.com, you can actually see, hey, I'd be better off doing gaming here, for example.
That's really interesting. Exactly.
You can even make it better just by moving around in your house because the Internet is not responsible for your Internet quality.
It's not only, and most people know this, I think, it's not only your ISP, but also where you are at your house.
I had enormous problems with this. I have, you know, an Internet connection and with the Internet connection came a TV connection over Internet, of course, now, and it just didn't work properly in my place.
Now, I'm not in a big place at all. It's really, I'm not very far away, but I'm guessing that in the walls here, there may be some metal cause problems.
And eventually I used ethernet and this is all about making kind of decisions like that, but what am I going to do?
And I actually, you know, I actually wired up ethernet just so that the TV thing watches and I never watch TV.
So, but at least I know it works. So anyway, this is all about that. We're publishing this database of measurements so people can get an understanding around the world in different cities and different locations about what, you know, how the Internet is performing for different areas and for different types of tasks.
Well, one of the things I really enjoy looking at this as a non-expert user will quickly understand what those network quality scores mean.
Video streaming, good, bad, whatever, online gaming, video chatting, but there's a lot of measurements there, but the average user doesn't need to know sometimes the names like jitter, like the technicalities.
Someone just wants to know, is my Internet good for this or that different scenario?
So having a score, a proxy for that, I think it's a really clever and interesting.
Yeah. I mean, and what I hope is that if you're not an expert in those things, you nevertheless read the blog post and you start to gain an expertise, particularly because the other blog posts, one, which was about Internet speed, where we say making your home Internet faster as it'll do with speed is about latency.
And so this, the time it takes to get from one place to another on the Internet, and this actually refers to a very old blog of mine, which is like, what's the fastest way to get data from one location to another?
So this is about, it's 11 years ago. So this was about latency versus bandwidth, right?
So it's one of these counterintuitive things. What do you think is the fastest way to get a very large amount of data from San Francisco to London, right?
And at the time I wrote this, I was saying, no, you could put it on BA286, a plane I knew and loved very often, or you could transfer across the Internet at a hundred megabits per second, which at the time, 11 years ago, that was a pretty fast connection.
And here's the difference. The Internet connection would take 22 hours to transfer the data and the BA flight was done in 10 hours.
So it was actually quicker to use an aircraft. And that just doesn't seem to make sense.
The problem is that's great if you want to transfer a load of data in one direction, the BA flight has very horrible latency, right?
It takes about 10 hours to get to London, sits around on the ground for a few hours, and then takes another 10 hours to get back to San Francisco.
So its latency is probably a day. Whereas that Internet connection is probably a few hundred milliseconds at the worst.
So these two numbers have different impacts, right?
If you really want to just move a lot of data and you see this, actually, people like Amazon, they have a service where they'll come to your business and take your hard drives and copy them because that's quicker than using the Internet.
And so the latency is an important factor.
And in fact, this new blog post, which this links to, was really about the fact that latency is one of the reasons why we focus on trying to reduce latency where we can.
And one of the reasons why our network is so large, so we're near the end user, because one of the big parts of latency is the speed of light.
And we cannot make the speed of light faster. We can only get closer. So. Yeah, it's a very interesting topic.
And it has a lot of research behind it. And one of the things that I always find fascinating is there's when now robot cars and automated cars and all that, even like health, telehealth, like someone is doing a surgery from in one hospital, a thousand miles away, just using the Internet.
And there was a lot of talk like two years, three years ago, in terms of tech sector, that latency would be the thing that would permit all of that.
Because if the latency is, so if you have like a fast connection, latency is low, you could get those types of situations that are very critical, like robot cars, they have to react quickly or telehealth.
If a surgeon is doing something over the Internet, it has to be quick.
All of that has to be quick. So the reactions. So latency is good for our Internet in general, but also.
Yes. And particularly for very interactive things. So for most of us won't be doing remote brain surgery, but many of us will be gaming over the Internet and the latency will make a big difference here.
So this is a quite in-depth thing by Mike about the impact of latency.
This graph actually, I think is really interesting, which is like the second one here, the relationship between latency and page load time, which is if you look at this, the X -axis is in the inverse direction, right?
So the left-hand side is the largest value and the right-hand side is the lowest value because you want low load agencies.
What you see is that the load time for a page on the website is actually affected by the latency of the connection.
And so actually, if you can make your connection faster, so we do that, Cloudflare does it, the 5G phone folks, because of the way they interact with the Internet should be getting better latency.
And if anybody's watching this from Australia, it's not uncommon for Australians who leave Australia to go to other parts of the world to say the Internet works so much better here.
And that's because Australia is a long way from other places. Now we obviously have data centers in Australia, which brings stuff locally, but if anybody has to leave Australia to go to something that's hosted elsewhere, it can be very, very slow.
Cunningly, the Australians have actually got a plan for this, which is Australia is actually slowly moving northwards at a few centimeters per year.
So actually the latency is going to get less, but it's probably a little bit too long for anyone to wait for that slow migration of the entire landmass.
So we have to move our stuff to Australia to make things safer.
But this is really kind of shows what the issue is here.
And so I think the next big frontier for applications on the Internet is how to make them fast is to look at the latency.
And so one day, I hope that ISPs will say things like, we'll give you a one gigabit second connection and latency to, let's say, certain great services is less than 10 milliseconds or et cetera, because that will also be a good way of understanding that it's a good Internet connection.
That's interesting. And you spoke about Australia.
At the end of this segment, we'll have on our AroundNet short segment, someone from Australia, actually, that will be speaking and giving some advices in terms of suggestions and all that.
From our network team, actually. Yeah. I mean, we have folks, in Australia, in our Sydney office and elsewhere, in fact.
So yes, it's a problem for all of us.
It's a problem that particularly shows up in Australia because it's remote, relatively.
But yes, we all want better latency. Where should we go next?
Well, that was all about latency. Why don't we deep dive into another thing, which is about speed, which is this blog post by Xiaomin Chen about DDR memory organization.
And this is actually one of those things is actually really, I mean, if you're interested in how we make our servers faster, one of the things that you obviously think about in servers is people want faster and faster CPUs, right?
It's like, I'm going to have so many gigabits per second kind of CPU, or they will look at the internal buses and stuff like that, like trying to make that stuff.
Or it's like, how big is the cache in the CPU? How does that make the thing faster, et cetera?
Slightly counter -intuitively, although actually once you read the details, it makes sense.
Depending on the layout of the memory module itself, the RAM sticks you put in the machine, there's actually a performance difference.
And the performance difference is if you scroll down, there's an example in here of this thing here.
So on a memory module, one of those things you plug in, there are a bunch of chips.
And in order to get data out, there is a signal which comes in, which selects which one of the chips is going to get data or write data, right?
So it's this CS underscore N thing here. And it controls which memory chip is connected, its data pins, the DQ pins are connected out to the bus, right?
And depending on the layout of the thing, you will have memory modules with more or less of these sets of chips.
And within them, it depends on how those actual chips are laid out.
So if you scroll down, there's a really interesting question, which is, there's a timing.
So here's the great example.
So this shows you, on the left hand side, you've got the address register.
So something wants access to this piece of memory. And then within the memory, there are banks of memory within the actual thing.
And the banks are laid out into these different chips, right?
And so what happens is, whichever one of these you want to select, will then get the data on the right hand side.
But depending on how this is actually laid out, because it's kind of a grid, if you like, there's basically rows, columns, and what they call ranks, or banks, very closely related to the same thing.
There is a timing difference. And if you scroll down, there's this thing called T4.
Find it.
This one? No, keep going. You're going to come across a timing diagram. There you go.
Activate timing. Okay. So just to scroll down slightly, there is a timing information, this thing called T4, which is the length of time in which you're allowed to do four activate windows, which is actually when you're activating chips on the thing saying, I'm going to handle this piece of memory.
And depending on the rank of the memory module, it can still have the same amount of actual memory total.
The timing window will be greater or smaller. And actually, if the timing window is larger, you end up with slightly slower memory accesses.
And so if you scroll all the way down, there's all these calculations in here, and you can go through it.
This is one of our servers right here. Sorry, I'm going to make you go back.
This one? So we have this experiment, right? So we have different sorts of memory in here.
And it's installed like this. And we look at some of the memory to see whether there are some which have a rank of four and some which have a rank of eight.
And so you have the memory in there, and then we are able to do tests.
It doesn't matter which slot they're in. We're able to do tests of the bandwidth of that memory.
So as we scroll down, you're going to see we did some latency checking.
So we did checking with different tools. There's the right performance actually is quite different.
And as we go, keep scrolling down.
So you can start to see some of these percentages of 2%, 3%, almost 4% differences in terms of right bandwidth.
And so if there's a right heavy workload on the machine, then the overall effect is that the memory that has the bigger rank actually is about 4% less performance on write.
And so for us, we decided this didn't matter, partly because we have a very heavy read workload.
We're reading cache very often, right?
Or we're reading workers, which is code that has to be executed.
But it could make a difference. And if you had something that was very write heavy into RAM, you might actually end up wanting to use different physical memory modules, even if they had the same amount of gigabytes.
So for example, here, this is comparing 32 gigabyte modules and telling you what the megabytes per second you can actually get out of them is or write into them.
And there's a significant difference depending on what the actual layout on the actual memory card is.
So I thought that was pretty fascinating because this wouldn't be obvious.
Well, it's 32 gigabytes. What's the difference? Well, there's actually...
There's a difference, yeah. And maybe it could be helpful even for us in the future.
It's always good to know. And in this case, information is out there.
Anyone can use it. But it's interesting that you can take different things from GPUs in terms of the usage, right?
It's a lesson learned. So here we are in the world of chat GPT and other very large memory modules, and they are heavily dependent on these very large billions of parameters, which have got to be in memory somewhere.
And I think if anyone who's played around with these things, they're looking at the amount of memory there is on their GPU card or on their A100 card or whatever they have to store that data.
And so actually, memory bandwidth is going to be an important thing.
And actually, one of the neat things that Apple has done is their processors, which also have a machine learning component, they have what they call unified memory, which is that they can access the RAM for the CPU or for the GPU ML component, and which means that the same RAM, the same stuff can be loaded into RAM.
That's one of the reasons why you've seen quite a lot of experimentation with LLMs and other things on Apple hardware, because you're not having to go out and say, okay, I've got to buy an NVIDIA GPU for my device, because actually Apple provides me some of that, and I can use the RAM I have, et cetera.
So interesting world we're getting into. But anyway, one of the things we like to do is optimize our machines.
It's better for the planet.
Tomorrow is Earth Day. We want to use less power, and we want to use less power per request or per transaction that we process through our systems.
And this is one way we do it, by trying to optimize memory.
Exactly. And those percentages are low, but multiplied by a lot of GPUs, it makes a big difference, in a sense.
It can make a big difference, yeah.
Well, where should we go next? Should we talk about Easter, Passover, Ramadan, Orthodox Easter?
Sure, I'm biased. Yeah, this is where you worked on this.
We talked a little bit before about how Ramadan shows up, particularly early in the morning and after sundown, you can really see Internet traffic dropping.
And then we looked a little bit at, and you looked at Passover, particularly in Israel.
And then we also looked at Easter, which comes in two flavors, right?
There's Easter and Orthodox Easter, depending on whether you're using the Julian or the Gregorian calendar.
Anyway, you tell us about this. Easter shows up on the charts.
It shows up on the charts. To be honest, I was a bit surprised on the impact.
I was thinking, oh, maybe during Easter, and this is Easter, like, April 9th.
But like you said, the Orthodox Easter also shows, which was this past weekend, April 16th.
I was surprised that I was thinking, like, in the last month, maybe that day was the day with less traffic in a few countries.
But I was surprised because it was the day with less traffic in several countries of all the years so far, four months, more than 100 days.
And you can see in some of the charts, for example, Italy is one of those cases, no surprise there.
But you can see that April 9th was only competing with January 1st in terms of days with less traffic.
And I also tried to put some percentages there, the drop compared with the previous Sunday, that Sunday is usually, the weekends are days with less traffic, typically.
But that one, and specifically, sometimes is one with less traffic.
In Italy, for example, April 10th, which was also a holiday there, was also really impacted.
So it's more than just Easter Sunday. It's also, in this case, the whole of that part, because there's also Holy Week, there's different patterns there.
I learned from reading about this that there are varying interpretations of Easter depending on the countries, whether Good Friday, Easter Sunday, and I come from the UK, Easter Monday is a holiday in the UK as well.
And so it depends a little bit on the place, right?
Exactly. And several countries are very typical. Others have those differences.
In Spain, it's one of those cases where the Good Friday is the one that has a bigger drop in traffic.
So it's even much more popular there in terms of people using the Internet less than Easter Sunday, for example.
Mexico is also one of those cases.
So traditions hold up even on looking at Internet patterns. And you were talking about the UK, April the 8th, which was Saturday, Easter Saturday, was the day that has a bigger drop, which is interesting.
I'm surprised by that. Yeah, yeah.
And then I also tried to put the time of the Easter Sunday lunch, which is the time where sometimes the traffic drops more, depending on the country.
This is Mexico.
I was mentioning Mexico before. Yeah, lots of drops there, yeah. So the US also.
We have a different, and also Australia. Philippines also saw, we saw some drop, not as the biggest drop of the year, but at least from the past month, it was clearly Easter Sunday, the day that had less traffic.
Before we talk about Easter, wasn't there something funny with Brazil?
I seem to remember that Brazil was the exception about Easter, wasn't it?
It wasn't. In this case, it was the carnival days, February the 19th and 18th, were the days with lowest traffic in the country.
No surprise there.
If you think about it, it's the days where people are really more interested in what is happening in terms of carnival than Internet.
But it was a good trend.
A lot to unpack here in terms of trends. So anyone can. Israel shows up, we see the beginning of Passover, Ramadan, we see prayer times and orthodoxy.
So you see the similar sorts of effects in orthodox Christian populations around the world.
So yes, once again, what we do as humans shows up on the Internet. Exactly. And our time is up, John.
Thank you so much and see you next week. Yeah, I'll see you next week, Israel.
Have a good weekend. You too. Bye-bye. Last but not least, we travel now to Perth, Australia for our short segment around net.
So here's Damian from our network strategy team. Hi, my name is Damian Madich and I am the Global Lead for the Commercial Interconnection Portfolio, working as part of the network strategy team for Cloudflare.
I'm based out of Perth, Australia.
And as part of my role, I get to work with our interconnection managers on our commercial and cost negotiations with our Internet exchange, peering and transit providers globally.
I love what we do because really I think that at the foundation of building, helping build a better Internet is running the world's best network.
And the work that we do in building the interconnections with our partners is really at the core of that.
A few fun facts about me.
First up is one of my favorite books that I've been reading recently, Never Split the Difference by Chris Voss, thoroughly recommended.
Chris Voss is a former FBI hostage negotiator.
So he talks about some of his experiences in the field, negotiation strategies, what works and doesn't.
Obviously very relevant to the role that I do and the work that we do with our partners in negotiating the best deals on behalf of Cloudflare.
My favorite blog post would have to be Nitin's post from a few years ago around bandwidth costs around the world.
Obviously it's very relevant to what I do.
He talks about some of the challenging markets that we deal with in terms of the cost to serve local eyeballs and basically that not all things are created equal.
What I love about that is that these are the challenges, the tough markets that we deal with like Korea, Indonesia, India that are crucial to our success.
They're tough problems to deal with but we really enjoy the challenge of helping contribute to make Cloudflare successful in these markets and really that comes down to the deals that we strike and in helping us build infrastructure locally at good cost.
A fun fact about me, in the age of VCs and doing things like this where you're working remotely, people often don't realize that I'm six foot eight tall, over two meters in four centimeters.
So that comes as a surprise when I get to finally meet some of my colleagues in person and natural to that height is that I love to play basketball and managed to be fortunate enough to spend a few years many years ago playing in some small professional leagues in Europe, Germany, Ireland, New Zealand and Australia.
So yes that's a few things about me.
So being based out of Perth, one of two employees here with my colleague Adrian Aragharam means that basically it's work from home, no office here.
So this is my home office as you can see, I'm lucky enough to be working from home and it's a nice cool wintry day here in Perth.
But since I have such a good love of basketball, basketball court right here on standby.