Cloudflare TV

Exploring AI development one year after ChatGPT and analyzing Cyber Week insights

Presented by João Tomé, Michelle Chen
Originally aired on 

In this week's program, João Tomé is joined by Michelle Chen, Chief of Staff of our Emerging Technology and Incubation team. We go over how ChatGPT is celebrating one year and the recent turmoil — on December 2, 2022 (you can check that episode here ), we've put OpenAI’s new ChatGPT chatbot (with Cloudflare Workers code) to an initial test.

Other related topics include how to build an AI application using our developer platform, Workers, and its complete ecosystem; the difference between AI Gateway and Workers AI; cost savings, reliability, and efficiency without being a distributed systems expert; and keeping privacy and security in check.

We also highlight how Forrester has recognized Cloudflare as a leader in The Forrester Wave™: Edge Development Platforms, Q4 2023, with the top score in the current offering category.

There are also some answers related to these Cyber Week days on the Internet. Is it a global phenomenon? Does e-commerce interest peak on Black Friday or Cyber Monday, and are attacks increasing during this time?

In the short segment "A Bit of History," our CTO, John Graham-Cumming, goes over how and when Cloudflare built its Zero Trust platform.

You can check the mentioned blog posts:

English
News

Transcript (Beta)

Hello, everyone, and welcome to This Week in Net. It's December the 1st, 2020 edition, and this week we're going to talk about Australia, cyber weak trends, Cloudflare's peak numbers and achievements, and building AI applications.

I'm João Tomé, based in Lisbon, Portugal, and with me I have Michelle Chen, based in New York, Chief of Staff of our Emergent Tech and Incubation team.

Hello, Michelle, how are you? Hi, I'm great. Thanks for having me on. Crazy that it's December already, last month of the year.

True, it's crazy. And in Lisbon, I'm not sure about New York, but Lisbon is raining cats and dogs.

Is it raining in New York?

It's actually pretty sunny out. I heard it gets really slippery in Lisbon when it's raining because of the cobblestones, but thankfully none of that in New York just yet.

True, it really gets, and a lot of sometimes floodings when the rain gets tough.

Before we go into our topics, did you celebrate Thanksgiving last week?

I actually am Canadian, so I usually typically celebrated Canadian Thanksgiving back in October, but I've lived in America for like three years now, so I'm slowly accustoming to American Thanksgiving.

So I did actually get to take some time off and I went to Orlando to visit some family, but no turkey or anything.

I don't think I've ever done a turkey before, maybe one day, but none of that for me.

Me too. I'm based in Portugal. We never celebrate Thanksgiving, mostly Christmas.

And what about the Cyber Week, like Black Friday, Cyber Monday?

We have actually a blog post about that. Do you use that time for shopping?

For sure. I have like a running list of things I need to buy and I'm always looking for like hunting for like good deals.

So Black Friday, I was in Orlando, I think just like mostly online shopping and scrolling.

I actually didn't really buy anything, I think, but I was definitely looking.

Actually, we have a blog post.

Just to see your trend personally, do you do the more shopping search offline or online and more on Black Friday or Cyber Monday?

I think now it feels like the entire weekend is like shopping, like they've extended starting from Black Friday all the way to Cyber Monday.

So it feels like the entire weekend was just scrolling.

I definitely do look more this weekend than like other weekends.

That's interesting because we have like a ranking of is Cyber Monday more popular than Black Friday for different countries, actually, including the US.

And apparently in terms of Internet traffic, at least Cyber Monday seemed like more busy for online Internet traffic, but also e-commerce shopping.

But we'll get into those trends next.

That's funny. It's kind of a generational thing as well, because my dad was like, oh, it's Black Friday, we have to go in store to go buy things.

And I was like, shopping in person, like I feel like I haven't done that in ages.

So it's interesting to see how like Internet trends and like human trends change as well.

True. And one of the things that I noticed, especially in the US is Black Friday seems more offline shopping.

So you go to the actual stores. So mobile traffic is highest on Black Friday.

And you could see that Internet traffic is higher than usual, but not as high as Cyber Monday.

It seems like last day, let me see if there's online something appearing.

And this is actually a trend from the past two or three years I've seen, which is interesting in a sense.

Yeah, that is really interesting and probably confirms everything we've seen in real life too.

But in Europe, it's more Black Friday all the time. It's interesting.

You have a role called Chief of Staff in terms of our Emergent Tech and Incubation team.

Before you were Product Manager of AI Gateway, right? What is your role in terms of Chief of Staff?

What do you do really? Yeah, I do a little bit of everything.

It's a very unique role. And I really like it because I get to do a breadth of things.

And one of the things like you mentioned was getting to PM some products that were just starting to spin up.

So AI Gateway is this product that we were thinking of before birthday week.

We were like, what should we launch with all this new AI stuff?

And we had so many different things going on. We had our regular birthday week launches, and then we had new AI products we wanted to launch.

So they just needed a helping hand. And I have a PM background. I actually was an intern in 2019, and I worked on the warp clients.

So being the Chief of Staff now, I get to jump in wherever I find something interesting.

So I got to jump in and PM AI Gateway and help start it up from the ground.

And it's all about, in this case, something that's really popular these days, although it's not new, which is AI, building stuff on AI.

Because the more specific aspect of generative AI since ChatGPT was launched actually one year ago.

Yeah, I saw that today.

One year ago, it was November the 30th. And one year ago, so on December the 2nd, John Graham-Cumming, our CDO, and I did a This Week In Ad show where John was just showing off ChatGPT capacities in terms of even building code to build applications with workers.

He did some tests. So there's an episode of early stage ChatGPT from one year ago about that.

And what I want to ask you specifically, and you have a talk, a presentation regarding building stuff in terms of AI and privacy and security in Las Vegas, if I'm not mistaken, during December, what does it mean to build stuff with workers, our developer platform, in terms of AI?

What is the specific aspect to that, really? Yeah, I think there's a lot of talk about AI and building applications.

And one of the things that was really heavily discussed, especially during the OpenAI saga over a few weeks ago, was about open-source models and building on open-source models so that you are not relying on one company and the direction that they want to go in, but actually leveraging open-source data and open-source models in order to do that.

And I think workers AI is really great at that because you have access to all these different open-source models.

We have a partnership with Hugging Face coming out where you can basically deploy any of the Hugging Face models to workers AI.

And I think that's really interesting because you now have a bigger selection of models that you can choose from that you know are reliable and open-source and building on top of that, I think is really interesting.

And then you can also use products like the one that I created with AI Gateway, where you can proxy your requests so that you can reach different models and get a lot of cost savings and performance savings as well.

You can cash, you can rate limit, and then you can get a lot of performance benefits out of that.

So you can actually build AI applications really quickly now.

Typical to how you can spin up a worker really quickly, getting that to connect with workers AI and then putting AI Gateway in front of it means that you can build a really great, robust application really quickly and for very little cost.

So I think it's great that we have these new tools in our toolkit so that people can develop AI applications, which I really think are the future.

The building blocks are a lot bigger now, I think with AI being capable of so much.

So it's a lot faster and easier to just spin up your custom GPTs or new AI applications.

So I think it's great that we're helping people accelerate and build the future.

You were mentioning AI Gateway in specific. It was during late September, our birthday week, that we announced AI Gateway.

Here is the blog post, making AI applications more observable, reliable, and scalable.

For those who don't know, what is AI Gateway?

And for example, we launched recently Workers AI. What's the difference between the two?

Yeah, so AI Gateway is actually a proxy, so your application's requests.

Normally if someone uses OpenAI and they want to connect to GPT-3 or GPT-4, they send a request from their AI applications to the OpenAI's API.

And AI Gateway sits between your application and the APIs, so that every user that sends a request, like a chat, the request goes to OpenAI.

But we sit in between that so that if multiple users are asking the same question, we'll actually return the question from the cache.

Or we can rate limit, or we can just make it so that you can actually see your requests a lot better, because historically that's actually not been very good for different providers.

It's hard to visualize how many requests you have created for different providers, and what are the error rates, or the cache hit results, and what are the tokens used, and your costs, and things like that.

So it's basically just a mirror control plane where you can cache and rate limit and see different requests going out.

And it actually connects really well with Workers' AI.

And Workers' AI is basically like our model APIs, where we have these open source models and you can connect to them.

And these models are actually running on GPUs deployed globally on the edge.

So they're really fast, they're close to users. And you can use AI Gateway in conjunction with Workers' AI.

So your request can go from the user to AI Gateway and route it to the Workers' AI APIs.

Exactly. So that's why it's important in this blog post we're showing now, Workers' AI, serverless, GPU-powered inference on Cloudflare's global network.

It's really important for those who are developing tools all over the world to have that AI inference GPU power near them, right?

It makes a difference in a sense.

For sure. I feel like everyone in the world is using AI now.

And I feel like having GPU-powered models close to users is really important for speed and latency.

So I am glad that we are leveraging our global network in order to make things run faster and closer to users.

Actually, on that regard, I think it was last week or two weeks ago, we launched some Workers' AI updates.

One is Table Diffusion.

Code Llama now are available. So those are two very well -known models.

And also Mistral 7b. These types of specific models, why are they so relevant for developers to build stuff?

Yeah. I think Table Diffusion was our first image generation model, which is used in a lot of different use cases.

We talked to a lot of different customers who are building different things.

And image generation was one of the highly requested models.

So I'm glad we got one of these models in.

Code Llama has been really helpful for even developers at Cloudflare to try and get AI help with their code, which I think is really cool.

It'll also help developers build their applications easier.

You can basically build applications without having a computer science degree, right?

Or having attended any bootcamps or things like that.

I made this joke that AI is now your technical co -founder, where you really don't need to know that much.

So having Code Llama and having all these AI tools at your disposal to help you build things is really interesting.

And then we also announced Mistral 7b, really cool model, French company.

And we've seen some really interesting results when comparing Llama and Mistral.

And so I'm glad we have more options so that more people can try different things and see what they like and dip their toes into more open source models.

In that way, you were mentioning the OpenAI turmoil two weeks ago. Actually, Sam Altman is again the CEO of OpenAI officially since this week.

It was announced with a few details there, a new board and all that.

But what that brought to mind, possibly to several developers as you were saying, is you should have your options open, right?

More options than just one is good, right? Yeah, exactly. And actually, we made some tweets during this time as well, but in AI Gateway, you can actually use what we call our universal endpoint, where you can define your payloads for multiple providers, and it'll try multiple APIs at once.

So during the whole Sam Altman saga, I think there was some downtime for OpenAI as well.

And basically, if you were using Worker AI Gateway at the time, you could define a payload for OpenAI, and then you could define a payload for Worker's AI.

And if it would try the OpenAI one, and if it failed, it would then send the Worker's AI one.

So now your applications are not failing or relying on one provider, but will actually be more reliable so that if there's downtime anywhere, your users can still get a response and still get an answer to what they're asking for and helping your applications just stay online more.

So I think that's actually a really cool feature from AI Gateway that was really relevant during the time as well.

Makes sense.

And I was pausing here in this blog post because it's related to AI inference related to workers being now available in 100 cities.

So there's a bunch of updates here in terms of cities where it's available, and it's continuing to grow, right?

We're putting GPUs in our data center. We have more than 300 locations, and 100 of those are already AI inference capable, right?

Yeah, we're ahead of schedule, and we have more coming.

So I'm excited at the pace that we're growing these presences.

And we like to tell the story of people actually getting on planes with GPUs in their suitcases, going to install these.

So it's funny just imagining that, but they're working really fast, and we're really ahead of schedule in getting our GPUs everywhere.

We already discussed in some of these tools, like in terms of cost, there's advantages there in some of our tools, observability, reliability, efficiency.

In terms of safety, there's a lot of talk about AI safety, privacy, those relevant elements.

What can we offer there?

Yeah, there's a lot of different safety and privacy concerns going on, I think, with generative AI in general, which I think there's no perfect solution for right now.

A lot of people are just applying their judgment and seeing what makes sense.

There's a lot of copyright concerns or ownership concerns for generative AI material.

But I think what we are trying to do is adhering to different standards that people are setting with workers AI, with our generative AI models, and making sure that we are not saving any request data or prompt data, and we're not training any of this data.

Actually, we have no business in training models, so we have no incentive to and don't plan on it.

So any data or any requests that are sent to workers AI or to AI gateway, none of that gets saved on our end, except for the metrics that show you how many tokens you used or what your cost was, but none of the user data is saved.

And that's the way we're trying to be more privacy-centric and secure.

We really don't have any business in training these models, which is quite different than, for say, open AI.

So we are trying to just be a leader in inference rather than training.

So that's how we're positioning ourselves to be secure and be private, but still allow these developers to have these tools.

In this blog post about workers AI, there's actually a mention to privacy specifically.

Many of the current solutions don't have privacy needs in place.

They're not worried about that. And in this case, workers AI is trying to be privacy-focused, service-less, accessible.

So there's that element in the basis of this, right?

Totally. I think that's totally based on we want to power inference and we're not training our own models or anything like that.

So we're privacy-focused from the get-go. Absolutely.

On other topics, and I invite you to help me share them with our audience.

We had a few announcements related to this week in terms of blog posts.

One was Steve Ray, that is now the new head of Australia, New Zealand, in terms of ColdFlare.

There's a blog post for him. So we're down under in a more relevant presence than ever.

I hope I get to go down there. Me too, never went. And actually New Zealand is like the other side of the planet in terms of Portugal directly.

So there's always this element in terms, especially New Zealand, of hey, if we want to go to the other side of the planet, we go to New Zealand.

And there's also this blog post that is more about achievements.

ColdFlare was named a leader in the forest edge development platforms wave in Q4.

Those achievements are relevant, right?

I love this announcement. I think it's so great. When I found out about it, I was really excited.

If you look at the graph, it's so nice to see us at the top right corner.

I think we've been working really hard at improving. And you can see we've actually, at the past reports for the edge development, you can see how we've matured over time.

But having us in the top right is really rewarding, I think, for the teams to see.

And finally, for people building on us, and for people to know about us.

So this is really exciting for me. I was blown away by this news.

And it's also something that highlights the fact that we have over 1 million developers building applications.

And this includes Workers, Pages, R2, Kiwi.

So all of these different products that make a real ecosystem that developers can easily really deploy their application, this very distributed network that we have, right?

For sure. And you can see a lot of our products that we launched, like Kiwi, Drupal objects, R2, D1, even Vectorize.

They help us people build more than just these static applications, but really stateful, full stack applications.

So I feel like we're continually adding and improving our toolkit, and excited to see what gets built on us next.

Absolutely. There's a few very nice comments from the first wave in here, in terms of interoperability, which is a difficult word for me.

But there's a lot of really nice things here that also shows all of the work that different teams at Cloudflare have been doing, right?

For sure. The first quote, I think it's a good testament to how the Cloudflare network is so strong, and our CDN product is historically what we've known for.

But I really like how we keep evolving and building on top of that. So a lot of these products that we build now, it's serverless by default, right?

Or global and running close to the users.

So we're really leveraging our network in order to develop new products.

And I think it's really fun being in, especially in the emerging technology and incubation group, thinking about what are our strengths and what can we leverage and what do users like about us, and continuing to build on that in order to build more for the future.

You don't need to be an expert.

This is also a good quote. You don't need to be an expert in distributed systems to deploy distributed systems.

You can have this ecosystem help in a sense.

Yeah, and you don't really have to think about it at all, right? You can just focus on shipping your code and building new features and things like that, and we'll help you distribute it and we'll help you deploy it globally.

So that anyone that wants to use it can use it, and you don't have to put any brain power into thinking about that.

Exactly. There's also a blog post here related to better debugging for Cloudflare Workers now with breakpoints.

This is actually from this week.

This is also a workers related blog post. A lot of workers related blog posts recently, right?

There always are. Yeah, this one's interesting. I'm excited to use it.

I definitely have some projects I need to debug better. So excited to try out the breakpoints and get my hands on that.

This offers also a deep dive into this very specific area.

Yeah, the breakpoint debugging. I've definitely used it a lot in the past, but not with workers before.

So excited to try it out. And I wrote this blog post about Cyber Week, so Black Friday and Cyber Monday numbers.

There's a very high level numbers here, which is Cloudflare processed a peak of 80 million ATT requests per second, and that was reached on Cyber Monday at 4.10 p.m.

UTC time.

There's a few different numbers here, and that meant Cyber Monday had 4 trillion daily requests, which is a lot.

5 % of those were blocked attacks. So a little bit over Black Friday, but also Black Friday was a big day in terms of requests.

And there's also some DNS queries here, 1.68 trillion queries per day on Cyber Monday.

So a lot. Yeah. It's sometimes even hard, I feel, to comprehend these numbers.

What is 22 million queries a second even look like? But you know that that's crazy.

It's really, really high. And even years ago when I worked on 1.1.1 .1, the resolver in 2019, the numbers were much lower.

So to see that growth be really explosive is really interesting.

Actually, for DNS, we have actually a percentage here, a 24% increase compared with the late August.

So this is like from three months ago to now, Cyber Monday, there was a 24% increase in terms of daily queries to one point.

Yeah. So that's interesting. And also the queries per second also increasing.

And then there's a few charts here related to Black Friday week.

Where was it in terms of Internet traffic more relevant? This is very specific to the US.

Like we're discussing at the beginning, Cyber Monday wins, clearly wins to Black Friday in the US in terms of Internet traffic specific.

We're using HTTP requests to calculate this.

And you can also see the Thanksgiving drop in the US there.

There's also some comparison with other countries and Black Friday in the UK, in Canada actually, in Germany, in several countries in Europe was number one, Australia also.

But for example, France was Cyber Monday, Spain was Cyber Monday.

So there's a little bit of mix. And in Europe actually. So there's always different trends to look at.

This actually is also mobile traffic. Mobile traffic was at its highest during Black Friday, even higher than Thanksgiving, which is interesting.

Yeah, people are shopping, comparing prices in the store. This is in the US specifically.

And then there's some e-commerce, DNS trends, mostly focused here on the US.

Cyber Monday also won. And Black Friday was number two. And there's also the Europe perspective here.

Black Friday won, for example, in the UK in this situation.

So there's a lot to explore here for those who like these types of metrics, really.

Yeah. Essentially, you can see lots of electronics, fast fashion, etc.

Exactly. Fast fashion really high too. But the cyber threats was not a big increase of cyber threats, especially DDoS ones, during the Black Friday days, which is interesting too.

Yeah, it's always interesting that we internally at Cloudflare, we know Cyber Monday is coming and Black Friday is coming.

And we know it's going to be a big day, but it's really just another business as usual day for us as well.

We have all these requests coming in and mitigating threats and attacks.

It's just another day in our life. Absolutely. Yeah. So it's fun to see how we can handle all these different loads.

But it's also fun. And I really love that because it's my job to look at the trend and see differences between countries, human patterns.

What I was surprised writing some of these blog posts more was how similar countries are from one year to the other.

So you could see really similar what happens in one year to the other, the Internet traffic related to Thanksgiving or Black Friday.

The countries are really consistent, which is interesting.

Yeah. I'm excited to see how it grows or changes in the future as well. Cyber Monday technically is, I think, a newer phenomenon, I guess.

It is. Yeah. So it's interesting that it's growing around so much popularity over the years.

And maybe we'll invent another Saturday or Sunday as well.

This was great, Michelle. Thank you so much.

Thank you. It was great. Thank you for having me on. It's been a lot of fun.

And that's a wrap. And now it's time for A Bit of History, a segment with Cloudflare CDO, John Graham Cumming.

So Zero Trust is one of those names, not everyone knows about it, what it means.

It's a little bit of the opposite.

So in order to have more trust, you have to have your systems to have Zero Trust in a sense.

When did Cloudflare start to become interested and start that journey in terms of Zero Trust to be also more ready for business customers in all of the experience at the office, for example?

Well, I mean, the thing is, nobody really knows what Zero Trust means.

I mean, Google called this BeyondCorp and that's a really weird name too.

And Zero Trust sounds weird. And the funny thing is, what Zero Trust is really trying to express is doing things the way you thought they should be done, not in the really rubbish way they had to use VPNs, right?

I mean, before you had Zero Trust, which is, Zero Trust just means using the Internet for your business purposes, right?

Before that you had to, and we at Cloudflare had it, we used the Cisco VPN.

I had to get the VPN client out and it would break and all this kind of stuff.

And you were working from London at the time, so.

I was. I had to VPN in and people had VPN in the network. And the thing is that if you think about what was happening in parallel to the corporate world, where everyone's using VPNs or maybe even having to be physically in the office, they can't even get access to applications.

Maybe it was only the executives that have VPNs or whatever.

Parallel to that, everyone's doing everything else on the Internet.

They're going to their bank and they do critical banking things.

They're doing dating on the Internet. They're using the web for everything.

Whereas in the office, they're being tied to this ancient legacy architecture of the corporate network.

And so there was a real schism, right? Between you go to work and suddenly it's like, back in time, using this whole way.

The type of dinosaur Internet in a sense.

Yeah. It wasn't even the Internet, it was some ugly corporate network, right?

And so I think that it was obvious that everything had to switch to the other way of doing things, which we're now calling Zero Trust.

And for Cloudflare, we wanted to get off the VPN. And we're like, no, we have this global network.

Our customers rely on this global network to run their businesses.

We should be able to use it for our critical internal applications as well.

So our initial Zero Trust stuff, which was Cloudflare access, we built for our own use.

We built it so that we could do away with the Cisco VPN and be in a more flexible world.

And now I don't even think about it, right? Yeah, you don't think about it.

You just use the Internet. Just use it, right? And with the help of hardware keys, that also plays a role.

This blog post from Sam Rega is back from 2018 about Cloudflare access.

We weren't calling specifically at the time zero trust.

Well, no, because access was the first product and the term Zero Trust had not become widespread in the industry.

And so we were not ready. And there were other components to Zero Trust, such as having a web gateway.

Because remember, that's the other thing, right?

What happened was companies were using their corporate networks.

Corporate networks eventually got connected to the Internet. And at the moment they got connected to the Internet, I remember this in the 90s when this was happening, the people connected to the Internet were very worried about what was going to happen.

And so they had these gateways. First of all, they had firewalls, they had limited stuff.

And then people started creating these gateways where people could get to the Internet, but through this gateway, they could control who was getting access to stuff.

And so that's another component of the Zero Trust suite is how that, of course, we built that too.

And you and I use that every day.

Every day, exactly. So this was in 2018. So a few years ago already.

And the Zero Trust architecture is much more complete now. It's an ecosystem of its own in a sense also, right?

Exactly. It's become its whole own thing.

In terms of Zero Trust, just to end the Zero Trust part, was it difficult to find innovation through machine learning in this space, for example?

Because we were discussing a lot of machine learning these days, and that is a very specific use case, but is automatisms, things like that at work making it better?

The thing is, what machine learning is very good at in our environment is looking at a flow of traffic and looking for security problems.

And so one of the things you can do is you can start, if you have a gateway where traffic is passing through it, you can start looking at it and saying, hey, does this look like DNS exfiltration of data?

Or does this look like malware?

Or does this look like an attack? And so the thing is, it's very much the case that machine learning is good at filtering through a lot of data and looking for anomalous things.

So I feel like we use machine learning all over the place.

And that is just part of what we do to protect our customers and to protect the Internet at large.

Exactly. So it's embedded in a sense, because it's all about what you do to the global network that plays a role in different products, including Cloudflare 1, Zero Trust, those types of products.

Yep.

Yep. That's right. So that's Zero Trust. Zero Trust is such a weird name. It is.

It's the name we've got. Hopefully someone else... The problem is, if you then look at the other names, which is SASE, there's all these other names for this stuff.

They're not very good too. There were CASBs and there were SASEs and SSEs.

True.

Acronyms. And yeah, it's not easy. It's not an easy game. I remember a few years ago, Google actually doing a webinar for journalists to explain what Zero Trust was all about.

And it was not very easy to understand initially, but they did try. It's just use the Internet as your corporate network, whatever the acronym that is.

Weird stuff. Oh, well, that's Zero Trust.

Thumbnail image for video "This Week in Net"

This Week in Net
Tune in for weekly updates on the latest news at Cloudflare and across the Internet. Check back regularly for updates. Also available as an audio podcast!
Watch more episodes