If you’re a founder building on Cloudflare Workers, head to cfl.re/launchpad to learn more about the program and apply to be a part of Cohort #2 -- applications are now open !
If you’re an investor or a potential customer looking to get in touch with any of the presenting companies, please reach out to the Workers Launchpad team.
Welcome to GPUX, serverless GPU, and let me show you how we're going to enable planetary scalability.
So pretty much the biggest problem now is finding GPUs. Somebody who's worked at NVIDIA, who leads a reinforcement learning library called CleanRL, in case you're not familiar with reinforcement learning, it's pretty much all the rage these days.
It's what powers chat GPT and other kind of AI chatbots. It's always almost considered the missing link of kind of what makes it work.
And this user, yeah, 20 times he tried to get quotas from AWS, but these are large quotas, right?
It's a lot of GPUs and it's very hard. So our solution is just training and inference for AI.
Very simple. Does work. Pay money. And you're saying, yeah, but all the major clouds have this.
Why would you make this? This is so dumb.
Just use AWS, GCP, Azure, right? Just like smart enough. And yeah, and this user did just that.
So we reached out to this user about two months ago, three months ago, December.
And we said, hey, we got GPUs and we have software to make it easy.
And they said, no, I'm good. We already have the cloud. We're using this, we're using that.
Okay. And then two months later, I see they're in a bind and I say, are you looking out for service providers, more GPUs?
And they're like, hey, yeah, but it doesn't look like you have an API.
Yeah, we don't have an API. It's almost ready, but not ready yet.
And then pretty much at that lead turned into a prospect over two months after they realized they went into the market, they looked at these a hundred companies providing GPUs in the cloud and they realized nothing really works.
And yeah, so the traditional cloud does not fit large model, large data set, AI workloads.
And I mean, don't listen though, but go try it.
Maybe it'll work for you, right? And look at TAM. I mean, I know this is ridiculous, right?
It's like we got lazy, but actually if you think about it, robotics needs a place to train.
For example, it couldn't even get to the point where the legs are a model, the arms are a model, different systems in the robot are different, differently trained models.
Apparently AI is really good at medicine, proto-folding, that stuff like finding cures to diseases or generative illnesses.
And of course, small businesses can really increase their margins by using AI in their franchises.
But okay, direct TAM. So direct TAM is probably more interesting. So direct TAM is pretty much direct money we capitalize on.
And for example, one of our targets is startups, new companies, because pretty much AI space is so new.
Every company is pretty much a startup.
Require about 800K just to reach MVP. So that's 800K that's going to go straight to us for training time.
And of course we need your help because we can't move fast enough.
And when we advertise, we just have users coming like this that say, hey, where is it?
We need it now. We have this money we need to give you.
We need to give you this money now. We need these GPUs now.
Where are they? And we're like, hey, we need some more time. We can't deploy so fast.
We're time constrained, budget constrained. And they're like, hey, we're time crunched.
Come on, hurry up. So it's like they're just trying to give us this money and we can't accept it yet.
So yeah, stay in touch. And that's GPUX.