👩💻 *Women in Tech EMEA* Tech Talks: Support and Solutions
Presented by: Lindsey Monyelle, Kamilla Amirova
Originally aired on August 9 @ 6:30 PM - 7:00 PM EDT
Want to know more about what our fierce women are doing? Join Lindsey and Kamilla as they walk you through their projects in Customer Support and Engineering.
If you want to apply for a role at Cloudflare but are unsure of what the best fit would be for your skillset, please apply using this link and the Recruiting Team will follow up - https://boards.greenhouse.io/careersday/jobs/3085504?gh_jid=3085504
English
EMEA
Women in Tech
Transcript (Beta)
All right. Hi. How's everyone doing? Hi. Kamilla, how are you today? Hello, hello, Lindsey.
I'm doing well. Thank you. How about you? Is everybody all right? Yeah, doing well.
I'm excited for our segment today and sharing my presentation with you, hearing the information you've got to share with me and with everybody else.
So let's get started.
We are here with the Women in Tech programming today, and I know you've got some information to share with us as far as integrating Cloudflare with Siems.
Is that correct? Correct, Lindsey. I'm going to say a little bit words of this project, just maybe a few words about us.
I'm Kamilla, Partner Solutions Engineer.
I've worked with Cloudflare now over seven years, and I had the pleasure to work also in another department like Special Projects.
And when I was part of Special Projects team, I actually had that project that I actually want to share it with you today.
So as you might know, in Cloudflare, we do have already a dashboard, very significant analytics and everything that we have.
And it's great.
They're helping you a lot to understand the thing, where the problem is or what needs to be fixed.
Or maybe there's something that is really not working or not working well.
You can understand it by looking at it. But also customers, they require combine one of the source of information they're getting from Cloudflare with other sources, for example, from their back end and compare it with together.
And this is where the need is to expose this data and make it available in other tools, like, for example, security information and event management tools or CIM tools that using it.
And this is where I was involved. So my task was, is to be able to recreate sort of or the type of Cloudflare dashboards within that most popular seem to providers like Splunk, Elastic, Sumo Logic, Datadog, et cetera.
So let me start then from the beginning. So this is Cloudflare dashboard, and this is our analytics.
And all the products that we have usually comes with that analytics package, which is fantastic.
They already pre-calculated with the data for you.
So you can understand if you're working, for example, in caching options, or there's some threats coming up, you can look at it and search it.
So how now I'm able to get that type of information that available and exposed with data to the customers into third party tool.
So that's the challenges I actually met with.
First of all, is like, you need to decide which are the one, the most popular tools your customers using so that you know that you can hit like the most of them with important stuff.
Also decide what type of information you want to expose it or to put into these dashboards.
Is it when I look at the security on the aspect or maybe performance or other, because some customers, they want to look at the performance, someone's security or reliability.
So that was a challenge that I had. Also figuring out the formula, and I'll tell you what it is.
When you're looking at analytics, it's already all calculated for you.
But in outside world, you work with the data that was exposed to you within the raw logs.
So we do have in Cloudflare Enterprise raw logs where you see it, but you still need to figure out how to combine them in order to show, for example, that the babies were saved, or this is other type of threats that we've been identified and combine them into it.
So I needed to make sure to define the right formula on the data that have been exposed in logs.
And of course, all of it come up with the same, right, we have that dashboard, it shows you then we saved or patched or threats, but what does it actually mean?
So we needed to come up also with glossary to make sure that everybody who looks at this dashboard, understand what that means.
So it was a great exercise to challenge, go through it.
And on top of it, you want to make sure that the process is simple, they can connect to the bucket, for example, S3 bucket, get the logs running, and immediately use the template that it's actually going to be populated with this data.
So that what helps you to make sure that people start adopting the technology, that it's simple to use, and it shows the right information that you want.
Right. So what I end up doing is that based on the customers that will look what they were using, and obviously feedback that they had, we ended up on summarizing that SIEM tools into Google Data Studio, Looker, Splunk, Sumo Logic, Elastic, and Datadog.
So that's what I was focused on.
And one by one, we start building it. And of course, each of it had a different language to use, different approach.
For example, Looker, they work with the databases, you can connect to it.
So the reports that populate in there are all running the queries behind it.
So you want to make sure you're working with it.
But if you look at Splunk or Sumo Logic, they have their own specific language or syntaxes where you need to run the reports into.
So there was another interesting challenge. But once I figured out for one, a good core dashboard type of reports there, it was much easier to figure out how to create it in another ones.
And what I wanted to show you is just a couple of examples what it's actually ended up looking like.
So this is, for example, Google Data Studio, where you can see you can find it in report gallery, this report.
And it's got different information. One is about like a global HTTP, HTTPS traffic, where you can see how many visits you have, from which country they come in, client IPs, you could click on each of these data, and it will filter out immediately based on that IP, for example, if you wanted to see where the request is coming from, or what the error type you want to actually look at it.
So that's what actually was kind of helping them to dig even deeper into the data.
And on top of that dashboard that we created, it was just the first step for them to get them going, because many of them took what they want from that dashboard and created a combined look for themselves.
For example, one of our customers, they've been using Sumo Logic.
And they use only few reports that was very important for them from our security dashboard, combined with the reports that they had from other tool providers, and put it in there.
So it was easier for them to see a holistic view across edge, across backend services, that they were running, and the network, of course.
And this is actually one of the reports in Data Studio. With the button, you can see on the top right corner, where they click it, uses a template.
And what it does, it actually populates that dashboard with your data, and your students are seeing it, actually, for as long as you need it to.
As you might know, in dashboards in Cloudflare, it might be limited, this data.
But in other tool providers that you're using, you can decide what actually the time frame you want to store your data.
So that was another way how they actually managed to get the data through, and look even into further, further, further data down.
So just an idea of the glossary that has come up as well, where you needed to understand what is origin request, what the cache request, uncache request, what we look at it, the value of the data that is populated in that field, et cetera, et cetera.
We had to come up with it in order to help them to understand and able to read them.
So this is example of the Looker. That's the one that actually got acquired by Google, and they run with the databases, which also was fantastic to share it.
And it was a great feedback where people was using it for them for performance to see the pages that had, for example, slowest performance.
So we were able to pull out all the static content that was with the slow time.
And another one that we pull out is dynamic content.
So we could, for example, improve it with one of the tools that we've got.
So they can utilize it even more and benefit more from it.
So right. And this is the Sumo Logic. So you can see they're all different.
They all ride behind different. But the idea behind them is to make it as simple as possible to get them exposed.
And from there, you can get started and get the project or get the information that's really important just for you to create your own unique dashboard.
Right. I think with this, I will stop sharing. Give the word to Lindsay.
I actually have. I thought that was really interesting, especially.
So for just so you know, I work in support. I've been with ColorFlare for the past year.
But a lot of the content that you're talking about, you know, I see it from more of the support.
And so the things that customers write in about the questions or issues that they are having with their logs.
And one of the things that I noticed a few questions recently have been more on the lines of like sort of how that information is parsed together and reporting that.
OK, so for an example, to sort of get my head straightened here is I'm seeing different logs for workers, we'll say, for individual workers versus all of it, like the numbers are adding up differently.
Is this has this been like a challenge on how to present that information in a clear, concise way across different types of information?
That's absolutely a wonderful question, because some of the products that we have, if you enable logs for them, for example, that you want to have, for example, workers request.
This is Cloudflare serverless platform that we are talking right now.
Yeah, it's called workers. So when you use them and you expose them in the raw logs, basically they actually create, let's say, extra data that comes with the workers itself.
And some of the content or some of the records might be multiplied just because the way the workers work.
And this is where we come up with the filter.
And if you filter in this data and put, OK, then this data should consider it in this way, then the data absolutely goes correctly.
And this is something that we work with Sumo Logic and Splunk to make sure that they are aware that if you work with specific project or you have workers enabled, make sure that you don't forget about that condition that needs to be enabled.
Otherwise, it actually shouldn't affect you if you don't use it. And let's say that somebody who starts with it, they don't have it immediately.
But those who use workers, they might affect by that that the difference is going to be in the statistics.
That's right. OK, OK, cool. And you actually answered one of my questions while you were presenting, which was, you know, I was I was curious if it was easier to sort of translate the work that you would do for one system to another.
Is that generally across the board or has it been like maybe some of them are easier than others to translate that data?
So certainly those that use queries running behind and connecting to databases, they say they're more or less standard and you don't need to figure out the formula because all you need is the query to make sure that's integral.
So like Looker, for example, you'll find that this type of thing, they work with the queries, which is absolutely great to work and you can multiplicate it across different environments and they act actually almost the same.
But if you look at the last thing, they have their specific also way of syntaxes, especially when you have some complex formulas, let's say, define what is going to be in part of the boats or other things like.
So this is yeah, it took me a while and I'm really grateful that we had a good support when I needed to get through the our partnership with Splunk or Datadog to look into it, to dive into materials and source into it.
But it was challenging with the first two or three.
But once you figure out, once you have the knowledge is how it's there and there, you already kind of, OK, it might work like that.
So with X, we're adding it was easy and easier.
OK, yeah. And that totally makes sense. The other thing that I'm kind of curious is you're pulling these different pieces of data when when we're looking at the information that comes through Cloudflare.
And I'm also kind of curious about if we're talking about firewall logs versus network analytics, we have our flow right where we have these different products where they come into place.
Now, is this was that some maneuvering of figuring out, OK, well, we should collect this log from this part of the flow, this log from this part so that it makes the most sense?
Well, the time when I was creating that dashboard, we had just one enterprise role logs with all the HTTP requests, firewall events, everything in one go.
So that's where we had actually already all information, let's say, exposed in one thing.
I didn't have to like I had to refer to another logs or something, whereas actually I see a really good benefit that now we have a firewall role separately and you can like just focus on the security events with it, whereas with network events or error logs, you can actually multiply.
But when I was working on it, we just had one combined all together.
So it was kind of like easier for me to piece it and pull out the information from.
OK, cool. And I think this may work as a as a segue.
I didn't even mean for it to. But one of the things that we tell customers who are not on our enterprise plan when they start asking for logs is that anybody can access their logs with workers.
So is this something that, say, someone even on a free plan probe is if they needed to, if they were interested in this kind of setup, is this related to log push where they're able to push it through workers to an external tool or is this enterprise only?
So the one I was focusing on, it was enterprise only.
And when we were building it, it was all through actually integrations with the buckets.
So you actually first set up your log push that pushing it to the S3 bucket or Google storage or any other storage.
And from there, you can actually already find Splunk, SumoLogic plugins, whatever their applications that already will transfer the data into it.
And all you need is just to connect to the source.
But later on, it was a great progress where we actually did a direct push of the logs into that providers.
So Splunk, SumoLogic, yeah.
They're all getting data right away. But the funny thing is that some people actually, they don't want it to go directly because of the cost processing of the data and retention of the data with that same tool providers can get quite expensive.
That's why use it on the storage buckets so that it's just maybe sitting there for like one, two years when they need to pull it out, they can do that, but at least they don't have to pay it.
But in same tool providers, they have it maybe for a few months longer, depends on their needs.
Okay, cool. Awesome. Well, thank you for answering those questions for me, because I was very interested in what you do.
And I actually have more, but we're running out of time. I'm curious to hear your project.
I saw it already, but I'm really looking forward to hearing about it and see it.
Yeah, yeah, sure. So I am going to be sharing with everybody a project that I did for myself.
This definitely started out as a personal project, and then I opened it up to share with the team because I was studying for the CompTIA exams.
Some of the network plus, the security plus were the two that I fortunately passed this past summer.
And while I was studying, I created a, like a study bot, because basically the point of this for me was I wanted to have questions given to me sort of randomly throughout the day, as best randomly as I could at the time.
And then having this sort of interactive presentation helped me with my studying.
So I am going to kind of walk through the basics of it. I've stripped out a lot of the flavor text and just kind of have the basic functionalities.
And this is going to be right now, it's totally on workers.
It utilizes the minimum of two workers, but it can do more depending on how you have it set up.
So I am just going to start sharing my screen here.
This one. All right. Can you see appropriately?
Absolutely, I can see it. Okay, great. Our pictures got very small. I have a multi-monitor set up.
So I will be kind of looking to the side. I apologize for that.
So basically I am going to just ignore the typo.
I resaved this one because I had an extra N there. So the first thing that I'm going to do is I'm going to open the answer page integration.
Now, this is, these have already been created, but if you want to make a new worker, you just click up here, create a worker.
It's pretty straightforward. But this is one it's already created.
And you can go into quick edit to get the code. Now, what I'm going to do is I'm just going to open this side by side and cut this out and paste this in here.
So what this is, is sorry, I did these backwards.
This is not the one I wanted first. Um, the questions list, is that the one I just opened?
Here, we will open these in the background. So I know which code I'm looking at.
Okay, great.
This is the one I wanted first was the questions list. So what this does is these are the actual questions.
This is integrated into Google Chat is how I have it set up, but it doesn't have to be that way.
It can be integrated into, you know, Discord or anything like that.
But it basically starts with a questions list.
And this is a just, you know, we have this as our different options here and array.
That's the word I was looking for. So each instance in the array is a different question separated by these commas.
I broke it down to five, but I had, I think, like 25 in this section.
So this goes in first. And then the next part is I just have a template.
This makes it really easy for when I add questions on the fly, I don't have to type it in manually.
I can just use the template and then use that as like my, my quick add.
So this is where it starts to get into the actual functionality of how it works.
So this is my randomizer. I have a, I have like a big love of writing code that involves randomizers.
It's sort of a personal preference of mine.
But we basically just use a random number generator.
And then we use this, we turn it, we parse the, the integer out of the index.
And then we create this new index, which is a different variable now with the index plus one.
And we'll see how that gets used. But this basically allows me to pull a random sentence out of that, that array from the earlier part.
I wanted to mention, by the way, as I mentioned earlier, I work in support.
So I am not a developer.
This is in fact a pet project for me as just a hobbyist. And I mentioned this just in the way to be encouraging to people.
Like you don't have to be a developer to use these tools or to have fun and, you know, expose yourself to something new.
Now, this is the actual function that gets the random sentences. So when this runs, it goes through sentences and it pulls out index.
And then index is a random number.
So this pulls that random sentence. I have to be a little bit mindful of my time.
Oh, my God.
I keep pressing copy when I mean paste. Okay. So this is just where it integrates into the app.
So we have the messaging app integration. We have the web hook, which connects here.
This would be the URL to connect into Google chat. And then this is the text of the body of the message that gets sent.
So for me, it would send the section that was being studied where I just wrote flavor text here.
And then find the answer at this was the URL of the CNAME that I have to the other worker.
And then this new index, this was the var plus index plus one. So this is because you start at zero.
So this is to give a page number. This is giving me a page number to a URL so you can append.
So it would be questions.com forward slash one forward slash two.
This gives us a different page number for each one.
And then the get random sentence, that prints the question. So you can actually see the question that's picked in the G chat or Discord or whatever message.
And then the rest of this is the functional need for the cron.
Now, this actually most of this comes with the worker when you open it.
When you create a new worker, this is kind of the text that comes with it.
And this demonstrates how to create an event listener so that it knows how to function.
I'm just going to not get into that too much because we can maybe have some explanation on in my soft plug here.
I'm going to be linking some of this to my page for the get started in tech.
Which is let me see here. Right here.
This get started in tech.pages .dev. I'm going to be doing a bit of a write up of this as well here.
So in the next maybe week. Not immediately. So we've got that.
Now, the next section that we have here, this is going to be what it links to.
So in my functional version of this that I have going on, I have it C naming to a domain that I own.
But it can just go straight to the URL of this workers up here.
Now, I'm going to just basically do a quick, quick takedown of this.
Which is that we have the HTML content. This is actually this is all stuff that can now be integrated into pages.
Which would be really interesting. And that is that's going to be my next stage of development is integrating this into pages.
So right now we have the HTML and the CSS page and a functional script page that impacts the behavior of the page.
All of this is sort of added to the worker.
But this could all be on using Cloudflare pages instead. Now, this then goes into individual HTML of the content.
So we have question 1, question 2, question 3.
So most of this is style. And then we also have page handling variables. So now I've described a bit how this works.
But basically we have to we have the bot on the G chat.
It basically creates a link. So when you click on that link with that URL path, with that URI, I've been using this here where it's domain 1 forward slash then the question.
And the COMT exams have six domains. So that's how they've been grouped.
And what this does is the URL goes here. So what this does is this says, okay, this is the URI of the path.
And then it slices everything up to here.
So then that prints out just one for this next part. Now, this next part then turns it into an integer because it can't be a string number.
It has to be an integer number.
And then that basically what we end up here is with that as an integer, we can now say, okay, we've been taken to domain 1 forward slash 1.
We're going to use that in this function down here to find the correct question up here.
Because this is another array. So we'll go to this one. And once again, we have to have these plus 1s because arrays start at 0.
They don't start at 1. So you have to find a way to manipulate that.
And then we have total pages is questions .length.
So then this is going to allow us to write ways to handle exceptions. So I wrote, jeepers, it looks like we don't have a question plus page.
We don't have a question 70 yet is what it does.
Why not submit some? I have a silly mouse that has a lot of fun features.
Why not submit some? And then I just have invalid selection if it's like negative 7 or something like that.
So this is my exception handling.
And then that's basically how this page works. Fantastic. Now, the very last thing that I can kind of show you is if we go to this preview tab, it's not going to work.
We have to refresh the page. We'll see it.
But it's not actually going to work. We have to launch first. Okay. This is showing us what the preview tab would show us as well.
It's showing invalid selection.
And the reason is because it does not have the forward slash 1 forward slash 1.
So then now we have these. And just to quick to show you the JavaScript inside the page, this is the hiding and showing function of the answer.
So this is a lot of potential.
I've been wanting to implement scoring, you know, session cookies, the ability to have maybe a leaderboard.
I had the idea earlier that maybe we could have something like this integrated to the Cloudflare community.
So this is my project.
That's fantastic, Lindsay. It's really exciting what you can do with it.
It's just the top of the iceberg. And you said you can find it somewhere on GitHub.
Yes, thank you so much. We have 54 seconds left. I have also uploaded this already to my personal GitHub.
It does have a lot of places where it could grow and get better.
And I welcome any sort of contribution to my username on GitHub is Lindsaynia, L-I-N-Z -I-N-E-H-A, and it's under QuizBotDemo.
So please contribute, anybody.
If you find this project interesting or fork it, make something similar for yourself.
And I welcome you to join me. Fantastic. Fantastic demo.
Thank you so much, Lindsay. Thanks, everyone, for listening. Yeah.
Thanks so much for meeting up with me today and for us to be able to do this session together.
I really had a lot of fun. Awesome. Likewise. Have a great day, everyone.
Thank you so much. Bye.