Cloudflare TV

Create with Conscience: Healthier Tech for a Digitally Distracted World

Presented by Bethany Sonefeld
Originally aired on 

Dark patterns, bottomless feeds, and manipulative software—we are surrounded by addictive and toxic technology. As creators, we have a tremendous responsibility to build tech that respects our users time, mental space, and well-being. As consumers, we must begin to build balance with the technology in our own lives. It’s time we create with conscience.

During this talk, I’ll discuss the ways technology is controlling our time, emotions, and attention. I’ll outline the tools and techniques companies use to keep us hooked and engaged. Then, I’ll discuss the ways in which we can commit to and build healthier technology—for ourselves and our end users.

English
Design
Ethics
Attention Economy

Transcript (Beta)

All right, so thank you everyone for tuning in wherever you are out there in the world.

I'm super excited to be here today to talk to you all about a topic that is really near and dear to my heart.

And it's been something that I've been working on over the past year.

And so I'm really excited to share with you. It's related to this concept of creating with conscience, which is really all about how we as creators can begin to build healthier technology for a world that we live in.

That's pretty digitally distracted. Right. So before I get into all of those things, I want to introduce myself.

So my name is Bethany. I'm currently working as a product designer at Cloudflare.

And during this talk, I'll be answering questions at the end and then you can feel free to tweet me.

I have a handle dedicated to create with conscience.

So I'm happy to hear your feedback or answer any questions that you may have afterwards.

So I've had a lot of people ask me, you know, what made you interested in this topic about digital responsibility and, you know, creating conscious and I think when I was doing a lot of research, I thought back to this paper that I wrote back in like my college writing class when I was a high school senior called the dangers of multitasking.

And so I want to read you an excerpt from it.

As I sit down to write this paper, I find my mind drifting, even as my hand continues to write the TV is on my iPhone is right next to me and I sit in front of the computer while multiple tabs are open.

What am I doing, I asked myself, I am multitasking.

The very thing I'm about to prove is bad for you.

So I feel like high school senior Bethany had some sort of inkling in her mind that The thought of multitasking wasn't what our brains were designed to do.

It wasn't healthy for me and I started to realize that having, you know, a computer with multiple tabs open and an iPhone right next to me probably wasn't the healthiest thing for me.

And so if we look at the study that was conducted in 2019 this shows basically the percentage of US adults who own devices.

Right.

And since the iPhone has been released in I think 2007 we've seen now a steady uptick in the percentage of US adults who own smartphones.

So much so that in 2019 that number reached about 81% so I started to think to myself, what happens when we get to a point where that number is 100% and every US adult has a smartphone in their pocket.

Right. Because we've come to the point where we've become pretty tethered to our devices and it sometimes lends the question of who's actually in control because it feels like technology is in control a lot Technology is in control of our time studies have shown that you know the average person spends about three to five hours a day on their phone.

And that can equate to almost seven years of your total life, which is the same amount of time it takes to obtain a law degree in the United States, which is It's wild to think that we're spending that much time on our devices.

And I think, given the nature that we're we're isolating and we're in quarantine that number has probably increased Technology is in control of our emotions.

So we know that depressive symptoms have increased about 33% in teens between 2010 and 2015 And what we also know is that the percentage of teams that own smartphones increased about 48% during that time span.

So there has to be some sort of correlation where those two are related And technology is in control of our attention, you know, on average, human spend about three minutes on a task before switching to something else either checking email checking their phone getting distracted by something else.

Right.

And, you know, we live in this world where we're so distracted by these devices that it's really changing.

Our personal selves and how we are in tune with our own minds, but it's also changing our social selves and how we interact with each other.

And and within society. Right. But what I want you to realize is that it's not our fault.

It's not our fault as humans that we have become so tethered to these devices.

I think it's the software that inhibits these devices that has really kind of Led us to become so addicted and has really reeled us in.

And that's what I want to focus on talking about today is specific tactics that a lot of companies use to do that particular thing.

So before I get started, I want to start off with a little disclaimer.

I am going to be pointing out a lot of negative patterns and a lot of manipulative software that we probably use in our everyday and that's not to say that I don't believe in the power of technology or I don't believe that technology can be used for good because I do believe those things.

But my point here is that I want to highlight these things so we can become more aware and that's really about what the create with conscious movement is about is just Making sure that we as creators people building technology and as consumers, people who are using that technology are becoming more aware of the effects that it has on our lives.

So now that I've gotten that out of the way, let's go in and talk about the ways that technology is beginning to control our time.

And I think the primary way that technology does this is through a psychological tactic called variable rewards.

So variable rewards are our schedule of reinforcement basically where rewards that you receive are received at random based on a particular behavior.

And this theory was basically studied by an American psychologist by the name of BF Skinner and you may remember him from like middle school or high school psychology, but What he's famous for is discovering this schedule of reinforcement.

So he's most famous for this Skinner box experiment where He would put lab mice in a box, they would press a lever and they would receive a treat.

And what he found is that the mice that would receive treats at random.

So maybe like a small tree once, once they push the lever, they'd get a large tree.

Other times, they would get no treats at all. The mice that received those at random would literally push the lever super compulsively because they couldn't anticipate what was coming next.

And it led them to want to, you know, continue to push that lever.

Versus the mice who would receive the same damn thing every time.

So the way that we see this translate into things that we experience in our human lives is the idea of a slot machine.

For those that aren't familiar, how a slot machine works is you push a button or you press a lever.

And what happens is you receive a small tree or you receive a large prize or sometimes you don't receive anything at all.

If you're not lucky. But that is why slot machines are so addictive is because we can't really control what we're going to be receiving in return.

And so it's that anticipation that keeps us really tied to it.

And it's fascinating because we see this translate into software that we use. So when you pull to refresh.

That's an iOS pattern. I'm not sure if Google has that as well.

When you pull to refresh on an app to load new email or get something new in your feed, you're literally like playing the slot machine game to see if you're going to receive any rewards in return.

And when we are swiping on dating apps, we're essentially playing this game to see if we're going to receive a match based on our behavior.

And when we post on social media and we wait to, you know, see if people leave us comments or give us feedback.

We're essentially playing this game to see how many rewards we can receive for the behavior that we've done.

And this is something that's deeply tied to our psychology.

So what ends up happening is when you receive a variable reward, your brain is releasing dopamine.

Dopamine is tied to feelings of pleasure and satisfaction, which is why it's found it's released when it's tied to a lot of addictive substances like nicotine and caffeine and alcohol.

Right. So what ends up happening is that when you look at your phone and you see all of these notifications, all of these essentially rewards, your brain is equating that with receiving that and dopamine is tied to it.

So it's very much a communicative tie between the two where receive variable reward dopamine is released and it's an endless cycle at that point.

Okay, so enough about variable rewards.

So another way that technology is beginning to control our time is through a tactic called a bottomless bowl.

Bottomless bowls are essentially an experience that really tricks you into consuming more by showing you an infinite amount of options.

And this theory was discovered by a Cornell professor by the name of Brian Wansink.

So what he's holding up in this photo is a bowl of soup that automatically refills itself.

And what he found is that people tended to eat about 73% more calories when the bowl of soup automatically refilled itself.

And we see this translate into a lot of things we experience in our everyday, like Netflix autoplay, for example.

Autoplay has become the industry standard for a lot of streaming platforms and a lot of video streaming services because they know that when autoplay is enabled, you will binge watch TV.

Right. And it's fascinating to me that the term binge watching is something that is widely accepted in our society.

But the term binge is very negatively associated with many behaviors such as binge drinking and binge eating.

But yet when we're binge watching a show, it seems okay.

Right. Because as society, we've accepted that. Also might point out, this is a great episode of The Office, Scott's Talks.

Highly recommend.

But many of us can relate. Right. Like I've been there before.

I have binge watched shows and then I see the screen that Netflix popped up and it's like, how long have I been here?

How did I get here? I need to do something with my life.

Right. And I think the point I'm trying to make is, is that these technology companies are not your friend.

They are typically algorithms that are designed to learn your watching behavior and then show you videos that are designed to keep you engaged and keep you on their platform for longer because it's good for these businesses.

Right. So aside from autoplay, infinite scroll is another example of a bottomless bowl.

Infinite scroll is bad for usability because it doesn't leave users with a stopping cue.

It leads us to continually just keep going without giving users a point where you can say, hey, you probably need to pause or you know, there's X amount of products left to be shown.

And I'll show a better example of this later on.

But infinite scroll is seen on a lot of social media platforms, which is why it's very easy to spend hours and hours on those things.

So by technology controlling your time, this is something that's really deeply rooted in our psyche.

But I want to talk about the ways that technology is beginning to control our emotions.

And I think the primary way that our digital experiences in our lives are doing this is through the use of FOMO.

So FOMO is the fear of missing out.

It's the fear that you are not involved in something that you would like to be a part of.

And social media has just put a spotlight on FOMO and really heightened this feeling because, you know, many of us can relate where we're just having a cozy Friday night and then you open social media and you're all of a sudden met with everyone out having fun and you feel sort of guilty for it.

And that's something that we didn't really have to experience even five years ago, right?

We weren't as in tune with what other people were doing. But when we're shown a constant feed of picture perfect lives of people having way more fun than us, it just gets super overwhelming and we feel like our lives aren't good enough.

But FOMO is actually used as a tactic in a lot of the software that we use on a day to day.

So I want to talk about some specific examples of different patterns that companies use to invoke FOMO within you.

One of the ways is when you're shopping online and you see this thing that says, hey, this item is selling fast.

It makes you want to purchase that thing right away, right?

Because you don't want to miss out.

Or when we're shopping for flights and they're letting us know, hey, there's only three tickets left at this price.

So make sure that you purchase it right away.

And one travel website in particular takes us a step further.

So they have this timer asking me to book before the tickets run out.

And then they have a counter saying, you know, how many people are looking at this flight.

As a user, when I see this, it makes me a little nervous.

And that's FOMO that is being created by this experiment.

So an eagle-eyed Twitter user decided to do some research and actually found that this number is randomly generated.

And we know this because they named their span class view notification random.

So we know that many companies out there are using this as a tactic.

They're not actually showing you accurate data for how many people are looking at this flight.

They're only showing you this number so that it will get you to purchase the ticket.

So aside from FOMO, one other thing I wanted to mention about technology controlling your emotions is that I do believe we've become very tethered to particular digital devices in our life to control how we're eating, how we're sleeping, how we're breathing, and even how much we're drinking in a day.

And while I think that these apps can absolutely be beneficial to our well-being and our health, I think that we have to remain in tune with our bodies, right?

So if I only rely on a sleep cycle app to tell me how I'm feeling and how much sleep I got during the night, I'm not actually in tune with my body and how I'm actually feeling.

So I want us to keep in mind that using these apps can be beneficial, but making sure that we're not getting so out of tune with our own bodies that we're so far removed from those things.

So by controlling your emotions, I think technology is suggesting that you should always be in tune, right?

You should never miss out on information. And that leads me to my next point, which is that technology is beginning to control our attention.

We all know that there's been studies that attention spans in the decline, and I think there was even something mentioned out there that humans have an attention span that's shorter than a goldfish.

I don't know the accuracy of that statement, but I wouldn't put it past us at this point.

It's not our fault. I think phones are technically always vying for our attention.

So let's talk about specific examples.

We've all seen something like this happen before, and we can sit back and laugh at this thing where people are distracted by their phone and they're not paying attention.

But the reality is this has become such a problem in our society. But what I want you to realize is that when you spend time on your device and on your phone, it's good for these apps and services and businesses because it's a successful business metric that companies can then point to and say, our users are spending this much time on our app.

That's good for our business. Why do you think social media is free?

A study found that in 2016 global ad spend on social media totaled around $31 billion.

$31 billion was spent with companies. The goal in mind was to try to vie for your attention, to try to make sure that they are getting eyes on these ads.

And this is why we don't pay for our social media accounts, right, is because we're shown this constant stream of ads that's vying for our attention.

And it's fascinating to me that how to get users attention is literally studied as a science.

So this is a book by an author by the name of Neri Wall.

It's all about basically how to build habit forming products. And in this book, he outlines the hook model.

You can see here, he literally mentions that you should use variable rewards as a tactic to build products that are habit forming, which you can argue that that means that you're trying to hook your users and get them to spend lots and lots of time on your app.

And this book sold thousands of copies, right?

So people are fascinated with this concept. Okay, let's talk about some specific patterns that we see in our everyday software that we use that is really tied to companies trying to gain your attention and control your attention.

And I think notifications is definitely one of the primary ways that companies do this.

So when Facebook first launched, they had their notification color blue to match the Facebook branding.

And what they found was that no one was clicking on this, no one was interacting with it.

And so they're like, well, let's change it to red, you know, maybe people will notice it at this point.

And clicking skyrocketed. And what ended up happening is that this has become a very common pattern for how we handle notifications.

It's known as a badging system. And if you're like me, I can't stand to see this red number, and I want to get rid of it as soon as possible.

And it is scientifically designed to do that.

A study found that the average smartphone user receives 46 push notifications a day.

So that is 46 times in your day that your phone is lighting up, asking you to pay attention to it.

And I get it.

Like, I've been away from my phone for a bit, and then come back and had a slew of notifications on my home screen.

It's just super overwhelming. You're like, oh my god, I missed out on so much stuff, right?

Again, relating back to FOMO.

And how many of us have seen this pop up before when we download an app for the first time, and they're asking us, hey, you know, we would like to send you push notifications.

And you're like, I barely know you. Like, I haven't even used your product yet.

And you already want to, like, bother me. So even if you decide, you know, I'll set this up later, don't allow, companies continue to ask you to turn on notifications.

And even some of the language that you see here is almost like they're pleading a little bit, right?

Like, please turn on notifications. Be in the know for when people follow you.

It's because they know that this is designed to get your attention and to keep your attention.

So a little story about an incident that happened in 2016 with Tumblr.

It had a push notification that would inform users about new and trending tags.

So it was designed to get you to open it and then interact with that particular tag so that they could get more engagement and increase views on that particular tag.

So one user received this notification one day.

I'll pause for dramatic effect. It says, beep, beep, hashtag neo-Nazis is here.

So the trending tag was neo-Nazis. And obviously, this is not okay.

So this particular user contacted Tumblr about it, and they went on Twitter.

And the head of content at Tumblr responded saying, yeah, it's for track tags.

We talked about getting rid of it, but it performs kind of great. It performs kind of great.

This is a prime example of a company that has the business goals in mind rather than doing what's right for their users.

Okay. So aside from notifications, modals are another pattern that I feel very strongly about.

If you know me as a product designer, you know I could talk for hours about modals.

And it's one of those patterns that honestly I feel like has been bastardized on the Internet.

And so I think that they honestly get a bad rap. They started out being used in a lot of operating systems as they should be used.

And then people have just taken them and abused them to no end.

And I'll show you an example of this. Because they're used to get your attention, oftentimes they pop up very obtrusively.

Like when we're on a website, and all of a sudden it's just like, very important message.

Like, please pay attention to me, right? Push this button. Give me your email.

And you're just like, I'm trying to focus on what I'm trying to get done.

And a study found that when these external factors like modals interrupt an execution of a primary task, users require about 15% more time to complete their task at hand.

So this is not something that we should be proud of. And I want us to be more mindful about how we're using this pattern.

But a lot of modals use a pattern called confirmed shaming.

Confirmed shaming, I'm sure you've seen it before.

It's used a lot in opt-in and opt-out experiences where they want you to enter your email or sign up for something.

And option A is like, yes, sign me up.

Option B is the confirmed shaming that happens. And it's usually something that makes you feel like a flaming hot piece of garbage.

So what I did was I collected some of the best examples that I could find on the Internet, and I'm really excited to share them with you.

This one is from the website Delish, and it's saying, enter your email, show me 14 simple dinners.

And the confirmed shaming that happens is, no thanks, I'll have microwave dinner tonight.

This one from Gmail, which was quickly pulled after people spoke out about it, is asking me to download the latest version of Gmail on your iPhone.

Option A says, yes, I want it.

And the confirmed shaming is, I don't want smarter email.

And then the last one, which is my all -time favorite from Esquire, is asking me to enter my email so I can see 75 movies every man must see.

And the confirmed shaming says, no thanks, I'll stick to the latest Adam Sandler films.

Sorry for any Adam Sandler fans out there.

Okay, so I guess what I'm trying to get at is that by technology controlling our time, our emotions, and our attention, it's fundamentally changing our personal selves, our brain, it's changing our social selves as well.

And it's become almost impossible for us to disconnect completely with so many pings, notifications, things vying for our attention.

And I get it, y'all. I've been there before. It's very, very difficult to completely tune out from everything and turn off.

And I was thinking back, like, the concept of working wherever we want is a concept that's new, right?

Working remotely.

And there's a lot of beautiful things about that. But there's also a lot of negative parts about that.

You can check your phone, you can check your work email while you're laying in bed at 11 at night, right?

Our parents' generation literally had to clock in and clock out of work.

So they could not bring work home with them.

And this isn't something that we deal with now on a day to day. So we're relearning these patterns of making sure that we leave work at work.

And that's become very hard for us to do.

Even when we're, you know, waiting in line at a coffee shop or waiting for the elevator or just, you know, hanging out at home on the couch, our go-to instinct is normally to pull out our cell phone or our smartphone or our devices, right?

Because I think we've come to almost fear our own minds. We've come to fear being bored.

But what this actually does is it fuels a cycle of dependency.

And this term was coined by a psychologist by the name of Doreen Dodgen -McGee.

She's done a ton of really, really awesome work around this topic. So how this works is basically you think to yourself, let's say you've been away from your phone for a while and you start thinking, hmm, I haven't checked my phone.

I wonder if like anyone's trying to get a hold of me.

I wonder if I have any new emails.

What if I have to respond to something on work chat? Did anyone like my newest Instagram post?

So those thoughts start circulating in your brain and your brain starts to generate cortisol.

Cortisol is a chemical that's tied to your mood and your motivation and your fear.

So it controls those things. And you start to get anxious.

You're like, I need to check my phone. Like, it's not near me. Like, I haven't checked it.

Like, I need to just make sure that no one's trying to get a hold of me, right?

Whenever you start to feel anxious or any sort of particular emotion, your body has an innate need to self-soothe.

So in order to self-soothe, it wants to get rid of those feelings, right?

And if the only way that you know how to do that is to pick up your phone to get rid of that anxiety or to get rid of that boredom, you continue to fuel this cycle of dependency and it makes it very, very difficult to break this.

And I'm guilty of this too. Like, whenever I need it from work or I'm, you know, out on a walk, it's very, very hard for me to just put my phone away and not check it.

But this is going to be something that I talk about a little bit later about trying to break this cycle.

And like I mentioned earlier, it's almost as if we fear our own minds.

Like, think about the last time you truly did nothing.

Like, sat in a park, stared at the trees, stared at the sky, or just sat on your couch.

Because the concept of that nowadays is just so foreign to us.

You're like, why would you want to do that? You know, like, why wouldn't you want to, like, play a game or watch TV or be on your phone?

And I think there's a ton of added benefits to letting our mind wander, letting our mind get into a state of idleness, which is why a lot of people talk about the power of meditation and why that's so important to our mental well-being.

Have you ever been, like, driving a car and suddenly you don't remember the past, like, three or four miles?

That is because that's your mind wandering. It's letting your mind get into a state of idleness where you're able to actually think really creatively.

And there's a ton of added benefits to this.

A psychologist by the name of Amy Fries says this.

When your mind is able to wander, it is assessing memories, emotions, and random bits of stored knowledge.

And there's been a lot of, like, really inventive, creative things that have come out of people just, like, you know, being alone with their own thoughts and letting their mind wander.

The story goes that Isaac Newton was sitting in his mother's backyard pondering life, as he is doing in this illustration, and he saw an apple fall from a tree, and that's where he started thinking about the theory of gravity and where it led him to discover that theory.

J.K. Rowling, famous author for the Harry Potter series, was waiting on a delayed train in Manchester when she came up with the idea of blizzarding world of Harry Potter.

And I don't think this would have happened had she had been, you know, nose to her phone, scrolling through Instagram.

So aside from technology distracting us from our own thoughts, it's also beginning to distract us from the people around us.

And I'm sure many of us can relate where we've been with someone and they're just on their phone the whole time.

I love this photo series by a photographer by the name of Eric Pickersgill.

He has photoshopped out the devices in some of these photographs he's taken, and I think it really highlights the power that these things have on us, right?

I started thinking about the other day, too, with quarantine and COVID, you know, how this is probably isolating us even more.

And I wonder if we're going to have to relearn some of our social patterns once we, when that day comes, when we get back to being around a more social setting again.

This has become such a phenomenon that there's now a term for it.

It's known as fubbing. And it's basically the act of someone snubbing you in a social setting by looking at their phone instead of paying attention to you.

And I get it.

I've been on the giving and receiving end of this before. My friend took this photo of me at a concert and she was like, you're not even watching.

You're not even paying attention.

Like, get off your phone. What are you doing? And I was like, you're right.

Like, thank you for keeping me in check. Right? Almost 64% of couples therapists have noted that mobile phone usage was raised as a common problem in their clients' lives.

Because this is a phenomenon, technology has advanced so much just in the past five years that as a society, we're still learning how it affects us socially and how it affects our personal relationships.

To quote Doreen Dodgen-Magee again, she says this, we don't engage the people in places surrounding us because we have come to prefer the company of devices that ask so little of us in return.

And this one, this quote, like out of all of the quotes that I share in this, in this presentation, I think this one hits me the deepest because it's, we know it's true and we know it's an issue that needs to be fixed.

Okay, so I want to kind of shift gears. I know that was a lot of heavy stuff to talk about.

But I do think there's, there's hope. I think there's a light at the end of the tunnel.

And that's what I want to talk about right now is just that we're become beginning to become more aware that digital well being is something that we need to prioritize.

And you can see from this Google search terms, digital well being spiked around August of 2018.

That was due to Facebook and Google and Apple announcing that they were going to be prioritizing this in some of their products.

So Google has released a series of experiments related to digital well being.

The one on the left is called envelope and it's literally like a physical, I think it's a physical cardboard enclosure that only allows you to use your phone for calls.

So just one purpose. And the one on the right is called stopwatch and it basically tracks how much time you have spent on your phone for that day.

We've also seen this seep into literature. So there's a lot of books, podcasts, a lot of education out there around how to navigate, you know, raising children in a digital world, how to navigate digital minimalism and well being.

And I think these resources are all great. I, I can recommend a few at the end of this if you're interested.

There's also like vacations for sale that focus on digital well being where they literally, I don't know, my assumption is they lock your phone in a box and they don't have Wi Fi on this tropical island.

But these are the extreme measures that we're having to take in order to actually disconnect from our digital lives.

It's also seeped into politics. So this is an act known as the Detour Act, the deceptive.

Sorry, I can't read that. Deceptive Experiences to Online Users Reduction Act.

This was introduced back in April of 2019 and some of the things that this highlights is it would make it illegal to design with the purpose of obscuring decision making.

So the confirmed shaming that I was mentioning earlier, it would make it illegal for companies or websites to use that.

It would also ban UI design that creates compulsive usage and users under teams.

So we would have to be a lot more mindful about the software we're creating for, you know, our kids generation, for example, like you for kids probably wouldn't be allowed to have autoplay enabled by default.

And to me, all of these things are a step in the right direction.

But I really truly do believe that we have to fix this starting at the root of the problem.

We have to fix this starting with the people who are creating software.

And so that's really what Create with Conscious is about is being more mindful, being more aware of the technology and the software that we're creating.

And I want to reiterate, right, it's not these devices that are inherently bad.

It's the software that lives in these devices.

It's the people that are building the software that may not always have your best interests in mind.

I love this quote by a design ethicist by the name of Mike Monteiro.

He says, we're no longer pushing pixels around. We're building complex systems that touch people's lives affect their personal relationships and undeniably affect their mental health.

And he's right. To me, that looks like taking responsibility for the tech we're creating and what we're building.

So in order to do that, I've put together four principles that I think really highlight ways that we can start doing that.

Number one is to build with respectful boundaries. So what that looks like to me is just giving users the option to choose which you should always be enabling.

So Slack does this really well. They have a variety of ways for you to turn off notifications.

If you want to be notified for every little conversation happening, you can do that as well.

But what they're doing here is giving users the option to choose.

So a load more button is a much better alternative to end the infinite scroll.

Because in this example in particular, they show me how many products I've viewed and then how many I have left to view.

So my brain can kind of equate that with how much time I will be spending here instead of just like I need to get to the end.

And as much as I should on social media earlier, I do think a lot of media companies are taking measures to prioritize well-being, digital well-being.

So I'm sure many of you have seen this. This came out I think a year and a half ago.

It's basically a message saying you're all caught up so you don't feel the need to keep endlessly scrolling and get to the bottom of your feed.

So the second principle to create with conscience is to encourage well-being.

And we've seen this more and more, I think, especially with COVID happening.

I think a lot of apps out there are encouraging you to take some time away from their experience.

So the dating app Bumble has a feature called snooze where you basically hide your profile and take some time away from the app.

You can set a time limit and say a message for why you're spending time away or you can leave it blank.

And TikTok of all apps, which I started using during quarantine, has a screen time management feature.

So it allows you to set a time limit for how much time you want to spend on the app.

And then in order to proceed, you enter a passcode.

And they also took it a step further. And I'm going to show you on the next slide a video that is shown in your feed if you've spent too much time on the app.

Hold on.

You've been scrolling for way too long now. Maybe you should get some food, get some water, and then come back later.

So I really like that they built it into their feed of regular videos rather than having the user go to a different part of the app in order to configure their digital well-being settings.

And then the last thing I wanted to highlight here is that we've seen a lot of digital well -being initiatives built into the software or the OS, the operating systems that we're using.

So I know Apple has screen time. Google has digital well-being. And what this allows you to do is basically set time limits on apps.

You can set downtime, things like that.

And you can see weekly reports about those things. So this is definitely a step in the right direction.

The third principle is anticipating unhealthy behaviors.

And I think as designers, we tend to always focus on happy paths and, you know, what's this experience going to be for a user that goes through this and everything works out for them greatly, right?

But we have to start considering really deep, dark things that people might be using our products for, such as depression.

Are the products that you're building stimulating, elevating, or potentially triggering symptoms of depression?

And how might someone with a mental illness potentially use your product to hurt themselves?

Addiction is another unhealthy behavior to consider.

So addiction can present itself in many forms.

I'm not just talking about screen addiction here. Does your product promote long-term use?

And what's the worst-case scenario for users who are susceptible to addiction and have a history of addiction?

And then exclusion. You know, is your product accessible to users of all groups, regardless of disabilities, culture, language, education level, race?

And really making sure that your product is not being used to exclude certain groups.

If these are not questions you are already asking your product team, you need to start asking these questions.

You need to start considering this because I think this is something, if you get anything out of this talk, please take away this slide as an action item that you are going to follow up on.

So what I'm trying to say with all these is that it's not easy to have these conversations, right?

Because I think a lot of the times people are like, well, I've brought this point up, and these are the responses I get back.

We don't have time to consider that.

Or that'll never happen. Or, well, we'll think about that in version two.

That's my favorite one. But we can't accept these responses because we need to be considering these unhealthy behaviors from the forefront of when we're building products.

And I'll show you an example of what happens when we don't do this.

So back in 2015, Facebook introduced an authentic name policy.

The intention behind this was to make sure that there were not spam or fake accounts that existed out there or people posing as someone else, right?

All good intentions.

But once they implemented this policy, what ended up happening is that a lot of drag queens that were using Facebook who were not using their real name were suddenly getting bullied and reported and their accounts were taken down because they weren't using their real names.

And so they protested and started this movement known as the hashtag my name is movement.

And it forced Facebook to have to relook at this policy because they didn't consider that in the beginning.

So I think it's really important that we're anticipating these unhealthy behaviors from the very beginning, the inception of an idea of an experience or a feature.

Okay, so last principle I want to talk about is changing how we measure success.

Because a lot of the ways that we measure success of our products is based on clicks, shares, views, signups, page views.

These things are all a fancy way of saying this user is engaging with our product.

And I don't think that's necessarily a bad thing.

But I do think we need to focus our success metrics away from just engagement.

And a typical scenario goes like this, right, a product manager will come to the designer and say, we need to increase our conversion rates.

And designers have had to get more creative with the ways that we are implementing these patterns because and we've given them gimmicky names, right, like gamification.

Gamification is essentially a roundabout way of saying you're trying to get into your user's psyche to trick them into using your product more and more, right, again, focus on engagement.

And I want to share with you all a story about how I was in this exact scenario at a previous job.

So what ended up happening is that we had a section of our product that we wanted users to interact with.

And for months and months, they just weren't clicking on this particular thing.

They were ignoring it. They weren't paying attention to it. And so, you know, the PM came to me and said, what can we do?

So we ended up implementing this experiment known as mouse out.

And how it happens is that if a user is focused on your area that you want them to click on and they navigate away, you pop up a modal that says, like, you know, hey, did you notice this thing?

Or like, please pay attention to this thing.

I don't remember the verbiage we use. But what I'm trying to highlight here is that, you know, I'm not proud of this experiment.

I'm not proud of what we built. But I know in retrospective, looking back, I will not let this happen again.

I will try to shift the conversation around conversion rates and engagement and focus it more around metrics that truly matter to us and matter to our user's experience.

Because short -term product gains, implementing dark patterns in order to achieve these short -term gains are not beneficial to our long -term product health.

Because what they end up doing is eroding trust that we built with our users.

So what I've started doing and what I want to share with y'all is a template that I put together called a goals signals metric template.

And it's not anything that's, like, groundbreaking. I adapted it from Google's heart framework.

But I really think it's beneficial to use because it forces us to really focus in on what are the things that we're measuring our success on.

So let's look at an example. If I am measuring the task success of a search results, right, in my product.

The goals that you're going to map out are super high level, you know, what are certain things that you want to map to the metric that you're tracking.

So here I could say users are finding what they're looking for effectively and efficiently.

My signals are the successes of failure, or sorry, the measure of success or failure.

So they're mapped to user behaviors and attitudes.

So here I could say they find the result of their search inquiry on the first page.

And then the metrics are things that you're using to actually track those.

So you can measure your search results on exit rate, a successful search inquiry, or even an SEO metric.

So I think my point here is that you, you want to really hone in on what it is that you're measuring success on instead of engagement, which is super high level.

And what these four principles will empower us to do is just empower our users to build healthy habits and healthy relationships with the technology in their own lives.

Okay, so to wrap up, the last thing I want to share with y'all is how to consume with conscience.

So I talked about how as creators, we can be more responsible, but how as consumers as people using devices in our everyday lives.

How can we be more mindful of how we're consuming And I want to reiterate, like, I love technology.

I'm not perfect.

I probably spend too much time on TikTok, but I'm aware of these things and I'm actively working on improving those.

So here are some of the ways that I'm going about doing that.

Number one is just minimizing the noise. I think that we get so distracted with things around us that it's very easy.

Like I said, 46 push notifications a day is a lot to handle.

So what I've started doing is turning on Do Not Disturb.

It makes it so that your phone won't light up and you're not getting those constant desktop notifications that are distracting you during meetings.

Minimizing the amount of push notifications that I allow on my phone and that I'm receiving has really been beneficial for me.

I have about 60 apps on my phone and I only allow notifications from these five because I asked myself one day, is there any notification out there that's that important to take me away from what I'm currently doing?

Probably not. And then by nature of getting rid of things, it minimizes the amount of things that you have to check or keep up on.

So I think maybe two or three years ago, I got rid of Facebook and Snapchat.

I was like, these aren't really beneficial for me anymore. And I found myself literally going through my queue of time to sit down for the day and go through my social media apps.

And it just felt super draining. So by nature of kind of getting rid of accounts that you don't use or provide value to you will help minimize the noise.

Second way to consume with conscience is to set boundaries for yourself.

So if you use Google Calendar or Gmail, there's a way for you to set your working hours so that people know not to schedule meetings with you after a particular time in the day.

Super nice. There's also an app called Flipped that a friend recommended to me.

It allows you to, I believe, lock an app for a particular amount of time so that you don't feel distracted or feel the need to check that app.

And then lastly, it's about being mindful. So, you know, you can set up Apple screen time and you can, you can know that you're spending 50 minutes on Instagram a day.

But it's up to you to practice that habit of distancing yourself from that and setting boundaries for yourself.

So next time you're out to eat. I don't know when that will be.

Or maybe you're on a walk with your dog or maybe you go to pick up coffee or even if you're at a stoplight and you feel the need to check your phone.

I want you to just pause and recognize how you feel. Recognize how hard it is to break this cycle of dependency.

Because you'll notice yourself starting to feel anxious.

But I'm telling you, y'all, like the only way that we can improve upon this and to break this cycle of dependency is to practice these habits on a daily.

Because habits take time to break and new habits take time to form. So it's not going to be easy when you feel the need to check your phone and you just sit there instead at a stoplight.

But try it sometime. Notice how you feel. Because what ends up happening when you step away from being like so digitally distracted is that you'll be more in tune with yourself.

You'll be more in tune with your surroundings.

I truly believe you'll be more content and you'll be more in tune with the people surrounding you who matter to you the most.

So you can find everything I just talked about on createwithconscience.com.

It has the goals, signals, metric framework on there if you'd like to download that.

And I just really want to thank you for your time and attention.

I know this was a lot to get through.

So if you stuck with me this long, I really appreciate it. And I'm really excited to hear your feedback.

And let's see if we have any questions that have come in.

It doesn't look like it.

So I'm going to sign off and thank you all for your time and attention.

Transcribed by https://otter.ai Hi, we're Cloudflare.

We're building one of the world's largest global cloud networks to help make the Internet faster, more secure, and more reliable.

Meet our customer, Falabella. They're South America's largest department store chain, with over 100 locations and operations in over six countries.

My name is Karan Tiwari.

I work as a lead architect in Adesa e-commerce at Falabella.

Like many other retailers in the industry, Falabella is in the midst of a digital transformation to evolve their business culture to maintain their competitive advantage and to better serve their customers.

We have a store legacy that we have to adapt to the digital culture.

A logistical legacy, a legacy of operations, a legacy that works very well.

It hasn't worked very well, but the challenge now is to transform it.

Cloudflare was an important step towards not only accelerating their website properties, but also increasing their organization's operational efficiencies and agility.

So, Cloudflare, for example, wasn't just a TI decision.

It was also a business decision. I mean, the faster we can deliver the data to our customers, the less loading time and seconds we can improve our site.

So, I think we were looking at better agility, better response time in terms of support, better operational capabilities.

Earlier, for a cash purge, it used to take around two hours.

Today, it takes around 20 milliseconds, 30 milliseconds. To do a cash purge.

Home page loads faster. Your first view is much faster. It's fast. Cloudflare plays an important role in safeguarding customer information and improving the efficiencies of all of their web properties.

Cloudflare, for me, is a perfect illustration of how we can deliver value to our customers quickly.

With customers like Falabella and over 10 million other domains that trust Cloudflare with their security and performance, we're making the Internet of Things a reality.

Making the Internet fast, secure, and reliable for everyone. Cloudflare, helping build a better Internet.

Cloudflare.

Optimizely is the world's leading experimentation platform.

Our customers come to Optimizely, quite frankly, to grow their business.

They are able to test all of their assumptions and make more decisions based on insights and data.

We serve some of the largest enterprises in the world, and those enterprises have quite high standards for the scalability and performance of the products that Optimizely is bringing into their organization.

We have a JavaScript snippet that goes on customers' websites that executes all the experiments that they have configured, all the changes that they have configured for any of the experiments.

That JavaScript takes time to download, to parse, and also to execute, and so customers have become increasingly performance-conscious.

The reason we partnered with Cloudflare is to improve the performance aspects of some of our core experimentation products.

We needed a way to push this type of decision-making and computation out to the edge, and Workers ultimately surfaced as the no -brainer tool of choice there.

Once we started using Workers, it was really fast to get up to speed.

It was like, oh, I can just go into this playground and write JavaScript, which I totally know how to do, and then it just works.

So that was pretty cool.

Our customers will be able to run 10x, 100x the number of experiments, and from our perspective, that ultimately means they'll get more value out of it, and the business impact for our bottom line and our top line will also start to mirror that as well.

Workers has allowed us to accelerate our product velocity around performance innovation, which I'm very excited about, but that's just the beginning.

There's a lot that Cloudflare is doing from a technology perspective that we're really excited to partner on so that we can bring our innovation to market faster.

What is a WAF?

A WAF is a security system that uses a set of rules to filter and monitor HTTP traffic between web applications and the Internet.

Just as a toll booth allows paying customers to drive across a toll road and prevents non -paying customers from accessing the roadway, network traffic must pass through a firewall before it is allowed to reach the server.

WAFs use adaptable policies to defend vulnerabilities in a web application, allowing for easy policy modification and faster responses to new attack vectors.

By quickly adjusting their policies to address new threats, WAFs protect against cyberattacks like cross -site forgery, file inclusion, cross-site scripting, and SQL injection.