This is Your Tech Leader Speaking
Presented by: Gretchen Scott , Sam Floreani
Originally aired on June 4 @ 8:30 PM - 9:00 PM EDT
Samantha Floreani is a well known and respected privacy specialist in Australia. We get to chat with her about power asymmetries and the societal implications of the ways we collect, use and share data.
English
Interview
Transcript (Beta)
Hi there, I'm Gretchen. Thanks for joining us for today's session of This Is Your Technologist Speaking.
This Is Your Technologist Speaking is a series of talks with local APAC industry leaders and we're nearing the end of our six-week series.
Looking back though, we've had and continue to have because today's guest is no different, a huge list of amazing people doing interesting things in the technology world.
So today we're with Samantha. She's a privacy and technology specialist with background in politics and international studies and also experience and a whole heap of education in data science.
She spent time as an independent consultant and she also works at Digital Rights Watch.
Thanks for getting up early this morning, Sam.
It's great to have you here. Thank you. I have my coffee. I'm ready to go.
And you also have a beautiful cat just hiding. Yeah, just peeking. I've got to say, a cat viewing on a video call is one of the things that's a 2020-2021 expectation, isn't it?
Well, she was born into lockdown. So I feel like, you know, one day I'm going to have to go back into an office and she'll be like, oh no.
She'll either hate it or love it.
Exactly. She's very beautiful. We were just talking in the green room and I was like, digital privacy, oh my gosh, it's huge.
There is so much in it and there's a whole heap I want to learn.
So let's just dive right in and stop being polite.
It's like it is such a big concept. I was trying to explain to people I was talking to yesterday what it is and I failed spectacularly.
Could you explain to me what you think or consider digital privacy to be?
Yeah, absolutely. It is such a huge area and it means different things to different people, which is part of the challenge, I think.
And you can kind of see that when we look at laws around the world.
You can really see that people are grappling with like, what is this?
What does it mean and how do we protect it? So yeah, it's what I would call a non-fixed social construct.
So that means it changes and ebbs and flows over time and it changes according to our worldview as well.
So like, for example, a teenager might have a very different concept of what privacy means versus an adult and all of their life experience will feed into that.
But so historically, when we think about privacy, it's very much this like the right to be let alone, you know, being able to keep things, keep my business to myself.
It's very like individualistic kind of almost a libertarian approach being like, I should be at liberty to have all of my information kept to me out away from the prying eyes of the government and businesses and whatnot.
And that still holds true to a certain extent.
But the reason that I am so passionate and interested in privacy is more of this kind of collective idea of privacy, which I think is a bit more interesting and a bit more radical and a bit more political.
So what does collective privacy mean?
So basically, it's this idea that privacy as a value and as a right has collective worth.
So one way to look at it is, you know, for democracy to function, we need elements of privacy in order to be able to organize, to be able to push forward social movements, to be able to develop our ideas and our personality, like away from the prying eyes of the state.
And the idea being that like, if you think about any social movement, women's rights, civil rights, all of these movements needed a degree of privacy to be able to get them off the ground because, you know, they were in opposition to the status quo at the time.
And so I really love this idea of privacy being something that enables us to realize ourselves and our ideas and collect and push back against power abuses or issues that we want to see changed in the world.
And that's, that's starting to be big.
Yeah, it's, it's a pretty big way to look at it. But I think that's starting to become a bit more realized rather than just being like, my data, it's mine, like, how do I keep myself safe?
It's more about how can we keep each other and our communities safe in the face of surveillance capitalism, and all of these data hungry companies and governments.
So yeah, I really kind of about pushing, pushing back.
There's a saying, I want some space. So what would you say to the argument that I've often heard that's like, I've got nothing to hide.
So it doesn't matter to me.
What are you moaning about? Oh, so many things I would say that.
Thankfully, it's starting to become, I feel like there's a bit of a change happening.
Like, I don't hear it quite as often as I used to. That said, it does kind of, it does kind of come up, it often comes up with people in my, like, peer group who will say something like, oh, but I'm not interesting.
Like, it's all of my stuff is boring.
Which is, I think, the same sort of the same kind of argument. It is just rephrased less combatively, I guess.
Yeah, exactly. So to that, I would say a few things.
Firstly, it's an immensely privileged thing to say. If just because you don't have anything of yours that you're, you know, concerned about falling into the wrong hands, doesn't mean that there aren't plenty of other people who do.
And for very valid reasons, like it, you can want to keep things to yourself without them being bad or being, you know, this idea of it being a secret has connotations of it being like this nefarious thing.
Right. And naughty. Yeah. Nefarious is the word.
But like, if you, you know, if, if you went and got an STI check, for example, maybe you want to keep that, that, that to yourself.
And that's entirely reasonable.
If you're part of the LGBTQ community and you're not out yet, then you don't plan on coming out at all because of whatever reasons, like keeping that to yourself is entirely reasonable.
So when people say like, well, I have nothing to hide.
It's kind of like, well, it's not really about you. I like that. Can anything actually be private anymore?
Oh, I, I have to believe so. Otherwise, what am I doing?
This is true. It is increasingly hard for sure. Especially in online spaces.
But even, but even in offline spaces, like, I think there's less of a divide as there used to be.
And it, I think we all, myself included, and I work in this space, we all underestimate just how much information we are generating.
And, you know, sort of this idea of like data crumbs follow like behind us, trailing behind us.
I think we, there's so much more than we could, you know, really fathom on a day-to-day basis.
So it is certainly increasingly hard to keep things private for sure.
It's not impossible. Not impossible. What would you say to, I'm going to be really specific here, say a parent of a 14-year-old who wants them to be a bit more aware of their privacy?
Are there any little takeaway tips you'd give that direction?
That's a great question. And there are some resources online, which I'm happy to share that are probably better at explaining this than I am, because I don't have kids.
So I, you know, it's kind of a bit more theoretical. I definitely always, I always think that having, like having conversations with them regularly is really important about what it's like to be online, what the risks are, things like that.
At 14 years old, for example, I think they're starting to be old enough to start to understand the issues.
And I think that there are some, for example, there's some podcast series out there that I would probably try and get them to engage, maybe engage together in some of these things, which really lay out the consequences.
Yeah. Because I don't think you understand, well, I'm not sure I understand it for me.
Well, yeah. And something I would probably try to communicate to them as well is that it's not, and this is, this is how I, this is how I see it.
I mean, there are other people who would be like, no, just tell them to get off social media.
I think that there's a really key message and they're just like, this is just about making sure that you understand so that you can make choices about how you interact online.
And so that you, so that you're empowered to do that rather than coming down with an iron fist and trying to strip them of their autonomy.
Because at that age, of course you want, you want to be online.
Everyone's online. So I would definitely come at it from a kind of empowering angle rather than like a disciplinary.
Yeah, absolutely.
Privacy has a bad enough name as it is. We don't need to add to that. Exactly.
Exactly. It's a rough time. So a lot of privacy comes down to data and you mentioned data crumbs, but it's not just how we collect it.
It's also kind of how we use it as organizations and then how we share it.
There's so many steps and my, my take on it is you could, as an organization, make a mistake at any point and not an intentional kind of evil decision, but you could just do something that was thoughtless and it would have quite significant implications.
Did you, is that something you'd tend to agree with?
It's more of a statement question. Yeah, absolutely.
And that is, that is reflected in the way that our privacy laws in Australia are written.
They, like for example, the Privacy Act has a set of Australian privacy principles and they kind of deal with the personal information across the, the information life cycle is what we would call it.
Because you're right, it's not just about collecting it, although that is a big area, it's also how it's used, how it's stored, whether it's stored securely.
So there's kind of an overlap with, with security there as well, how we share it, even how we destroy the information or de-identify the information.
All of these sort of points in the life cycle can, can go wrong.
So is there, historically I've kind of been of the opinion that the legal structures we have in Australia haven't caught up with the technological advances at the same, like technology is moving so fast.
Do you think we're closing that gap a little bit with the regulation?
Sorry, that was a loaded question, isn't it?
Yeah, so in general, no. But we're, so let's, if we like narrow that down to privacy specifically, the privacy laws that we have in Australia were based on a set of principles that were written in 1980.
And so, and they really tried to be technology agnostic, or neutral, technology neutral is probably a better word.
They wanted it to apply to whatever could come. But of course, in 1980, they didn't know what was necessarily to come.
And so I think the, the law that we have in place, tries to be flexible enough, it's principles based, so you can kind of make it work.
But the flip side also happens where you see companies and organizations kind of like massage the law to be like, yeah, well, we've technically complied.
But it's kind of disingenuous, you know, something like a really long terms and conditions being like, this is what we're doing, is, I think, an example of technically complying with the law, but it's completely disingenuous when it comes to actually communicating with people, how we use what you're signing up to.
But yeah, so, and then emerging technologies. I mean, it's funny calling AI emerging, because it's been around for yonks, but it's just kind of having a resurgence, I think.
It's technologies like that do pose really particular challenges to the way that privacy law is written.
And so that there is internationally, and I think also in Australia, we're starting to see it, especially with the way we are in the process of reviewing the Privacy Act.
There is this building pressure to update these protections with privacy kind of at the heart of that.
So fingers crossed, we get some improvements, but Australia is generally pretty terrible at it.
So it's like the school report that says, yeah, B for effort, but a C in execution.
Yeah, try harder. But on the upward slope, we're slowly improving.
So when it comes to data, I feel like there's, I mean, organizations have it collected.
There was a book maybe five years ago, suggesting that data is the new oil and very firmly arguing that case.
But the moment you're talking about things like oil and data and power, there's massive asymmetries in play as to who gets the last say and what the intentions are.
Are there any way to mitigate the impact of the power asymmetries in this?
So you were talking about a community collective or a collective group of privacy.
And I'm probably more interested in it from that take than just individual, because as an individual against the world, you're not going to do that battle, are you?
Well, that's it.
So there's a fair bit there to unpick. The first thing I would say is, data is the new oil thing.
It frustrates me a little bit because it squarely puts it into this very capitalistic, very monotite, like it's worth something because it's worth money, which is true.
It is worth money. And that's why we are where we are today in a bubbling hotbed of surveillance capitalism.
Which, by the way, is a great book, isn't it?
The Danish book, is it? I always forget the name. The Age of Capital Surveillance Capitalism.
Yeah, an excellent read. A challenging read, but an excellent one.
So, yes, I think that that's a challenge, that this idea of it being solely like a monetary value, because I think that what that means is that we lose sight of the fact that this connected to people.
It directly impacts people's lives, how they're able to access services, how they're able to interact with each other.
So by just framing it as this like monetary thing, I think we lose sight of that sort of human element, which I think then allows for us to mishandle it because it's abstracted away from the human realities of what it means.
Yeah, I agree. It becomes a set of numbers or points or dots on a graph as opposed to an actual human person, heaven forbid.
Yeah, exactly. That's exactly right.
And so when we're thinking about these power asymmetries, I think that privacy is a really, really fundamental space because like if we think of information as power, that's kind of like a common adage, then the data flows between individuals, communities and companies and governments is really one, like unidirectional.
This is true.
And so as they amass more and more information and become more and more powerful, it's us as citizens, as community members, as individuals who are left in this position where we don't have much power.
We don't have much bargaining power.
We don't even necessarily know how they know who's about us and we know very little about them.
Or even who it is that has the data, right? Exactly. And so they're able to do things like manipulate choices when it comes to, not just to like purchasing and like, yeah, commercial things, but also the, like I think Cambridge Analytica sort of stuff is an interesting look at like how you can also manipulate democratic outcomes.
So a huge power imbalances. So I would say that privacy is one of the key ways that we can sort of examine those power structures and be like, hang on, that's actually not fair and it's not ethical and start to put in protections so that we can, you know, try and balance.
I mean, in many ways so much of what organisations do with data is phenomenal, right?
Like I love the fact that I can go online and try and buy something that I know I need or I want.
And based on my search history, everyone knows what I want and what I'm looking for and will push me options.
I love the convenience of that. So there is some goodness that comes with, well, I find that useful.
I think sometimes I get quite focused on the negative or the potential downsides.
Is there, what do you think of the good things that come out of, I guess, sharing little bits of data here and there?
Oh, totally. Like, I mean, don't get me wrong. I certainly am very passionate about privacy, but that doesn't mean that I don't want to, you know, ever share any of my data.
Like just when I live in an island, you know, that would be completely unrealistic and unreasonable.
And also it wouldn't be any fun. You wouldn't get to enjoy any of the things.
And, you know, I'm also very, very drawn to the tech and to data.
Like I've studied data science, you know, I'm very interested in the power of what we can achieve with technology and with data.
And I think that it's entirely possible for it to be done well and to be done with like social issues and social progression kind of at its core.
What I don't think is that that's possible if, while it's still so heavily focused on like this, like if your goal is to turn a profit, then it's very hard to reconcile that with good human outcomes, in my opinion.
I'm revealing my cards here as like such a lefty.
I think the extreme, the push me, pull me on that is quite hard.
If you just make money in the short term, then the motivation isn't always true.
I think also there's a, there's this idea that it's inevitable, that we have to trade privacy in order to get this tech advancement or this convenience.
And I just don't, I think we need to really rigorously question that one, question that, because I think that that has served a particular purpose for big tech companies to be able to sell this narrative.
It's like, well, if you if you want it for free, you're going to have to, you're going to have to just give up your privacy.
And that's so much part of like the comp, like our collective thinking now.
But I think we need to really, yeah, question that and be like, well, actually, is it a technological necessity or is it just a commercial necessity?
And how might we like reimagine technology to be, to offer us convenience and to offer us, you know, good outcomes and good social outcomes without it, you know, undermining our human rights, because at the end of the day, that's what it's doing.
And we have been sold that as an absolute truth, haven't we, that in order to access tech, we have to, the cost is our privacy data.
Yeah, absolutely. And I just don't think it needs to be that way, you know, not to the same extreme.
Hey, how did, this is a space that interests me, is that intersection of your physical privacy with digital privacy.
And I was reading this morning about an overseas startup, that's really trying to change how supermarket shopping is experienced.
And what they've done, it's an app on your phone.
But as you go through these supermarkets, there's video cameras that know who you are, and your shopping trolley.
And as you put everything in, it tracks it, right.
And then you get to the checkout, and it's all done, you just pan your app and walk out the door.
And they got something like 39 million US dollars in their Series B funding.
So that's a huge amount of money. But there's part of me that just wants to scream and go, I just want to get bread and milk.
Yeah, how does physical privacy tie into the digital privacy space? Yeah, it's a really fascinating area.
And, and yeah, I think those, the divide is, is maybe less than we would initially think.
Supermarkets are a really interesting example, because they were, you know, a lot of people don't think about this, but the loyalty, like a loyalty card program is actually like very early days, a very early days example of surveillance capitalism, and this idea of trading your data for, for, you know, access to something.
And so supermarkets have been kind of at the forefront for a really long time.
When it comes to pushing this, this specific form of data extraction and monetizing it, because they wouldn't do it if it wasn't making them money, right?
They don't actually, they don't care about giving you like, a special deal.
It's not a community service. Yeah. Yeah. So, yeah, so I think that's really interesting to keep in mind, like the kind of the history of that in this.
And, and yeah, I think it's, we're finding new and interesting ways to kind of digitize our real world experiences, be it through, you know, tracking on our phones, how we move through spaces, be it through cameras, which, you know, may or may not have facial recognition technology, being able to, with something like your, if you, if it's tied to your credit card or your debit card, being able to kind of pinpoint, like, oh, well, you've, you've bought this here and you've bought this there.
Like all of those things kind of are related to the physical world and how we move around in the physical world, which I think is really, again, coming back to those examples of how we might not necessarily realize just like living our day-to-day life, doing something as inane as buying milk and bread, and you're adding to this huge collection of data.
And it seems kind of irrelevant and kind of pointless.
And I can, I can understand why someone might be like, oh, that's boring.
Like, it doesn't matter. It's just me buying milk and bread.
But when you take all of these parts and then they're aggregated together, which there's a huge market for data aggregators who bring it all together and then sell them off, sell it off as data sets.
That's when it starts to become an issue.
Like the sum of the whole is greater than the, I've just butchered that saying.
I know what you mean. That's brilliant. What can we do? Like, do we just roll into this future of my supermarket will know every single thing about me?
Oh, that's a bit bleak, isn't it? I mean. Sorry. Early morning chats.
I don't worry. I live in this space. I think that there are a few things.
So obviously there are measures that people can take as an individual to protect their own stuff.
So, you know, things like using VPNs, things like using password managers to bolster your security, things like using encrypted messaging services and things like that, and really normalizing those processes.
I like that because there was a time when that was extremism and.
Yeah. And I think we're starting to starting to see that that's shifting.
Like, you know, I don't know if you use Signal, but every now and then I'll be like, it'll pop up and be like, Gran has joined Signal.
Gran, cool. But really push like the encryption thing in particular, there's this, you know, the government even the last couple of weeks has been pushing this idea that only criminals would use encryption, for example, which is ridiculous because encryption is a fundamental part of the Internet and how we.
Sounds like they've never bought anything online.
Yeah. Right. So I think like normalizing the use of end to end encrypted messaging services is a nice way to push back against that and be like, no, it's a it's completely reasonable for me to want to.
It doesn't matter what I'm talking about.
I could be texting you and be like, you know, it's raining.
Yeah, it's raining. And that we should still have the right to keep that between you and I.
So there's so, yeah, so that's individual things that can do. Then there are bigger things.
So, for example, you could this is a very shameless plug, the support organizations like Digital Rights Watch or like the Electronic Frontiers Australia, you know, these civil society groups that are really pushing back and trying to work with government and companies to get to the middle ground.
Right.
Yeah, exactly. And then, of course, you know, we can be much more vocal, like, for example, if you have a local supermarket and they're installing X, Y, Z thing that you're uncomfortable with, tell them or write to the, you know, the higher like office or whatever.
Like, I think we can we forget that we can speak up about the things that we're not comfortable with.
And, you know, if enough of us were to do that, then I think we would see a bit more change.
But it's very easy to be like, oh, the horse is bolted.
There's no point even trying. And I think that that's a depressingly cynical view.
I think we've been somewhat conditioned to that fatalistic take on digital privacy, though, haven't we?
Like, I think it's been a conscious marketing choice by some organizations.
I would agree, because if we believe that, then we won't resist it.
We won't question. Hey, I've got one more question.
I think you maybe alluded to an answer to this earlier on. What do you think the biggest myth and tick is, but kind of in reference to digital privacy, given our conversation so far?
I mean, my, my gut reaction to that is that technology is not neutral.
Like, I think that gets pushed a lot, especially in like an AI kind of space.
And that certainly is not the case. Like, it is inherently political.
It is inherently to do with us as human beings and and the society that we live in.
So there's no way to make it neutral. So I definitely that. Um, yeah, we're told to believe that numbers are somewhat magical, right?
Like they have power and they don't have opinions and they're always right and true.
But I don't think that really holds when you talk about numbers about people.
Absolutely. I mean, and the data never tells the full, you can never fully encapsulate a person in in data as far as I'm concerned.
So too complex. I think we remember when we go into these code writing processes that we're extracting, but then at the other side, we forget that we did that at the start.
Yeah, absolutely. I think it's really, really easy to forget that and then be like, oh, this is this is this is magic.
We're out of time. See you next time.