Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week’s episode:
Understanding the Need for Regulation of Facial Recognition, Exploring Technology Ethics and Power Dynamics, and Examining How AI Decisions Affect People's Lives
In today’s digital world, data protection and technology ethics are becoming increasingly important. With the rise of social media, online dating apps, and other digital platforms, it is essential to understand how to protect your data and ensure that your sexuality is secure. This guide will provide an overview of the steps you can take to protect yourself from potential risks associated with technology and data protection.
First, it is important to understand the basics of data protection and technology ethics. Data protection refers to the process of protecting personal information from unauthorized access or use. Technology ethics involve understanding how technology can be used ethically in order to ensure that people’s rights are respected and protected.
When it comes to protecting your sexuality online, there are a few key steps you should take:
1) Be aware of what you share online: It is important to be mindful about what information you share on social media or other digital platforms. Be sure not to post anything that could potentially put your safety at risk or compromise your privacy. Additionally, consider using strong passwords for all accounts related to sexual activities or interests in order to keep them secure from potential hackers or malicious actors.
2) Use encryption: Encryption is a great way to protect sensitive information such as passwords or credit card numbers when communicating over the internet. It scrambles messages so that they cannot be read by anyone except those who have access keys for decryption purposes. Consider using encryption when sending messages related to sexual activities or interests in order keep them secure from potential hackers or malicious actors who may try accessing them without authorization.
3) Utilize two-factor authentication (2FA): Two-factor authentication adds an extra layer of security by requiring users entering their username/password combination with another form of verification such as a code sent via text message before they can gain access into an account/platform/website etc. This helps prevent unauthorized access even if someone were able to obtain a user’s password through hacking attempts etc. Consider using 2FA for any accounts related to sexual activities/interests in order to keep them secure from potential hackers or malicious actors who may try accessing them without authorization.
5) Be mindful when meeting people online: When meeting people online through dating apps etc., always practice caution by taking necessary safety precautions such as meeting in public places initially rather than private residences until both parties feel comfortable enough with one another. Additionally, never give out too much personal information until trust has been established between both parties.
6) Report any suspicious activity: If something doesn’t seem right, don’t hesitate to report it immediately. Whether it's inappropriate content being shared on social media platforms, suspicious emails being received, phishing attempts etc., always report any suspicious activity immediately so that appropriate action can be taken quickly. Taking action for data protection and technology ethics is essential for ensuring our safety and privacy when engaging with digital platforms, especially those related to sexual activities/interests.
By following these tips outlined above, we can help ensure our sexuality remains secure while also protecting ourselves against potential risks associated with technology misuse.
Hello and welcome to Securing Sexuality, the podcast where we discuss the intersection of intimacy and information security. I'm Wolf Goerlich.
He's a hacker and I'm Stefani Goerlich.
And she's a sex therapist. And together we're going to discuss what safe sex looks like in the digital age. And today we're talking with Stephanie Hare, author of Technology is Not Neutral, a short guide to technology ethics.
So as one writer, Stephanie, to another, I have to say your writing is fantastic. I describe myself as tech adjacent. I'm not really techie. That's him. And the way that you break down everything that you talk about, the ethical and the tech concepts for your reader, I thought was so accessible and so engaging. And I'm super excited to be talking to you.
I mean, anytime someone compliments your writing, it's just a gift as a writer because you're always alone in your house for years. So thank you for that. That's really kind and generous. I'm so happy someone read it and found it useful. And you're right. I didn't write this for people who are already interested in tech, although I hope they will gain a lot from it.
I really wanted to open this conversation up to everyone because technology affects all of us. And we all need to think about this question of how do we make it work for us and maximize the benefits and minimize those harms, particularly for vulnerable populations or in this case, vulnerable human interactions. And that's intimacy is a whole thing of requiring vulnerabilities.
So how do you secure that?
You know, there's one aspect of the book that just keeps rattling around in my head and I am going to sum it up. So for the listener, please get the book and read the quote exactly. But you mentioned, you know, you don't invent electricity without inventing electrocution. You don't invent the train without the train wreck.
And in a very real way, we've invented a lot of technology to connect, to form relationships, to sustain relationships over years and years and years.
And so I wanted to talk to you through your learnings, your thinking, the way you approach this, because I really like to get a sense of, I mean, of course, the focus of our podcast is on intimacy, but a sense of what you're seeing in terms of how modern technology is playing out and where we're seeing some benefits, but also where we're running into train wrecks. Yeah.
I mean, gosh, where to start. So that quote that you mentioned at the beginning is by Paul Boriglio and he is a French cultural theorist and philosopher. And I only mention that because I had come to technology by way of studying the humanities, specifically history and French.
So long before my interest in technology, I had an interest in languages and history and other cultures, dialoguing between cultures, which has been a fantastic foundation upon which to build a career in tech. And that's kind of the approach with the book is like, you don't just have to be a coder or a developer or working in hardware, which is also really fascinating.
It's really for everyone from the person who has the idea to like the finance teams or the HR people who are helping to recruit and staff all the way through to when we roll products and services out into the wild and test them and you get the intended consequences and the unintended consequences. And as a researcher, all of that interests me.
So when I saw that quote where he was like, you invent the ship, the shipwreck, the plane, the plane crash, electricity, electrocution, the key bit he says is every technology carries its own negativity, which is invented at the same time as technological progress. And I had that pinned on my wall and like a three by five note card for years.
And it was like my guiding star while writing the book was like, how do I help anyone who's thinking about investing in a technology or buying it, bringing it into their home, installing it into their school or their hospital or whatever for really good reasons.
Are there moments in that chain between idea and execution where we need to stop and pause and go, wait a minute, who might be impacted here?
Not all technology affects the same people in the same ways. And we're going to be moving on to intimacy and just say intimacy tech or technology that we use in moments of intimacy is like just one example of where we might want to think about technology ethics.
And also like even what we mean by intimacy, because you could say that putting facial verification technology in your child's school so that they pay with their face instead of using money, you could argue that having a little child submit their face to a scan, teaching them to transact with their body so that they get food is like one of the most intimate things that you do.
When you stare into somebody's face, into their eyes, that's really intimate.
And that's what we let these machines do, right?
And with greater or lesser accuracy, and that creates different problems depending on how it's being used. In a security setting where you can be arrested for a misidentification, that obviously is like a violation of one's intimacy. One's very own body is being used against one by police.
But also, as I say, like teaching kids to transact with their bodies is really problematic in a world where we're trying to teach kids about boundaries and being safe and who is allowed to have access to your body and who isn't. We as parents and teachers, we give kids one message and then we put this technology in a school.
So that gets really tricky because then those kids get a little bit older and they start getting requests for certain nudes, for instance, or stuff that's happening on ephemeral messaging services or even information if they're being groomed.
How do you teach them to protect themselves if the messages you've been giving out since they were young are totally messed up?
So you can tell I have a view on this and again, I'm not pretending to be neutral. Having studied this for years, there are certain no-go areas for me and the human body is a very problematic one. And then anything that's affecting vulnerable people or people who are in a vulnerable situation, e.g.
you're in a school, you can't really say no, like the power dynamic there is totally messed up, is different than the message that technology companies love to give, which is like, well, if you don't want it, don't use it. You're consenting to this.
So it's like, I don't know how much consent a kid has actually, particularly if it's in a school and they need to go to this school to get their education or if they're like a vulnerable child on school meals.
And that is the basis on which they get fed by the seat, right?
So like now we really start to get messy where it's like, oh, if you're of a certain class, you could afford to buy your way out of that versus if you're a poorer child who's relying on state meals and you're using facial verification technology to track all of that. Now you're affecting children who are vulnerable in a socioeconomic perspective as well as their age and their setting of education.
So it's fascinating and it's horrifying. And once you start seeing it, you kind of can't stop because you start to see it everywhere because what you're really talking about is power and power dynamics between people.
And that was one of the things that I thought was so fascinating in reading your book because not only am I a sex therapist, I specialize primarily in working with kinky clients, with people who spend a lot of time very carefully negotiating the power dynamics in their relationships and thinking very deeply about what authority do I want to cede over decision making over my body.
And I kept thinking this is so intrusive and it's so coercive and we don't get to negotiate any of these choices.
And at one point I was holding your book and I looked at Wilf and I'm like, I feel like terms of service should be check boxes, right?
It should be like a BDSM negotiations list. You could be like, I agree to that and I don't agree to that and you don't have to do that, but I'm fine with you doing the other.
And I found it fascinating to read your book through the lens of somebody who thinks a lot about power and control and the consensual power exchange and how little control and how little consent we really have over the choices that are being made for us. Okay.
First of all, I'm loving that and kind of wishing that we had spoken before I'd written the book because that would be so fascinating to weave in. So perhaps a supplementary chapter is required for the paperback.
I shall speak to you Anand and talk to my publishers because it's a great framework and I think you named something really interesting there, which is that in the BDSM community, power relationships and the idea of consent and safe words of like, you know, at any point you can stop. That is perhaps not as explicit in other interactions and forums.
And I think that's something I wanted to cover in the book was I was really worried that so few of us study philosophy formally. When we think about philosophy, we might think of like ancient Greek men in like, you know, robes or something hanging out in Athens.
And it's really difficult if that's your mental image when you say the word philosophy, what do you picture?
If you think of that, then it's like, well, how the hell does that have to do with technology, blockchain or the metaverse or just even Facebook?
And that's the whole thing is you need a little bit of a grounding. And what I hope to do is like, I did the hard lifting so that this should be easy for the reader. If you have time to study philosophy, just read this book. You will get it in 206 crisp pages and then off you can go and you can just pick it.
You know, philosophy, I think I make it out like a Swiss army knife. There's six component parts of philosophy. You go through each one and one of them is political philosophy. So that's where you get to talk about power and you see how to how to power relationships interact with ethics, which is values. You can start to then see politics as the expression of ethics.
You get into aesthetics, like what is beauty, but also like what is experience, which is quite relevant as well for intimacy tech and all of the other branches too, which are all fascinating in their own right.
And then once you just have that basic framework of the six branches of philosophy, you can kind of get going and it's up to you if you want to go really, really deep dive and like, you want to read Wittgenstein, go for it, knock yourself out.
Do you have to?
No, because there's an entire canon of thousands of years of human beings across all cultures thinking about it, including up until the present day. And then you start to see the examples in real time where you're like, that is weird.
Why do they say I'm consenting every time I want to access a website?
I'm made to have this impression I'm consenting to give my data. But the fact is it isn't consent. If I don't agree, I can't access the site.
What kind of consent is that?
Is that mafia style consent?
It's a con. It's gaslighting.
So it teaches people you've consented when actually, to what have you consented?
So here in the European Union, I shouldn't say here because obviously the United Kingdom has left, but old habits die hard. Next door in the European Union, we've had the GDPR. That was a big step towards enshrining more data protection, which is linked to privacy, but still separate.
And that has also inspired, therefore, the United States to start looking at, well, what would it mean to update our data protection and privacy laws, including for specific groups like kids.
For kids, the UK has actually been leading the way in something called the Children's Code or Age Appropriate Design Code, which California has just passed. So you start to see really interesting interactions, which again, and I only mention this because of political philosophy, different countries will regulate these questions differently. And it's not necessarily a question of like, who's right, who's wrong.
It's more just like, you know, I'm broadcasting to you today from the United Kingdom where we have a constitutional monarchy. You are in the United States where it is a federal republic.
You know, these are democracies, but it's a republic. It's 50 states and they've all got their own systems and they do power differently. Even in the UK, we have four devolved governments. So like in Scotland, it's going to be really different to England on some stuff. And then you've got the European Union, which is like a really complicated tapestry of 27 countries all trying to like march together.
So knowing political philosophy, understanding power and that ethics is different to law. So like just because something is enshrined in law or not is really separate to ethics and values and what's good and bad and right and wrong. Those questions are very much alive and everybody is still shaping them.
But the question is who has a voice, who doesn't have a voice, who's excluded, what is past precedent and where do we want to go in the future?
So it's super dynamic. We are kind of living that dynamism here in the US since the overturn of Roe this summer. And one of the things that Wolf read about that hit us both really hard and you know the details of the case more than I do. The lawsuit. So not only is it being decided, it's also being litigated.
In the US, there's a recent lawsuit against this marketing company.
And back to what are you consenting to?
Okay, I consent to put a cookie, no big deal. But what this marketing company have been doing is aggregating all this data, including device identification data, which is not hard as you know to go from device to human, including location data so that ostensibly you could do targeted marketing.
Now the problem with that was anyone can get a free account on AWS, anyone can get a free 30 day trial of this data and anyone can pull down anything they want.
Now what could an adversary do with that?
Well, very quickly they found that people are using that to identify, you know, doctors who were performing certain procedures. They're using it to identify people who were going into abortion clinics and started pulling on these pieces. And we can imagine also people might go to a club, you know, people who might go to a certain event.
All this was available to anyone who wanted to set up a free AWS account and had enough technical acumen to pull it down. So now you have that power and control piece and that intimacy piece and that bodily autonomy piece where as a sexual health professional, anybody that wants to buy data about who's been visiting abortion clinics, I automatically assume they don't need for good reasons.
I mean, there's an intrinsic malevolence in that search request. And so really being aware of how these monitoring options and how this data is being culled and used is really creepy and gross.
And you know, you used or you quoted a term in your book, the creepy line, and I kept coming back to that, right?
Like that to me is the definition of the creepy line. Like when you are aggregating data for market research, that's intrusive. When you're selling visitors to abortion clinics, that's creepy. Exactly. And the creepy line moves. So you know, what I find creepy might be different to what you find creepy and like what one country finds creepy and other countries totally fine with.
So that's tricky if you're a tech company who's trying to operate globally.
So like where is data stored?
What are the rules around it?
And then you get into those concepts of transparency, explainability, accountability.
So transparency would be like, can I, citizen hair, find out, I don't know, can I get some sort of like report, you know, an annual report?
Who has my data?
What is out there about me?
If I wanted to do opposition research on myself, on behalf of my enemies, like where am I?
What's out there?
And they would, you know, what could they pull up?
So that first of all, I just get that like, oh shit moment to be really frank, but I think most people would be like, oh my God, pictures of you that you didn't even realize were online that like other people had uploaded, but now with facial recognition, we can like, we can put you in that picture, identify you, everyone there, where it was taken, blah, blah, blah.
So like, and we can do this historically. So we can do it for years going back.
So like who's in your network?
And then you feed that into certain really big companies that love to play with people's social graphs, their social networks. And now you're starting to learn who you're friends with, who you drink with, who you maybe do drugs with, who you have sex with, who you dated and were like in a really good emotional relationship with. And then it went bad. And then you fought and sent all those texts.
Like, you know, it's all out there. It's all messy. So there's that. But then there's the question of like, okay, cool. Now that you've given me the full like picture of what's out there about me, sorry, do I have any rights to be forgotten, which the European Union has examples of that on its books and how you might do that.
But if that's only within the EU, you can just get a VPN and see Google from a different perspective and still see it. So that's not great for deep fakes or revenge porn or the like.
So it's a total mess because basically what you'd want to do is like shut down the market, right?
Shut down the third party data broker market entirely and be like, I never consented to give this data. I have no idea who's selling it and buying it. And I don't get a cut of that, by the way. These people are making tons of money on my data.
Now you'll get pushback from banks and credit rating agencies that are like, we need to know all this stuff about you so that we can give you a credit rating.
It's like, okay, I mean, do you?
And also if you do, why does it have to be sold?
Why don't we make this a public good and put it in a big data trust, right?
And go socialist on this, which they would then freak out about because they are worth so much money and they don't want to turn off the tap.
And even then we'd have to have a really nice discussion in front of everyone in Congress with the public going, what is in scope and what is out of scope?
What data do you need to determine a credit rating?
And what data is you just being creepy?
And that's the problem is that we've seen in the US, we've definitely seen it elsewhere, but we can stick to the United States for just a moment that can change overnight. So like what Americans thought was like a person's right to choose and privacy and bodily autonomy, et cetera. And people maybe took that right for granted overnight was changed.
And it maybe wasn't shocking if you were watching it or if you just grew up as a woman in the United States and knew that there's always been this group of people who's been working very, very hard to overturn that. Right. And now we have to decide if we want to, you know, it's a democracy.
Do we want to just let that stand?
Are we going to fight back?
Like there's all these questions that will now happen.
And one of the big things I think is going to be how bad is the nightmare scenario go?
Does the geolocation data that you have to give, I mean, your mobile phone won't work unless they can track where you are, because that's how the technology works.
But does that have to be handed over?
Well, if you've got police with a warrant, maybe.
Yeah, maybe it does.
So what do you do if you're all the apps and social media companies and email providers that are on people's phones?
They all have to make a technology as neutral or not call and let people know we're going to comply with any orders. And then you start having to watch. And I suspect we'll have to have a number of test cases.
The FTC lawsuit is a case in point, but there could also be other ones where, say, somebody has had an abortion or even just goes to visit an abortion clinic or is a doctor working in one where that kind of data is used against them to prosecute them successfully and, for instance, put them in jail.
Are we going to stand for that?
Are we OK with that?
Are companies OK being part of that?
Or can they voluntarily reconfigure their architectures and their data gathering practices to not?
So all of these are both ethical questions and deeply political, because it's like, what's your view on, you know, when does life start?
What's your view on privacy?
What's your view on human rights?
What's your view on women's rights?
All sorts of stuff or people who give birth rights. So it's tricky. And then what if you decide, OK, the United States has gone mad. Iowa will just cross over into the border into Canada or Mexico.
What happens to adjudicate those decisions then if you take them into extra territoriality?
And it's a mess. So we have created such a problem with all of this incredible technology that in so many ways makes our lives better, easier, more convenient, but it has this terrible drag of digital dust behind us. It really does. Almost like pollution, if you think about it, like the Industrial Revolution, all the pollution. It's almost similar in my mind the way you describe it.
One of the things I liked about your title is, you know, technology is not neutral. I think the argument in that particular lawsuit I mentioned is, well, we're just neutral. What people do with this data is what they do with it. We're just providing and collecting. And oftentimes I think that argument can be made and it sort of tries to let them off the hook for the abuses of the technology.
And alternatively, another thing that I see as an argument that gets made is, well, we're the good guys. We just provide the service. And if other people misuse it, that's on them.
Your facial recognition chapter, you mentioned PEMIs, right?
And you just mentioned Revenge Porn. So one of the things that PEMIs says is, hey, we're out there. So if you want to know that your image is out there, upload your face and we'll help you find it.
I'm like, well, yeah, if that's you finding your data, but what's to stop me from finding someone else's data?
I mean, there's so many different decisions that get made around how these tools are used that I think it's impossible to say that they're neutral. And so I was wondering from all your research, if you had some sort of operating framework of navigating these decisions, if you're a product company or if you're building technology. Yeah.
I mean, so first of all, I always would want to know who makes money off of this?
Follow the money. Follow the money. That's my born in the United States characteristic that thinking has never left. Someone somewhere is profiting from this, so follow that trail. And also following fast on the heels of follow the money is like who gains in power and who therefore loses in power.
So if the only way you can stop revenge porn is to give someone else your data and that company in the case of PEMIs is like super sketchy as in they're like domiciled in the Seychelles last I checked and like, you can't find out who works there or their governance structure or like any accounts. You can't audit them.
That's really different than let's imagine, you know, I don't want to call it a brave new world, but a better world. Let's imagine a world in which you really did want to be able to let people find out just their images where it is.
And if you want, take it down, you might think who are already trusted authorities in this space that have a history and a track record of working very well with vulnerable communities or indeed any communities and where you can, you know, it's transparent. It's explainable as in the sense you understand what's happening with the data at any point. It's accountable as in like if they screw up.
So like if the regulators, for instance, wanted to run that and again, you have to be really careful because as we have seen in the United States, you can have a beautiful democracy flourishing lovely and then all of a sudden elect an authoritarian who then controls all those state institutions.
So is the regulator the right body or do you have to create something totally new that perhaps is ephemeral?
It allows you to like check your own images, delete what is out there about you and then it just all disappears. Like it's like literally self-destructs.
You know, we do this with social messaging. It's possible. What you don't want is the data being retained and then used to train other algorithms for other purposes. And you just have no idea. Like that's just happening for you on the dark net or even just in normal life and you're just not aware of it and you have no control over it.
So I think these questions of control, transparency, explainability, accountability, trust, who and that thing of like how do you know, what is the safe word?
Like how can you just be like stop, stop or just like delete?
I mean, I would love the idea if there were all sorts of ways for us to just, you know, delete everything year after year. But then I trained as an historian. I also see like there's also an argument for keeping data, even stuff we don't like, perhaps especially stuff we don't like because we're trying to learn from the past.
So if we're constantly cleaning and expunging the past, that could create problems too.
Even in areas where we're like, what the hell does that have to do with the revenge porn?
We might still need to learn from that data. And I'm not saying how, I don't know necessarily have that answer, but I'm holding space for it mentally to go. It's possible I haven't thought that one through yet.
So we need to just be careful before we nuke all of the data sets on there or the learnings from it, because maybe we'll need it to understand a problem or to build something better in the future. So it becomes so messy. That's what I was saying. Like if we could solve this stuff fast, like we would have done it by now, I suspect.
The revenge porn piece, I think it's a lot of press, a lot of conversation right now.
When I was reading about PEMIs, when I was reading your facial recognition chapters, I think that a lot of people who in my world might have clients that are doing sex work or cam work, or heck, peers of mine who do sex work or cam work, they feel safer because the tech adjacent among us know that Google reverse image search exists. So they know not to reuse a picture.
But I don't necessarily know, I didn't until I read the book, that there is this sort of publicly accessible facial recognition software. Like the idea that somebody could think, you know, I'm safe because I use a scene name and I only sell original content. And not realizing that that original content can be scanned and then used to find them in other places really shook me.
Again, great example of that creepy line. You talk a lot about like context in which technology is used, and how the context can change impact, how the context can change outcomes. And you know, obviously, my perspective is thinking about sex and intimacy and the context of sort of very deeply personal behaviors.
And one of the stories that came through my world recently was a dad who had taken nude photos of their child because the pediatrician had requested them in order to assess and make a diagnosis. And that was caught by an AI filter, I think in Google, but I don't want to speak to that. And the father was reported for creating child sexual abuse material.
And one of the points in your book is that, you know, these are technologies that most people don't get to push back on. There's not necessarily this opportunity to rebut the AI or to rebut the algorithm, because the science is so fancy and important and refined that it's not going to make an error or it's not going to let things slip through.
And this idea that people are creating content that has a very specific purpose and a very specific intentionality behind it, that they think is safe, and they can end up in some really nightmarish situations was probably the creepiest part of what I read over the weekend.
Yes, and it's also a choice by certain companies not to make it easy for people to understand if they get into trouble or if there's a misunderstanding that the AI didn't understand it.
Like, who at Google are you supposed to call to be like, hi, I'm actually dealing with a legit medical... Maybe the doctor could call and be like, I, doctor, blah, blah, blah, have requested this photo because I'm attempting to do a diagnosis with a remote patient. Turn off the AI. I want to speak to a human. And then you sort that out.
How do you do that?
And in so many other industries, you have dedicated customer service, but it's expensive. It's a cost.
So that's the whole question again, where regulation comes in, which is like, if you are going to be making decisions that are affecting people's lives, the accountability piece to that is like, what are people supposed to do if they have a problem?
If a computer says no, who do you call?
I mean, Henry Kissinger once famously said, if I want to talk to Europe, who do I call?
And it's sort of like, if I want to talk to Google, who do I call?
There's no customer service line. And that's really tricky.
And it's not just tricky in the case for that father, but it's also tricky for the doctor who is like, oh God, now what?
Because you might not want to ask this from future patients, possibly legitimately, possibly not, I don't know. But there's a whole bunch of stakeholders that we would want to open that conversation up to, to be like, okay, pediatricians around the world, talk me through this. We've been talking about telemedicine here in the UK. We've got such a backlog with our national health service. I think it's months and months.
It can be really difficult to get in to see your general practitioner. They push you to do stuff through the website, to use telemedicine and to talk to people on phones, et cetera. So do we need to update our practices to understand that if it's coming from a doctor's surgery or to a doctor's surgery, that's possibly a different use case than just random kid pictures being sent on the internet.
How would you code that?
How would you make it safe?
I guess that's like the big question ultimately is if you've got a new use case that's coming up and you want it to be safe, but you also want to be able to check it and you also want recourse from both parties, from all parties really to be able to discuss it. You need to build a new infrastructure for that.
How would you code it question is fascinating because we're assigning labels to behaviors to people. If we're looking at the data of people whose phones have been near abortion clinics, in one state we might be labeling them as felons now, in another state we're labeling them as patients. The dad is temporarily, hopefully, temporarily fighting the label of a sex offender. They're not neutral labels.
They are labels that have tremendous impact and detriment right off the bat. You said and I underlined this and then I highlighted it. You got double emphasis and I believe I even put a star next to it. We will never know how we're being classified or for what purpose. When we think about people navigating intimate relationships, people navigating parenting, people navigating healthcare, they're not just having their behavior monitored.
Their behavior is being used to put them into categories and so many people are already trying to navigate the world in stigmatized identities or marginalized populations that having technology sort you into boxes you may or may not fit into is a really, really sort of scary thought. Yes.
This is where, again, I think having the humanities background can be such an advantage when thinking about these things because what you've just done there so beautifully is highlight the role of language.
Are you a patient or a felon?
Someone has to code that. And that's not even just a creepy line. That's in that case, crossing a state line.
And what if you're going back and forth?
I'm thinking of here in the UK, we've got Northern Ireland and this crazy porous border that it has with the Republic of Ireland.
I mean, sometimes it cuts through people's houses. If you were geofencing something like that, that would be so tricky in that country or between those two countries on that border. Ditto in Europe. And obviously there's many US states where that's an issue.
But second, knowing that the machine and the people building the machine behind it are labeling you and categorizing you in ways that you will never know to a student of literature who knows their Kafka or has read Schulz and its in. This is not new. We're acting as if these are new problems. They're not. They are straight out of the 1920s, the 1930s, the 1940s. We've seen it with medical experimentations.
We've seen it with all sorts of who's crazy versus who's just having postnatal depression. But the way that we didn't understand women's health for a really long time was like, totally bonkers. And it would result in lobotomies or prescribing them crazy medication or saying that they had to be locked up.
I mean, here in the UK in the 19th century, you could just clearly just say a woman was being a bit stroppy to her father or brother or husband, basically her male guardian. You just declare her insane and pack her up. And that was problem solved because she had no rights. Don't get it. That's massive. It's not new.
What I'm saying is like, sometimes I think in tech, we will also think, oh, God, we've created all these new problems. And I'm like, it's so linked to what we've already been doing. And that's kind of the point. That's why technology is not neutral. It's this product of humans. And we humans have a history. We have languages and sociology context. We have visual representation. We have norms.
And those norms are constantly shifting and being contested. And then someone has to encode them and enforce them. And often hide them because we hide them to hide the power structure. And we hide them to hide the money.
And so that's our job, I hope, as technology ethicists is to lift up the boot, lift up the lid and go, what is going on in this mess?
And show everyone and go, do you realize that's what you're contributing to?
Or do you realize that's what happened?
That's what happened in your kid's school?
Or to look at our lawmakers and go, this is what's happening. You've known about it for 10 years.
Why have you done nothing?
Or our regulators, who very often are quite toothless. That's why the FTC's lawsuit in this case about this company is so interesting because the companies immediately come back and been like, no, no, we're voluntarily changing our data collection policies because they're kind of like, please don't turn off the money tap. Don't regulate us. Don't change the law.
And that's why all these tech companies are, you know, they've spent more on lobbying than fossil fuel companies and pharmaceutical companies. There is a reason for that. There is a reason for that.
And like, that's the thing is like, how do you get the average person walking down in the street who's dealing with, you know, pandemic, energy crisis, inflation, war in Ukraine, cost of living, just try to get the kids to school.
How do you get them to care about this, to understand how it affects them and to push for change?
Because it's not enough to just go, I'm really unhappy with the world as it is. It's a total dumpster fire. I think it feels accurate. It's what are we, what would, what can we do to build it so that it's a better world for the kids, but also for all of us as well. We'd quite enjoy living in a better world too.
So we have to do a lot of work and thinking, and that's really difficult to get in people's radar, I think at the moment because of everything else that's going on.
So it's in a way very encouraging to see a regulator take this stance and push it because we're going to have to update what we mean by privacy, I suspect in the 21st century in the U S to make a bold statement, really broad. So much of what we've talked about is systemic at this point.
It is, it is woven into government, into public policy, into marketing, into criminal justice. And it can be really overwhelming for individual people to think about, to want to try and protect themselves, to do something, to be a little bit more private, to retain some sense of agency or control over their data, over their bodies, their families, everything that goes with their data.
Are there any small steps?
Are there any simple things people can do to kind of unravel a little bit of the data that's being gathered about them?
So I would say in some senses, yes, there are things you can do, but also it's a little bit, I guess, like public health or climate change. When you're looking at these really big problems, individual action is only going to go so far because these are like structural and systemic problems.
Again, to make an analogy, like here in the UK, we were being told that our energy bills were going to, in some cases, like not just quadruple, but quintuple, multiples that we don't even have words for in everyday parlance. And you can tell people to turn off their lights at the mains, their electricity at the mains and put on an extra sweater and cuddle your pet.
That's the actual advice we were being given, by the way, which is just surreal. But ultimately it doesn't matter because the factors that are creating the energy crisis and the inflation crisis here, the cost of living crisis, are Russia invaded Ukraine. That's like one thing. The pandemic and supply chain issues is another. UK reliance on gas is a third. We can go on and on.
An individual family can't fight against those things. And I feel like we can look at that with a pandemic. You can wear your mask or wash your hands loads. But if everybody else is doing whatever they're doing and we don't have vaccines, you're going to have a very high spread rate with the virus of transmission.
And then as you get more and more people vaccinated and you ventilate and do all the good practices that we now know, we had to work together as a community, as a society, to try and bring that situation under control, which I'm not saying that we have, but I think it's better than it was in 2020. So with tech and with this question of data and privacy, I think it's a similar point.
You can spend days if you want. There are many, many guides online about, you know, it's not just the basic thing of like, maybe don't have a listening device in your home.
Or do you need to have every social media account going because that's just profiling you?
You are literally just feeding your stuff into their business and they're making money off of you.
Do you need to do all of those things?
You can do that and you can change settings on your devices and the like.
But the fact of the matter is companies want you to be doing all of that because what they don't want you to be doing is writing to your elected officials saying, for the love of Christ, could we please get a privacy law in this country?
I can't fight the third party data broker industry. It is a multi-million dollar industry. That is where lawmakers are supposed to step up and do their job and regulators have to enforce it. Here in the UK, we have a regulator, the ICO, the Information Commissioner's Office, which doesn't really do very much on facial recognition technology.
It keeps saying that it needs to study and look at stuff and then it doesn't shut it down.
Meanwhile, schools are putting it into school and using it on kids. The London Metropolitan Police has permanently integrated it into its operations.
That's leading to people being stopped and racially profiled, misidentified, right?
So it's created a power vacuum.
So this whole thing of like, whose responsibility is it to fix it?
It's like a hot potato and lawmakers would then, I'm sure, come back to us and go, listen, we're responding to our voters' concerns and our voters aren't telling us that this is what they care about. They are telling us they care about the price of gas or whatever. So they'll put it back on us.
So it's this whole thing of like, we're going to have to organize and work together if we want that change, just like when we want to get anything else done in our democracies. You want to clean up your water, your air, you're going to have to be making a real case for it or improving education standards or getting seatbelts in your car. Like none of that stuff just happens.
It happens because citizens gather and work it, work the system and get a law passed. And even then, just because the law is passed, as you know, I think if only that just fixed everything, we wouldn't have the gender pay gap and like racial discrimination on a structural level. Law is not enough.
So yes, you can do your bits and do your bits. Every little bit helps, but it's not enough. And just know that the companies are happy to pay fines by regulators or for you to change settings on certain devices because what they don't want is anything that is going to interfere with their ability to make money. They're not here as like social good vehicles. They are for-profit vehicles. And like that's fine.
There's like zero judgment on that. Like that's just the capitalist society in which we have decided to live. Fine. But if we want to change that, most of them aren't going to do it out of the kindness of their hearts.
The only exception I would say to that is weirdly on facial recognition in the United States, you did see several big tech companies asking for it to be regulated even before the George Floyd murder. And then after George Floyd, several of them came out and said, we are halting development of this technology or we're not going to sell it to the police.
We're voluntarily pulling back, which is an example I love to use here in the United Kingdom all the time because in the UK there has been no such self-restraint. So that solves that technology problem or it doesn't solve it. It helps solve that technology problem a little bit in certain parts of the US where these companies have decided to do it. But it doesn't solve it worldwide.
And that's kind of the point.
What would solve it worldwide?
We're back to getting laws passed and strengthening regulation and having greater civic accountability on these issues. So it's fun. People are people. And a lot of these problems we've seen in other technological revolutions and IT in particular would like to say never before, nothing we can do and there's no framework for that. It's all brand new. I like that you draw that clear line. And so I really appreciate the time today.
I think those are some great tips, acting collectively, doing what we can personally, but also acting collectively and raising awareness of all that's been collected on us and what it can mean.
So with that, any final thoughts?
Where can people find you?
Yes. So final thoughts would be if you really want to see action on these issues, like start local in your own life, in your own kids' school, and writing to your elected officials because they will respond to what's in their email inbox or in their postal bag. And point two, if you want to find me, I am on LinkedIn. My name being Stephanie Hare. And I'm also on Twitter.
And my handle at Twitter is a little joke, at Hare, H-A-R-E, underscore brain, so Hare_Brain for being a nerd and also having the name Hare. It's revenge of the nerd's handle. So I'm on there pretty often if I can ever answer any questions or if people want to send stuff that they think is really interesting or raise issues. I'm constantly working on this topic. It's unfortunately not a solved problem.
So I will be very keen to hear thoughts from anyone listening to this on what they'd like to see covered in future or where to find resources of how to get started on taking control of your data and making your lawmakers work for you on protecting it in your communities. I'm sure you'll get some good articles, good suggestions. I hope so. Thanks so much for joining us, Stephanie.
And thank you to the listener for tuning in to Securing Sexuality, your source for information you need to protect yourself and your relationships. From the bedroom to the cloud, we're here to help you navigate safe sex in a digital age. Be sure to check out our website, Securing Sexuality, for links to more information about the topics, to Stephanie's book, and also of course for information about next year's Securing Sexuality Conference.
And join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week.