Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week's episode:
The Impact of Technology on Gender and Sexuality: Insights from Information Governance
In today's modern world, technology has permeated nearly every aspect of our lives, including our intimate relationships. The emergence of the sex tech industry, which encompasses a range of products and services aimed at enhancing sexual experiences, has created a unique intersection between intimacy, technology, and data protection. As we navigate this rapidly evolving landscape, it is crucial to prioritize ethical considerations to ensure the privacy and well-being of individuals engaging with sex tech. This article delves into the various ethical considerations that must be addressed in the sex tech industry, highlighting the importance of data protection.
Understanding Sex Tech: Before diving into the ethical considerations, it is important to establish a clear understanding of what sex tech entails. Sex tech refers to a broad range of products and technologies that aim to enhance sexual experiences, including but not limited to sex toys, virtual reality (VR) pornography, sex robots, and apps for sexual health tracking. These innovations have the potential to revolutionize intimacy, providing opportunities for exploration and pleasure. However, as with any industry that deals with personal and sensitive information, ethical concerns arise. Consent and Privacy: One of the primary ethical considerations in the sex tech industry is the issue of consent and privacy. When individuals engage with sex tech products or services, they often share personal data, such as their sexual preferences, health information, and usage patterns. It is crucial for companies in this industry to prioritize obtaining informed consent from their users and ensuring that their personal information is securely protected. This includes implementing robust data protection measures, such as encryption and anonymization, and being transparent about how data is collected, stored, and used. Informed Decision-Making: Another ethical consideration in the sex tech industry is the need for informed decision-making. Companies must provide accurate and comprehensive information about their products and services, including potential risks and limitations. Users should have a clear understanding of how their data will be used and the potential consequences of engaging with sex tech. By ensuring that individuals are well-informed, companies can empower users to make choices that align with their values and preferences. Addressing Bias and Inclusivity: The sex tech industry must also prioritize addressing bias and promoting inclusivity. Historically, the field of technology has been marred by biases, whether intentional or unintentional, which can perpetuate stereotypes and inequalities. It is crucial for companies to actively work towards eliminating biases in their products and services, ensuring that they are inclusive and respectful of diverse sexual orientations, genders, and cultural backgrounds. This includes involving diverse voices in the design and development process and regularly reviewing and addressing any biases that may emerge. Consent and Coercion: In the realm of sex tech, issues of consent and coercion can become particularly complex. Consent should always be freely given and revocable, but with the advent of advanced technologies, such as sex robots or AI-powered chatbots, the lines can become blurred. Companies must ensure that their products and services do not encourage or facilitate non-consensual activities or perpetuate harmful power dynamics. This requires implementing safeguards and providing users with the tools to set and maintain boundaries within the digital realm. Regulatory Frameworks: To effectively address the ethical considerations in the sex tech industry, a robust regulatory framework is essential. Governments and regulatory bodies must work in tandem with industry stakeholders to establish guidelines and standards that protect user privacy and ensure ethical practices. This includes implementing clear policies on data protection, consent, and transparency. Additionally, ongoing dialogue and collaboration between regulators, industry players, and advocacy groups are necessary to keep up with the rapid pace of technological advancements and emerging ethical concerns. As technology continues to shape our intimate lives, it is imperative that ethical considerations are prioritized in the sex tech industry. By emphasizing consent, privacy, informed decision-making, addressing bias and inclusivity, and combating coercion, companies can create a safer and more responsible environment for users. Additionally, strong regulatory frameworks will play a crucial role in ensuring the protection of user data and upholding ethical standards. By embracing these considerations, the sex tech industry can continue to innovate while respecting the rights and well-being of individuals engaging with their products and services. Key Concepts:
Hello and welcome to Securing Sexuality. The podcast where we discuss the intersection of intimacy-
-and information Security. I'm Wolf Goerlich. He's a hacker and I'm Stefani Goerlich. She's a sex therapist. And together we're going to discuss what safe sex looks like in a digital age. Today we're joined by Keigh-Lee Paroz. Hello. How's it going, Keigh-Lee? Hi, guys. Thank you for having me. It's so good to have you here. So, Keigh-Lee, you've worked in public and private sectors in Scotland and the UK and Australia. Uh, you've worked in information governance, Uh, specialising in embedded digital teams. You delivered projects for the Scottish and UK government. Uh, you design training on data protection and information governance. Uh, you've used tools like Gamification and a I and everything else. Um, this is like a smorgasbord of all my favourite stuff. So thank you so much for for joining us to talk about, uh, all things, data protection. Well, thank you. I think I think it's just really fascinating to look at what it is to be human in the digital age. And you know, the impact that technology is having on us both as human beings and as sexual beings. Now we connected with you through the sex tech school, right? So can you tell us a little bit about your journey into that space and some of the ways that you've been, uh, you know, participating and and bringing a new lens to, uh, entrepreneurs and innovators. Yeah. So I have a I have a a nontraditional career trajectory. I think how it's described where, You know, I've kind of come to the things that I really love by way of a process of elimination. And so, um, early on in my career, I was very interested in the legal sector. Um, I wanted to be a writer and a lawyer. Basically, um, but I wasn't convinced writing would make enough money, so I thought I should have a backup. Um, but I spent my early years, um, working in the courts in Australia, and I observed a lot of behaviour there that got me very interested in the kind of the why of law and how it works and why it has authority. And where does that power come from? And why do we give it that power and kind of these very anthropological kind of questions. And I just find it fascinating as you know, the way that human beings interact with each other, the way that rules are set up in society, that we all choose to follow or not, as the case may be. I suppose if you're in a criminal court, um, but also, you know, things like gender and how that is how that impacts treatment both in terms of institutions but also kind of individuals. And, you know, there there's a particular phrase that used to be used. I don't know if it's still used, but it used to be used in courts where a judge would say, I cannot see you, meaning you're not dressed appropriately for my courtroom. Go away and put on appropriate clothing basically, you know, And so if a woman's skirt was too short or, you know, I had a colleague who had multiple piercings in her ears, and she had to take them all out all of the time before she could go into a courtroom. There's this very, um, hyper real ritualistic sense of decorum in courtrooms, and that fascinates me. Um, so that kind of started a whole line of queries in my brain about everything from gender to class to normalised behaviour or behaviour. That's out with kind of social norms as well as legal norms. Um, and eventually that turned into a lot of work in kind of gender, fascinated by human rights, um, indigenous issues. So one of the great things about working in the court was I did a training on inclusive language and there was a huge element of that not only about inclusive language, which I've actually carried into the entire rest of my career and found it incredibly valuable, but also, uh, specifically on the way that indigenous people, um, tend to communicate and what that looks like in a legal setting. And I mean, that kind of thing is fascinating. Um, so it's all of these kinds of things that inform my early career that then get kind of taken into different ways of working and then travel and finding myself in a more regulatory space in the UK and Scotland, where I'm looking at things like freedom of information because we should know what our governments are doing, or data protection rights, because people should be protected and there should be some kind of balance between everyone who has our data and who we are as individuals. And I think there's really interesting threads to unpick there about where these concepts come from. But the thing that makes me passionate about it is this idea that it exists in order to protect people. You know, it's there for a reason. It's not just some red tape, it's not. I was going to say just some pain in the ass, But I don't know if I'm allowed to swear and you know that then becomes very fascinating from how do people perceive it? And GDPR is a really good example to talk about when it comes to how people perceive legislation that's there to keep them safe, Um, and how organisations view it. I, um, also kind of started my career in the court system. I used to run for community based mediation programmes, so I worked with a lot of different populations in different contexts, all trying to resolve their disputes before they got to the judge. And I appreciate what you're saying about the formality of court and the language of court because, um, especially, you know, UK Australia. You guys have like the powdered wigs and all of, like, the accoutrement of formality. And in America we're not quite there. But we still have language and decorum and expectations. And it was always fascinating for me to see, um, people that were coming in with, you know, maybe $500 small claims case. And they wore their best ironed button down polo shirt and cargo shorts because they were going straight from there to work, and it was always a shock to them to be told. This is a presentable you need to leave, you know, you need to come back dressed appropriately, and they would say, But this is literally my work Uniform like this is what I wear in customer facing roles. Why wouldn't this be appropriate? And so just the cultural nuances of not just country differences between the UK Australia and us, but courtroom versus non legal settings. We can have a whole conversation just about that. But, um, that's not really what we're here to talk about. That's just what made my ears happy, because it's so rare that I get to talk to somebody that has that same sort of legal exposure background which does make me curious. How did you get from the lost stuff to the sex stuff? All of that interest in gender, gender, politics, how gender affects how one is allowed to act in society is quite fascinating. And I think once you get into that, you eventually hit the point where you have to talk about sex and sexuality because there's a lot of rules in society about what women are allowed and not allowed to be or do or you know and again, I mean in the court, there's a lot of, um, and I suppose I should I should have made a trigger warning that because are we talking about this kind of thing? The topic of sexual assault, domestic violence? Those kinds of issues may come out as examples, but seeing how that is dealt with through the courts and the fact that we are all sexual beings, I think it's a It becomes a very interesting area to look at in terms of what it is to be human. Um, and and I don't think we can be fully human without exploring sexuality and who we are as sexual beings. I don't think we can be fully human without understanding pleasure and what impact pleasure has in our lives and by pleasure. I don't just mean the physical act of sex. I mean, pleasure in a much broader sense, you know, it brings me pleasure to sit by a cafe looking at an ocean, drinking a cup of hot chocolate. You know, like that's that's pleasure as well, you know, So it's not just about sex, but I think that the way that our world is structured doesn't leave a lot of room or doesn't put a lot of weight on things like pleasure, whatever that might look like. And so it's kind of fascinating. And and then there's all There's also this really interesting and fascinating area where again, working in the public sector, you know, there are these strange rules about whether or not you would fund anything to do with sex. Are you allowed to talk about sex? One of the really interesting pieces of work that I did in my career was, and again I'd love that. This came out through GDPR and data protection. Um, I got a question about the retention of transgender people's records, and there are all of these different pieces of legislation in the UK um, four main ones. And there are different overlaps that you need to be mindful of when it comes to transgender people or people who are transgender. And this is where again, you know, that inclusive language training I'm so incredibly grateful to the Queensland court service. Just shout out to them for that training. I mean, it stood me in really good stead, and I ended up, um, because it's government money. You do a proportionality exercise. And that was a massive education and learning curve for me as well as the legal side of, Well, what are our requirements when it comes to retaining information and what will out a person as transgender? You know, if we look at an application from last year and it says Mr Smith, and this year it says Katie Smith, what does you know? We're going to figure that out, you know? And also, if they are receiving public funding, then will that automatically that name change and possibly bank account change? Will that automatically flag them for fraud investigation? And what does that look like from a discrimination and equal rights point of view. So it was an absolutely fascinating piece of work that I'm both really proud of and really grateful for, Um, because what ended up happening was we had to look at that. We put in place processes around that, but also there was a great need for inclusive language training, and it is now mandatory at that particular organisation because of that piece of work. So that's the bit that I'm really proud of. And this was before the media tipping point. But I could see the media tipping point coming. This was before Laverne Cox was on the cover of time and the big fury about that. Ah, so it was pre that particular moment in time. Um, and I was actually told at one point not to do that piece of work, not to continue with it, because it was still fairly contentious. And I took a step back and I thought about it and I thought, Well, one my job is to is to assess, identify and assess risks for this organisation, and I think this media tipping point is coming. I also have to be really open and admit that there was also a personal element to it for me because I felt like being part of a public body and part of government. It was incredibly important to legitimise everybody's experiences and that it wasn't right to not provide good support for elements of society because of some ideological difference. The public sector is there to provide good governance for everybody who lives in that country, all of the citizens of that country. And so I feel very strongly about that. And, um as well as you know, I mean and that was I'm admitting to personal influence or personal opinion about that, but that that just simply informed my absolute desire to make sure that we were compliant, but also that we treated people decently. And so, unfortunately, that particular organisation also really valued treating people decently. So once we started having the really awkward conversations that came across and people did get on board with it, and the person who had told me to stop working on it eventually came and said to me, I'm really glad you ignored me, you know? So I full credit to that particular organisation and to the people who were involved in that work. Um But there are some really interesting things that happened, Um, during that piece of work. So you know, as as I as I listen to you and as I think about your journey and some of these these notes that you pulled out, one of the things that I it occurs to me is that, you know, from a legal system from a language perspective, Uh, certainly from the way we build technology, there's a lot of assumptions we make right. Like, this is dress for court. Um, this is the appropriate, uh, box for gender. This is the appropriate way to communicate this. This is the appropriate way to enforce some of these things. And when you know, I think about, uh, the tech side like a I and algorithms, you know, we take those assumptions and now they're being repeated 10,000 fold in mass. And so I was wondering, you know, in your in your experience, what are some of the ways that you find that organisations, institutions and individuals can sort of get out of their head and not fall prey to this? This is a boy. This is a girl. This is the the toy for this. This is a toy for that. This is the box for this. This is the box for the other. Yeah, this is a really interesting question. And Scotland, fortunately, is quite progressive when it comes to legislation on this. And, um, which I think is a very good thing. Um, one of the things, um, one of the things I'm always very conscious of is not only when am I making assumptions, but I'm also really aware of when we make assumptions. And then we assume that those assumptions are correct because that's when we get into a whole load of trouble. Because not only have we made an assumption, but that assumption may not be correct. And we're acting as though it is, and that can be really hard to untangle in a digital product, especially if, like someone like me who's in compliance doesn't get invited to the table until they're about to launch. And then they just want their tick box right to say that it's compliant because they've done all of the things and the product is amazing. Um, but you look at the product and then say, Well, you've included, you know, highly sensitive information. I'm going to try not to use jargon. So there's categories of information under GDPR and the Data Protection Act in the UK, where certain protected characteristics are treated as more sensitive than general information. So your name isn't sensitive at all, generally speaking, whereas your sexuality might be. But what I think is really interesting on the point about assumptions is there's a number of digital systems, especially in government, where you're collecting information. But this applies to APP developers as well, where you're collecting information about an individual and their partner. And what is fascinating to me is we've been doing this for ages because Mr Smith is married to Mrs Smith and they have three Smith Children and they receive X amount of support for whatever. And we don't think of that as being information about sexuality. It's the increased interest in people who might be homosexual or who are transgender or who are bisexual, and the wide spectrum of different types of human sexuality and couplings. And our systems really struggle with that because they're based on Mr and Mrs Smith. Um, and what I find really interesting is is a lot of people that I've worked with over the last couple of decades, haven't picked up on the fact that we're talking about sexuality until it's these other sexualities that come on board. It doesn't occur to us that heterosexual, because that's our norm, because that's just the assumed coupling. We don't think of that as being sensitive personal data, because that's just how things are right. And so there's this incredible embedded assumption right at the outset of a process like that, and it doesn't necessarily occur to us to treat that as sensitive personal data because it's not that other, which is why we have that legislation. It's to protect those people, not on a way these these regulations actually apply to everybody. And maybe we should question our heteronormative thinking and look at the potential unintended consequences of some of the data we're collecting and how we're using that and what that can also be used for, especially who we're sharing that information with. I love that you bring up sort of why I, in my world we would call like manna normativity. You heard that one, Mrs Jones, Mr and Mrs Smith, because that's becoming less normal, right? Like I just saw an article yesterday that something like 30% of Gen. Z identifies as some form of queer. And in my world, um, I am the rare and elusive like, monogamous sex therapist like That's certainly not the norm in my community, professionally or even really socially, as I think about it. So we we're entering this age where a lot of the things that have just been assumed as defaults when building databases or creating health forms, or any number of these things we can't do anymore. And what that means is that people are being asked to expose themselves a little bit more in ways that I don't think people have thought about historically, especially not just sort of on the data collection and curation side. So how do you have those conversations with the technologists that you interact with carefully? My objective when I'm having those conversations is as I said for me, it comes back to there's a reason that we have these rules and it is to protect people. And so I, as I said, I'm very big on the what are the potential and intended consequences of this? You know, the communications act when it came into the UK. I don't think a lot of people thought or looked at it as a This is a piece of surveillance legislation. This allows the government to collect what websites you're going to and store them for a certain period of time. For most people, that's not going to be an issue. But I have serious concerns about the homogenization of experience. Um, who gets to decide what is other? Who gets to decide what is normal, who gets to decide what's going to be criminal, You know, what does that look like? Because certainly, you know, and again the piece of work that I did on which again came out of a question about retention of transgender people's records. Um, the kind of hate crime and the levels of hate crime. And the statistics on how many people have considered or tried attempted suicide are horrific. I don't know how as a human being, we can look at that and ignore it, or think that there won't be some potential unintended consequence from collecting that information and it falling into the wrong hands or a system being hacked or people being treated differently because of that information. Um And so when I was doing that particular piece of work, I was delivering a data protection training session, and it was just normal data data protection. It wasn't anything particular. It was the standard. This is your data protection training for this particular team. But people knew that I had been working on this piece in the background, and somebody took a particular point in that particular training to ask me out. Right? And this was when we were in the office, and I'm standing in front of a room full of people, and I'm trying to be really present so that I can be a good trainer and, you know, be really responsive to people. And, um, somebody asked me. So did you used to be a man, and I found it incredibly confronting. Um, and so part of me is standing there feeling like and in the back of my mind, I'm like, Oh, my God, I'm really confronted by this. Does this mean I have unconscious bias that I'm not aware of? And the other part of me is standing is standing there going. You've been really silent for a long time. Answer the goddamn question. Oh, no. It was It wasn't as long as it felt like. And fortunately, because that part of my brain was distracted with the Oh, my God, Do I have unconscious bias that I haven't sort of unpacked? The other part of me just looked back at this guy and said, Does it matter? And I was so relieved because that was the right thing to say and the right point to make. And it was amazing to me how it completely changed the room. Um, and it made me really keenly aware of how important it is for good allyship and how important it is to be willing to feel the discomfort. And so when it comes to kind of those assumptions that we put into our digital products or having those conversations with, um, you know, the the designers of the system, you've got a real chance to look at. Well, did we do user research? You know, have we had users involved in this testing? Where did we get this information from what assumptions have we made? And what have we built in? Because it's just easier to interact with this other system and that interoperability piece, you know? So it's looking at what we're actually doing, but also how we're doing at it. Sorry how we're doing it. And again, being really aware of those embedded assumptions or and and just making them a bit more conscious. So the conversations that you have with the guys who are designing it, being able to approach that with curiosity and that mindset of Well, why did we do this particular thing? Where did this particular line of information gathering come from? Why are we collecting that? I mean, that's an important data protection question, because we should always be able to explain what data we're actually collecting. You know, there shouldn't be any data hoarding. Um, you know, we should absolutely be able to know why we're collecting it, what we're doing with it and who we including who we're sharing it with, how long we're storing it for and where it is because we have We need to know those things so that we can make sure that people can access their data subject rights. Um, and that we're doing the right thing from a data protection point of view. And so But there are lots of hooks and levers you can use when you are having those conversations with the people who are designing and building the system. And the really important thing is to make those connections really early on and start having the awkward questions before they've gotten to the point where they've actually delivered something. And, you know, like a really good question is, you know, does your MVP include compliance or to what level does it include? Compliance? Because quite often that's not an assumption. You know, it depends on your delivery timetable and all these other things. But sometimes compliance is not included in the MVP. And that always blows my mind, because surely you need it to be legal. Isn't that not the basis of any kind of product you want to introduce? Um, but, uh, I think sometimes, you know, for whatever reasons, it just doesn't quite make it into MEP. And I think, Oh, my God, I could do a whole session on agile, Um, but, um, and how people interpret the rules of agile um, that's a whole different podcast. I think I have you to talk about agile because agile is one of the few things that I get as the non technologist. So I'm making a mental note. Invite you back part agile, continue so agile and MV PM, VP meaning minimum viable product that the first product you're producing, I'll I'll, I'll spell that one out. Uh, it does remind me. I know you work a lot with start ups to help develop growth opportunities, uh, implement systems for growth, even to identify potential board members and advisors. And so one of the things I want to ask you around, that is is are we starting to see a sea change in compliance and data protection And really, um, embracing more of an inclusive future for technology? Are we starting to see that as part of a growth strategy? Because for the longest time, when I'm doing advisor work, they're like, Yeah, yeah, yeah, we'll get to that. But that's a cost. We we're not there yet. We're you know, your point about Shouldn't it even be legal? It's like, Well, let's see if we have market fit at first. Ah, makes me want to pull my hair out. So, in the advisory work you do, uh, is this starting to be seen different sometimes, Yes, sometimes, no. But again, I think this comes back to Stefani's point about having a conversation. And so I'm really I'm really good at making myself a pain in the ass very early on. Um, so I will, You know, if I'm going to be working on a project, I will try and have coffee with people early. I will try and build those relationships before I have to start saying, guys, that's actually not ok, um or I know you've just, you know, I know that I've come in at the last minute. I'm very Yeah, Unfortunately, I've had a lot of instances and a lot of examples of coming in at the last minute and being that person who says we need to tweak a few things. But I think what I hope is changing and what I and I suppose one of the key messages that I kind of that I have as a practitioner is that being able to build trust with your customers is really vital. And one of the ways one of the key ways you can do that is through your compliance regime. Because I for me, it's all bound up. A good product to me is something that I can trust is something that I know. If something goes wrong, they have a plan to fix it quickly and that they're not going to do stupid stuff with my information, you know, and that all is driven by your compliance side. And so for me, it's an aspect of customer service. It's an aspect of how you deliver well for your customers. It's not this thing that gets added on at the background. It needs to be in your DNA. One of the things that I think companies are becoming more conscious of is the, uh there's a There's a phrase in the legislation, but it's also a sort of a catch phrase, which is data protection by design and information, security by design. And what those two things mean is that these have to be considerations throughout your development process. And so the question then becomes, Is that genuinely happening? You know, in large organisations. But I think I I don't have a sense of the broader is that happening for start ups and SME S because usually if I'm involved in a start up or an SME. They're already conscious of it, Which is why I'm there in the room, you know? Or I am there in the room and I will have that conversation with them. So I have a I don't have a broader view of whether that's happening. Um, but actually, one of the there's a previous podcast that you guys have done. I think it was a poly on. I can't remember what number it was, um, but I loved that they had designed their product purely from that customer centric point of view where they built it from a well, how is it easy for people to have control over their data? And I think that the fact that that is unusual, I think, answers your question, but that that way of, you know, designing a product. Yeah, for sure. I love that you bring up a poly on because, um, they're phenomenal example of kind of doing things the right way from a privacy and, um and, um, agency sort of perspective, not agency isn't like government agency, but agency isn't like the users' ability to make decisions for themselves. And I mean, they are doing a lot of good things, but I'm super curious to hear, sort of like, What would your prescription be like as we head towards the end of our conversation? If you could wave a magic wand and everybody in sex tech started doing things better, what would you want them to be doing? Oh, goodness, Um, actually, if there was one golden egg, it would be to link in people's minds the idea that if they don't do this from the outset, it can kill their company. So instead of focusing on, we have to be profitable first. Yes, that is a driver, obviously. But if they don't do this properly once they've started making that money, if they get hacked, the percentage of companies that go out of business once something goes wrong from a data breach point of view, it's phenomenal. It's huge. And if they don't deal with it, well, that goes up to like in the 90% kind of category. Um, in fact, it's over 90%. And so it's that link for me between the idea of if it's not a moral standpoint to make sure that you're doing the right thing by your customers, then think of it in a bottom line. Sense because you end up in the same place. Um, because if you don't do this properly and do this well, not only will your customers eventually have problems, but you will eventually potentially go out of business because you haven't done this from the outset. So I think there's a I think there is a link there for people to educate themselves and make sure that they are, you know, getting on the right track right from the get go, rather than thinking that it either doesn't matter or it can be kind of a bolt on an after the fact. So if you want to stay in business, follow this advice. Keigh-Lee, thank you so much for joining us today. It's been a pleasure. Thank you. It's been lovely to spend time with you both. This has been an absolute delight, and thank you for tuning into Securing Sexuality, your source of information you need to protect yourself and your relationships. Securing Sexuality is brought to you by the Bound Together Foundation, a 501 C three nonprofit From the Bedroom of the cloud. We're here to help you navigate what safe sex looks like in a digital age. Be sure to check out our website, SecuringSexuality.com for links to more information about the topics we discussed here today as well as our live events and join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week. Comments are closed.
|