Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week’s episode:
Connected Sex Toys: Privacy and Data Protection Considerations, Risk-Aware Consensual Kink (RACK), and Understanding the Implications of Connected Sex Technology
As technology advances, so too do the products we use in our everyday lives. This includes sex toys, which are becoming increasingly connected to the internet and other devices. While this can provide an enhanced experience, it also raises important questions about privacy and security. With connected sex toys, there is a risk of data being collected without users’ knowledge or consent, as well as potential unauthorized hacking of these devices.
As such, it is essential that better regulation and enforcement of privacy policies be put in place when purchasing connected sex toys. Connected sex toys are becoming increasingly popular due to their ability to provide a more immersive experience for users. These devices can be controlled remotely via apps or even voice commands, allowing for greater control over the user’s pleasure.
However, with this increased level of connectivity comes an increased risk of data collection without users’ knowledge or consent. For example, some companies have been found to collect data on usage patterns and even store recordings from conversations between partners using the device – all without informing customers beforehand.
This type of data collection is highly intrusive and can leave users feeling violated if they are unaware that their information is being collected in this way. In addition to potential data collection issues, there is also a risk that these devices could be hacked by malicious actors who could gain access to sensitive information such as passwords or credit card numbers stored on the device itself or through its associated app.
This could lead to identity theft or other forms of financial fraud if not properly secured against such threats. Furthermore, hackers may also be able to gain access to intimate recordings made while using the device – something which could cause significant distress if made publically available online without consent from those involved in making them.
Given these risks associated with connected sex toys, it is essential that better regulation and enforcement of privacy policies be put in place when purchasing them.
Companies should ensure that customers are aware of any data collection practices before they purchase a device and should make sure that all personal information collected is securely stored with appropriate encryption measures in place so as not to fall into the wrong hands if hacked by malicious actors online. Furthermore, companies should also ensure that any recordings made while using their products remain private unless explicitly shared by those involved.
Finally, governments should consider introducing legislation which requires companies selling connected sex toys to adhere strictly to certain standards regarding customer privacy. Such legislation would help protect consumers from potential abuses related to their personal information, while at the same time ensuring companies selling these products comply with necessary regulations. In conclusion, better regulation and enforcement of privacy policies when purchasing connected sex toys is essential given the risks associated with them.
Companies must ensure customers are aware of any data collection practices before they purchase a device, while governments should consider introducing legislation which requires companies selling such products adhere strictly to certain standards regarding customer privacy. Doing so will help protect consumers from potential abuses related to their personal information, while at the same time ensuring companies comply with necessary regulations.
Hello and welcome to Securing Sexuality, the podcast where we discuss the intersection of intimacy and information security. I'm Wolf Goerlich.
He's a hacker, and I'm Stefani Goerlich.
She is a sex therapist, and together we're going to discuss what safe sex looks like in a digital age. Today we're talking to Rowenna Fielding.
Rowenna is a freelance professional privacy and data protection nerd with over a decade of experience helping organizations stitch together law, technology, and humanity as best they can manage. She obtained a data protection qualification before GDPR arrived and made data protection sexy. And since then she's worked with charities, public sector organizations, and commercial businesses.
She's written many articles, given plenty of talks, and has been proud to accept a couple of awards for services to geekery. Rowenna firmly believes that a strong commitment to human rights and welfare is all that stands between us at present and the future tech dystopia of sci-fi nightmares. And she is going to talk to us about sex, toys, and connected technology. Welcome! Thank you so much for joining us.
Hi, thank you so much for having me. It's great to be here. I am so excited to talk to you. I know you and I have connected on Twitter off and on in the, you know, before days when it was still a functional website. And I have been so looking forward to this because Wolf and I talk a lot about, obviously, sex, toys, and connected technologies.
But we haven't had somebody on that can really get into the details with us and can help us understand what we and our listeners need to know about how to use these toys in a way that is, you know, pleasurable and also private.
Well, I mean, it's the right time for it to be the right time to be talking about it now because the Christmas gift buying season is nearly upon us.
So yeah, connected sex, toys and privacy. So I work in data protection and privacy and information security is a little part of that. But there is so much more to data protection and privacy than InfoSec. And what I've noticed, not just in, you know, connected sex, toy selling, but also just everywhere, is that a lot of people confuse security with privacy.
So when it comes to connected sex, toys, the connection between toy, phone, company, servers, whatever, could be really, really secure and completely impervious to third party access, if such a thing is possible. But what happens to the data that goes from the toy to the phone to the servers and then possibly onwards, that's that's a whole nother thing. That's a privacy thing.
And even if it's secure, it doesn't necessarily mean it's privacy friendly. So think about privacy as the right to be unobserved and uninterferred with. And that applies whether it's your body privacy being touched or moved about, or your associational privacy like who you get to hang out with.
And so, so much when it comes to data, because data is pieces of us. It's just as important as our physical presence. So if someone's monkeying with your data, they're monkeying with you.
Yeah, I love that separation.
Actually, I don't love that separation because it's a seesaw. Like everything is security.
But you're so right, though, because oftentimes, I ask the question, okay, let's assume we secured everything, right?
Whatever we're talking about, we removed all vulnerabilities.
Have you thought about how these toys or tools or apps that we're building could be misused?
And people are like, well, why would they do that?
Because not everyone who can see this data may have the best intents around what this data is.
So what are some of the concerns from a privacy perspective with, specifically with these toys, but more broadly, of course?
Yeah. So from a privacy perspective, who's using the data and for what is massive. Because it's data about people's sexual activity. That's kind of, for most people, as sensitive and confidential as it gets. So the idea that someone else is kind of looking over your shoulder and taking notes while you're having sexy fun time.
I mean, even if they never use the data for anything but benign purposes, that's still something that in the offline world, you would expect to seek and require consent for, informed consent as well. So it's the principle of the thing, first and foremost. It's just basic manners not to watch somebody having sex unless they've invited you to. But then obviously with the data aspect, things get much scarier.
And again, it's not necessarily that people using the data have malicious intent. There's really no need for anyone to wish harm on anyone else in order to mess with their privacy. It's actually kind of the default. So digital marketers think, yay, clever analytics tools to help me do my job.
And either they don't have the knowledge to use it responsibly, or they have to weigh up abstract potentials for risk to strangers versus their KPIs and a stable income. There's a massive set of conflicting imperatives there. So what you get is toys and apps and websites that have been designed without thinking privacy, privacy, privacy all the way through.
They use kind of standard tools and data profiling and SDKs and add tracking and make use of the telemetry for all sorts of reasons. And every single one of those secondary uses has the potential to have an impact on those people's lives. So what happens to the data and who's using it and what for is a really big question around sex toy privacy.
And I don't know if you remember, WeVibe actually got into a bit of trouble a few years back when it turned out that they had embedded advertising analytics in their connected toy app, which meant that basically people's sexual activity was being used to profile them in order to target advertising at them and not necessarily just advertising of sex toys either. So that's from an etiquette point of view.
That's like rude from a data protection law point of view. That's illegal in a lot of places. And from a privacy view, which privacy is a human right. It's part of the rules that we have in place to protect people from each other. Yeah. So from a privacy point of view, the more data you have, the more privacy risk is involved.
So there's a whole load to think about in connected sex toys. And it's mostly around who's doing what, why and how.
The idea that privacy is one of the ways that we protect ourselves from each other is, I think, a really powerful statement, especially here in the States right now, kind of an ongoing theme in the podcast because of what was going on when we launched it was the rollback of privacy protections.
First, under the auspices of abortion rights, but it seems like this is a creeping thing. There are a lot of people right now that are pushing back on the idea of people having a right to privacy.
And the notion that that extends into our intimate relationships, forget reproductive choices, but just what we do with our partners and what we choose to use when we do it is a really fascinating and somewhat concerning idea that I don't think a lot of people often think about.
Yeah, that's absolutely right. So you know that scene in Jurassic Park where Dr. Ian Malcolm says your scientists were so concerned with whether they could, they didn't stop to think about if they should. It's exactly the same sort of thing when it comes to data and technology.
And we have the ability to put each other under almost constant surveillance and then draw inferences and make judgments and then use that derived data to influence those people's existences. And the fact that we can do that seems to have been translated into an imperative to always do that as much as possible.
And it's got to a point now where when you're working with digital technology, especially when it comes to data, it's a lot more effort to use data technologies responsibly and compassionately and safely than it is to do it recklessly and harmfully. So in the interest of keeping costs down, that time and effort to do the sensible, responsible, ethical thing, it just doesn't really happen.
And then that shifts kind of social and cultural norms so that people expect to be under surveillance and generally people are quite happy to accept unlimited collateral damage as long as it's not them. And that's kind of the thing that's happening with privacy at the moment.
People who don't, who have enough social capital to be insulated from the effects of other people's judgment and influence don't rate privacy very highly because they kind of don't need it so much. Or they have the resources to build up around themselves. But in an interconnected world, we all matter. So it's turned into a power struggle, basically. It's really interesting and kind of terrifying. You mentioned Jurassic Park.
And of course, the old joke about Jurassic Park was, spare no expense, one IT guy.
I have to wonder, and I was curious in your experience and your research, if this popped out at you, are these sex toy companies even staffed to provide adequate privacy?
Well, I'm going to answer with the standard data protection person's go-to phrase, it depends.
So, I mean, yeah, big companies like WeVibe and Lilo and the Doxy, the big vendors. I don't actually know if Doxy does an interconnected toy. I must look into that.
Yeah, they're big business. They have resources. They have staff. And certainly in WeVibe's case, they have a harsh lesson from a class action lawsuit from when they dropped the ball, as it were, last time. But startups and also companies that are put together in places where privacy and human rights don't have such cultural significance as they do for us.
Yeah, they're not necessarily going to have either the knowledge or the desire or the resources or the staff or the time to build a viable business model and do it ethically and responsibly. So there's a real tension there.
And then, of course, you get big businesses that just don't care.
So yeah, it's going to depend on the business and the people running the business. But it isn't easy, fast or cheap to make connected sex tech safe and respectful of human rights at all.
When you're talking about SDKs and I was starting to think about, well, OK, if I'm a startup, I'm going to pull whatever is available, whatever is easy to get my hands on, whatever I can configure to send a signal over Bluetooth or light up a display on a phone or whatever.
And needless to say, a lot of these open source components, right?
If you're building this infrastructure and think of it like Legos, the underlying Lego blocks have a lot of security and privacy concerns as well.
Yes, exactly. So for example, if you're a connected sex toy vendor and you've put out your product and you want to collect app telemetry for troubleshooting and bug hunting and performance enhancement and yada, yada, yada. So you pull off that telemetry and you bung it into Firebase. And Firebase is owned by Google. Maybe you are also embedding Google Analytics because you want to know who you can upsell features and upgrades to.
And maybe you're using Google Signals as part of that, which correlates via device fingerprinting to account information.
And bosh, suddenly you've told Google exactly who's having sex with whom and where and for how long and how often. And while Google might not give a damn about that on an individual basis, when it comes to profiling people for targeted advertising and digital marketing, not just of goods and services but of ideas, that suddenly gets really, really significant, especially on a large scale.
OK, I'm going to have to advocate for my fellow non-techies in the group. Sorry.
You say telemetry, what do you mean?
And what the heck is an SDK?
So telemetry, data from the app about how the app is being used, what the device is, what maybe what other apps are on it, what operating system and versions of software it has. Some apps will demand access to the contacts list in order to set up that connection. But what that means is the entire, sometimes the entire contacts list gets uploaded to the company's servers.
And that obviously has the potential to be used for profiling and targeting.
So yeah, it's telemetry is all the observations that can be made about the app and how it's being used in that particular instance. And that's obviously tied to a device and potentially to a user registration, potentially to purchase information.
And from the point of view of the advertising networks, that data might be being spewed out to 100,000 different companies, all of whom have access to thousands and thousands of other third party data streams that are tied to that device ID.
So yeah, it's bigger than it looks. That's what telemetry means. Small word for a big scary thing. I'll let Wolf take the SDKs one because I'm too long out of the technical side to trust myself to explain it properly. I was mentioning Lego blocks earlier. So SDK stands for software development kit.
So if you want to write software, you pull down a bunch of modules and put it together, right?
You no longer, it's not like the old days where we started writing a line of code with a blinking cursor and built everything from scratch.
Nowadays, we consume a whole bunch of services like the ones off of Google that she mentioned. Thank you for that. It's always helpful when we have the techier conversations if somebody explains it to me. So break this down into like practical sort of scenarios or examples for me.
What are the real world concerns that sex toy users should be aware of?
So there's individual concerns and there's social concerns. So individually, you don't necessarily want someone peeping over your shoulder while you're having off. You possibly don't want somebody logging the patterns of how and when you use the devices in order to try and influence you to spend more money on something, whether it's that device or other things.
You don't really want somebody using the patterns of your sexual activity to make inferences about your demographics or your character or your socioeconomic status in order for that information to be used to steer your choices. So that's the kind of individual level. And obviously, of course, there's the security issue as well. You don't want J-random third party to be able to look in on what you're up to.
And then from a social point of view, obviously, these massive data collection and profiling categorization engines, they have large scale effects. So a data harm example might be if you buy a toy and you have to put in user registration information before you can use it, which is a no-no. That's bad. Send it back if a toy does that to you.
But also, you might have to fill out a mandatory field about gender, and it might only be a binary male female. And if you are a trans, non-binary, genderqueer person, you've basically been misgendered by the device that you paid for, and you can't fix that. So that might not be a physical injury, but it's certainly, I think, a psychological injury. And that sort of thing shouldn't be underestimated.
And of course, then when you've got that sort of data, data that's been constructed with very heteronormative ideals being used to profile and influence other people and make observations about people in general, well, then you just end up with a whole load of stereotypes and biases and bad assumptions being replicated at scale, which is all very big and abstract.
I think for most people, it pretty much comes down to either, I don't want somebody looking at me while I'm getting off, or I don't want somebody making any money out of looking at me when I'm getting off unless I'm getting a cut, which is totally fair.
Yeah, that was one of the things I was thinking, because this is an aspect of privacy that I feel like the technology has outpaced humanity.
Because we know when someone's looking at us, when someone's in our house, even if we're just having dinner, it's like, what are you doing here?
We're very good as a species around like the, let's say, 10 meter or so radius.
I see you, you see me, that feels weird, I'm going to act a certain way. When things are invisible, as so much of this is, we lack those instincts, and we don't have those same intuitions. And I feel that that can get people into a lot of problems. We've built incredible feats of technology. And now we've got to a point where our technological capabilities exceed our personal capabilities.
So we've put so many layers of abstraction between ourselves and the people we interact with, through technology, the more layers of operation there are, the harder it is to think through all of them and consider risks, implications, responsibilities. So what happens is people end up reacting with their instincts and their guts.
And what might work out on the plains and the savannas and in the mountains, in a pre technological society, which is as far as our evolutionary biology has got, doesn't work in a data rich, high tech, high connected world, because that's a whole different set of survival needs.
And so I think we've bitten off more than we can chew as a species with technology, because we've made it far easier to do than to think. And when we do without thinking, we go with our primitive sides.
It's only thinking that lets us take a step back and go, actually, what are the consequences?
What are the ethics and the moralities here?
And technology just makes it easier not to do that.
Yeah, too easy. One additional question I had on the ethics and morality.
Why are people advertising off of sex toys?
Why does a dog lick its balls?
Because they can. And basically, it's that if you build it, they will come. It's that if you can generate data, then you'll find a use for it. So you get kind of digital marketers who go to digital marketing school and learn nothing about law and ethics. It's all about tools and techniques.
So a sex toy company gets big enough and thinks, oh, we need to hire a digital marketer to grow our business. They're bringing in somebody into an incredibly sensitive space that's whole professional existence so far has been about getting as much data about people as possible and making as much use of it as possible in the company's interest. So there's just a massive kind of culture gap there.
And also, you know, data, data, ethics, data harms, that's, that's really complex stuff.
Most, most normal people don't go wandering around thinking about it all day.
Like I do, they, they expect technology to be safe and have been developed by the best of the best with only the best. And you know, that's kind of why we have consumer laws, because never in the history of technology have human beings ever gone to make something safe before monetizing it.
And I think your statement that it's easier to do than to think is really important because people tend to be far more impulsive when it comes to pleasurable things, whether that's food and alcohol and vices, or whether that's sexuality.
And I know a lot of people who are thoughtful and intentional, and are aware of the risks and fall on the side of, but if I act on that, I can't do or use this fun thing. And I really want to do and use this fun thing. So I am just going to lean into what you described as sort of the abstract nature of the threat.
And you know, it's all very, very fuzzy. And the likelihood is that if something happens, it won't be me. And I really want to do the fun and pleasurable thing.
How do we respond to that?
Well, that's just people being people, isn't it?
That's what people are like to a greater or lesser degree. We've got this set of wiring that we've built abstracts and ideals and things on top of.
But when it comes to making a choice between immediate physical gratification and long-term abstract potential adverse consequences, it's really hard not to go, yeah, but the thing that feels good is the thing I'm going to go for, because life is pain and the world is harsh and time's too short.
And why shouldn't I feel good?
Yeah, I don't think there's an answer to that except for really good governance. This is just me, a therapist, asking you, a technologist, to solve human nature for me quickly and easily and preferably painlessly. With a button, absolutely, yes.
Yeah, that's right. The more technology we've piled on, the more questions we've found, far more than answers. And I think personally, I think the responsible, compassionate use of technology has got to start with self-examination and understanding your own baseline and your triggers and your emotional impulses and your likes and dislikes and all that jazz.
Just like having good sex starts with understanding yourself, your body, your likes, desires, and communicating that with the person that you're with or people, whatever. It's the same with technology.
To do it in an ethical way, that's got to start with who do I want to be?
But obviously, that kind of discussion with oneself doesn't generate any money. It doesn't happen often. And we've got a whole industry out there of an attention economy, trying to pull us this way and that doesn't leave an awful lot of time for self-reflection.
But if we're going to survive our own technology as a species, I think we need to do a lot more work on ourselves as individuals rather than trying to police each other. What I'm hearing you say is that more people need sex therapists.
Well, at therapy in general, but particularly sex therapy, yes, because we're still carrying some really outdated and maladaptive attitudes and ideals around that, which again, might have worked in pre-civilization humanity, but now we've got all this complex stuff around us. I think we do need to do more thinking.
On the topic of sex therapists, Stefani, you've told me about something, was it RACK?
Yes, RACK, Risk-Aware Consensual Kink.
So, this is an idea within the BDSM community that it's disingenuous to try and say that things are safe. Crossing the street is intrinsically, carries a certain amount of danger.
So the responsible thing to do is to be aware of the risks, to educate yourself about what possible outcomes there are, to educate yourself on how to mitigate those possible outcomes, and then to make sure that everybody is on that same page and has the knowledge and information that they need to engage in whatever you're going to do consensually. So it's pretty much as it says on the tin, risk-aware, consensual kink.
And I think that is a concept that is very relevant to the idea of connected sex tech, because so often, as you mentioned with the holidays, these are given as gifts, they are part of a relationship.
And sometimes for some people that might come with a certain sense of obligation, right?
Where I have many clients who aren't into sex toys at all, it's not their thing. But their partner gives them a present and they don't want to hurt their partner's feelings or let their partner down.
And so they're trying to figure out how to integrate this into their play and into their relationship and making sure that everybody is on the same page when it comes to consent and to consenting to risks around connected tech is, I think, a really important conversation that not many people are having. You would expect somebody to at least have read the safety instructions before using a machine on another person.
And it's exactly the same with computers. You're using machines on other people, same with data. So risk aware consensual tech, I think, is a good philosophy for the future, certainly between individuals when you get into the power dynamics of corporate versus individual or political versus individual, it all starts to get a little bit more interesting. But it's the same with Infosec as it is with privacy.
There's no perfect state of perfect absolute security or perfect absolute privacy or perfect absolute interpersonal relations. It always takes compromise, communication, tradeoffs, negotiation. It's a process. It's not something you can install. But obviously, when it comes to sex tech, you've got that whole lizard hindbrain, let's get it on thing going on, gumming up the thought works.
So getting past the lizard brain and trying to get any sort of risk assessment into that is part one. Part two is the consensual side. I think you and I have talked about this on the Twitters once or twice. We had this great idea in tech, hey, people should consent to the data they're giving online.
And that got translated into that lovely button we all click on every single website that says yeah, accept all. It did not translate well, at least it hasn't yet.
Is it better in sex tech?
Am I able to consent at all with these toys?
Again, it depends. So the WeVibe Rape, a connected sex toy, that now has really good consent mechanisms built in in terms of the data. When you install the app, it's really transparent about what's happening with the data. And if you want to send analytics back to the company's servers for them to do product improvement, et cetera, you opt in to that. And otherwise, it isn't collected from your phone.
So there are good examples. There are also some terrible examples where it just hasn't even been thought about. And it's very similar to things like sex tech websites or even sex bloggers. Sex blogger will write a whole post about the importance of consent, but not ask arrivers at the site if it's OK to park analytics cookies on them.
You know, it's like walk the walk. So two things.
First, I'd love to hear like a horror story of like you said, some really bad examples.
I'd love to hear a good horror story because who doesn't like a good horror story?
Like the ghost visiting Scrooge ahead of Christmas horror story.
Second, as a follow up, my question is this. Sometimes you can glean a sense of how well a company cares by certain indicators.
Is those indicators like when I install the app, does it ask me these things?
Is there indicators that you can use as a consumer to decide, am I with a good company or am I with a horror story that we're about to hear?
Well, first of all, I think being able to get access to the terms and conditions and the privacy info before unwrapping the toy is absolutely essential because you should have the opportunity to look at what's going on with the data and go, no, actually, I don't want this in my life and send it back and get your money back. You should be able to do that. Yeah.
Links to that sort of thing on product packaging, you know, QR code that flashes up, privacy notice, that sort of thing. That's like the gold standard. And then you've got, if you've got an app, what permissions it asks for. If an app's asking for access to your entire contact list, you would probably want to be a bit wary of saying yes.
If the first thing that the app does is not tell you all about the privacy, and that means telling you about the risks, not giving you the corporate PR fluff, this is how we've made everything peachy because you don't need to know that they've done the bare minimum that like law and ethics requires.
What you want to know is what do you need to look out for?
Of course, you know, it'll take a company with very strong ethics to actually, you know, admit to any kind of risk being involved in their product at all. And then you've got the sort of analytics and telemetry that should be on an opt-in basis only. Shouldn't be taking anything off the device without permission.
And the opt-in should be really detailed about, you know, what kind of data not just is collected, but also what sort of data linkage is going on at the back end, what other data sources are used, you know, whether or not that's being mapped to social media profiles, which is a really common thing in digital marketing. And there should be an offline only mode.
So, you know, maybe you want to use the toy solo without hooking it up to the internet. It should be able to do that without, you know, it shouldn't refuse to function if your phone and if it's in like airplane mode or whatever. Not that you should be taking devices on airplanes unless it's a private flight because content or using rather, you can have it in hand luggage, whatever.
And in the privacy info itself, so it needs to be comprehensive and clear. It needs to be written in human language, not in legalese or in technobabble. And it needs to be really specific about what rights, data rights you can exercise and how you can exercise them. And just generally good, honest, good faith communication.
So and no using like weasel language like for marketing purposes, because when you see that, you know that there is a buttload of tracking and profiling, all sorts of creepy shit going on underneath them. They just don't want to come out and say so. You mentioned, you know, if you open the box and you see these things, send it back or rather you don't see these things, send it back.
And that made me think of the process by which people that maybe don't purchase online go through when they purchase a toy because the option to return things is actually incredibly rare. Usually there's sort of a set protocol. You pick what you want by looking at floor models that do not have the booklet inserts with them. You take the box up to the counter.
The person at the counter takes it, the box opens the box, takes out the toy, puts in some batteries to confirm that it works. And then you make the sale and it's no returns or exchanges.
It's been so long since I bought any kind of that sort of paraphernalia in person. It's all been online to me.
Do people still buy text toys in person?
Is that a thing?
No, I'm kidding. I know they do. I will send them in to have the conversation with their partner because a lot of people are intimidated by these resources. And so being able to touch it, to see it, to hold it, to look at different choices makes them feel safer and more comfortable with the use of maybe a vibrator for the first time.
But we're trading one kind of safety and comfort for the loss of another kind. And that strikes me as profoundly unfair.
Oh, it is. Absolutely. I think. And that's an effect of like the massive data entitlement attitude that is pervasive through business today. So the idea that because it's possible to get data, that there is a necessity to do so.
Yeah, I mean, the really ethical sex toy company, connected sex toy company would be massively diligent about minimization and pseudonymization and anonymization and put loads and loads of gates between anything identifiable and the kind of inferences being drawn.
But yeah, it's entirely reasonable that, you know, bricks and mortar stores say if the packaging's been undone, we're not accepting returns when it comes to something that you use so intimately. That's fair enough.
And the, you know, being able to go back to a place and return it, that's really hard as well. I suppose the obvious answer is we need better regulation. But I think we all know just how effective regulation is in the real world. It's one thing to say thou shalt not. It's quite another to police it, enforce it appropriately and fairly. Yeah. Yes.
And that's how we end up with clicking the button that says allow all.
In the US, we have consumers' reports.
Do you have something similar to that in your way?
We do, yeah. So we've got the Office of Fair Trading in the UK. And there's also nonprofit organizations like WITCH, which do consumer reviews and nonprofit organizations that do things like, yeah, battle unfair consumer practices and TV programs they make for great entertainment. Rip off Britain. There's never any shortage of material for that. I almost think there needs to be a clearinghouse that's looking at these toys and giving some feedback.
But A, we got the situation where a lot of people do want this private. So it's not like you would be like, oh, let's, you know, have a very public website that people could go to.
And B, again, a lot of these companies don't have significant investments. It's one thing to do it for a car. It's another thing to do it economically for a toy. And those things are only done for cars because of a history of harm to humans followed by regulation followed by culture shift. And we're still kind of in the early stages of with data tech, you know, harm to humans, regulation, culture shift.
We're still going through that. We haven't actually really got to the culture shift yet. It would be great if there were more people willing and eager to hire people like me and pay them to do kind of privacy reviews of connected stuff.
But until, you know, there's regulation with teeth and consistent enforcement, there's kind of no no imperative to do that.
What's the motivation?
And money, you don't have to for something that probably isn't going to get you into trouble anyway, but will harm people somewhere over the horizon.
Like what business is going to do that?
So let me ask this as we head towards kind of the end of our conversation. This episode is airing in the middle of December. Christmas always falls on the 25th. And this year Hanukkah falls rather late in the month as well. And there are lots of other reasons why people give holiday gifts this time of year.
So what would be your top recommendations for somebody who might be looking to to buy their their loved one, their beloved a more intimate gift this year?
What should they be looking for?
Are there products or companies that you like or prefer?
Give us give us your holiday shopping guide. So I think, well, item number one is do your research.
You know, if you're going to if you're going to be reading, say, reviews and stuff about the look and feel the features, whatever, look for reviews that talk about the privacy and the data stuff. Check out the company's website.
I mean, if it's loaded with like tracking cookies and advertising stuff and, you know, there's something like we've just planted a whole load of surveillance on your device, click here to make this annoying message go away without giving a choice, then that's not really a good indicator. It's kind of a red flag. Check out returns policy on anything that you might be considering buying.
Just put sex toys, connected sex toy privacy into a search engine. And there's a bunch of articles that will come up. So there's companies that that are looking into these things like Ken Monroe at Pentest Partners. He does some really interesting tear security tear downs of connected sex toys. And I mean, some of them are just shocking.
You know, if if if the privacy information is just a dense wall of legal babble, then it's an indication that the organization doesn't actually understand or care about privacy. So maybe consider not getting an Internet connected toy.
Maybe if you get something that only works like Bluetooth to Bluetooth on paired devices and doesn't travel across the Internet, that's always going to be safer because you're not sending the data to a third party's servers and then to a couple of billion appetizers and the usual stuff as well.
You know, look at whether the toy is body safe, whether it's been tested to certain recommendations because if it isn't made of body safe material or it hasn't been safety tested from an electrical point of view, you can pretty much bet that it doesn't have any kind of privacy protection either. Great tips. Great tips.
Well, I really appreciate you coming on today. This has been a great discussion.
Oh, thank you for asking me. It's a really interesting topic and yeah, I can go home for days. Don't let me. We'll have to get you back on. I think the line that's going to stick out in my head and I'm going to be thinking about this well into the evening is when you said never in human history have people worked to make something safe before they monetized it.
And such a good example with these toys and with these products.
Yeah, people will be people. People will be people.
With that, thank you so much audience for tuning in to Securing Sexuality, your source with information you need to protect yourself and your relationships. From the bedroom to the cloud, we're here to help you navigate safe sex in a digital age. Be sure to check out our website, securingsexuality.com for links to more information about the topics we've discussed here today, safer toy companies and Rowenna's social media contact information.
Absolutely and join us again for more fascinating conversations around the intersection of sexuality and technology. Have a great week.