Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week’s episode: Mitigating Risks: Implementing Security Measures in the Sex Tech Industry
In recent years, the sex tech industry has experienced a significant boom, with technological advancements providing new and exciting opportunities for individuals to explore their sexuality and enhance their intimate experiences. From sex toys equipped with remote control features to virtual reality adult content, the possibilities seem endless.
However, as the industry continues to evolve, it is crucial to recognize the importance of privacy and security in this rapidly expanding field. This blog post aims to shed light on the significance of safeguarding personal information and ensuring a secure environment within the sex tech industry. Privacy Concerns:
Security Risks:
Importance of Privacy and Security:
Best Practices for Privacy and Security:
By addressing these issues promptly, companies can stay one step ahead of potential threats and ensure ongoing privacy and security for their users. As the sex tech industry continues to evolve and thrive, prioritizing privacy and security is paramount. Protecting personal information, safeguarding against data breaches, and ensuring a secure environment are essential for user trust and empowerment and legal and ethical compliance. By implementing strong encryption, transparent privacy policies, and regular security audits, the sex tech industry can continue to grow while providing a safe and secure space for individuals to explore their sexual desires and enhance their intimate experiences. Key Concepts:
Stefani Goerlich: Hello and welcome to Securing Sexuality. The podcast where we discuss the intersection of intimacy-
Wolf Goerlich: -and information security. I'm Wolf Goerlich. Stefani: He's a hacker, and I'm Stefani Goerlich. Wolf: She is a sex therapist. And together, we're going to discuss what safe sex looks like in a digital age. And today, we're joined by Brad “Renderman” Haines, Hacker, security researcher, and also the founder and chief researcher for the Internet of Dongs Project. How's it going, Renderman? Brad “Renderman” Haines: it's going pretty good. Wolf: It is always fascinating to me, because when we started on this path, everyone I talked to said, Hey, did you talk to Render yet? Have you looked at Render’s site yet? Did you see his talk in this? So why don't you fill in our audience about what I've heard so much about in the past year in terms of what you've been talking about doing? Renderman: So the Internet of Dongs project is, uh, my research project to assess the state of security from a purely informational, you know, Internet of things from the standpoint of Internet-connected sex toys. This was a field that, frankly, has not been researched heavily. Because of, you know, societal hang-ups on things. Some companies just can't get over the giggles. It's something that I found very interesting because it actually affects people in ways that you don't ordinarily think about with, you know, other Internet of things, you know, products, your fridge, your thermostat or anything like that. So the risk landscape and threat assessment is very different. So it's been a very interesting exercise. Stefani: For the non-hackers in the room. What do you do? What is the Internet of Dongs project? What does that entail for you? Renderman: Up until, you know, the last 10 years or so, everything was essentially a manually operated device. Now they're so-called smart. They connect to your phone which allows you to control them on your phone or allow somebody else to control or over the internet allow control at a distance. That last piece is, uh, what really got me interested in? Because suddenly the threat is not just, you know, somebody attacking your device, you know, in the room or or what have you, as I like to say, uh, the mole hordes of the gate. You're 100 milliseconds away from every jackass on the planet now. And they could all theoretically connect to your sex toy or, you know, your account or something like that. So it's a suddenly a very different uh uh, world. These devices are not unlike a lot of other things you have in your home. They're Bluetooth that connects to your phone. There's a mobile app. The mobile app talks to company servers for your accounts and connects to other, uh, users. There's a lot of security aspects, uh, along that path. And so, yeah, using the skills I have, I'm trying to assess these and work with the companies to, uh, basically get them, uh, as secure as they can be. Wolf: Now, we've talked to folks on this podcast about the privacy issues with sex tech, right? Like, does the toy gather information they shouldn’t? Is it collecting and sharing this in ways it shouldn't. Is it posting it in places it shouldn't. Is that data able to be aggregated? I had one person who I was talking to who said, You know, when I realized that not only my could my Fitbit tell did I have sex last night, but it also could tell Did I enjoy it? And so there is this weird overlap on the privacy side, of course, which makes it very creepy. But we really haven't had many conversations about the security side. So I'd be curious to know, like, what are some of the common vulnerabilities you've run into on the security of these devices? Renderman: When I started the project in 2016 it was, frankly, terrible. Lots of low-hanging fruit shows, right? The industry. you know, after finding a bunch of vulnerabilities and reaching out to the industry, they took note, they woke up very quickly and have made things a lot better. There's still room for improvement, of course, just as with any product, but it is way better than it was. The most common kind of vulnerabilities I have are authentication issues. Where, you know, the passwords are being sent and properly hard-coded passwords within the applications that are used for how the application talks to the back end. Things like API keys that allow you to get in and possibly do a heck of a lot more than you were ever supposed to as a user? Poor data handling, you know, not using strong encryption and connecting the app to the back-end servers. The vast majority of that, like I said, has been fixed. But there is a subgroup of vendors that typically will just build a device for another company, slap their name on the box and call it a day. They aren't incentivized to secure their devices as well, because after they sold the device, there's no money to be made. There's no additional revenue paths. So they don't want to upgrade it. They don't care to. These are the ones that typically have a lot of problems with excess data collection. With some of the aforementioned authentication issues, and it gets downright scary when, as you mentioned, like with a Fitbit – I can tell a lot of interesting information about you. If your fridge reports to the company, how many times you open it in a day. You might not be proud of the fact of how often you're going for a snack, but the company receiving information from your sex toy or the app saying how many times a day you're using to masturbate takes out a whole different meaning in society. It's just about metric. But people, when you suddenly associate sex with it, get really – have different, you know, much higher opinions on it. So, you know, the case of WeVibe when they got sued or part of five or six years ago now, uh, cost them $5 million, in that particular case, the mobile app. The privacy policy that was in that, uh, app did not disclose that they were collecting certain information. Like, you know, when you logged in, uh, what your favorite pattern was, who you connected with other. You know, diagnostic information, like there was a temperature sensor in the chip set that, you know, was used to tell if the thing was overheating. You know, that information was all sent to the company. The text they had in the APP was actually the text from their website which didn't mention any of the the functionality of the app. So they really got caught on a technicality. They were collecting this information without actually telling people, because somebody copy and pasted the wrong text. There was no evidence that they were collecting these, you know, major dossiers on their users or anything like that, Other than you know, what's the most common pattern people use which, you know, It's the kind of information we let companies collect all the time from us. You know, Why does it matter? You know what you know particular mood lighting color or, you know, we use that they like to know what your favorite so they can like, uh, make sure that the next version of the product is able to do that and better or whatever. But when you associate it with sex suddenly like knowing which particular vibration pattern you prefer takes on a whole other realm. Wolf: So I remember Peter Sandman talking about risk management way back in the day. I mean, way back in the day, like VHS tapes days, he was giving lectures. And at the time what Peter Sandman was saying was, You know, we always talk about risk being the hazard, the harm we always talk about risk as being the vulnerability. But really, what drives a lot of the risk conversation is the outrage, right? How impactful do you feel If this information is shared and leaked and you're right, I may not care at all if you know the temperature of water I like, but I may be pretty outraged if you're sharing the patterns that, uh, that I'm using. Renderman: Not even necessarily the patterns, but some of them, uh, the way that the, uh, apps were coded, uh, could accidentally leak things like GPS information. Knowing exactly where you are. That certainly takes on a whole other thing. The outrage is in certain ways justified. Uh, in the case of we, it's you know, like I said, a screw-up in the privacy policy, they weren't telling people they were doing this or why. But at the same time, it's something we allow everybody else to do. You know, the IOT space. So why? Why is the fact that you're associating with sex different? I always find this interesting because when I give talks, I always look at the audience and I say, Yes, every one of you here, with maybe a few minor exceptions for In Vitro are here because people had sex, like get over it. But it's it's when you get past the giggles. And I fully admit, when I first started the project, it was, you know, going to be for giggles and like, had a had titles that would get, uh, selected at conferences. But once I started getting into it into the research, I found how bad things were and what the implications were, and it took on a whole real level of serious very quick. Stefani: Can you give me some examples or tell us some stories about how these technologies have gone off the rails? How this has impacted the everyday user, the person that just wants to be in their bedroom with their vibrator? Renderman: There's multiple devices that I found that were accidentally leaking GPS coordinates. So I could tell which room of the house you were in the scary part with that was there was a on one of them. There was a search function for being able to find other users and invite them anonymously to connect to your device. But the problem was, they wanted a Tinder-like functionality where it would say how far away you were from the other user. Well, there's multiple ways to do this. They apparently, or their developers, took the easy way, which was to send the GPS. Coordinates, uh, between the users and just do the calculation locally. So if you just peeked under the hood at, uh, what was being transmitted, you could see the GPS coordinates or the other user. And with the search functionality, it would tell you for every user that came back in the search where their GPS locations were. I mean, this is like stalker dream kind of situation, which is dangerous. I mean, this is supposed to be anonymous, but very much not now. And I can tell which room of the house you're in the most flagrant abuse or vulnerability that, uh, got did get exploited was, uh, this company QIUI. Had a cock cage, a male chastity device that locked around the genitals and, uh, had a mobile app to lock and unlock the device. You know, pa over Bluetooth. Their app was and particularly the back end of, uh, services was terrible. You could ask it, uh, certain to do certain functions. And in its response, it would actually give you the user's password in clear. And if you, uh, went through and iterated, you know, the the user, IDs It would tell you their passwords, and you could use that to access other functionality and do things like change the password on the account, effectively locking them out, which is, you know, inconvenient. But in this particular case, the device required a key from the server to be sent to the APP to then be sent to the device in order to unlock. So if your phone couldn't connect to the server or you were locked out of your account, you couldn't unlock the device. There was no emergency, you know, bypass or key or anything. If you got locked out of your account, you had to get really intimate with some bolt cutters. Uh, a researcher in Europe reached out to me to help interface with the company. And we tried to report these things. We got a hold of the company reported them nothing happened, asked for updates, say, Oh, yes, yes, we're working on this, and then nothing happens. Months went by. We're trying to disclose this responsibly. I noted that, uh, a very cryptic tweet by, uh, some other researchers I know that occasionally delve into the sex tech field, uh, were expressing, you know, frustration and dealing with a sex tech company. I'm like I just asked them point blank,is the QIUI and they're like, Yeah, they have been trying to get all of the company to report stuff for even longer, and they had found the the the same things. Um, so we decided to kind of combine forces to say, Hey, we now know of each other guys, you've really got to get on top of this. Uh, and months went by, we were asking, Do you have a plan? Even, You know, this is during the height of covid. So we just want to see a plan, you know, some sort of momentum and we weren't like they came up with a new version of the app that actually was worse in a lot of ways. Uh, and finally, we were like you. You have yet to show us anything. We're gonna go public and give us a plan. And so we went public. We didn't disclose any of the actual vulnerabilities, but you shine a flashlight on something like that, and it's so blatantly simple that everybody found it. Uh, and there was a lot of reports on their forums and on Twitter of people having their account, uh, passwords changed and being stuck inside these devices, Uh, the company had to post a video of how to, like, essentially break out of the device, you know, with a screwdriver. The interesting part was somebody wrote a python script that essentially iterated through all the user IDs changed the passwords and then sent an email to the address in the account demanding a certain amount of Bitcoin in order to unlock the account. So we literally had ransomware for a sex toy. Wolf: But I think for those people listening, who may be building toys, maybe being in the sex tech industry, one of the key takeaways, which I'm hearing from you, which we've talked about in so many other industries, but I don't think is being considered much right now in the space is to have a process in place. Have someone responsible for checking those emails for responding, for replying, for engaging with the research community so we can take some action when people like yourself find these things. Renderman: Yeah, there are two major things that I tell a lot of companies to do. One is a vulnerability disclosure program. Pretty much any company should be running this, you know, have a vulnerability disclosure program. They're not hard. The idea is, if somebody finds a problem, you know, or vulnerability or, you know doesn't even have to actually prove it. But just like, hey, this doesn't look right. Give the people a a way to report this to those that can actually do something, not a general delivery thing that goes to a secretary that doesn't understand the the, you know, technical stuff, but have a go to like an actual technical person. And for triaging, this should be a no brainer because people want to do work for you for free. Let them. This is a a simple business thing of people want to make your product better. Let them, uh, you know, part of it is establishing a, uh, process internally for handling these communicating with the researcher constantly. Letting him know that in 48 hours you're gonna triage this, Test it out? Uh, let them know if it's, uh, actually a problem and what they're gonna do to fix it, You know, when they're gonna fix it, Uh, and just, you know, you don't even have to pay them, you know, as long as you, uh, uh you don't have to pay them or offer any remuneration or compensation. You could just have a kudos page or send them a t-shirt or something. It doesn't have to be expensive, but it's such a basic thing. The other thing I pointed out to a lot of the industry when I first started reaching out was it was very interesting to me because these companies did not think of themselves as, uh, tech companies or software companies. They previously had, you know, electrical designers and stuff like that that you know, motors, batteries, et cetera. You know, materials designers that built, you know, these devices. Somebody got a chipset, you know, development kit from Nordic or Texas Instruments Or somebody put it in, you know, made a basic app. Hey, it works. Ship it. These companies did not realize that they just opened up the door to a huge world of pain. And they don't go to the conferences like I do or Wolf does. Uh, they aren't aware of what they needed to do in order to secure this. They didn't think of it as something that was connected to a wider world. And so when you tell them, Hey, you're a software company now and much like sex, if you make one mistake, you're supporting it for the rest of your life. But it's interesting that they're making the mistakes that we've, you know, figured out 15, 20 years ago. This is not new territory like anything else IOT. It's, uh, same as a nanny cam. Same as a thermostat or fridge or stove or doorbell or whatever. Um, funny story about doorbells. I actually got my hands on a, uh, one of the first cock cams, which is a digital mounted web cam, and took apart the application and, uh, found that the, uh, manufacturer had reused, you know, large chunks of the application over and over again in multiple different products. Uh, the actual hardware itself was the same as you find in a Delink webcam. The application itself, you could tell, had actually been previously used on a video doorbell product, but they left the doorbell sounds the MP3s of the doorbell sounds in the app, which means that for the rest of my life, I will never have a shortage of Ding Dong jokes. Stefani: I love that. And I think that what you're saying about sex toy manufacturers not thinking about themselves as a tech company, is equally true on the user side. Right, like my clients will buy Bluetooth-enabled toys or connected devices for when they travel, and they rightly think of them as relationship enhancers as tools to help preserve and protect intimacy across the distance. But they don't necessarily think of those things as high-tech technology. They in the same way that the manufacturers don't recognize the data that they're holding necessarily. I don't think that the users think often about just how intricate and just how adaptable, as you've just shown, um, these devices can be, so what can end users, what can my clients do to make sure that whatever they're bringing into their home and into their relationship is safe? Renderman: This technology is amazing, as you said, uh, for long-distance relationships and maintaining intimacy, uh, as well as for exploration of different elements of sexuality. They are a wonderful advancement. The problem is like with most, you know, high-tech stuff. People don't understand how their devices work. They don't realize that, you know, they're used to a remote control car or something like that. As a control thing you're thinking, you know it's in the same room. It's a short distance. But if you're connecting to somebody halfway around the world, yeah, that's not happening from a little, you know, battery-powered remote. So it's got to go over the Internet. The Internet is full of jerks who, you know, might wanna mess with stuff. Uh, and people are always poking at stuff and finding things. Uh, the major thing is, if they can monetize it. You know, bad guys, if they can't monetize it, generally, we'll leave it alone. But there's always the people who just want to sow chaos. Uh, the other thing, too, is with some of these companies. The information they're collecting if they're collecting it for, you know, responsible purposes, how are they storing it? I'm waiting for the first, you know, divorce case that, uh, requests data from one of these companies to prove that, you know, the guy has his The app on his phone was connected, uh, paired with his secretary's toy instead of his wife. You know, something like that. Uh, but most people need to realize that you're not special. Nobody is is specifically going to be targeting you. Uh, you are not that interesting. I'm sorry. What they are doing is they're generally looking at things in aggregate they're looking for Can I get the information out of all of the users if there is, like, credit card numbers in there or something, or or other personal information that they can use in some way, they're looking at in aggregate. So, you know, making sure that the, uh, companies that you buy these products from, you know, the actual vendors, the manufacturers, if they have something about security on their web pages, do they have a vulnerability disclosure program? Do they at least seem aware that such thing exists as to what they need to do? Um, you also need to consider yourself, uh, what do you find risky. You know, Are you OK with a company knowing that every time you open the app, it's connecting to their server? There is a log generator that you connected to the server. So that means you're probably going to be using a toy. Are you OK with that? Uh, on a technical level, they do need to be somehow aware of who you want to connect to. You know, if you're doing a remote, uh, session. Uh, some of these apps have, uh, audio, video and text capabilities. You know, capabilities again. That is being sent through a computer you do not control. Are you OK with that? My guess is that the majority of people are OK with that. Uh, but you need to ask yourself – if you're not, there are a ton of, you know, manually-operated devices still in the market that, you know are are great and do the trick, Um, and stick to, like, old fashioned phone calls and, you know, imagination, things like that. Wolf: When you say, you know, most people just don't care. Um, or most people you know are are pretty open to it. I oftentimes think that part of that is that the risks are invisible, right? And if the risks are invisible, oftentimes we don't know. We're taking the risk, and part and parcel with that is, you know, when this is the the the dawning realization I've been having this, uh, this podcast with Stefani is you know, when we think about security, we think about things like vulnerabilities. We think about things like, you know, credentials, being in clear text so people can reuse them. We maybe think about assets and asset management. Um, but when we talk to everyday people, and you say, Hey, what do you think about security? What are they think about? They think about things like keeping their families safe, keeping their friends and loved ones safe, you know, protecting their finances, protecting their homes. They don't oftentimes equate. The technical security with the personal security. Renderman: Yeah, and it's it's interesting. I mean, the Ashley Madison case again shows because there was people who committed suicide because they were outed, having been on that site and their, you know, social group or whatever. That was a taboo thing. Uh, and it was interesting when I first started this research there was a vulnerability. I found where you could ask the log in page, basically for one of the vendors. Hey, uh, I forgot my password. You know, Do you have, uh, you can basically ask it. Does this email address have an account associated I? I took a list from the Ashley Madison dump of you know dot gov and other interesting, uh, domains. Didn't find anything, dumped my own personal, uh, address book, about 300 addresses. Ran it through to see if anything hits. And I actually had one, which led to a very awkward conversation to confirm, uh, my finding Fortunate. It was somebody that I knew would not be offended by this or anything like that. Uh, it turns out they bought, uh, some devices for some Bluetooth research, uh, as well or something else, but, uh, yeah, but It was the fact they had to create an account to, you know, activate it and everything, but I got a hit, and it could have been somebody else that I suddenly now know something about them that I maybe never expected. And it's like that can make things awkward. And so it's interesting that you know what people want to keep private. You know, if I got, like, 30 or 40 I've lost count of how many devices I have now. You know, everybody knows I have these. Like they carry around in a luggage bag is carry on on a flight, which makes going through TSA so fun. Uh, they always want to inspect that bag. Stefani: Been there. I have actually had that exact, um, situation at Midway Airport. And it's, um, interesting when they start laying everything out on the table. So been there. Wolf: Uh, a couple of things I want to tease out of there. One is it's important to if you're registering these tools, uh, maybe not use your main email. Definitely Never, Ever use your work email perhaps consider using a disposable email. Uh, but the other thing I wanted to jump back. You said something about divorce cases. How would you envision someone saying, Look, we're gonna send you Stefani, All the evidence of GPS tags off of the device. What? What would that, uh, what would that look like? Stefani: I mean, first, that would look like me referring them to Renderman, because I don't know anything about GPS tags, but, I mean, it's not uncommon when I'm doing consultation work about a case. And I'm not speaking about any particular case, obviously. I mean, I I've had people ask me to review text messages. I've had people to ask or ask me to explain terms that are used or to give my impressions of a pattern of behavior. Uh, one thing that I think is interesting is that not in expert witness work, but I have had colleagues who work with couples around infidelity. And I've heard more than one story about people being outed when one spouse went to, I don't know, link the doorbell or link a new printer to their phone on their partner's phone and they notice a sex toy listed in the Bluetooth pairings that they don't own. So, like, Oh, and and and not all of them are are wise enough to give the pairing device as a different name. Some of them are pretty overt in terms of you want to be able to find your vibrator, quickly click this button. Exactly. So there absolutely have been cases. Um, not court cases, but client cases that I have heard of where people were inadvertently outed because their phone was linked to a device it shouldn't have been. I've also, uh, had my, uh, people who were driving down the highway, and all of a sudden their partner's phone lights up. And it's because it has paired to the WiFi signal of a nearby hotel, which, if you're local, uh, is a cause for questions for those people driving down the highway. I've had that happen a couple of times, so these are things that I don't necessarily people think about or consider until it happens. Renderman: And this is why you should always stay at, you know, nice higher-end places that all have the same, uh, network name at all their locations. Stefani: Excellent point. Although that only works if you travel in general.If you are somebody who works down the street and doesn't typically leave or need a hotel but it will still raise the red flags. Um, the one thing that I am most curious about is how this will play out. Um, long term. I wonder, as more and more things become appified, as more devices not just sexual devices, but health devices. When we were at AVN last month, we met the makers of a new effectively cockring that was programmed to give the wearer health data around their erectile hardness around their blood pressure around a lot of different things that seem really super helpful on the surface. Until you pause and think about the fact that all of that information has to go somewhere and it becomes very blurry very quickly. And I'm wondering if you could kind of pretend it's the New Year and predict the future. What do you see coming down the pipeline in terms of IOT sexual devices of all kinds and where are you excited and where are you a little bit trepidacious Renderman: Uh if I had to predict the future. Couple of things, I said the data for a sex toy would be used in a a court case like anybody would ever have thought of something like a, uh, a proving an alibi or something, Something like completely off the wall that you would not be like. Well, OK, that actually does. Because, I mean, they've used Fitbits to establish alibis. They've used Fitbits to, uh, break alibis and, you know, prove murder. So it's gonna be another data source. I. I highly encourage all the companies that you know, much like PCI or GDPR or anything that those rules exist. Basically to say, if you don't need it, don't store it. Because if you don't have it, you don't have to worry about, the other thing that I think is unfortunately gonna happen is various, uh, countries and regimes are going to start using data from these, uh, companies to out homosexuals and others. Uh, because companies like Kiroo have, uh, uh, the ability to pair like devices. So to balances to, uh uh, Stroker. So you could basically have a long distance homosexual interaction. Now, if one of those parties is in a country where homosexuality is illegal and a lot of these countries typically will filter or monitor the Internet usage. Oh, suddenly, that's an interesting piece of information you now that could end very badly looking sort of things going on in the States with, you know, anti-trans, LGBTQ harassment, and such it being written into law. You could see, you know, law enforcement going to some of these companies and saying, Hey, you know, tell us everybody that's ever, you know, identified as one gender, that pair of the same gender. Stefani: And it's interesting that you bring up the outing factor, Uh, on the positive side of that Wolf and I were in Jamaica for a sexology training that we attended and some of our cohort were trans or were gay. More than one of them logged into Tinder while they were there. And it was fascinating because they immediately got a pop up on their screen, saying, Your phone is telling us that you're in a location that might not be safe for you. Do you want us to hide the fact that you're online, and A, it's sad and awful that that even needs to be a thing. But B, I think that's a great example of what more companies, especially sex tech device companies should be doing. There should be some way to take the data that they're using and leverage it to protect their users or to at least give their users options to protect themselves. Renderman: Yes, but the bigger issue is they need to do this preemptively. They need to be aware of these threats and deal with them, you know, build it into the security into the product from the beginning because much like the, you know, the warning label of, you know, do not iron pants while wearing the fact that Tinder does that means that there was probably a reason it probably was that somebody got hurt or worse, and they had to put that in after the fact. Not preemptively. Wolf: This does dovetail into the last question we wanted to ask you. So obviously, many people who listen to this podcast are in sex tech, are building new toys, are building new apps. What advice would you give them and in which way would you like them to reach out to you about Internet of dogs and about your work. Renderman: My major piece of advice for anybody in sex tech and for users is to not be afraid of the technology. Yes. You know, some of the things we've discussed here, the vulnerabilities and that, uh, are frightening. But no, no more so than many, many other products you use on a daily basis. And don't think about it. What you need to do is you need to be aware that there are these potential issues. Judge, you know, look at your situation. Is this something that you can handle in your life? And like I said earlier, if you can't, then while there are still manually operated devices out there, these things are great during the pandemic, especially. It showed, you know, long-distance relationships, the ability when people couldn't get together, uh, in person, it allowed for a lot of relationships to either continue or to blossom at a distance. You know, during the lockdowns and from a, uh, a sex therapist standpoint allowing somebody to explore, Maybe they think they're homosexual or something. They could explore what it would feel like or, you know, emotionally to connect with someone of the same gender. But, you know, at a distance, I am astounded at the cam model industry, which exploded during the pandemic as well. The interactivity with viewers and the models with these devices. You know, it's the old telephone company to reach out and touch someone just in a new and interesting and private way. It is really an interesting technology. You just need to be aware of the risks and judge your own situation for them. Uh, we've got several articles on the our website, which is, uh, internetofdon.GS. So Internet of dongs, but dot GS at the end, Uh, [email protected]. Uh, I'm also on Twitter at Internet dons, And, uh, yeah, that that's always interested in talking about this stuff. Um, I know make recommendations on toys that you know, that there's lots of sites to do with those kind of reviews and everything. Uh, typically, don't you know, offer Say, you know, one is is more secure than the other, Uh, but can definitely make some very general, uh uh, punch you out some resource. We got a bunch of articles about, You know, if you're a customer, if you're a cam model, uh, actually expanding out to do more about cam models and the general operational security, how to protect yourself? Very useful resource. We're hoping soon to have a like a one-pager that if you're a, uh, adult toy store that you could just, you know, have on the counter of, like, what do you need to know? You've heard the stories. You know? What? What's the truth? Um, hopefully one day we can get the, uh, the manufacturers together to, uh, agree on a standard and, you know, work together to, you know, come up with some common criteria for securing things and kind of keep each other secure. Uh, because the last thing you want is I want is to be kind of like an underwriter lab for sex toys because he wants to see my face on the other side of the box. Wolf: We'll put all those links into the show notes. So if, uh if you're listening to this, and you are fiercely writing it down or trying to figure out where the dot GS goes. Please check out our show notes. Thank you so much for joining us, Renderman and thank you for tuning in to Securing Sexuality. Your source for the information you need to protect yourself and your relationships. Stefani: Securing Sexuality is brought to you by the Bound Together Foundation, a 501c3 nonprofit. From the bedroom to the cloud. We're here to help you navigate safe sex in a digital age. Wolf: Be sure to check out our website, securingsexuality.com for links to the information we've talked about today, as well as our live conference in Detroit. Stefani: And join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week, everyone. Comments are closed.
|