Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week’s episode: Sex Robots and the Impact on Human Sexuality and Relationships
In today's digital age, technology continues to advance at an unprecedented rate, permeating various aspects of our lives. One such advancement that has garnered significant attention is the development of sex robots. These humanoid robots, designed to simulate sexual encounters, raise numerous ethical and psychological questions. Here, we aim to explore the ethical implications of sex robots and analyze their potential psychological impact on individuals and society.
Ethical Implications of Sex Robots The advent of sex robots brings forth a host of ethical concerns. One primary issue is the objectification of women and the reinforcement of harmful gender stereotypes. Critics argue that the creation and use of sex robots perpetuate the idea that women are mere sex objects, further entrenching inequality and contributing to the objectification of women in society. Furthermore, the design and programming of these robots can perpetuate unrealistic beauty standards, leading to body image issues and self-esteem problems among users. Another ethical concern is the potential for harm and exploitation. Sex robots blur the line between consent and non-consent, as they are programmed to fulfill any sexual desire without the need for mutual agreement. This raises questions about the importance of human connection and the potential for individuals to develop unhealthy and unrealistic expectations of sexual encounters. The Psychological Impact of Sex Robots Beyond the ethical implications, the development and use of sex robots also have psychological consequences. One potential impact is the erosion of social skills and emotional intelligence. As individuals engage more with robots for companionship and sexual gratification, they may become less adept at forming and maintaining meaningful relationships with other humans. This could lead to increased loneliness, social isolation, and a lack of empathy. Furthermore, sex robots may exacerbate existing sexual dysfunctions and addictions. Individuals with pre-existing issues, such as pornography addiction or difficulties forming intimate relationships, may rely on sex robots as a substitute for human interaction. This reliance could hinder personal growth and inhibit the development of healthy coping mechanisms and associations. Additionally, the use of sex robots may contribute to the desensitization of individuals toward sexual acts. Overexposure to explicit content and the ability to customize and manipulate the robot's appearance and behavior may reduce the ability to find satisfaction in real-life sexual encounters. This could lead to declining overall sexual satisfaction and intimacy within society. Conclusion The ethical and psychological implications of sex robots in the digital age are complex and multifaceted. While proponents argue that these robots provide an outlet for sexual expression and companionship, the potential harm and negative consequences should not be ignored. Society must engage in open and thoughtful discussions to ensure that the development and use of sex robots are regulated and guided by ethical considerations. Continued research, dialogue, and vigilance are necessary to navigate the evolving landscape of human-robot interactions and safeguard the well-being of individuals and society. Key Concepts:
Stefani Goerlich: Hello and welcome to securing sexuality. The podcast where we discuss the intersection of intimacy-
Wolf Goerlich: -and information security. I'm Wolf Goerlich. Stefani: He's a hacker. And I'm Stefani Goerlich. Wolf: She's a sex therapist. And together we're going to discuss what safe sex looks like in a digital age. And today we're talking with Straithe. Many, many things can be said about you, Straithe. One of my favorites is “a William Gibson character that has walked off the page.” I know you're doing work as a technical community manager. You've got a background in researching human-robotics interactions. I'm really excited to talk with you today. Thanks for joining us. Straithe: Thanks for inviting me. Wolf: So I wanna start, um, with the scene at the diner. Right. So we're at a diner. We're having breakfast. The three of us. This was pre-pandemic. This was the early days. “I remember the sky was the color of a television. Turned to a dead channel.” Maybe, maybe not that one. I'm still hung up on the William Gibson thing, but we were having a conversation, and you were telling me about how these robots and the example you're giving was, as I recall it, a sex robot that was brought into an expo and was being manhandled, was being, you know, molested. And it didn't fare well. And that always stuck out to me because your research is around human-robotics interactions. And we're not often kind to our technologies, to say the least. Straithe: Yeah, you're right. Especially when it comes to robots that are embodied in sort of a more human factor. There are a lot of different ways that people will approach them, ranging from, being like your best friend to just a tool. And when people teach them or treat them just like a tool and not like a person. You have really opposing opinions about whether the actions people are taking are actually OK. Stefani: So how do we decide what is an OK way to interact with a robot or with technology? We've talked to philosophers about this, but I'm curious to hear your take on it. How do we know what's an OK interaction? Straithe: Well, that's what the whole field of human-robot interaction is about is, you know, looking at how people treat people and seeing if we can apply those same principles to robots and humans interacting with robots. And that is pretty much the same in most research that I've read and reviewed and even done myself is that if you have a robot and it has eyes, and the eyes move, and kind of indicate emotion like you would expect a human to… People instantly anthropomorphize the robot and will interact with it, uh, with those visual cues, like they would a human. And so a lot of it is anthropomorphization, which is when you treat an inanimate object as though it's human. And yeah, that really kind of is a guideline for me is like, if you would do it to a human-like, would you do it to a robot? But, um, you know, I strongly anthropomorphize things and other people don't. So there is, uh, a bit of a scale on what's ok or not, and I think it's up to us as a society to really decide what that is. Wolf: Do you think we're having enough conversations about that as a society, though? Because you know, when I when I have, uh, talks about, uh, chat GPT, for example. Just as a rough example. Some people are like, Oh, it's so funny, I. I convinced the, you know, little, uh, bot to go insane. I got it all confused. And I'm like, Yeah, but should you have I mean, yes, it's just a text message, but should you feel good about that? I feel like you should just leave the chat GPT alone if you're going from text box to text box and try to confuse them. Straithe: So there's a couple of points there, and it's that there's a lot of difference between AI and social robots. The embodiment of the robot makes such a difference in the ethical and moral conversations that we're having. So, one thing that a lot of researchers bring up when they're talking about robots, especially in the sex space, is how you treat them physically. Would that sort of end up being how you treat humans physically? So if you have a robot, that acts humanoid and is able to do sexual acts with you and it listens to you like, can a robot give consent? Would giving somebody this robot teach someone that they don't have to use consent? Or would it be a good place to practice consent? So, like, there's stuff like that to think about is how much that physicality changes that relationship. And then you have that AI component, which is where the consent comes in. So you have the physicality you have the mind you have - Then we're in your data stored to decide all this. So there's a lot that comes in, and I think we're having parts of conversations that are useful. Um, like talks about AI impact Social Robotics and how we'll think of that tech in, you know, a sex situation, Um, but so does how we talk about people and the laws that we have to protect people. And the ways that we talk about sex therapy that's also like all of these conversations are basically joining into this area because it's very complex. Stefani: I am curious to know more about how robots and AI are being used in sexual health. This is an incredibly emerging discipline in our world, to the point where, you know, just in the last few days on the list serve for my credentialing body. There have been conversations flying around about, um, sexting with chat GPT or forming emotional relationships with other AIs. Um, but I don't think that many sexual health professionals really know what to do with I'm gonna say more tangible examples like some of the real doll robotics that are really advanced in terms of their responsiveness and their capacity to interact with their owners, users, partners. I'm not even sure what the right term would be there, and it's something that people are doing, whether or not my world is prepared to address that or not. And I'm wondering, sort of what insights you might have? Straithe: Is this new? Uh, because all of the things I had been reading about it is that they are just dolls that they don't have voice interaction or ability to perform acts on you. They are only to be performed acts on. Stefani: When I say responsive, I know that they were developing dolls that would literally like, um, self-lubricate and would mimic a physiological response, a physiological arousal response to the touch or the interactions that their owners, users, partners, still not sure about the right term might be engaging in. And then I know that they have been working on some more, um, advanced AI aspects. They have both the real doll company and their own separate robotics arm now, that was doing a lot around, um, more interactive dolls. Straithe: Um, a lot of what I have done in my research is focusing just on that social aspect of using a physical robot. Because we can often fake it called in a method called Wizard of Ozing, where we actually have a person in the background controlling the robot. So we don't have to worry about AI What you were saying about the real dolls is to me, they're machines. At this point, they're not robots yet. Because robots – the definitions of social robots that most people accept, um, by Kate Darling is that a robot is something with movement that is able to make choice based on its environment. Um, and so it sounds like the real dolls as they are are machines because they don't physically get up and move around. And it doesn't sound like, um, they're taking Well, I guess they are taking information from their environment and then, you know, lubricating in response. But, um, whether they can do it on their own is interesting. I have so many questions I need to go and research this more after Stefani: I pulled up the ones that I was thinking of. And actually, I think my error might have been in the name of the company because Real Doll had two that they're calling AI robots Henry and Harmony. And those are the ones that I was thinking of in terms of their self-warming. They're self-lubricating. They have, um, defined erogenous stones, uh, programmable personality, memory, animatronic talking head. But the one that I was thinking about in terms of interaction actually isn't realdoll at all. It's, um, from Cynthia Amati. Uh, the company, not a person. Um, but it's called Samantha, and they said they did. They describe her as a robot capable of enjoying sex. So she vocalizes when she's touched. She can tell jokes. She can offer health facts. So it's starting to bridge that divide between AI interaction and robot interaction. I think in my head I blended the two into one product. Straithe: One of the interesting things about social robotics is like again, it is an amalgam of these things. Like we definitely need a robot that does, you know, what the Samantha doll does before we get a You know, general AI robot that can do all the sex things you want, right? So, um, you know, it's a step in the right direction, but I wouldn't call it Social Robotics yet. Um, and I think we're I think we're getting there, Um, for sure. And part of, uh, Social Robotics is that, you know, we can do things you expect in a social situation on the robot and, uh, get the result you expect? So I guess in this case, a robot responding to a touch in the way you expect it to is useful. And, um, one way that it is useful is, for example, people talk a lot about sex dolls for therapy. Uh, especially for people who've had traumatic sex in their life. Uh, is it a good idea to, uh, help these people deal with their trauma or process their trauma and get interested in sex with another person again by first introducing a robot. Is that ethical? I don't know. Is it useful? Maybe There's definitely a lot of research that can be done in that area. Um, but one of the fun things is thinking about you know how how we use that information. Like, who's gonna do that research? Who are the participants going to be? What type of information needs to be collected? Why, uh, and how is it stored? Where is it gonna be like, There's so many, Like you have the IT aspect of it because you've got all of this personal information you've got the AI aspect of it because it's got to collect information and process it to be any good as a sex robot. And then you have to think about like what? Um, what behaviors We're enforcing or changing. And why in society too. Wolf: And we could have an entire conversation, a very long in depth conversation about ethical research and how to protect all that, uh, that data. But I want to jump back. You said Wizard of Ozing and I. I love that term. So I'm assuming that means like the person is controlling the device like a puppet. Straithe: Yes. Wolf: All right. So the experience of the person in the experiment is that this is a fully formed robot because it's acting, interacting with them and everything, and then you have someone in the back end who's, you know, pulling the strings. Straithe: Yeah. Wolf: Love that, because that does that solves a lot of the computational challenges of trying to solve and figure this out within that What are some of the unique insights that that you found when you were researching this? I mean, you already brought up consent, but very broadly, what are some of the things that popped out to you when you were having these experiments. Straithe: So Wizard of Ozing is a pretty standard thing in human-robot interaction research. Um, most of the experiments use it. And what really struck me when I was a new researcher in this space learning about Wizard of Ozing is that people are totally willing to suspend their disbelief, and, uh, a big reason for that is because of video games and movies and social media. And all the other media we consume is that, you know, people see, uh, data from Star Trek, and they assume that we have, Of course, robots will be able to interact with me or Bender from Futurama or all of these things. So, um, a lot of what we do with robots is informed by the media we consume. And I think a lot of the ways we expect to interact with robots in the future and how, and even now, again is influenced by the media. So researchers developing these robots can really draw on those expectations to make the robot that people are expecting. And that's one of the fun things about sex robots is there's not necessarily a set expectation. So there's actually, I think, more of a chance for sex robots to be successful because, like, right now, how often do people meet like the perfect sexual partner, right? It takes work. It takes effort. You're usually not great the first time. There's always good stories, the first times with someone, right? Um, so there's leeway for a sex robot to not be great in the beginning, as long as it vaguely does what people expect. Um, you're probably not going to try a sex robot for the first time and be like that's it. I'm never going back to humans. Uh, so Wizard of Ozing, uh, same thing in the other experiments I did, which in that case was, you know, I was using robots to social engineer people. That's what my thesis is on is, uh, using robots to get information out of people and make them do things they otherwise wouldn't do. And so, especially with sex robots, there's just like there's so much potential there when you think of like the honey pot, right? So, uh, it does excite me as a topic to be like, What are the social engineering implications of a sex robot? Just like all so exciting. Stefani: I loved that idea of a honey pot. Um, we and I don't know at this time whether the what I'm about to say will come out before or after your episode, but we are about to record an episode called Stefani Goes to the Cyber Brothel. Travel about an adventure, A side quest we had when I was on sexology sabbatical. And until you said those words just now, it had never occurred to me that the environment that we were in the room that we found ourselves in the setup that was there could totally be used to, um, Honeypot. Somebody, um, a little bit of foreshadowing. One of the things that I was joking about the entire time we were there was you know, at what point do we get kidnapped by the Russian Mafia – like this feels like at some point, it's gonna turn into, um a Liam Neeson taken scenario. But I was joking. And then you I heard you say the word honeypot, and I went, Oh, I wonder who else is visiting that space. I wonder what kind of work they do. I wonder what kind of like personal, like emotional vulnerabilities they might have that could be exploited in a scenario like that. And now I kind of wish we had taken you with us when we went, because I think that we would have had a very different conversation around what that environment was or could potentially be. Straithe: Well, we'll just have to do a follow-up episode and take a trip together to a maybe different cyber, uh, brothel Because I believe there's one here in, uh, Canada in Toronto as well. So maybe I can convince you to come up and visit me in cold Canada to visit another, uh, robot brothel. Stefani: I would love to do that with a completely different perspective because, you know, the one that that this was truly a I I'm doing seven weeks of sexology research and sabbatical. Clearly, I need to have all of the experiences I can in this moment to really explore that space and that environment. But going with you specifically thinking about it from a social engineering honeypot vulnerability perspective would be super fun. I am down. Straithe: I'm already thinking of the papers we can write. Wolf: I'm watching both you guys get very excited and, uh, I love the academia. I'm like, Oh, that that I can see the passion And and that's not a far city. I mean I'm down. I'll drive, I'll fly. Straithe: Perfect. BRDD. Stefani: Yeah. Whenever your wife and your friend are excitedly discussing an academic adventure to a robot brothel, I'm pretty sure you're gonna be OK with that? Wolf: No, no, I think straight-headed. I'm the DD of this. I'm the designated driver for this adventure. This is way too exciting for me, But I'm in. Straithe: You know, So So, um, my thesis was on robot social engineering and defining the term um, you know, it's, what, 60-70 pages of just talking about how you know, robots and their, uh, personality how they look, the sensors on them, Uh, how people perceive them. Even their clothes can, like, really affect how you interact with robots. And all of this impacts the information you can collect from people with these robots. And so, uh, I think there's just so much space, especially in this in the sex space where you already do have different clothing that means different things or excites pe people differently. Or you have different, um, body morphs that make people excited and like the ability to personalize your sex so strongly and not have, um, you know, retaliation or someone think you're weird because it's a robot, Uh, and how much that might change, you know, comfort in sex for our society is really interesting, and I think that's great. But then it scares me when I think about all the sensors you need on a robot and all the processing power, you're gonna need to use all of that information. Um, and where that data is stored and how it's stored and how you access it like all of that is the scary side or the, um, part we really need to be aware of when you're thinking about getting a sex robot. Stefani: How many people are actually getting sex robots these days? I feel like this has become a very popular topic to explore academically like you and I do to write about, like, these big, like sociology, think pieces in magazines and websites. But do you have any research or any numbers on how commonly these technologies, these devices, these tools are being used right now? Straithe: So that's the thing is, I think we're still very much in the exploratory phase. We're still at the very, very beginnings of this. And there are some folks that, uh, with very strong disabilities that need some form of intimacy that do, uh, have companion dolls. They're not robots yet, but they are dolls. Um, but it lets them explore what intimacy might be like. Um, and again, we have things like AIs that also are helping people, like dip their toes into the space. Um, but I haven't seen a good combination of doll with the right skin texture and looks with the right AI, with the right… You know, unfortunately, uh, unfortunately named mechanical movements. Because that's not what you want to think of sex as as mechanical. But, you know, you do need, um, those movements to make it real. So I think, you know, we're making progress in all three, but no one's merged them together. Well, yet, so it is an impending, uh, issue. And again, like, I think it's still gonna take about 20 years for a good doll to come out. Uh, that even I might buy. Um, if I have the money, we'll see. Um, but one of the things is, uh I think it's right that we're exploring this before the robots come out. Because I really want to know if having a sex robot in your house will enforce bad behavior or condemn bad behavior. So, for example, if someone's really used to, you know, they buy a sex robot in the future, and they treat it with sexual violence constantly, is that an “outlet for them?” Which makes me sick thinking of it? Or is it, um, just gonna reinforce that bad behaviour and get that person to abuse, uh, human partners that they then choose after they have had violent and terrible nonconsensual sex with a robot. Right. So, um, these are things I want to solve beco before the issue becomes endemic, or at least know more about it before it becomes more of a problem. But maybe that's why we need to think about this before sex robots are more common. Wolf: I don't know that we can definitively weigh in yet in terms of, is it an outlet or is it a reinforcer? But I'm so glad that, you know, people like you and others are thinking about this ahead of time. So much of my career has been built on. We willed the Internet into existence, and then oops, things happen that we never stop to think about right. We never stop to think about all the ways that the technology we're building could be misused. Uh, which is one of the reasons why I've always been fascinated by your work and keeping an eye on some of the things you're doing. Because I am really appreciative of the fact that we may have 10 or 20 years to figure this out. When you were in the lab when you were working through this, was there any indication as to which direction it would go? Um, in terms of being an outlet versus being a reinforcer something as simple as consent or something as simple as you know, manners. I've seen some research about, you know, people who say please to their Alexa are more likely to say please in public is there any in early indications as to which way it may go? Straithe: So one thing is children, uh, teaching children to interact with the robots. If they don't respect robots and they kick them and hurt them, they are very much more likely to hurt their peers or, um, be violent towards their peers that they're violent towards robots. So, as a parent, um, it's our person in society who cares about kids and wants to see them grow up well, uh, that's one of the things is, if you see a child, you know, defacing property, you usually can tell them to say no. If you see someone If you see a child hurting another child, you will usually say, Hey, stop that. And a robot is kind of a mix of the two. It's a person. It's property. Legally, we don't have that figured out yet because it is complex about where things switch from machines to personhood and all of that. So there's a lot of grey area in there, and it's not quite clear yet, you know, like where we're gonna sit on that. And I think there's a lot of conversation, partially because humans have so much variety in what we think. And there's not one definitive answer for all people. Some people are more inclined to treat robots with a lot of love. Some are not I. I say thank you to my, um, map whenever it's like turn right, I'm like, thank you for telling me and, like, I will talk to it the whole drive, um, and be like, oh, where are we going next, little AI and like all of that. So I'm very strong on that side. And I am biased and I know I'm biased because I have that strong emotional attachment to robots. But it's so hard when you spend so much time with them to not be totally in love with them. Um, but there are definitely people who were like, Hey, look at this spot, robot. And you know, when they first introduced Spot and there were people in the lab that were just, uh in the video where they were showing it off for the first time just kicking the robot like there are so many people online that were very distressed at that But the person in the video didn't look distressed. So again, there's another example of the other extreme. So, um, my inclination is to think that, um, it'll be reinforcing all of our action towards robots will be reinforcing. Um, for example, there's a lot of work into sexism and robots. And if you think about it, all of our assistant robots, uh, assistant AIs specifically are named after women because that's what the test groups say. are a good name, but that's enforced by bias. Is women in servanthood. So it's is it a good choice like yes, the test panel said, Oh yeah, we love these names. It makes me feel cosy, blah, blah, blah. But as researchers and people building the tech Is this the Is this how we want society to go? Do we want people to keep thinking Women are should be subservient which obviously like I do not think, Um, but we're reinforcing behaviors. And now we have things like Siri or Alexa or all of these very soft-sounding feminine names. Whereas you think of Watson from IBM Super intelligent computer, it was able to beat jeopardy. And like, why did they use a male name instead of a female name? Like Why couldn't it have been named Tiffany like people have biases. So and and a lot of research shows like these are being reinforced and all of that. So I have strong opinions on this, and I really think that, uh, behaviours towards robots will be reinforcing of bad behaviours if we don't make other choices when developing them. Wolf: You know, I I'm so glad you pointed out the spot, dog, all those videos that go around I'm like, Stop kicking the poor thing. It's just trying to walk. It is it is a little bit distressing. And I I'm similar to you, I. I talk to pretty much everything. I. I have been known to tell my coffee pot. Good morning. So, uh you know, maybe it does take a certain type of person who leans that way, but one of the things that is fascinating that oftentimes I think doesn't get enough attention is people are completely a feedback loop. So if we think that for whatever reason, women should be, you know, on the reception side. And then we build an AI that has a woman's name and a woman's voice that is going to automatically reinforce that original behavior. And if we were to flip that script and we've seen countless examples of interrupting a feedback cycle and put a male and next thing we may start to think Oh, maybe a male could be our system. Maybe that is more. It starts to normalize society and normalize people who are interacting with it to think the other way. It reminds me of some of the conversations Uh, Stefani, you, I, and others have had around erotic maps. Right. So what happens early on in your first couple? Uh, experiences with intimacy tends to be imprinted on you in a certain way, and then you look for that, and it becomes a positive reinforcing cycle. Stefani: Uh, yeah, we humans are the accumulation of their experiences, and every relationship that we have and every interaction that we have helps us conceptualize the world around us and the people that populate it. And the more that people have opportunities to interact with, you know, folks from diverse backgrounds, the more open minded and receptive they are to diversity. And the more people have very rigid gender ideas, the more likely they are to see people in very rigid, binary ways. So, is it a simple way to explain it? Yes. Is it an oversimplification? No. And I think that it's gonna be really fascinating when I am retired and sitting in my sexology rocking chair to look back in 50 years and be like, wow, look at how AI and robots and all these technologies, even texting, you know, Look at how all of these screens and gears and synthetics have influenced how we see ourselves and how we interact. Not just with those things, but with the humans around us. Wolf: As we have this interaction Straithe, I. I know your, uh, area of, uh, interest was social engineering and social engineering can be negative. We can convince people to give us information or expose themselves to do things they shouldn't. Uh, but it also could be positive. We could appeal to people's, uh, better interests. It could appeal to people's better nature. Are there ways for anyone who's listening to this, who's building sex tech or thinking about, uh, robots and creating these devices as experiences? Are there ways that these devices could be used to effectively social engineer, a better world? Straithe: Yeah, for sure. Um, you know, one of the best things about working with human-robot interaction is pulling in specialists from other disciplines. Uh, that know the social aspect better than you do to teach you what should be done with a robot to make it have the actions it should. Um so in this case, I'd probably call someone like Stef and say, Hey, what are things that you need to, um, support your clients? Is there a sex robot we could build to make things better? So that would be Step 1 is call in a subject matter expert and then work with them to figure out how to, uh, support a behavior that is needed. Like maybe you want to build a sex robot to teach people consent. Um, so have a robot that will just say no a lot and actually, like, be able to defend itself. Or maybe it literally is able to shut it down its body in a way that a human can't. Um, you know, would that help someone learn consent? Um, there's something there that could maybe be done. And I'd love to talk to a sex therapist about that and do like experiments. I think this would be great. Um, but then there's other things, like, you know, teaching people how to have sex, like some people have really, really bad sex. And maybe we can get social engineering robots that are able to be like, Hey, I want you to practice this thing on me. Can we do it and like, it'll describe things. And maybe you do it over and over in a way that, you know, a person like would get really bored and say, OK, like we've done this 20 times today. Can we stop like I don't want that feeling to continue? That's also important because that's consent. But like having a robot where it's like, OK, you did that better than last time. Let's practice again and have that positive reinforcement of, you know, experimentation and sex and trying to get more comfortable with certain actions and stuff like that. Like maybe that's a really good thing we can do with social engineering and robots is, uh, better sex for the world. Stefani: Um, you had me at collaboration. I when I did my PhD in clinical sexology. Brad Seren, who heads the science of BDSM research team at Northern Illinois University, was on my dissertation committee. And now I'm like Straithe and Stef need to do like the science of robots and sexuality research team, like we could take, we could We could use Brad as a role model and we should collaborate. We should build our own research lab. I am down for that. Straithe: I don't have words. I only have hand actions I'm so excited for. Stefani: For those who cannot see a podcast as that should be all of you Unless we have some you know, X-men superpowers happening. We are just sharing, um, hand-heart symbols at this point because we have bonded over future. Um, sexology. Robot. Um, awesomeness. So stay tuned for the announcement of our request for grant funding for the new lab. Straithe: Maybe I do have to go back and do a PhD. If anyone is looking for a PhD student, please let me know. Stefani: Yeah, get funding. That way, we'll send you back to school. They'll fund the lab and then I'll just come in as like post-doctoral researcher. Straithe: My heart is exploding in happiness. Stefani: I can see Wolf's Wheels turning Wolf, are we able to like – does gofundme cover research labs? Can we, like, fund an entire discipline of science and sex? Wolf: I have ideas, but not ideas that will fit the remaining minute we have at this podcast. Stefani: So, I, I have a less serious question. But It's an important one, and it's been burning in my mind ever since, Um, our my my my field trip to the cyber brothel. And as we start to plot future research, um, I, I wanna ask you a very serious question, which is: so many robots, even like robot adjacent technology that is built for kids, have the cute little puppy dog eyes that make them very animatronic and very humanoid. And what I wanna know is, why don't the sex robots have those? Why do they have the dead creepy corpse eyes that you guys will all hear me talk a lot about when we do the cyber brothel episode? Because that to me, was the most surprising and most discomforting part of that experience for me and I'm – we spend so much time developing all of these ways to humanize robots. But the minute we start creating tools that are theoretically the most human-esque purposed things. They have these, like dead plastic doll eyes that are really, really creepy, and I realized that this is not a serious question, but it is a very important question because those dead staring eyes have been burned into my brain ever since we left and you are my robot person. So I want to know why are the humanoid robots not more human in the eyes? Why can we give the puppies eyes, but not the people? Straithe: So this is actually, I think, a very serious question. There is a whole subsection of human-robot interaction that focuses on only this problem. And a lot of these people tell you it's very serious. This is what they do all day, and a lot of it is we just don't have the right materials. We don't have the right experimentation, products, et cetera, um, to create realistic eyes because one thing about realistic eyes is they're wet. You kind of need self-lubricating eyes. But like that's not a big worry right now, because people are like, well, especially with sex robots like Is the pussy wet? Not are the eyes wet? So that's something to think about is like it's hard to create realistic eyes that move well, are wet, and uh, can be in an electrical system that requires the eyes to move. So I think it is a very hard robotics problem. Um, and maybe I don't know. Would you be more comfortable if it was just like a digital visor over top with completely fake pixelated eyes? Like, would that be better for you than the dead eyes? Would it be better to put stuffed animal eyes? I think that would creep me out. Oh, my God. I'm getting goosebumps. Um, um, but, you know, eyes are, you know, as I said, windows to the soul, making correct eyes is very difficult. Um, even with the puppy eyes, you know, do you go purely physical? Do you go digital? What color do you use? Do you use red? Do you use green, black, blue brown, uh, purple? I don't know. Whatever you want, there's so many details about eyes that people care about, especially because that's the first place. You look, um, so we could have a whole conversation on this some day. Stefani: No pun intended, but I feel so seen to know that that is a valid issue and that other people, um, are wrestling with that question. And the idea of the wetness of eyes had never occurred to me. But if they can make a self-lubricating doll. They just migrate that technology north, guys and all will be right in the world. I think, uh, thank you for taking my question seriously, you've for giving me a really fascinating answer. I appreciate that. Straithe: Yeah. Any time. Wolf: And thanks for joining. We're definitely gonna have to do some of these follow ups we've been talking about. It's so good to have you on. Straithe: Thank you. I have had a lot of fun and there's still so much I wanted to get to, So, um, can't wait to come on again. Wolf: Absolutely. And to the listener, Thank you for tuning in to Securing Sexuality. Your source of information you need to protect yourselves and your relationships. Stefani: Securing Sexuality is brought to you by the Bound Together Foundation a 501c3 nonprofit together with our conference sponsor Fun Factory. From the bedroom to the cloud. We're here to help you navigate safe sex in a digital age. Wolf: Be sure to check out our website securingsexuality.com for links to more information about the topics we discussed here today, as well as our upcoming live conference in Detroit Stefani: where if you're lucky. Maybe you'll get to hang out with Straithe and me. Who knows? We'll see and join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week. Comments are closed.
|