Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more. Links from this week’s episode:
Sex Positivity and Diverse Approaches to Integrating Technology into Sexuality
Technological advancements have revolutionized the way we interact with the world, including our sexual lives. The internet, social media, and dating apps have made it easier for people to connect with each other and explore their sexuality.
However, these advancements have also raised ethical concerns about how technology impacts sexual rights and how it can be used to navigate them. Sexual rights are human rights that ensure individuals have the freedom to make choices about their sexuality without discrimination, coercion, or violence. These rights include the right to access sexual health services, the right to consent to sexual activity, and the right to express their sexuality without fear of persecution. Technology has the potential to either promote or hinder these rights. One of the main ethical concerns surrounding technology and sexual rights is the issue of consent. The ease of accessing sexual content and the anonymity of online interactions have led to an increase in non-consensual sexual activity, such as revenge porn and sexual harassment. Technology has also made it easier for predators to groom and exploit vulnerable individuals. It is crucial that technology companies take responsibility for preventing and addressing these issues by implementing measures such as better moderation and reporting systems. Another concern is the impact of technology on sexual health. While technology has made it easier for people to access sexual health information and services, it has also led to an increase in risky sexual behavior. Dating apps, for example, have been linked to a rise in sexually transmitted infections. It is important for technology to be used responsibly in promoting sexual health and education, while also addressing the potential risks. Technology also plays a role in promoting sexual diversity and inclusivity. Social media and dating apps have made it easier for marginalized communities, such as the LGBTQ+ community, to connect and express themselves. However, there is still a long way to go in terms of creating inclusive and safe online spaces for all individuals. In navigating these ethical implications, technology can also play a positive role. For example, virtual reality technology can be used to create safe spaces for individuals to explore their sexuality without fear of judgment or discrimination. Apps and websites that promote consent education and healthy sexual behavior can also help individuals navigate their sexual rights in a responsible and ethical manner. In conclusion, technology has both positive and negative implications for sexual rights. While it has the potential to promote sexual health, diversity, and inclusivity, it also raises ethical concerns surrounding consent, exploitation, and risky behavior. It is important for technology companies to take responsibility for addressing these issues and promoting ethical practices. Ultimately, technology can play a valuable role in navigating sexual rights, but it must be used responsibly and with a focus on promoting the well-being and autonomy of individuals. Key Concepts:
Stefani Goerlich: Hello and welcome to Securing Sexuality, the podcast where we discuss the intersection of intimacy-
Wolf Goerlich: -and information security. I'm Wolf Goerlich. Stefani Goerlich:He's a hacker, and I'm Stefani Goerlich. Wolf Goerlich: She is a sex therapist and together we're going to discuss what safe sex looks like in a digital age. Today we're joined by Ezio Di Nucci, a philosopher and professor of bioethics based in Copenhagen. His latest book is “The Control Paradox: From AI to Populism.” But he first came to our attention when we read his writings on the emotional well-being of robots in the book “Robot Sex: Social and Ethical Implications”. Hey, welcome. Ezio Di Nucci: Hi, good to talk to you guys. Stefani Goerlich: Thank you so much for joining us. Your chapter in “Robot Sex” was absolutely fascinating. The minute I put it down, I picked up my phone and emailed you. So thank you so much for joining us. Ezio Di Nucci: That's very generous of you. Let me tell you a little bit about that chapter and then how I got to that. So, my thinking there was starting from more of a kind of political side about sex rights. One of the things that got me interested about sex rights was sort of disability rights. So I came to the technology and specifically to robots, very much coming from this more political-ethical interest in sex rights and disability rights. And those are the things that sort of motivate me. So I must confess to begin with that I'm more interested in sort of the rights of humans than in the rights of robots. That's where I come from. And if I had to briefly summarize that chapter, basically I would say, look, it's about the fundamental question, which I think I had written a few years before, but hadn't quite gotten to the sort of technology as a way of thinking about that question. So the fundamental question would be, “do we have a right to sex?” Wolf Goerlich: Hm Ezio Di Nucci: And not just the question, yes or no, do we have a right to sex, but also, is there even such a thing as a right to sex? Is sex the kind of thing that we can talk about in terms of rights? And then both sort of disability and technology were my two ways of thinking around sex and sex rights. So some of that work, for example, looks at the fact that, well, okay, but what does it actually mean to have a right to sex? Would it be something like a right to sexual pleasure? And would it be a right to being informed and competent in sexual relations, independently of whether or not that leads to sexual satisfaction or sexual pleasure? So those kinds of relations. So the relation between sex rights, sexual pleasure, sexual satisfaction, and then especially with groups that might have sort of access problems. And I'm thinking also of things like neurodegenerative diseases, like Alzheimer's disease and things like that. So that's where I'm coming from maybe 15 years ago. Then I sort of slowly started thinking that we should at least look at technological solutions for a very simple reason, because one of the reactions to my earlier work on sex rights was basically, well, but we can't talk about people having the right to sexual pleasure or sexual satisfaction because that's going to be incompatible with other people's rights. So you can't make anybody do anything for your sexual satisfaction. So it's a very delicate thing. And I'm sure I don't have to explain that to you or your audience. But then I started thinking that technology could be a way of sort of navigating that, that there could be a right to sexual pleasure that didn't infringe on anyone else's rights. In philosophy, there is this thing that people always do at this point of the conversation, which is distinguishing between positive rights and negative rights. So negative rights are like property rights. It's the right that people shouldn't touch your stuff. While positive rights are like health care rights, our rights to getting services, like health care services. And one of the things that got me or got me and then the people that were responding to my work worried there was, okay, so what should we think of sexual rights?Are sexual rights rights to sexual services? But that can’t be quite right because then we are actually infringing on other people's autonomy. Right. So for example, if we're thinking sex workers, maybe even if we're thinking autonomous sexual robots, right? And so anyway, now I've given you quite a long introduction to my thinking, but basically that's how I got to the technological bit and to sex robots, by thinking about rights, which might sound a little bit funny to begin with, but I hope I gave you a bit of an impression of how I got there. Wolf Goerlich: No, I'm glad you outlined that because one of the things that I have struggled with is exactly what I think you hit on, right, which is this idea that there is a positive right, a right to pleasure and understanding and self-direction. And there's also a negative right, like I should be able to consent, I should be able to set boundaries, I should be able to be self-directed. I had never had a framework to compare both of those. So I think that's incredibly useful. What are some of the outcomes of that thinking then, right? So what does that mean for assistive or adaptive technology? What does that mean for people who are providing services or content? Ezio Di Nucci: So we can think about it, just do a little topography of this kind of debate, right? So we can think in terms of, I guess in the first instance, there would be people basically just saying, look, there is no such thing as sex rights in terms of entitlement to sexual satisfaction or sexual pleasure. So some people won't even enter into this debate, right? So sex rights as positive rights, they don't exist. There is just sexual autonomy, right? And if you make a step further and say, well, I'm at least going to have the debate about the existence of sex rights as compatible with everyone else's sexual self-determination, because obviously we don't want this talk of sex rights, even if it is for vulnerable populations or marginalized populations, we don't want this talk of sex rights to go back on the sexual revolution, right? We want this to be part of that movement. And that's why, as you rightly say, Wolf, it's completely clear from the beginning in this framework that it has to be subject to very clear boundaries and consent requirements, right? And that's where technology comes in, right? The technology offers the hope that we can talk about the right to sexual pleasure and sexual satisfaction without violating anyone else's sexual self-determination and autonomy. So that's the kind of theoretical framework that one sort of hopes to navigate in. And then in terms of outcomes, I mean, first of all, let me remind you, I'm a philosopher, so we're not thinking in terms of outcomes or products, right? Ideas is everything that we do business with. So I'm not actually going to propose anything very concrete. But in the robot sex book from, I think it's 2017, that you were referring to at the beginning, I think basically the suggestion is technology could be a way of satisfying the sexual rights of vulnerable populations. And that is where it gets from a philosophical point of view, and especially for someone like me working in bioethics at the medical school, where obviously everybody is talking about informed consent all the time, that's where it gets difficult, but also interesting, right? Because a lot of people will say, well, look, some of members of this vulnerable populations, maybe even the majority, again, if we're thinking mental health conditions, if we're thinking cognitive impairments, if we're thinking neurodegenerative diseases like Alzheimer's disease, those are kind of the standard cases where people say, oh, some of these individuals are just not capable of giving consent. So that's a backstop. So that's another place where you can just basically jump ship. So the first place was to say, look, there's just no such thing as sexual rights. There is sexual self-determination, but there is no rights to sexual pleasure or sexual satisfaction. So that we try to kind of deal with. Now the second point is to say, well, there might be sexual rights, but if what you're worried about are those vulnerable populations that I was talking about, well, those populations happen to be the ones that cannot necessarily give informed consent, especially when it comes to delicate things like sexual relations or health care services. So I'm always using this comparison to health care services, first of all, because it's something that people are familiar with. But it is also true that those are the two domains in our social lives, sexual relations and health care services, where informed consent is mainly at the most delicate and at the most important. And so I think that's where it gets difficult, but also that's where it gets interesting. I think sometimes we underestimate that by setting very clear boundaries. So for example, saying, in care homes for the elderly, there should be no sex or things like that. Then obviously, that's a safe space, but it's a safe space where there might also be a lot of unfulfilled desires. There might not be such a quality of life and the wellbeing. You're paying a price in terms of wellbeing. So here I'm not, because you were asking me, Wolf, about conclusions and outcomes. I'm not necessarily making a policy proposal. I'm not saying, oh, let's draw out guidelines for sexual relations in care homes or in places where there are vulnerable groups. But I'm saying, let's have this conversation because there is suffering. And normally when there is suffering, we're thinking in terms of mildering that suffering. And obviously, part of the sexual revolution is let's not be afraid that just because it's sex, we're not going to have this conversation. And I know that I don't have to convince you or maybe some members of your audience, but sometimes you're just met with sort of this wall of, oh, but those are vulnerable groups. This is dangerous. There can be abuse and manipulation and all these kinds of things. And I completely understand that. I'm just saying, let's have the conversation and recognize that there is also harm in all this in terms of well-being and in terms of those things. Stefani Goerlich: And I'm really glad to hear you bring up both the consent piece and also the harm that comes from just saying that the solution to the consent piece is to just say, well, there doesn't need to be any sexual contact. There's just, we'll just create a nursing home policy that nobody touches anybody else, problem solved. Because, you know, I come out of the world of sexual assault survivor advocacy. That was the first part of my career. And obviously, I am a big proponent of consent culture, but we fail to recognize that when we create safety by simply outlawing behaviors or by saying the way we're going to protect consent is to eliminate the option to opt in as well, we're not actually creating a consent culture. We're creating a controlled culture. So I'm really happy to hear you talk about those two. And I'm also curious, you know, often in my work where I hear people talking about a right to sex, it's not necessarily being coming up in the context of persons with disabilities or aging populations, where I hear the phrase right to sex most often is in kind of like the incel movement or, you know, among certain populations of men who have decided that they have a right to sex that women are withholding from them. And because of that, it can feel sometimes kind of scary to say, well, let's talk about a right to sex. So I love sort of your immediate framework of, you know, we're not going back on the sexual revolution. We're not talking about violating consent or coercive right to sex. We're really talking about how can we create a consent culture that starts from a place of yes/and rather than well, no, we can't protect everybody. So we're just making everything off limits. But it is a very delicate and nuanced conversation. And now I feel like I've been rambling at you. So let me ask, how does technology help mediate that conversation? How can technology help untangle some of these complicated issues? Ezio Di Nucci: Before moving to technology, let me just thank you so much, Stephanie, for emphasizing that. Also, that's one of the reasons actually what you were saying about the in cell movement, why I try to talk about sexual rights and not the right to sex. And obviously the language on some of those debates has moved in such a way that I completely understand that those two things can get confused. And I'm aware of that. And one of the things that I've more recently written about is to frame it in terms of sex positivity, right? The relationship between the sex positive as opposed to sex negative movement and sexual rights and exactly in the spirit of what you were saying. And I think, I mean, the irony of bringing in technology now to the conversation we've been having is that technology, as you two know better than me, brings up its own extra layer of ethical questions. So that's why I wanted to frame it first of all as a problem of our rights that technology can help us with, right? Rather than to begin with as a technological issue. So I think, but basically the simple way, and before we get into the details of what kind of technologies and what kind of solutions we're talking about, basically the simple way that technology can help with this is that the main worry, you'll remember as we discussed it just there, was look, if you talk even within sex positivity about people having rights to sexual satisfaction, sexual pleasure, and sexual rights in the positive sense of rights, so sexual rights as positive rights, is that they will violate someone else's rights. So technology there is going to basically stop that because the idea is that those technological solutions will be able to satisfy people's sexual rights without violating other people's sexual rights. Right? So the way I normally do this, this is easier to do on a board than in a podcast, but the way I normally do this is think of it in terms of, so someone's positive sexual rights can be satisfied through technological solutions without someone else's negative sexual rights being infringed upon. Right? So in a sort of very simplistic sense, technology is a kind of middleman that prevents abuse, but then obviously brings up a lot of other issues that also have to do with consent, with security, with manipulation, with privacy. Right? So looking at it from my point of view, actually the technological solution is quite hopeful because to begin with, I'm just worried about sexual rights not being compatible with everyone else's sexual self-determination. As you said Stefani, obviously we don't want to go back on the sexual revolution, so sexual self-determination is paramount, but saying that sexual self-determination is paramount cannot mean that we're not even going to look at sexual rights. And that's where I think, that's how also, as I explained at the beginning, that's how my own thinking has moved. Sort of being able to introduce technology as a way of saying, well look, maybe we don't necessarily have to give up on the idea of sexual rights. Wolf Goerlich: Should we still with these technologies model consent? And I'll tell you why I'm asking that. I think about habits leading to outcomes, and I think about the protection of the negative right as still being important when a person is interacting with a person. And that protection of that negative right comes from habits, comes from culture, right? And there was a study recently, I think it was a few years ago, I want to say it was out of Brigham Young University, I could be wrong on that, but what they were looking at was, are we becoming less polite to each other because we don't say please and thank you to our Alexas and her voice assistants, and just by barking out orders, we then bark out orders to our friends and our family. So the question I have for you is, should we still have, with this technology that's satisfying the positive right of these people being able to have sexual gratification? Should we still have that technology model some form of consent so that we still have the habits that lead to consent culture and protect the negative rights? Ezio Di Nucci: I think that's a really good question, thanks for that. Because I think the other thing we've done so far, and we've talked about this as a completely sort of gender free issue, but obviously a lot of people are rightly worried about sort of this being a gendered issue. Maybe that we're sort of modeling the technology around some practices that are going to just replicate practices of oppression, patriarchal practices of abuse. And I think again, like with the earlier issue, that shouldn't mean that we don't even have the conversation, but we have to be very, very careful in how we program these kinds of systems. Right? Because in fact, we want technology, not just when it comes to sex, but in general in our society to have disruptive power and potential. So if, especially when it comes to sex and gender, if technology ends up just replicating, as you were rightly saying, some of those habits, then we might have sort of dealt with some well-being, but we haven't really dealt with the big issues of our society. So I think that's a kind of, I know it sounds disappointing of a philosopher to say that that's almost like a programming issue. I don't mean it as, it's an ethical issue, but it's an ethical issue that has to be done in the kind of traditional sense of responsible innovation of basically programming those algorithms so that we break free of some of these habits. And I think, I mean, it's quite tempting there to think that we should basically program consent requirements into it, even if the technological system in question, basically a very simple robot-like system, doesn't actually deserve that. So that we're doing it for pedagogical reasons. And I think that's the position I would take. So that we would be basically doing an as if, and we would be doing it for pedagogical reasons. I mean, I think you can probably make the point you are making, Wolf, and I know you two probably know more about this, running this podcast than myself, but I guess people have their worry about porn consumption, right? That it sort of lowers not just the quality of sexual relations, but it also lowers the kind of interaction between human beings. And that it has, you know, not necessarily, but it can have statistically a lot of negative effects on sexual practices. So I think something like that argument probably does apply to more complex technologies. But I think the complexity there is actually something that we can use to our advantage, exactly because we can program some of these responsible habits, if I can call them like that, into the systems. Wolf Goerlich: Now, I know Stefani has a question, but I wanted to say something to follow up on that. I like your point about disruption through technology, but I also think what's embedded in what you said and what's so vital to the conversation we're having here is we have folks who are talking about how to build laws in this. We have folks who, like yourself, are talking about how to think to this. And then we're going to have a conversation with someone who's building sex tech and say, oh, by the way, here's some of the ways to think about this from a philosophical perspective. Please do that programming that Ezio mentioned. So I think that is absolutely spot on that we've got to address these issues across just a multitude of spectrums. Stefani Goerlich: Which just kind of makes me feel like I'm throwing a wrench in the works with my question, because what this brings up for me is, you know, I'm listening to you two. I'm listening to you, Ezio, and I'm thinking of Westworld, the whole show that was kind of originally founded on this idea that people are going to this immersive park where robots don't say no, right? Robots will let you kill them. Robots will let you have sex with them. That the fantasy is the robot doesn't consent. And then I think about when Siri first came out and my kids were playing with Siri and trying to figure out what was possible. And one of my stepchildren goes, Siri, talk dirty to me. And Siri giggled and goes, dust bunnies, lint, spider webs. And so clearly they had in the programming thought to kind of divert around that. But it makes me wonder from a customer service perspective, how people would react if machines were programmed to just say, no, I'm not going to do that. And then that raises the bigger question of do robots, do intelligent machines have a right to say no? Is there a right to consent within programming or within a tool that somebody has paid to use? Can my Roomba tell me I'm too dirty for them to want to help me? Ezio Di Nucci: Yeah, those are really, really good and difficult questions, Stefani. I mean, I'm not going to be able to answer them. But let me try to just play with them a little bit. Because I think you're absolutely right that if we think of them as gadgets that we buy and own, then we might have some of those issues of, I'm going to return it to the shop because it's not doing what I want it to do. But I think in a way, if we're thinking in terms of machine learning algorithms, we're basically really thinking beyond that. So that's where I meant to, Wolf, earlier when I said the complexity there creates a lot of other problems that I'm not qualified to speak on. But actually, it's quite hopeful, the complexity, because of and I think also, I mean, the other thing that gives us a better sort of access to those questions is the fact that we're talking about sexual relations. I mean, sexual relations are supposed to be satisfying and interesting and fulfilling. And probably in order to have satisfying and interesting and fulfilling sexual relations, you need a system that can say no, you need a system that is not just going to do exactly [what you want]. Also because I guess one of the things about sexual relations is that you learn from each other. And so you also want a system that can teach you. And that kind of system is not the kind of system that is just going to execute. So I think of this also in a kind of educational sense. I mean, it seems to me that we're still and I know that we're doing a podcast online and it's 2023. But actually, I think we're still stuck in a very moralizing view of sexuality. Right. I mean, the three of us have been talking about this having to be fully integrated into the sexual revolution. And I think if it has to be part of the sexual revolution, it has to be something that demoralizes sex and sexuality. And the way in which technology demoralizes sex and sexuality, if at all, is through complex systems that can actually teach customers responsible practices, but also practices that are fulfilling and satisfying. And again, technology is not the thing I'm mostly interested in. The thing I'm mostly interested in is sexual rights and this idea of demoralizing sex. It's just that now, I like that you're challenging me because you're making me think, well, it's true that there are all these problems with the technologies, but actually, maybe there is potential in that. Maybe the fact that those technologies, because they already are so complex, will deal with some of those objections. And then if nobody will buy them, so to speak, that will just speak to the fact that we still haven't managed to demoralize sex despite your efforts, despite my efforts. Yeah. And then maybe can I just add something about, I think when we're thinking in terms of programming those systems, I think if the… I was just thinking about, and you two being based in the States, I guess you've been following this even more closely than myself, but those Silicon Valley banks going bankrupt the last few days. And let's not forget that community is not very diverse. And I think diversifying the tech community will already deal with some of these issues. So in a way, actually, I'm not too worried about that. Maybe it's just because I'm just a philosopher and then the tech people will have to do the real work, but I wouldn't be as worried. I think the thing I'm mostly worried about is the issue of basically shifting the consent boundary, sort of accepting the possibility of allowing vulnerable people to have sexual relations and whether that trade off is worth our boundaries and consent practices. Stefani Goerlich: So as somebody that works with people who experience a lot of moralizing around their sexual and relational expression, what I'm hearing really resonates with me. And I think that a lot of that, and obviously this is sort of a universal statement, but morality is culturally specific in a lot of ways. What is blasé in Europe is very much shocking in the United States. And I think that it's very easy for us to say people with disabilities deserve intimate pleasure and they deserve companionship, even if that companionship is in the form of a robot or a doll. And I think it's really not quite as easy, but on the easier spectrum to say people that are aging don't lose their sexual selves and their sexual identities. And even if it makes us a little squirmy to think about grandma and grandpa at the nursing home, they are still adults and they still have those memories and those drives. But I think the conversation becomes a lot more complicated when we are thinking of people who might be struggling with problematic sexual behavior or who might have sexual desires that aren't appropriate in any context. And one of the conversations that happens a lot in my world is this idea of if we create technology, are we mitigating harm in those populations or are we just allowing them to practice an inevitable offense? And I don't want to call out necessarily any specific communities because I've had this conversation around lots of different, sort of more concerning, erotic and sexual groups. I think it's a conversation that very quietly happens in the criminal justice field, in the mental health field, in tech spaces. And I'm curious how we balance that, how we balance demoralizing sex and really encouraging people to have a sex positive, self-affirming, self-accepting framework while also acknowledging that there are some desires, there are some interests that aren't safely expressible or that are intrinsically non-consensual. And how could technology perhaps help with that? Or are the people who say that technology only serves as a practice field for future harm right? Ezio Di Nucci: So I think now you've brought up an even more difficult one, Stefani, thanks. And I think this is something where I'm not really qualified to speak on. I completely see why sort of the conversation leads also there. But to be honest, all my work has only been with sort of vulnerable groups understood as sort of either mental health, cognitive impairment, or the elderly, specifically neurodegenerative diseases. I know that there are people, also colleagues of mine, working out there about using technology to mitigate some of these other practices. But I really don't know anything about that. And I would be quite careful going there. I mean, in fact, this is, I mean, I'm your guest, so I'm supposed to answer the questions, but this is one of the places where I quite like to throw it back at you. And what do you think about that? Because I would generally there want to draw a line. And for the reasons we were talking about there, not even start programming some of these practices, just basically those are the practices. It's like with, because I work at the medical school and sometimes we think in terms of, should we keep a virus that we've completely controlled? Should we keep it for future reference in the lab? Or should we just be happy if it just disappears and never comes back? And I know that's a really easy and cheap analogy, but that's one of the places where I would be tempted not even to keep it under wraps in the lab. But basically, so we don't even start the programming. But I think from a philosophical point of view, that throws up a lot of questions about how can you argue to be sex positive on these kind of practices, but then draw a very clear line with these other practices? Does that then not basically compromise the sex positivity message in the first place. But again, I think you two are much more qualified to speak on that. What do you think about that? Stefani Goerlich: Well, first of all, I want to thank you for actually saying I don't know. I think a lot of people these days are really scared to admit the limits of any knowledge. We feel the need to pontificate on a subject if it's thrown out of this. And I really respect somebody that says that's outside of my wheelhouse and I'm not even going to venture. I think it's an incredibly complicated question. I think that the people that are looking for a universal this is helpful or this is harmful answer are probably misguided because in medicine and in mental health, just because three people have the same diagnosis doesn't mean that they're going to have the same progression of their disorder, doesn't mean they're going to have the same response to treatment. Every case, whether it's cancer or compulsive behavior is unique to the individual. So I think anybody that blanketly argues that we can provide technologies that will mitigate all harm are probably misguided. And I think of the people that say if you afford any sort of technological intervention, all you're doing is validating problematic behaviors are also probably a little hyperbolic. I like your instinct to until we figure this out, we don't code it, we don't build it, we leave the technology alone. Because one of my big sort of frustration points with the world of tech and the Wolf as the household resident technologist has heard me screaming yell about this is I feel like right now we build so many things because we can without stopping to think about whether or not we should. And I know that sounds like I don't know a line in a sci-fi movie trailer. It's like the Luddites psychologist is railing against technology, but I actually really like technology. I just think that we need people like you to guide the process and to have people who are trained in critical thinking and in analysis and in turning over an idea or a piece of machinery in every possible way and really thinking about the diversity of ways in which it could be used and the diversity of people who are probably going to encounter it before we just go, it'll probably be fine and throw something out there. So I'm with you. I think I asked a question I knew didn't have a clear answer. But how often do you get a philosopher on your podcast, you got to ask the big questions. It's what you get paid the big bucks to answer. Ezio Di Nucci: Absolutely, go for it, but it is Denmark, so there is no big bucks. Can I maybe make an analogy? Because I think what you were saying, Stephanie, reminded me of something else that my group works on, that it's on reproduction. So we work on ecto-gestation. So ecto-gestation would be the idea that the human bodies would no longer need to gestate, that all of the gestating process could be outsourced to machines. And I mean, we can do another podcast on this. I don't want to throw us off track. But sometimes we get, my group gets the, basically the criticism that, I mean, why spend, you know, taxpayer's money or, you know, private money to research this thing that is so far in the future. But I think, as you were saying, I think this is the time to talk and think about it and also to decide, you know, how much do we want to go further with the development of these practices? I mean, emancipating gestation from bodies has some potential. Also there are some obvious critical points. But I was just using, because I think when we talk about sex robots, you know, independently of how clearly we define them in our conversation, we're still talking about something in the far future, especially when we're talking about, you know, sort of habits and consent practices and stuff like that. That's why I compare it to ecto-gestation, something that doesn't exist, but that we can absolutely conceptualize and that would disrupt our oppressive practices just the same way in which hopefully sex robots can disrupt oppressive sexual practices. So yes, like you two are doing, let's end the conversation now and see what happens. Wolf Goerlich: When we parse out the benefits and negatives about sex robots, I think there's a couple things in there. One is, is interacting with a non-human entity going to change how I interact with humans? And the second part of that is, is interacting with a non-human identity going to allow me to perhaps dehumanize people? So in other words, does working with the technology cause me to behave differently when I see someone as a person? And conversely, does working with technology make me think of other people as other? And of course, both of those leading to some negative outcomes. The Westworld analogy is a really clever one because a lot of the excitement of that park was going in and having consent violations and being incredibly brutal to these robots. But more recently, more realistically, we've had hologram companions, we've got Replika. There's a lot of different, very rudimentary AIs being built. And we've seen sex robots being brought to conventions and very quickly being physically abused and to the point of falling apart. So as we move to our final minutes here, I was wondering if you could give us some direction. Is there hope in this technology? Is there concerns in this technology? Generally, where do you feel we're headed? Ezio Di Nucci: So I think it's actually your right to point us towards the end in that direction, because I think probably the more interesting side of things, I would say today, and I don't say that lightly because it goes the direction of a certain company having made huge investments there recently and then laid off thousands of people, is more virtual reality than sex robots. So software rather than hardware, if I'm allowed to use, I'm just a philosopher, that very old fashioned way of thinking. And I think maybe some of the conversation that we had today should be rethought through that distinction. So that if we're talking experiences, so if we're talking sexual relations in a purely experiential sense and not necessarily as embodied, how does that change some of the conversations we've had? I think that could be one way of liberating some of the problems. We're talking purely experiential and the programming is going to be not easier, possibly even more difficult, but we don't have to worry about some of the more technical aspects because we're talking purely virtual reality. But I must say again, and I know I don't want to take the easy way out here, but this sort of sex and virtual reality part is really not something that I've been thinking or writing about. I mean, some of my writings are starting to age, I would say. So I was originally just thinking in terms of this sort of idealized sex robots of the future. And I'm more recently starting thinking that maybe actually we should think in terms of a disembodied, purely experiential. And I don't want that to be misunderstood. I'm not saying that there cannot be any sort of violations of consent or any abuse in a purely virtual setting. Absolutely not. But I'm just thinking in terms of what happens when we disembody sexual experiences. It is not from a purely conceptual and philosophical point of view. It is not clear to me that we would necessarily lose that much by completely disembodied sexual experiences. And in fact, I think it could be quite liberating, again, thinking of the technology as something that we can also learn from. I think by disembodying sexual experiences, we could be able to do things that our bodies cannot do. We would be able to learn things that we might even think possible. So I think, again, this is a place where it's quite hopeful when we think in the terms set by your podcast, namely in terms of sexuality and technology, because we shouldn't see ourselves as limited by our bodies. Absolutely. And again, I'm not an expert here. I don't want to say that there are no sexual experiences that are not ruled out in a fully disembodied version of them. So it is just empirically possible that not every sexual experience can be replicated in its disembodied version. I'm just thinking that maybe that is actually the way to go independently of meta struggles. Stefani Goerlich: This has been a really fascinating conversation. I love the chats where we get to ask each other a whole bunch of questions and everybody walks away going, I don't know, but let's keep talking about it. And I think that that is what we hope everybody that listens to this will do. We'll keep thinking about these big questions, keep wrestling with the idea of who has a right to touch and to intimacy and to connection and whether it's a person or a machine that has the right to say no and what no means in a world where so much of how we interact with each other is mediated by devices and algorithms and programs. And I just want to say thank you for exploring, even if we're leaving them unanswered, some of these cool conversations today. Thank you, Ezio. Ezio Di Nucci: Thank you so much for having me. It's been fun. Stefani Goerlich: Yeah, it's been great. And thank you so much for tuning in to Securing Sexuality, your source for the information you need to protect yourself and your relationships. Securing Sexuality is brought to you by the Bound Together Foundation, a 501c3 nonprofit. From the bedroom to the cloud, we're here to help you navigate safe sex in a digital age. Wolf Goerlich: Be sure to check out our website, securingsexuality.com for links to more information about the topics we've discussed here today, as well as our live conference in Detroit. And join us again for more unresolved but intriguing conversations about the intersection of sexuality and technology. Have a great week. Comments are closed.
|