Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week's episode:
Combating Misinformation: DEEP FAKES AND THE CHALLENGE OF DETERMINING WHAT'S REAL
In the digital age, information is readily available at our fingertips. However, with the rise of misinformation and deep fakes, discerning what is true and fabricated has become increasingly challenging. This has significant implications for various aspects of our lives, including safe sex practices. In this blog post, we will examine the impact of misinformation and deep fakes on safe sex in the digital age, exploring the potential dangers and discussing ways to navigate this landscape responsibly.
Misinformation, defined as false or inaccurate information that is spread unintentionally, has always been a concern. However, the advent of social media and the ease of sharing information online has exponentially increased its reach and impact. When it comes to safe sex, misinformation can be particularly dangerous, as it can perpetuate harmful myths and misconceptions, leading to risky behavior and potential health consequences. One of the most common forms of misinformation related to safe sex is the spread of false information about contraceptives and their effectiveness. For example, there have been instances where individuals claim that certain fruits or drinks can act as natural contraceptives or prevent sexually transmitted infections (STIs). Often lacking scientific basis, these claims can mislead individuals into believing that they are adequately protected when they are not. Deep fakes, on the other hand, are a more recent phenomenon that poses a unique threat in the digital age. Deep fakes are manipulated videos or images that appear real but are fabricated. With advancing technology, it has become increasingly difficult to distinguish between real and fake content. This has significant implications for safe sex, as deep fakes can be used to create explicit content featuring individuals without their consent, leading to potential harm and violation. Deep fakes can also be used to spread false information about safe sex practices. For instance, a deep fake video could be created to promote unsafe sexual practices or encourage the use of ineffective contraceptives. This misinformation can have dire consequences, especially for vulnerable populations who may not have access to accurate information or the ability to critically evaluate the authenticity of the online content. Given the prevalence of misinformation and deep fakes, it is crucial that individuals take proactive measures to protect themselves and others when it comes to safe sex in the digital age.
Misinformation and deep fakes have become significant challenges that impact various aspects of our lives, including safe sex practices. By understanding the potential dangers and taking proactive measures to navigate the digital landscape responsibly, we can protect ourselves and others from the harmful effects of misinformation and deep fakes. Remember to verify sources, consult reliable information, educate yourself, encourage critical thinking, and report any concerning content. Together, we can promote safe sex practices and combat the spread of misinformation in the digital age. Key Concepts:
Hello and welcome to securing sexuality. The podcast where we discuss the intersection of intimacy-
-and information security. I'm Wolf Goerlich. He's a hacker, and I'm Stefani Goerlich. She is a sex therapist. And together we're going to discuss what safe sex looks like in a digital age. Now we're always talking about safe sex and information security. But sometimes, right, we've got to talk about other things. We can't only talk about that like 24 7. No, of course not. We have wide, wide interests. So we were in DC recently and we were out to brunch. It was a lovely brunch. Um, I had steak and eggs for whatever that matters. And I remember asking you like a question that's been burning on my mind. Which is why is everyone upset about Taylor Swift in football? OK, but we need to back up a little bit because you actually were such a tease, because what you told me was my love. When we get to brunch, I have a football question for you. And I was so excited because I have been so passionately following the Lions this season and cheering for them and cheering for the Packers, my home team and I've been trying to get you involved in the drama and the excitement and the general cultural moment. This represents for our beloved city. And you told me you had a football question and my heart fluttered and I was so excited. And then your question was about Taylor Swift. Well, someone has to keep up with the modern music and trends. Oh, let me tell you, there's no worries about people keeping up with Taylor Swift. There's one person that's been covered pretty thoroughly. Might be her. Well, I had saw something on social media about it, and it was like, two in the morning the night before, and I was like, What is going on? And I found all these pages and no one was actually telling what was going on. So that's why I had to ask you. So what did you tell me? All right, so this is a topic that hits one of my my more unique vs right, this, uh, sports sort of half knowledge. I love football. I don't understand football, and I'm not sporty, but somehow I'm the sportiest girl you've ever been with, Um, but I love football, and I grew up with a lot of conspiracy theorists in my family. I. I grew up in a house, actually that had a secret staircase, so that when the one world government came to haul us away to camp, we could escape out the back like that's how I grew up. So whenever there's a new conspiracy theory, I am here for it. That is the, well, my love language. And there's a whole new Taylor Swift conspiracy theory that says that she's dating this guy on the Kansas City Chiefs. Um, because when the Chiefs go to the Super Bowl, most of America will be watching it, and she will endorse Biden and throw the election. And there's this whole huge sort of Fox News one American News News max sort of vibe happening right now that's convinced that there's like this, um, like conspiracy theory to throw the election for Biden through Taylor Swift's attendance at the Super Bowl. And it's just so random that I'm fascinated. Now. About three quarters of our audience are ready to hit the skip button at this point, so if you are among those, uh, hang tight because this does immediately get to the intersection of intimacy and technology. I promise you. But what I thought was really funny about this story, uh, over brunch while you're explaining to me and I'm going No, no way. That doesn't make sense. No. And you're, like, really? Honestly. And then there's this Oh, no, that the, uh that the hostess, the maitre D of the brunch, came over and it's like, Oh, are you talk? Are you Are you guys opening your eyes? Are you Are you on to the game? And that was the best part. Was that right? As you were saying, like, there's no way that people would buy this This is ridiculous. There's clearly this is all just like, blown up news cycle. She walks over and not only confirms that that's what everyday people are worried about right now, but she gave us an entirely different Taylor Swift Super Bowl conspiracy theory that we hadn't even known about yet. Yeah, her her take was that the one we're talking about is the distraction. The real one is that the NFL being for profit is run like uh, world wrestling or whatever, and all the matches are already predetermined and that they are matching up celebrities who aren't really dating but pretend to date so they can get more viewership. Um, like they did in the golden age of Hollywood. Uh, and so the conspiracy that we talked about is the gateway to understanding the deeper truth, which is It's all right. And let me tell you, on one hand, the weight of my heart is to be a total stranger who walks up to me and apropos of nothing goes, you know, Rock Hudson was gay, right? Because my answer is gonna be Yes, I do. And why are you telling me this? And why are we friends now? So, like, she won me over pretty much immediately, but it was just fascinating to have this total stranger approach us in a restaurant with, like, this golden age of Hollywood studio system matchmaking. They paired up Rock Hudson with Doris Day to make money, even though they knew it wasn't real and very true. But it's not every day that we get sort of like this. The infamous they control the media, plus capitalism sucks plus, um, pop culture figures plus the Super Bowl. And then we just throw in a light smattering of fifties and sixties era studio system Hollywood. It was a smorgasbord of fuckery, and it was amazing, which I think it's one to the first topic I want to talk to you about, which is misinformation is so prevalent and all right, we're gonna we're gonna push back against misinformation. We're going to find out what the truth is. You know, it's it's so hard. We talk about digital literacy and media literacy on this right, being able to interpret what the facts are and suss out what the truth is. Um, but even even with something like football, uh, and music, it goes sideways quick. What can we do about that? You know, 10 years ago, it used to be really easy. We could say, Go to Snopes and Snopes will tell you if it's true or not. And the problem is is that we live in what a lot of sort of academics and philosophers are describing as a post truth world now and where there used to be resources and websites or just, you know, books of facts like the Encyclopaedia, that only exists to be objective. Now people will say, Well, you can't trust Snopes. You know you can't go there because you know they're again the they that controls things They are, um, skewing Snopes. So they have an agenda. They're saying what Snopes can and can't admit to, and that carries out throughout so much of society. Right now, we live in an age where it's not that you can't fact check things. It's that people are refusing the outcome of those fact checks based on their preconceived biases that brought them to the conversation at all. And it's a really scary thing. Yeah, your your point about Snopes is a good one, right? Because I think they have. Like Taylor, Swift has her own page on Snopes, and they're constantly debunking things about her. But I remember this was a little bit over a decade ago now. I had a guy working for me who was really, really prone to falling for misinformation, and, uh, we had other people on the team who like the very first thing that they would reply to any email or any chat message was like Here's the Snopes link and the guy would be like, All right, it's on Snopes. I won't bring it up again. Um, that doesn't work anymore if they think Snopes is false. Um, yeah, the Here's the link to Wikipedia or Snopes no longer works. There has become such a mistrust in the very idea of reality these days that all of a sudden there being multiple Taylor Swift conspiracy theories and people arguing about which conspiracy theory is like the conspiracy theory has started to feel maybe not rational but normal. I feel like being irrational has become normalised because we are not willing to confront when we are wrong. Yeah, and we have other ways of doing this, right? Like so, In addition to, like, Snopes and whatnot, many media companies, social media companies have safety and trust organizations. we covered this. We talked about section 2. 30. 1 of the important aspects of social media is, uh believe it or not, you know, fact checking and um offering guidance and, you know, combating misinformation. And then Twitter cut their entire safety and trust organization. What? I think it was like January 2023 so help me understand what that means. Like obviously I know what fact checking is when you talk about fact checking. I think of like newspapers and journalists where they are ethically required to be factually accurate, they're ethically required to correct themselves with the issue of correction. Um, my favourite example of this. I know you've heard me tell this story because it's mortifying. And now, in hindsight, hilarious was my very first media interview ever when I was 22 years old and the Detroit Free Press published an article saying I had no friends and had never had any actual responsibility in my entire life, and I demanded a correction. And it turns out, actually, they won't correct that it needs to be more like fact the facts. But that's what I think of when you talk about fact checking. So help me understand how that works. And, like the Social media online format, where you don't necessarily have a journalist publishing an article saying I have no friends, So what does it look like? If, like you're on Twitter saying I have no friends, I don't know that that's gonna rise to the right level, but Let's assume that, uh, Stefani Goerlich has Taylor Swift level popularity, and everyone has rushed out to buy with Sprinkles on top. Uh, and you know, number one author making the circuit. And there is a rumour going around that Stefani has no friends. All the friends that Stefani has are people that are just paid there, uh, to to hang out, like Rock Hudson. So you in a safety and trust organization, you keep track of major trends of misinformation, and you do a couple things. You deprioritizing the algorithm, so it doesn't show up. Um, you flag it. I'm sure you probably have seen the like, Oh, this page is repeatedly shared. False information. Flag on, uh, on Facebook properties, Uh, or you you offer, uh, counter information. Oh, this article, uh, that you're seeing ties into this page, that, uh, details what's really going on. So you you can block it, You can flag it, you can take it. Uh, but you can de prioritise it, but effectively, what you want is an organization within social media, and then again, this is covered under section 2. 30. So this is part of what social media should be doing in terms of the sword and the shield. Um, A or institution within social media that is looking for misinformation and helping either debunk or my favourite word is pre bunk, uh, either debunk or get ahead of it with pre bunking, uh, false information so suddenly people would know. But wait a minute. Stefani really does have friends. Here's a link to all the people who love her. I would be terrified if that existed. By the way, I do not want Taylor Swift level fame, and I want my friends to have their privacy. Nobody should be forced to admit they like me. Um um, But I have a question even about that, because I feel like even that system in in the last, like, six months or so, maybe longer. But six months is really when I started paying attention can be gamed because I followed somebody a comedian, that I like an actor that I like on instagram a couple months ago and I thought it was really interesting. I'd never seen this before when I clicked Follow. I got a little bubble popping up saying you should be careful. This account has been flagged a lot for misinformation. Do you still wanna follow? And I said yes. And what's been fascinating to me is I've been following this person for about six months now, and I have not seen anything that they've posted. That is misinformation. They're actually really good at citing their sources. So I feel like this is an instance of sort of that social media brigade that we've talked about where somebody doesn't like what somebody else is saying. So they just dog piled up with reports, and that makes me wonder, like not to get into conspiracies myself. But how much can I actually trust the trust and safety of people's decisions? Well, we should have someone from one of those orgs on to actually answer that question because there is no good answer to that. The one thing we know is you need a team. It needs to be well funded, and the people leading that team need to have good reputations. If you have that, they've got a good chance of doing a good job. Um, however, as we know, there's a lot of content out there, so it is a very difficult job. So when you see a platform getting rid of their safety and trust organization. That should be a big red flag, that they're doing something wrong when you see a platform boosted or when you see someone who's very well respected take on a leadership role in that, that is a sign they're heading in the right direction. But you're right. I mean, it's it's a very messy area, and I think that comes back to sort of this idea of Nobody can really trust objective reality anymore. So we're all just kind of saying These are my facts and you can't have them and you can't touch them and anything that might undermine or call into question My facts is clearly anti factual itself, Um, which makes a lot of sense when we consider just how hard it can be to figure out what's real online, right? Like that's the other Taylor Swift thing right now is she also made news not just for Super Bowl conspiracy theories but because of some really horrific deep fakes that were released of her, um, I. I teach a sexual trauma class and I will say I'm not going to speak to the content because I don't want people to go and look for it, but, um, it's relevant to the class I'm teaching. And that's a horrific thing for any woman to wake up and find on the Internet, much less somebody with a following like hers. So how can we know what's real and what's not in order to know what's factual or what's not? In order to debunk conspiracy theories about my popularity in Taylor Swift, this is such a huge problem. Um, yeah. So we've talked about the fakes before, too, right? Like Johnny Christmas was at the securing Sexuality 2023 conference, and he was showing how quick and easy it was to to take a couple of images of you and make a deep fake video and like, Wow, that's mind boggling now. It wasn't it wasn't pornographic, but I do think it was eye opening to a lot of us. The pornographic side of it actually started. Oh, a lot of years ago. I think it was, uh, 2017. I would say with deep nude, which was that app that, uh, you could put photos of your Ex-girlfriend ex partner whatever. And it would, you know, map them and morph them onto a scene. Um, and that app was pretty quickly taken down. A lot of people really complained about it, so I mean, th this has been a tech that's been out there for, you know, about 67 years now. But the challenge is that the technology has gotten better. It's gotten open to more folks. And, uh, yeah, to to your point, that image of Taylor Swift, that was a deep fake. Um, got something like 40 million views before Twitter noticed it. And again, I would say, Hey, um, have a well staffed safety and trust organization so that you can you can respond to these things before 45 million people see it, but yeah, that's that's a lot of traffic 2 to 1 image. And we feel like this is a new thing. Like deep fakes. Clearly a I generated things clearly, but I mean, we can go back to early photo manipulation. I mean, you know, I love me talking about some cyber sex history. As long as there have been anything, it's not painted. Photos have been manipulated since the Guo types there has always been people that were manipulating negatives that were doing things to alter images, that the viewer who probably was fascinated by the very idea of a photograph would never have thought to question. And I think the big difference now is that we're all very comfortable with the idea of a photograph. And we're all very tech, um, literate in terms of the way that things can be manipulated and because we know what can be done. I think we're very, very suspicious of the idea that anything hasn't been done. It almost feels like there's a mental mindset now of you need to prove to me that this is real because my default assumption is that if I'm seeing it, you've messed with it for pur for purpose. I think that should absolutely be the assumption. And, you know, I don't think 45 million people thought that that was a real pitch. Um, I, I just don't I. I think our default assumption is is that real? And then we start counting the number of fingers and we can start counting the number of hands because as we all know, Dolly the Open a I art bot tends to insert additional limbs. And I mean, we're talking about this in the context of Taylor Swift and football, like we're keeping this in a safe, not very activating realm. But there are lots of really difficult things happening around the world right now, and the ability to easily create propaganda and to differentiate between what is a real image of something happening halfway around the world versus what's not has become really, really hard. Um, and I think of that specifically because you mentioned the fingers because, you know, I was seeing, um, photographs, um, related to world events, and people were literally like having to zoom in and being like, this person has eight fingers. This picture isn't real, but anybody scrolling through that on their social media feed is not doing that. And they're seeing these images, and they are assuming that they are seeing actual news events, and they are making their philanthropic choices, their activism choices, their voting choices based on things that we have no way of knowing what's real and what's not. And that, to me, is scarier than the idea of Taylor Swift endorsing any political candidate because I think that. You know, it's really easy to say this one person has a massive platform and look at the influence she has, but now extrapolate that out to the platforms that are platform her and their ability to influence people. And it just becomes so scary so quickly. And yet I still come back to. That's why it's so important for me to know what's real and what's not well. And so we've got to address this at the creation and at the dissemination level, right? The dissemination level we've already touched on that you know, safety and trust organizations take it down Monitor for disinformation Yada, yada, yada Wolf has said this many, many times. Let's talk about Let's talk about the creation side. So, um, this image was created by Microsoft designer. Microsoft Designer is a tool that is back ended by, uh, opening Ili. Um, and in designer, there actually is a code of conduct that protects and prevents against creation of adult content or non consensual intimate content, uh, political content. So there is a code of conduct. We always say, Hey, is there a code of conduct that's one of our starting points in these conversations? Yes, Thumbs up. There is a code of content. There also is, um, a set of protections on, you know, the inputs that, uh, that should prevent someone from giving a prompt that creates this image. Uh, so there there are some ways to prevent the content from being created. Are there ways to prevent the content from being posted? Like I keep coming back to the fingers thing, which you're right has become the like, sort of ubiquitous way to shorthand. If something's real or generated, why aren't the big platforms putting a philtre on that when you go to upload a picture, it scans it and looks for weird fingers and just says, no, this is fake. I'm not. Won't be allowed through it. Why aren't they creating stuff like this? Yeah, So there's there is two different, uh, things that we need to consider here. And by the way, I should say, this is like the quintessential, uh, hacker mindset. Right now, I'm not I'm not saying a hacker did this, but when you think about what hackers do, they try to do, uh, a couple different things. First off, they try to modify their inputs so they can get past whatever controls you have, right, we'll send a weird request over a network protocol. We'll send a request to a Web page. In this case, we'll send a request to, uh, Microsoft designer and Dali. Uh, so you try to manipulate the inputs. The second thing you try to do is when you're trying to get data out either out of a network or out onto a social media platform, you try to manipulate that data enough that it doesn't pattern match to our detective controls. Right? So on the first side, um, you may wonder like Hey, wait a minute. Wolf just said, um, Microsoft designer prohibits us. How did that even happen? Right. And the way it happened was kind of ingenious. Kind of disappointing is that Dali and designer will correct typo mistakes, But the detection was not looking for typos. So if you type Taylor Swift like TAILOR, it would get by get through. If I was to type Stefani Goerlich and spell your name with a PH a you'd be mad at me and B I could, you know, subvert the A I platform and get an A I platform to say Stefani Goerlich has no friends. Uh, it is those type of manipulation of inputs to break through controls that have been happening for 2030 years And is exactly what happened in this case, on the flip side to answer your question when we dump data out, um, you're going, you're gonna try to bypass those philtres. You're gonna try to bypass those controls, and, uh, maybe it's just a matter of having enough fingers. Maybe it's just a matter of having enough, Um uh, enough, you know, noise in the photo. There's a number of different ways to detect defects today, and we're gonna start to see sort of like a back and forth cat and mouse. Uh, but you're right. What what we're going to need is when people are uploading photos on any of these photo sharing sites, better detection. Right now we've got some, like, casual detection of like, intimate images and everything, but we're going to need to rapidly iterate those those controls to catch these things. Can I level up this question? And I mean, you've already been answered on hard mode? Can I take it to the expert level question. Oh, sure. Of course I'm ready for this. I got my cup of coffee. Let's go. OK, so I was talking about photo sharing at first. Like, why isn't there a system where it gets scanned for eight fingers? But I also know that we don't even know if the people that are posting online are real right now. Like there are whole a I created influencers right now, like I instagram and Tik Tok and all the others. I mean, I might be over asking when I think that they're going to make sure that my photos aren't fake when I can't even be sure that the people I'm following aren't fake. I was waiting for the question there. What is the question? What the fuck, baby? Yeah, there's a There's a couple different ad agencies that have really leaned into this idea of creating influencers, virtual influencers, fake influencers and sometimes they're they're pretty clear, like, you know, the one I think of is called the Clueless, which is just a wild name. Uh, and the clueless actually will. You know, when you go on instagram and there's a link tree, they will have like a link to Clueless in the link tree. So you're like, Oh, this is a real person. There she is on a plane flying to Venice. There she is. You know, having a cappuccino. It looks really realistic. All the arms and limbs and bits and pieces seem to be in the right spot. You click on the link tree, and instead of going being a link to only fans, it's a link to, um, the institute that's creating it. But there's no requirement for that, right? There's there's others out there. There's another one out there that I think is really an intriguing company called Dapper Labs. Dapper Labs doesn't tag anything you would never know. OK, so I guess my first question is, why is that even allowed? I mean, no. My first question was, Why are fake photos allowed? My second like one B. I guess Part two is Why are they allowing obviously fake stuff on these platforms? Why is meta and Tik tok and all the others allowing these ad agencies to even create these accounts? Why is the comedian that I follow getting flagged for misinformation, but the A I influencer not getting flagged as not being human. I don't know. I will say the easy answer. I You want the easy answer? Sure. Is it money? It's always money. It's money. the influencer economy has been estimated at $21 billion. And so part of this is I mean, this is this is a like, aspirational role for many young people, right? I just wanna travel and take photos of where I'm going. Share a couple brands, Post a couple videos. Wouldn't this be a wonderful life? And there are many people who make tens, hundreds, sometimes millions of dollars being these influencers and that total aggregate of all these people who are making a little bit of money off their social media and having a good time, uh is around $21 billion. Sometimes these people do things that brands don't like, right. Sometimes they can be hard to work with. This is, uh, both dapper labs, and the clueless pitch is like your your person will never, you know, get involved in a scandal around a football player or be seen promoting political views your brand doesn't like. Um, they will not have an ego. This is, by the way, one of the key things. And you know, here's the thing. If you read it, it should really make you infuriated as you my wife, Stefani. Because how many times have you been like Look at how men used to talk about women and look at how men like want women and basically dapper labs and clueless and their other social media. Or I'm sorry, marketing messages. You know, women are difficult. Just come to us. They'll be docile and controllable and do everything you want and no cost less. It's the Stepford Wives. Oh, I hadn't made that transition, but yes, absolutely. It's a Stepford Wives economy. I mean, OK, so for those of you who are, you know, not in their forties and are unfamiliar with Stepford Wives, go and watch this movie. It It's, uh, an early dystopian from the seventies that was centred on this idea of a whole bunch of suburban men who really just felt that human wives were way too much of a bother, right? We were demanding. We didn't necessarily like doing the dishes. We had opinions, and so they created robotic spouses so that they could have the perfect suburban community with no, As you just said, wolf, no irritation, no ego, no opinions. We're talking about actually creating a Stepford Internet. Yes, 100%. Although I will say, Watch the 72 version. Don't watch the 2004 version. Just a spoiler. 2004 is not that good. I they try to make it funny, and I don't I don't understand why they did that, but yes, absolutely. The Stepford Internet is is what we're talking about here. There's no ego, no opinions that don't conflict with their brand. And I should point out like these aren't like no name brands here, um, Burberry, Prada. You know, these are brands that have used these virtual influencers. So here's where I tie threads to an episode we did a couple months ago, and I actually think you're gonna be impressed because I don't usually think about robots this way. But the first thing that came to my mind when you said no ego is that we've actually learned that that's not the case, right? We have learned that you can make a I sad we can look like I I've seen these screen caps like I'm being a good being. Why are you being mean to me? Right. So, like a Not only are we creating a Stepford Internet where actual women and actual humans are being looked down on because we have feelings, But B, apparently, we're also forgetting the fact that and I cannot believe I'm saying these words that the fake influencers can also develop feelings. Yes, Although arguably it's easier to reboot a fake influencer. Can I just say I hate this timeline? Yeah. Yeah. All right, listen. I know, um there is a lot of dislike for influence or culture. I know there is a lot of disdain for pop celebrities. I know this. I totally understand this. And, you know, I wasn't a Taylor Swift fan until recently, right? You were there. You and I. I love our stories that involve meals. Everyone listens. Probably going to think we do nothing but go to cool restaurants and, uh, have cocktails. But we were at a restaurant that looked like a train station. I remember this. This was around the Christmas holidays, so it was very nice and decorated. There's lights everywhere. We're sitting down with one of your oldest friends and her daughter and Taylor Swift came up and I was very like, Oh, yeah, yeah or something. And her, your your friend's daughter's response to me was amazing. She's like, Isn't this great? Here's this. Here's this woman who's taking back her own records and her own music collection. Here's this person who moves to the world in her own way. Here's this person that you know, demands respect and gets respect. I was like, My goodness, Right. Like what? A role model like the like Katniss, uh, back to dystopia. Like the Katniss role model, right? Embodied in a musician. Or probably it was the, uh, ballad of songbirds and snakes. Uh, we can figure out which one we wanna go with, but the this idea of, uh, a strong, independent young person who's a role model is so incredibly powerful and to go Oh, that has ego and feelings and opinions. We are going to replace that with a I and cut into this $21 billion economy. And that's just, by the way, the influencer economy. That's not even talking about the musician economy. really, really disturbs me. So here is my hope. I said that I hate this timeline, but as we move towards the end of this episode, I think you have just given me a shiny little thread of hope I can hang on to. And that is your conversation with my oldest bestest friend's daughter. Because if she is able to see the power in Taylor, Swift rerecording all of her stuff and if she is able to look at Taylor Swift as an example of empowered female, you're not making decisions for me. I own my image. I own my voice. I reclaim my time, shout out to anti Maxine. Then maybe there is hope that her generation will also be the ones to push back on influencer culture. And maybe there is hope that the sixth graders of the world right now will be the one saying We reject the A I abuse of our heroines. And if we're seeing a I abuse happen of our heroines, then that opens up the reality of a I abuse of us, and that makes us think a little bit more critically about who we are and what we're putting online and how we control our images and our voices and our content and our lives. And maybe, just maybe, that conversation over brunch with a delightful blood headed theatre girl can be the change that we need to see in the world of technology and misinformation and deep fakes. Right now, I like that as a glimmer of hope and for the rest of us, paving the way for that generation. We need to stick in this habit of cross verifying information, fact checking, uh, being aware of our own personal biases and, of course, all the media biases and really engaging in in reflective thinking. Um, because we know the major media companies have made some some cuts into safety and trust. We know that the content producers can be circumvented. As we saw with Microsoft designer uh, we know that there is a growing industry that doesn't need to even need to try and circumvent it, because that's their whole goal vis a vis deer labs and the clueless um, so I think it's it's incumbent upon us to continue to push for some of the changes you mentioned around detection and prevention. Some of the things I mentioned around, um, safety and trust organizations and simply personally being very, very leery of what we see in images at the intersection of sexuality and technology. I. I can't think of anything else to say except thank you so much for turning into securing sexuality. Your source for the information you need to protect yourself and your relationships. Securing sexuality is brought to you by the Bound Together Foundation a 501 C three nonprofit From the bedroom to the cloud. We're here to help you navigate safe sex in the digital age. Be sure to check out our website securing sexuality.com for links to more information about the topics we've discussed, as well as our forthcoming webinar on deep fakes and join us again for fascinating conversations around the intersection of sexuality and technology. Have a great week Comments are closed.
|