Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week's episode: Violet Blue - Wikipedia Intimacy Coordinator - Wikipedia Safe for work porn - Huggingface Adobe Voco (YouTube demo) “George Carlin: I'm Glad I'm Dead” purports to be a comedy special produced by artificial intelligence trained to mimic the legendary stand-up's style. - LA Times Hikikomori - Wikipedia 18 U.S.C 2257- 2257A - Justice.gov Spanish Prisoner Scam - Wikipedia AI-Generated adult Content and its Implications
Artificial intelligence (AI) has revolutionized various industries, from healthcare to finance. Yet, its impact on the adult entertainment industry has raised significant ethical and legal concerns. AI-generated pornography, also known as "deepfake" pornography, refers to the use of AI algorithms to superimpose individuals' faces onto explicit content, creating realistic and often non-consensual pornographic material. Here we delve into the ethical and legal implications of AI-generated pornography, highlighting the importance of addressing these issues in order to protect individuals' privacy, consent, and mental well-being.
Understanding AI-Generated Pornography AI-generated pornography involves the use of deep learning algorithms to manipulate and alter visual content, particularly by superimposing the face of one individual onto another's body. These algorithms analyze vast amounts of data, such as images and videos, to generate convincing and realistic pornographic material that can be shared online. While AI technology has the potential for numerous positive applications, its misuse in generating non-consensual pornography raises significant ethical concerns. Ethical Implications
Legal Implications
Addressing the Challenges
AI-generated pornography presents significant ethical and legal challenges that must be addressed to protect individuals' privacy, consent, and mental well-being. The non-consensual use of AI algorithms to create explicit content exploits and objectifies individuals, violating their rights and causing psychological harm. Legal reforms, technological solutions, and education campaigns are necessary to combat deepfake pornography and create a safer online environment for all. By prioritizing ethical considerations and implementing robust legal frameworks, society can mitigate the negative impacts of AI-generated pornography and uphold the dignity and well-being of individuals. Key Concepts:
Hello and welcome to Securing Sexuality. The podcast where we discuss the intersection of intimacy-
-and information Security. I'm Wolf Goerlich. He's a hacker. And I'm Stefani Goerlich. She's a sex therapist. And together we're gonna discuss what safe sex looks like in a digital age. Hey, you know what? Can I admit something? Yeah. Sometimes I'm wrong. Can you say that again? Because we're recording. And I wanna capture this moment for posterity. No, I'm actually gonna beep it out when I play this. But do you remember that episode we did a bit ago about Taylor Swift and deep fakes and porn. It was like 71. That was one of my favourite episodes. It was so much fun. And we got some good feedback. And some of the feedback was from Wolf. You said block the edits or block the, you know, way people are typing in because there's some, like typos allowed people to create these images and block how the images are extracted and uploaded. Because I'm thinking about DLP and you mentioned social media and they're like, Oh, well, You sweet summer child. There's so much better ways to solve this. All right. Well, you've already gotten so tacky. You've lost me. So I hope that, there's somebody that can help me figure this out because I she just wanted to talk about Taylor Swift, man. So where are we going with this this week? We've got Joshua Marpet on the line. Josh, how's it going? I am good, Wolfy. How are you, sir? I am doing so good. So I know you. You and I are both faculty on ions. I know you've had so many opportunities to help. Companies big and small tackle AI but for those who don't and I'll probably forget something you are the executive director of Guarded Risk, a firm devoted to security and compliance of law firms, insurance companies, data processors and the clients are your faculty and ions with me? You're a board member of global B sides and a couple other B sides, right? Several of them? Yes, yes, absolutely. So besides, Delaware bides DC. I'm a board member of sky talks, which is defcon Mini con. I'm I'm a board member of half a dozen other conferences that I rarely like. They just put me on the board. It's weird, but, doesn't make any money, or I don't really do much with them, so it's not a big deal. I started a 501 C six to help companies understand risk management. But that sort of fell apart as dormant right now. Oh, God. III, I advise multiple. Start up. I'm working with two right now, and, we probably gonna work with a few more in the near future. and with two kids, I occasionally sleep. Sleep is overrated. no, no, sleep is wonderful. I wish I got more of it. But what are you gonna do? But now I think we need to, like, just add you on to the board of Securing Sexuality the conference. Just you. You're one day there. You're like, Oh, I guess I agreed to that. You know, it's interesting, but, I I happen to be, very proud of something I have done. I was on the board at besides Las Vegas, and we brought Violet blue. I'm sure you don't know Violet Blue. Absolutely. She was rejected from a conference because of a talk she gave about drugs and sex, and I was very proud that I pushed to have her brought to besides Las Vegas to give her talk there. And I managed to be to wrangle the job of introducing her talk, and I'm very, very proud of that. So this is actually a topic that's near and dear to my heart and has been for many, many years. I think it's It's something that goes right along with security, to be very honest with you. It is It is a measure of communication. It is a measure of consent. It is a measure of understanding, sometimes very interesting situations that are outside the boundaries of normal or or what we term as normal behaviour or perceive as normal behaviour. And that's a whole conversation to have all on its own. OK, and and yet it is one of the most stigmatised things that people do, just like breaking into banks legally. So it's kind of fun. I mean, I make my career out of helping people parse out what's normal and what's not. So you've won me over. Let's do this. Let's talk. So there's this little thing called AI or actually LL MS large language models. We we can start wherever you wanna start. But one of the things that they've been abused for is creating pornographic imagery or creating images that take celebrities and put celebrities in various positions they shouldn't be, or up to and including, you know, child materials as well. Let's start with the first couple of topics that are much easier to much more palatable. and the first couple of topics are things you're you're referring to deep fakes. You're referring to AI’s image generation, AI’s that are building pornographic images. there are several AIs that are public on the Internet that are just for pornographic images. As a matter of fact, the the the members of that AI sort of community hold contests to see if they can get an image that's exactly to this standard or the most insane image they can think of, or various different contests. there are ways to get around the security features of some of the public AIS to get, image generation AIS to get pornographic imagery. And then there are ways to go get around the security standards of the text AIS to get pornographic stories. OK, so which of those is acceptable? And which of those isn't? Is any of that acceptable? Now, mind you, we're not talking about the the child, the child, material. Leave that aside for the moment. But is an image acceptable? And the story isn't or vice versAIs neither acceptable as both. I mean, you have a consenting AI That was kind of a joke, but you get the idea. I mean, we we've had philosophers on to talk about whether or not robots can consent. So I not gonna lie. I've kind of lost any sense of the idea of of humour or sarcasm when it comes to what are we asking robots to consent to and when do we not need to ask them anymore? Because these are real life scenarios now, Like honestly, until you said that it had never occurred to me that we might need AI's consent to tell a or write a dirty story. but the idea that we would need a human being's consent in order to create fake porn of them makes perfect sense to me. So how are people drawing those lines? They're not. I mean, let's be honest. There are websites right now that look for, celebrity porn in the movies. You know, when, when, when when boobs are shown or ass is shown or whatever. Full frontal in some cases. in a movie, as as part of the movie. And they they extract that scene. There's people that will put it in Photoshop and and and raise the lighting artificially so that you can see what was tastefully hidden by darkness. Well, it's not tastefully hidden anymore. And is that legit? And And the answer is, they're putting themselves on film. Metaphorically. It's digital now, but you get the idea. Is that legitimate? II? I don't think so. It feels wrong, but I don't know that I have a cogent or or coherent argument against that kind of thing. OK, I don't know Wolf. What do you think? Let's ask him. Oh, I have no idea what's right or wrong in any anymore, but I know that like when you're on the movie set, the performers are actually negotiating that, like what is tastefully gonna be hidden by darkness? What is What is the, things that I'm wearing? And Stefani? What is that called again? There's, like a whole coordinator. Intimacy coordinator. Yeah, I. I actually study this a bit. We do intimacy coordinator training later this year. Oh, really? Yeah. There's a couple of them on tiktok. I'll I'll see if I can find them. I'll send it like they The questions they raise are out of this world. Cool. You know, you're not comfortable doing that. How do we simulate that effectively or or what? What do you wear during this? And like I'm like, Wow, that's so cool. I never thought of it. You know, it's it's neat. It really is. I don't actually want to do necessarily intimacy coordinating work, But I think that the negotiation skills that you're talking about and the language that that profession is creating around sex is something that is so useful to me as a sex therapist. And, especially as I'm doing more expert witness work and helping people wrap their heads around what negotiation looks like in these sort of context. So, yeah, it's on my to do to train as an intimacy coordinator later this year. Nice nice. Nice night. Yeah, the negotiate. Look, negotiation is implicit in every relationship. Whether you're straight, gay poly, queer in some form or fashion whatever. If you're not negotiating with your partner, you're not doing it right, OK? And it's it's it's part of communication. Oh, my God. Communication is important in a relationship. Who the fuck that? OK, and so do we have a duty? A right, A a need to communicate with the the the the the the the person who's porn were deep faking with the robot who were asking to deep fake it with the the recipient who were saying Take a look at this, you know, like like, is there a negotiation or communication there? typically in the past, in, in, in Internet based archives of of, you know, crap. Now I'm having a brain fart. not F net the the old archives. wolf, when you go to vampires dot bite dot flunk dot flunk dot flunk or whatever. What was that? The binaries. All that binaries. Use net use net. Thank you, brain. Sorry. You know, use net. You'd literally There were There were subgroups in use net that were full of nothing but porn. OK, and it was, This is back when people use dial up like it would take you 10 minutes to download that that that that piece of of of of basically, effectively to the person At that end, it's a piece of meat that you're downloading OK, because you're downloading image after image after image after image for, you know, and and it's like, OK, that's that's a bit weird, all right, And is it the same as buying a playboy and look, this is a philosophical discussion. Now let's get back to the technology so I don't even know, but this is something that's been interesting, and I've been thinking about recently, and it's it's an ongoing thing. part of the problem is that I have people that come and talk to me because for whatever reason, I'm seen as one of the father figures of infosec. And so I've probably not very professionally helped people understand some things about themselves and about their relationships. I'm one of the most vanilla people you'll ever meet. I'm my wife and I have two kids, and we're living the middle class life, you know, with our house in the suburbs and all that kind of crap. But I've talked to and have great conversations with multiple pole toes that are, you know, and they're like, Why are we having this problem? I'm like, Did you talk to each other about it? No. Maybe that's a problem. And, but it's It's fascinating how similar all of these issues are of every relationship of every stripe of every person. And it all boils down to communication and trust and honesty and honour and discussion. But II, I feel, you know, first off, I love the idea that all these molecules want you. That's that's what I heard. That was my takeaway of this. But the I. If someone is working with the intimacy coordinator and negotiating where they're comfortable with or what not to get on on a screen, at least they've had that negotiation. I had that consent. Now we can talk about like some of that consent. Negotiation gets removed if you've photo manipulated. The result. But at least that that conversation is happening similar to if models are are posing for photos. I know we've had you know conversations with, producers of of that content saying that sometimes those negotiations are very clear. And sometimes those boundaries K are very fuzzy. But at least there is the opportunity to consent. But by the time we get to, I'm gonna ask one of these image manipulation tools to create, a picture of Josh Maret and his 16 hole. No, that that's gonna be the cover out of this episode. But by the time we get to that right, like the you as the the human subject do not have the opportunity to be engaged in that consent or have thoughtful communication back to the art form or the medium which is producing the content truth. But then, on the other hand, if somebody very carefully did a watercolour of of, you know, Wolf and and, and his 16 molecules are they asking for your consent to do that? But would anybody deny that it's potentially art? So there's a question of Is it art? And does art transcend that that desire or that need of consent? And and it doesn't transcend, but I mean there there's a lot of discussions around this, and by the way. Now we're gonna switch over to the the nasty stuff. Ok, I was actually, I saw an article very recently about, Ah, Good Lord. What's your name? Brook Shields. Are you familiar with the the photo set that is very controversial about her? Yeah. She, was allowed by her mother to pose for you know what? I'm not even gonna say aloud because that implies that she had a choice in that she didn't. She was, Playboy was allowed to photograph Brooke Shields at 10 for a full, traditional Playboy photo shoot. Spread. So Brooke Shields, was really, and she's very well known as a victim of child exploitation. There was a lot of her early work that her mother signed off on that as an adult. She has said she would never have wanted to do. but yes, absolutely. And those? Those playboy images have been, legally defined as art and not as a child. Exploitation, right. She actually went to court to get them pulled from everywhere, and the court said, no. Your mom signed off. Therefore there are, and that's that's just horrifying. Ok, sure. how that makes legal sense. Because if any other parent were to create exploitative materials of their child, the fact that it was allowed by the parent would not make that art. It would still be, frankly, most child exploitation happens within the family system, and it's never held up as art. It is prosecuted. Yeah. So then the question is, since in traditional media, we've already defined some child exploitative materials art. What do we do about stuff? That's, you know, procedurally generated code generated and doesn't show any specific child. just shows generic child, if you will. Is that illegal? Now, my response is yes, but in reality, I don't know what the law says. OK, that's that's a horrifying question. And one I don't ever wanna, like, just nasty irrespective, how do we stop that from happening On a technical level is a whole different question. OK, but I wanted to point out that's why I brought this up. That there is an there actually is a real life example of that in the past. Far before AI and LLMs ever came out, I will say on the the legal side. around the time we're recording this in 2024 and I still can't believe it's 2024. The European Union is actually working through criminalising AI generated content, of Children. So there there is laws coming, and as we all know, like EU does it. And then maybe a decade later, US catches up. So you are so optimistic, I. I get that a lot. So actually, the opposite was true in the US it we actually tried to head this off in the late nineties, we passed the Child Pornography Prevention Act in 1996 and in 2002, the Supreme Court struck it down. the the provisions that criminalised virtual child exploitation material, which included computer generated images because they were computer generated. And they said that because no Children were harmed in the production, that production, qualified as free speech. So the EU 20 years later is actually ahead of where we were 20 years ago. Where we said, as long as it's virtual, Apparently it's legal. No comment. Horrified, but no comment. Yeah, but, Hey, like I said, we were gonna keep it non technical for a little while, Steph. So, but here's the thing. Assume that I have an AI that does image generation and assume I'm I'm selling time on it. You know, you want to generate illustrations for your book. No problem. Go for it. You wanna generate images about, you know, for for a poem you're writing for your fiance. Great. Go to town. OK? You wanna generate child sexual abuse material? What the hell is wrong with it? but how do I stop that? Because let's be honest. Like there are plenty of people that look younger, OK? There's actually a court case about that. There was a guy that travelled back from Brazil, I think, South America somewhere and had a DVD in his in his luggage. And they prosecuted him for child abuse or child material, child abuse material. And the the star of that DVD had to show up in court and say No, no, no. Here's my ID. I'm 22 or whatever she was, she was legal and they went You're joking like they had a a paediatrician. Examine that the recording that like they went and they did the right stuff. So OK, that's awkward. And then there's the fact that, yes, it's procedurally generated. And then there's the fact that you know, like one of my exes. Swear to God she looked 16 in her thirties. OK, but anyway, it's tough. OK, so how do you do? How do you deal with that? And by the way, what is child sexual abuse material? I have an 18 month old baby. OK, there are plenty of pictures of me holding her, and you can just see the top of her butt or whatever. You know, because she's wearing diapers. They they sag. Is that Is that C, you know? But if I if I had a system that shuts it, it's a small child and there's there's, there's there's any kind of of skin of any kind. Shut it down. What about if it's just her face? You know I'm holding her so she's facing forward and you can see her face and she's smiling. I like like it's really hard OK to determine what is CS a from skin colour or or you know that that there's skin showing or whatever it's from from from various things, it's it's not as easy as it seems. And you know how many, wolf, how many pictures are of you when you were a tiny child that you were in a bathtub or in a play pool? Out in the in the in the front yard or whatever? Do you know what I mean? No, I Dr I grew up Gen X. We had no no photos of us II. I entered the scene around 12. I'm pretty sure, actually. So did I. And my my wife My wife often asked me Why aren't there no pictures of you? When you were a kid, I'm like they just weren't There wasn't a thing. Our parents really did much. But you're you're you get to the point that I was, alluding to when we were talking about this in the previous episode, which was You can have good input filtering like don't use these words, right? Don't request these things. But the history of computer hacking is finding ways around those philtres. We have a long history of subverting, and I think that what I heard was the Taylor Swift one was made possible because they misspelt Taylor so it got through the philtre and then the LLM went Oh, you must mean Taylor Swift Taylor. And so we we've got problems on the philtre side, which we can We can enact some guard rails, but they're not gonna be great. And we got problems on the export side because think about every D, LP and image recognition tool ever which, to your point, has usually a lot of false positives and and by the way, even if it's not false positives I I mean, have you ever seen safe for work porn? This is not a joke. If you go to YouTube and YouTube, you know, YouTube, search safe for work porn. There is a porn or a compilation of porn that somebody took and changed into safe for work. So they they have like, eating an ear of corn instead of OK, they have, pumping quarters furiously into a pinball machine rather than OK, riding a horse. Pinball machines are very sexy, though. I mean, those were like the original, you know, illicit technology. But wow, I did not know about this. Is it? Is it something that it's very old like? It's probably 20 years old at this point. Jesus, I'm old. But, like, is it Is it something that is? Could you could you say Oh, this is totally not porn? No, it's absolutely. But you can Absolutely. But if you said you know, Taylor Swift riding a horse lady Godiva style, which means naked just to be clear, down a bumpy road, could you then photo edit it to take the horse out? Hm? Or something like I. I mean, I I'm making this up on the fly, but do you get the idea? Like, is it possible to make things close enough? Yeah, of course. Yeah. So we're not gonna stop it. We're not gonna philtre it. There's another item that I wanna throw into this mix because this was brought up on a on again on a previous conversation we had, which is, you know, with advertising, right? Certain content cannot be advertised by Facebook or instagram. It's blocked. so you know, they they show a picture of a cactus instead of something else, and they they allude to buy these pills in your, What this person was alluding to was the choices that we make by the philtres. I also create conditions where people may not, see themselves recognised or see themselves reflected. They may not feel, accepted. She was using the example of, she could ask for a lot of male parts. But anytime she asks for a female part, even if it's like it's for a textbook, the the APP would shut it down like, you know, open it would be like, Nope, can't do that. And so there's also I think we need to acknowledge that our own biases and our own, beliefs and what's normal and what's not also get reflected and coded in these philtres. Yeah, what if you, you know? And if if your staff is racist, maybe you can show all black porn you know, African American porn, But you can't show any white porn. OK, or or generate it rather, maybe you can, you know, there there's there's racism, sexism, ageism, and is you can think of could be reflected in those philtres and and and embodied in those philtres. So wow, that was an interesting choice in terms, but, excuse me, but I mean the the problem is is that? You know, it's It's like the old saying Whenever you make something foolproof, God makes a better fool. And if you build a philtre Sorry, go ahead, Step. I've never heard that before, and I was just saying I love that. That is a brilliant statement as ancient. I don't even know how old that my grandfather told that to me, I think. And my dad used to tell it to me all the time. So I mean, like, I as far as I know, that's 100 years old. Feel free to steal it. It's all yours. but it's it's literally if you make something that's perfect. Yeah, you're you're somebody's gonna come around, fight, figure out a way around it. I mean, let me give you an example of filtering gun Wrong. A good friend of mine would tell this story. and if he hears this, he'll recognise himself. But he he was on a pen test and his team was working around the the denial list that the that the that the, the the defence team had in place on their Web web application. Firewall. OK, this is, basically a way to keep people out if they're trying to do certain things, OK, Stefani And they couldn't get around it like this is really good. So he got on a plane, flew out to the data centre social, engineered his way into the data centre, pulled the waf out of the rack across it and flew home. And he went to the report meeting. And they're like, How was the, you know, how was the the denialist? How was the the Waf? And he's like, it was good. We had a problem with it like, yeah, you couldn't get past He goes, No, we could get past it. Here's your wife. And they went What he's like. We couldn't get past it. So I stole the waf. They're like, how with, Or he said. Then we were able to get pa. Then we would had no problem getting here. Here's your demand. Admin. Here's all your stuff and they're like, Oh, dear God. So no matter what you do, somebody's gonna figure out a way. I mean, let's be honest, you know, hugging face. No. What's hugging face? Oh, open up a browser. Go to hugging face.co. It's just hugging face.co.com CO. This is yeah, but it's it's nothing horrible. It's, it's a forum for AI enthusiasts to trade AIS They trade models. They trade data sets, they trade everything. But if you go to, models at the very top, there's a button that marked models. Click that on the left. You'll see a list of tasks that AIS can do. One of them is image generation and text the image. You can build your own in minutes so I don't care what you set up as philtres. I'll just build my own. I'm looking at it right now, and the scariest thing for me is text to video. So we've reached the age where I I really can't believe my own eyes If what I'm looking at is on my computer screen or on my cell phone in security. We've already started doing this, whereas we're taking all human factors out of authentication. If your CEO loses his cell phone or her cell phone on a trip, guess what. You don't get a cell phone until you show up at an office in person. Yeah, but you know who I am. I don't care you don't get a new cell phone until you show up in person. just the other day, a week there was the, the one person that had a zoom call with four or five of their C level board members who instructed them to make a $25 million deposit. They were all deep fakes live. And he made a $25 million deposit into an account that he'd never touched before. And guess what? They were not real. They were all deep fakes. So you literally, until I see you in person, it's not real. Which means that hey, I recognise I'm on the help desk. I recognise his voice. No, you don't. I can see her on the zoom call. No, you can't. Yes, you're absolutely correct. As a matter of fact, if you want to go back to YouTube, look up. Adobe Voco VOCO. OK, Adobe space Voco. You will see a a product launch video about a system that adobe built where you could feed in about 20 minutes of audio material. And then you could type whatever you wanted and that speaker would say it. So you I I would feed in 20 minutes of wolf speaking, and then I could type, you know. Hi, Steph. How are you? And it would say in Wolf's voice, perfect inflexion and everything. Hey, Steph, How are you? OK, and that they sh they never released that product because the backlash was so bad that, by the way, is from 2015 or 2016. So now you see how long this has been going on. Wow. So how does this impact like we? We've talked about the the icky things from my world, and I don't want to spend too much time on those because, frankly, I love our listeners, and none of them want to spend an hour listening to us. Talk about CC. That is the best way to I'm OK with that, but I can think of so many other problematic areas, right? Like we're heading into an election year. How does anybody know what news is real anymore? How does anybody know what information they should believe? I mean, short of going back to the town crier, where the people that saw it with their own two eyes, told the guy to go and yell up and down the block that the goats are running loose. How do we actually communicate news now and trust it? We don't It's quite literally it's that bad. It is literally to the point where if it doesn't come from a reputable source who is double checking and fact checking and checking with the appropriate people, you can't trust it. And the problem is, is that that kind of news? That kind of investigative journalism, that kind of proper journalism costs this magic stuff called money. And you know, you guys are doing this, for example, as a passion project. You guys, you you two are doing this as something that's appropriate and and relevant to the two of your interests and loves and and cares, and it's useful. But realistically, this is a passion project, and eventually you may get sponsorship. You may have it already. I don't know. I apologise. you should call Bad Dragon, but, but your your you don't need money to do this, but it costs you money to do it and time and money. Time is money. And, for somebody to do investigative journalism on a proper fashion, they actually have to. They need quite a lot of money. And so who's going to do that? Because the only people we're getting news from other than comedians who are better newscasters, most newscasters these days and they're getting their money from jokes and and and ads and everything else are people like, you know, Fox who are getting their money from, other places. OK, who may be politically biassed? No, never. But you're getting me started, you know, back to back to the the LL MS for a minute. This is another thing that's happening right? Is so many of these news stories now are just being cranked out by one of these, you know, text chat bots. And we we we both know friends who were reporters who got out of that industry into our industry because there was no money. Everyone was getting laid off. Everyone was getting let go. And so not only do you not have the money to do investigative journalism, but a lot of the good journalists have already left, and a lot of the content out there is being inherited off of these These models, I mean and and, somebody did a YouTube of it not long ago where they showed 50 different news channels and they were all parroting the exact same lines because they're fed their stories from a from a consolidated news source. And they read the word for word, and they've been buying up news stations, and they've been buying up, radio stations and they've been buying up all these stations. And coincidentally, maybe they have a political agenda. Maybe not. Of course, they're totally perfect. But, you know, I'm just saying it's possible. but they're they've gutted the field of journalism. There was a joke, on reddit literally. I think today was a picture of Walter Cronkite and and it said underneath. Do you believe this? Kids? This was a guy that read the news. He didn't tell us how to think. He didn't tell us what to think. He just read the news as it happened, and then he shut up and got off the air. And I'm like, Well, yeah, that's Cronkite, like what's? And there was Jennings. And there was rather and there, you know, and like and it was like, Oh, wow, that doesn't happen anymore. Yeah, but with adobe vocal, we could bring him back. I got you so hard. It it it's interesting. And I promise gonna sound like I'm derailing, but I'm not. It's interesting that you bring up Bad Dragon, Josh because we don't have advertisers, and we don't do that for a couple reasons. partly because we talk about, you know, sex, tech and pro. And I don't ever want people to think that what we're saying is informed by money. So we have self funded this whole thing. It is a hobby for us. It is a passion for us. but it's funny that you specifically referenced Bad Dragon because my kid, my child, the fruit of my limbs. No, that's not how that goes about my loins. Well, it's good with them. But, you know, my child was like Bad Dragon should sponsor your conference, and my child was reaching out to Bad Dragon. I don't think they ever responded to him. So you know, Bad Dragon, if you're out there. Hi. But you saying that made me think about my kid because my kids in his early twenties and completely cynical and jaded, and it's just beyond the idea of truth and reality like he is, and his friends are completely nihilistic because what's the point? You can't believe anything. Nothing's true. Nothing's real. There's no sort of fixed point of fact or agreed upon objective reality these days. You can't believe what your eyes and ears are telling you. You can't know that your best friend on Instagram is even real and not a buy in a North Korean like, yeah, So there's this whole generation of kids that have just, like, given up on the idea of caring, because how can you care about what you can't trust? I take it one step farther. How many of them will be able to afford a house when they become of age? How many of them will be able to afford a starter house or a family house? How many of them will be able to afford a car, a house, a job, a yearly vacation to Europe? Whatever you know, or even to Disney World? I mean, God, Disney World is like 8 to $10,000 a trip now for a family like, that's crazy. OK, I and so now you see and I know we've gone far a field and I apologise. But I mean and, and you have to understand one thing else about me. I'm I'm very liberal, except I'm also, I'm an ex cop, but I'm a pro to a person. OK, Second Amendment person. And I'm not pro to a unilaterally or or or universally, let's be clear. But the point is is now you wonder why people go on while shooting rampages. It's because there is literally nothing left and they, you know, and I'm not parroting mental health care, mental health care. I mean, that's bloody important. Don't get me wrong, but it's literally because there's no career path. There's no American dream. There's no there's nothing for them to do when you just don't care because you don't know what's real. What does it matter? So we watched the Stepford Wives, the 19 seventies film, and we were. We were just having a conversation about that, and he was also like you guys like this. The acting is terrible, but at one point in time he was talking about the houses. He's like, you know, the they don't seem to have that very fancy job to get that house. I said, We'll pick one of the houses and he pointed at one of them. I said, All right, so at the time that was a $25,000 house. He's like That was a $25,000 house. It's like that. That house was the price of my car. This is not fair. He started ranting because it's like what happened to housing and everything like, Yeah, well, I mean, you used to be able to get a house for middle class income. No, no, no. Remember that that minimum wage back then was $3 you know, $4 whatever it was. But on the other hand, minimum wage has gone up maybe 50% even if you say it's gone up 100%. The housing prices and tuition prices, student loans and car prices have gone up a hell of a lot more than that. When I saw a friend of mine got an F 150 pickup and paid close to $100,000 for his car, and I just the hell that's the price of a house. He looked at me funny. He's like, No, it's not. Oh, God, you're right. You know, but, you know, my my parents bought a house when I was 1213 somewhere in there, and they paid significantly less. Then I paid for my house here in in Pennsylvania, and, they got a lot more for the money. A lot more for the money I have. I have I We have a lovely house. Don't get me wrong. We have almost three quarters of an acre of land. OK, they have six there. There's a difference. You know what I mean? And in New Jersey as well, which is, you know, anyway, Wow, if we gone far a field, I'm terribly sorry. No, you're fine. It it seems really unfair that, you know, you could create fake porn of me tomorrow. And wolf, could AI generate my voice to fake out? I don't know. My clients and somebody else could create fake videos of Wolf and I giving a talk, But we can't just like, why can't AI make us a house yet? Like, why are none of these technologies being put to practical use? Do you want the honest answer? Yeah, because you're not rich enough to matter. Yeah, and I don't mean any offence. By that, please don't make AI'm not either. No, but you're not rich enough to matter. The people that control significant amounts of these technologies. Now, can we do it, hobbyist? Absolutely. Can I build a 3D printer that will build me a house? Yes. Will it pass code? Maybe. OK, well, your your point about code I wanna jump on because one of the things I was, and I'm not gonna bore the audience here. But if anyone wants to get me as pull me aside at a conference, it can be some drinks. I will. I will tell you this. I was looking at earthquakes and building codes and how they developed until we finally had a consistent code where our building could survive an earthquake. And I was plotting that timeline across software build materials and all the problems that have happened along the lines of bad code, that bad software code now not building codes, that we continue to write again and again. And and so, you know, clearly we're not. We are part of my timeline of that pretty far away from having any building codes type instruction about Here's how you manage a AI model. Here's how you build AI models so they're safe. But I know you are brought in all the time to talk to, you know, Fortune 500 companies who are very concerned about AI and ML. What what can we do to? To help nudge is to be safer. You know, it's interesting you say that about building codes because I need to give you a tiny bit of background. My father is an expert witness on building codes, slip trips and falls. He's one of the top tribologists in the world, which is the friction. So if you want to talk friction, he knows more than just about anybody else in the world. It's scary. I spent much of my childhood getting paid, you know, a pittance to run a friction testing machine. But that was my allowance, damn it in the work. So I have literally tested more flooring types. I've thrown dummies into the paths of subway trains. I've like like I've done all sorts of weird stuff. but building codes like we I grew up surrounded by sets of building codes. Boca Basic Building Code 19 fifties, 19 sixties, et cetera. 18 hundreds. He has, I guess, some very rare books. And so, if you ever need to talk that, Yeah, I'll put you in touch with my dad. You guys can chat for hours before I am taking you up on that. That's not a problem, Not a problem. You just have to come up for the barbecue, OK? You you both have to come out. You're killing me. We have already had Wolf give talks on food production on toasters on Universal movie monsters. He is gonna end up doing an entire friction themed cybersecurity talk, and I'm gonna be the one that has to listen to him practise. But there'll be barbecue that we barbecue. But in all seriousness, you know, when you talk about plotting it against S, bombs and everything, you should also talk to Sergei Brauss. Do you know Sergei off the top of my head? I don't It's not ringing a bell. So I just interviewed him the other night on Paul's Security Weekly. The podcast I'm on and, I I meaning all of us. Did I not just And it's fascinating because he says that malware is on the verge of developing a language that is consistent and coherent. In other words, instead of finding vulnerabilities as a oh, look, this is interesting. And look, there's a hole in it vulnerability. It's I wish to design a vulnerability for this class of events, and I can create vulnerable where vulnerability should be based on my understanding of the entire class. OK, so instead of a bespoke vulnerability because of a casual found encounter or an encounter, it is. I know, where there's going to be vulnerabilities because I understand everything about this. There's going to be a language of vulnerabilities, a language of malware, and it's fascinating talking to him about that and terrifying. Oh, and have absolutely terrifying because the obverse of that is that we have nothing resembling that for cybersecurity. Nothing. Hell, you talk to four dozen pen testers and ask them the definition of pen test. You'll get four dozen different answers. OK, you talk to incident responders and ask them for what do they do to stop the bleeding and you get 20 different checklists? OK, we're screwed, OK, which is why one of the things I'm doing is trying to define pest as a beginning sort of vernacular terminology. I'm actually honestly thinking about starting to write some articles about terminology so that we can start getting definitions OK? Because if we don't have apples to apples definitions so that we can talk in a similar language, even if it is English or Hebrew or whatever. But if we don't have apples to apples definitions, how the bloody hell do could we ever expect to get it anywhere? OK, now Sorry, one of my rents. I apologise, but you've got a situation where we have LL MS doing porn. You've got a situation where we have bad actors doing all sorts of God awful things. OK, they're already using deep fix right now with AI assistance to, steal money from parents and grandparents. Steph, you're getting the phone call at three. In the morning. Mom, Mom, I'm in jail in Spain. Send $1000 to this wire address, OK, and it's your kid's voice, and their voice is accurate. Believable, understandable. It's them. You know it. It's them, but it's not OK. the the other. By the way, the other variant of that is, Mom, There's a gun to my head. Send money to him now or he's gonna kill me. Holy God! OK, those are defects now What? What do I mean about AI assistant? A defects and AI are not the same thing by the way, just to be clear. OK? A. I can produce video deeps or image deeps or even audio def fix. But it is not the same thing. OK, so what am I talking about? AI assisted Well, AI just scanned all of social media and found your kids Instagram posts harvested all of the audio and video out of them. Stocked it into the deep fake generator because found out as well that your kid stupidly said I'm going to Spain tomorrow. Three days later. Boom scam initiated. OK, by the way, why did they pick Spain trivia question? Wolf, Do you know it's the Spanish scam was what It was originally right. The Spanish Prisoner? Yes, the Spanish Prisoner scam. The classic scam that led to all this. Do you know how classic it is, Steph? It started in the Crusades. Oh, wow. Not a joke. During the Crusades there would be a night who would be captured and or or their letters, which were kept close to their heart, were were pulled off a dead body and read. And then somebody would write a letter trying to imitate their their style, saying, I'm in prison in Spain. Send £1000 by courier like it's still going on. Sorry, I love history. Lessons like that. No. So do I. I I'm usually the one rambling about sex history. But I, I think that's actually a really important thing to share because in my world, a lot of people get super overwhelmed by the technology piece. I mean, I always say, You know, my job here is to be the tech adjacent one and to kind of be the voice of the person that's like, can you please define LLM for me? And that's my job here. But that's also because that's my world, and that's my people, like we think about technology as very, very new, and we think about these processes and these experiences as being very, very new, and a lot of people get really overwhelmed by trying to keep up on all the newness So I think that even though it might be happening in a different form, hearing that this is something that's been going on since the Crusades actually makes it feel a little bit more learnable for a lot of people all of a sudden, it's something that isn't the scary, unknown thing. It's just a new version of something that's been happening for a long time and all of a sudden that feels more manageable. Well, hell, do you Do you know what people said? Bitcoin was? And all of the Cryptocurrency, What's that? I'm sorry, Chucky Cheese token? No, no, they said. Bitcoin is a great way so we can speed run through all of the scams of the 18 and 19 hundreds. Fair. I can see that. I mean, and it's It's literally the way these things work. And and and AI is we're gonna speed run through all of the sex scandals of the 19 fifties, all the way through the 19 of 20 tens or whatever, and and we're gonna see AI porn that's generated of specific people. We're gonna see a porn that's generated specific racial groups. We're gonna see AI porn that's generated of specific everything. we're not gonna see and and just the same way that we had, girls gone wild in the in the nineties and two thousands, right? Remember that? and I? I used to bounce on Bourbon Street in in New Orleans so I'd see I. I remember seeing the photographers in the early two thousands down there, OK, It was horrifyingly weird, let me tell you. But, you know, the the we're gonna see AI generated porn for specific request groups. We're gonna see a a group that's gonna pay 20 bucks a month so that they can get brand new A. I generated porn every day of XYZ. Whatever is their fancy, OK, we're also going to see groups that are much more hidden if they're smart, which they're not, which is going to pay a lot of more money for CS a material OK, and and so, like just the same way that they and use net. Thank you for the word again, Wolf. I couldn't remember for the life of me. we're gonna see it segregated by by what's interesting. You know, like blondes, brunettes, male on male. aha, goo. yi, I can never pronounce these things right. Forgive me, whatever. You'll see the exact same thing. Ok, and then and and to be honest, because the other problem these days is the dating scene is so God awful. According to all of my friends that are dating, they they they they can't find a decent partner to save their, you know, for love or money. Basically, Ha, ha. the, we're gonna see a lot more of these happening and in such ways, and I and I honestly, I've been wondering when we're going to start seeing the Japanese thing. hiki mori. You familiar with that? those are the, guys that are just kind of opting out of relationships entirely. They are only, having sort of virtual relationships, if at all, if at all. And some of them are the ones that are withdrawn. I think it's the same term. but they withdraw from society entirely. And then we have the ones like we're gonna repeat all of these things from the 19 sixties seventies, eighties nineties, two thousands, everything from free love, all the way up to who the hell knows what's real anymore, But it's going to speed run through the next two or three years. And it's interesting, you know, kind of bringing things back to the Taylor swift question that instigated this conversation. It's interesting the idea that you know AI is speed racing through the fifties because some of the most famous photos of Marilyn Monroe coming back to Playboy Marilyn Monroe's Playboy spread was not consensual. Hugh Hefner bought those photographs from a photographer that hired her as an artist model before she became a name. Oh, I didn't know that. And when she made it big, he put her on the cover without her consent. Wow, So we are really and truly just replicating old problems in new ways. It's It's a lot of problems, and I don't see any solutions in sight. and if we try to stop all porn, it's it gets ugly really fast, and it doesn't work. if we try to stop child porn, we can stop child porn, but there's a cost to that. Anybody that looks young can't have their pornographic pictures anywhere. That's awkward. And then we have what is it? 22. 57. I forget that we're we're all porn. Porn, porn actress actors and actresses have to have their age registered. It has to be available to the law enforcement. Should they want to come by and see it blah, blah, blah. And and that's cool. I'm good with that. That makes total sense to me. and there's a lot of porn, really good porn houses that actually film the consent sessions. These are the things we're gonna be doing. And these are the things we're gonna be Ha happened to you and to everybody else. And are you OK with that? Is any problems with that? Tell us now, you know, and there's actually a court case on that. There was a a porn actress who went through a filmed consent session. Absolutely no problem. Absolutely no problem. And then after six, they did things to me without my consent. it's literally a films consent session. OK, which is both part of why circling all the way back to the beginning. Intimacy coordinators are so important. Not just in Hollywood, but in pornography. Right? Also why incent education is so important because that makes perfect sense to me. You can say up front. I am totally down for this. And midstream realise you're not feeling it that way or the fantasy it was does not align with the There are any number of reasons why somebody might revoke consent in the middle of something. Yeah, and to them point at a video and go Oh, well, you know, you said up front it was fine. Now, that might meet a legal threshold, but we need to be teaching ongoing, enthusiastic affirmative consent. Yeah, absolutely agreed. So I get to be the bad guy who does the time check. we We're at time. I'm gonna I'm gonna ask you one final question, Josh and Stefani, I'll let you have the final word. Then we'll do the outro. Josh, don't go anywhere because you mentioned Hebrew and I want to talk to you about something before we wrap. but I know Stefani, you and I got that thing at 5. 30 Eastern. So we got right. What's my final question? Well, you know, a lot of the what? I had one. You, you I. I was in the moment what you know, it is good to hear a lot of the things we talk about. Consent, communication, awareness, have a place in this Even when we're talking about LL MS. Even when we're talking about, emerging technology around the AI space is there any other like, Hey, you should think about this. You should be aware of this. You should be, you know, looking for this and the other guidance that you can give people Josh to protect themselves as we speed run through all the crimes in the next couple of years for your family. Have a code word. Have a couple of them have a code word for everything is fine, Mom. And a code word for everything is not fine, Mom. Ok, for business? No, don't use any human factors in authentication resets, for news, check out your new sources and find out who owns them and figure out who you believe, because it's gonna be really important in time to come. And I have several. And weirdly enough, many of them are comedians these days. That's the scary thing. But they do better news than most news stories. These days. And so that's my sort of miniscule pieces of advice for various aspects of this. I love that you say have a code word because, you know, we keep throwing back to examples from history. And I remember sitting in my kindergarten class and telling my friend that my mom told me that if anybody ever came to pick me up and said, Your mom told me to come and get you If they didn't say the word watermelon, I shouldn't go with them. And so a lot of these sort of like basic safety things that we used to teach kids back in the early days of the Internet or even before the Internet are just as useful as they were then. And I think that is really my biggest takeaway from this is, you know, thinking about things like stolen letters in the Crusades or stolen pictures in Playboy or safe words in kindergarten. Oh, wait safe words. That's probably not the right term to use but passwords in kindergarten. You know, all of those things make this really overwhelming kind of nihilistic, no real solutions. Yet conversation, if not more hopeful, at least a little bit more manageable for me. And I hope for the other me, like people listening to this. And so I wanna thank you for that, right? Like if this is a hard conversation to have, because there aren't any good answers yet and that can feel really scary. And that can feel really overwhelming. And I will be the therapist in the room and just say that those feelings are valid. But we can look to history to teach us how to navigate the future. And that is something I hadn't really thought about before having a a chat with you today. So thank you for that, Jo. Pleasure. Absolute pleasure. Thank you for having me. And thank you so much for tuning into scaring sexuality, your source of information. You need to protect yourself, your relationships and anything going on in Spain. Securing Sexuality is brought to you by the bound Together Foundation of 501 C three nonprofit. From the bedroom to the cloud. We're here to help you navigate what safe sex looks like in a digital age. Be sure to check out our website Securing Sexuality.com for links to more information about the topics we discussed here today. I think we might have set a record for most links captured in a conversation. So there will be a lot of references in that, go check that out as well as information about our live events, all of which are both safe for work and safe for life. None of the links are gonna go to any of the scary stuff that we talked about today. because we want you to be able to join us again next week for more fascinating conversations about the intersection of sexuality and technology, be safe, come up with a password and have a great week. Comments are closed.
|