Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week's episode:
The Pitfalls of Relying on Technology for Personal Issues: covenant eyes
Privacy and security have become paramount concerns for individuals and organizations in the digital age. With the increasing prevalence of monitoring software, such as Covenant Eyes, it is crucial to examine the potential risks associated with these tools critically. While Covenant Eyes aims to promote accountability and protect against harmful online content, assessing whether its benefits outweigh the privacy and security risks it presents is essential. Here, we undertake a comprehensive analysis of Covenant Eyes, shedding light on the potential consequences of using such monitoring software. Understanding Covenant Eyes: Covenant Eyes is a popular monitoring software that enables users to track and monitor internet usage on various devices, including computers, smartphones, and tablets. It is primarily marketed as a tool to promote accountability and help individuals overcome pornography addiction. The software captures screenshots and logs internet activity, which is then sent to an accountability partner or sponsor for review. While Covenant Eyes may effectively achieve its intended purpose, it is crucial to consider the potential privacy and security implications of using such software.
Privacy Concerns: One of the primary concerns surrounding monitoring software like Covenant Eyes is the invasion of privacy it may entail. Users may feel uncomfortable knowing that another individual constantly monitors and reviews their internet activity. This can lead to a loss of trust and a sense of surveillance, potentially straining relationships between users and their accountability partners. Additionally, the data collected by Covenant Eyes may be susceptible to unauthorized access or hacking. If the software's servers are compromised, sensitive information about users' internet activity could be exposed, leading to potential embarrassment or even blackmail. Furthermore, the screenshots captured by the software may inadvertently capture personal or sensitive information, such as passwords or private conversations, which could be misused if they fall into the wrong hands.
Security Risks: Besides privacy concerns, monitoring software like Covenant Eyes also poses security risks. By granting such software access to monitor internet activity, users provide a backdoor for potential attackers to exploit. If the software contains vulnerabilities, hackers may be able to gain unauthorized access to users' devices or networks. This could result in the theft of personal information, financial data, or even the installation of malware or ransomware. Moreover, Covenant Eyes' constant monitoring and logging of internet activity can slow down devices or networks, impacting overall performance. This is particularly problematic for organizations or individuals relying on high-speed internet connections for work or daily activities.
Mitigating Risks and Ensuring Privacy: While Covenant Eyes and other monitoring software may present privacy and security risks, some steps can be taken to mitigate these concerns. First, it is essential to thoroughly research and understand the software provider's privacy policies and data handling practices. This includes assessing how user data is stored, who has access to it, and what measures are in place to protect it from unauthorized access. Users should also consider using additional security measures, such as strong passwords, two-factor authentication, and regularly updating their devices and software. By implementing these best practices, individuals can reduce the likelihood of unauthorized access to their personal information. Furthermore, open and honest communication between users and their accountability partners is crucial. Establishing clear boundaries and expectations regarding the use of monitoring software can help alleviate concerns and foster trust. Users should have the autonomy to decide what aspects of their internet activity they are comfortable sharing and what should remain private.
In conclusion, while Covenant Eyes and similar monitoring software may provide accountability and protection against harmful online content, it is essential to critically examine the privacy and security risks associated with their use. Users must carefully weigh the potential benefits against the consequences, ensuring their privacy and security are adequately protected. By understanding and mitigating these risks, individuals can make informed decisions about whether or not to utilize monitoring software like Covenant Eyes.
Stefani Goerlich: Hello and welcome to Securing Sexuality. The podcast where we discuss the intersection of intimacy-
Wolf Goerlich: -and information security. I'm Wolf Gorlick.
Stefani: He's a hacker. And I'm Stefani Goerlich.
Wolf: She's a sex therapist. And together we're going to discuss what safe sex looks like in a digital age.
Stefani: But before we do that, love, Have you updated your iPhone yet?
Wolf: Yeah, of course I did. I mean, updates and multi-factor. Those are the things I talk about all the time. I say as I'm stalling to double-check.
Stefani: Yeah, do you have any, like, shiny red push notifications up there in your settings?
Wolf: I am on IOS 17.1.2. So I have updated the other night.
Stefani: Ok, so have you heard about Namedrop yet?
Wolf: No. What's that?
Stefani: Name drop is a super convenient feature, if you are, you know, you or me at a conference and a really problematic feature if you're basically anywhere else. It's a new thing that they added to the iPhone that, if your phones are proximate to each other, will automatically share contact information. So I suspect this is like Apple's response to Popple and Linktree and all of the cool sort of QR contact-sharing things that are happening right now, but it's slightly less cool because the default setting is on. So if you don't know to turn it off your phone could share your contact information with any other iPhone you happen to be close to. So, I think you should do that while we're while we're thinking about it.
Wolf: All right, So what am I doing? Where am I going?
Stefani: All right. I want you to go into your settings app; tap on General.
Stefani: Go to the airdrop tab.
Wolf: Oh, it's an airdrop thing, okay.
Stefani: And then you should see Namedrop.
Stefani: Then you'll see, ‘bring devices together.’
Stefani: Turn it off.
Wolf: Turn it off. Turn it off. Turn it off. All right.
Stefani: And just that we have restored your ability to have informed consent about who gets your information and when. And that, my darling, is what we are all about here, right?
Wolf: Well, thank you. I feel much more locked down now. That's good. It's good to know.
Stefani: And it's so fun when I get to be the one to nag you about updates instead of you being like ‘baby, It's been like four days. Update, update update.’
Wolf: The shoe is definitely on the other foot, that's for sure. We know that sharing information that you don't realize is being shared is a problem.
Stefani: I mean, we talk about that all the time in a million different ways. I think elements of that come into every single episode that we do. But the flip side is that, you know, there are some tools, some apps, some devices where the information sharing isn't a problem. It's the feature, and one that comes up a lot in my world, especially with therapists that talk about or work with problematic sexual behavior, are sort of like consensual spyware programs that clients will install and use to serve as like a behavioral reinforcement to keep them from doing things that they don't want to be doing. And one of those made the news recently, which I thought was really interesting. Have you ever heard of Covenant Eyes?
Wolf: I have. So, first to use the word spyware. One of the things I love about infosec is we just tack wear on everything. So any software that is spying on you, spyware. And of course, any software that's malicious is malware. So, I love that. I love that about our industry. We're very good at naming things.
Stefani: Which is weird because you don't call sex tech sex wear.
Wolf: Well, I mean, there certainly is pleasure wear.
Stefani: Yeah, sex wear is, you know, leather and lace and latex and vinyl. Probably not. Bits and bites and github code.
Wolf: So I looked in the covenant eyes for you. And as an application, what it does, is it it screenshots what the person's doing? It tracks their browser history. There are certain conditions where it can record what? You're typing your keystrokes and it shares all of that. First, it stores it and sends it back to Covenant Eyes proper, but it shares all that with your quote-unquote “accountability buddy.” Apparently under the guise of stopping you from using and viewing pornography.
Stefani: Yeah. And that's why this made the news. Because our new speaker of the house, Mike Johnson, has acknowledged that he uses Covenant Eyes with his son to monitor each other's–I guess we'll say digital purity–which I have feelings about as a sex therapist. I'm fine with my child having privacy, and he's an adult. I don't necessarily want to know what he's doing, but I think that I also have problems with this as a citizen because the dude’s our Speaker of the House, like I'm assuming if we're talking about screen capping things on his phone and sending them back to this company, I mean, that's a problem when we're talking about an elected official with access to a lot of sensitive information, because it's not just noticing when he's on a porn website, right? Does it turn off/on based on what you're doing? Or is it a perpetual thing?
Wolf: So that's a really good question. Basically, the application is doing screenshots and recordings all the time, and sending that information back to Covenant Eyes cloud environment, and then the accountability buddy gets a version of that. Now there's some logic in there, like if it's at this site or it's that site, your accountability buddy will be notified, and the screenshots are redacted at that point in time. As you might imagine, with everyone else they're talking about A I will help you figure out what should go and what shouldn't go. So not everything will get to your accountability, buddy, but certainly. And I don't like that phrase accountability, buddy, but we'll we'll we'll table that for another time. But certainly everything gets to covenant eyes, data centres.
Stefani: Obviously, you know, A as a citizen who wants my government to be safe. I have concerns about this. So let me let me ask this, You know, where is this going? Like, what is this company that's getting all of this information from our elected officials and every other user that happens to want an accountability buddy?
Wolf: So first off, let me just say that everything I'm about to tell you the rest of this episode is available off of Google. I'm gonna walk you through basically the process I use to evaluate any sort of software. Like, even if I didn't know what the software was. Here's the process I used to figure out. Should I work with this company, and I think that's important. Like your friends, I guess our friends in many cases now, these days, you know, the line blurs as we go to events and do things together regular really reach out to me and say, Hey, what do you think of this company? Right? And so I wanna, like, walk through the process of answering your question. But I want to be very clear that everything I'm doing is basically just using searches. There's no internal information, no magic, nothing up my sleeves for those of you who can't see. I'm actually demonstrating to my wife.
Stefani: So this isn't some like Super Cool Investigative Report, where you've gone undercover and talked to staff members and interviewed prior employees and, like, dug into records. This is Wolf on a Saturday afternoon sitting with Google.
Wolf: Yes, exactly.
Stefani: OK, good to know, because that means my people could theoretically do that too.
Wolf: 100%. Everything I'm talking about. So the first thing I want to know is where are they located? They're located in Owosso. Do you know where was is?
Stefani: It's in Michigan, right?
Wolf: It is, good call.
Stefani: That's really weird. I thought for sure that they were based, like somewhere, like either deep in the Bible Belt or like Utah. I thought for sure this would be like a super conservative community that would host a company like Covenant Eyes.
Wolf: Yeah, liberal progressive chill. Michigan is not where you think you'd find them, but they are about a 90-minute drive away from us in the middle of the state, just west of Flint. It's a very small town. One of the things that I did once I got their address is I used Google Street View, and this kind of cracked me up. Google Street View opens to a trailer like an old school trailer, just sitting in an empty parking lot. And I went, Oh, my goodness, where is this business?
Stefani: Wait, like a mobile home, or an RV, or like a pop-up camper?
Wolf: Like a mobile home trailer?
Stefani: Is that actually their headquarters?
Wolf: It is, but you have to, like, turn the camera around. So for whatever reason, the address in Street View takes you to this trailer. If you turn the camera on, first you see a thing called the Maya Ministries. I'm not sure what that is, but that is a placard. Maybe they sublet or whatnot. If you walk around to the front of the building, you see Covenant Eyes. It's the headquarters. It is next to some train tracks, which is good. You know, I like trains. Honestly, the only thing I knew about Owasso was that you can get, uh, uh, steam train rides because they have the steam train institute in Michigan. So hint, if we want to go up there, that would be fun. That would be good.
Stefani: Note to self.
Wolf: So they've got a reasonably decent size company. They are not public. So one of the things you do is check public records. It's not a publicly traded company. So it took me a while to find that they make around 27 million a year. That's not a lot. We'll come back to that. Why that could be a problem. It also kind of gives an idea of how many customers they have–we’ll come back to that. Next I went over to LinkedIn and looked at what the size of the company is. It's a relatively small size company. It's got 165 employees as of this recording. So yeah. Small little company. Probably about 75 to 100 people in that headquarters. I would guess a lot of the folks as I was looking are working remotely.
Stefani: OK, of those 165, I'm curious. How many of those are security people? I feel like just for the nature of the product, the nature of the service. We're talking about people sensitive information. Just that's the nature of what they do. So I'd like to think they're putting a decent amount of effort into securing their users.
Wolf: Yeah, when we were talking about this, you had a phrase which I just loved, which is, uh, you buy the service so that everyone can get their own personal blackmail file. And that just cracked me up.
Stefani: I mean, that's that's kind of what it feels like, right? If somebody is screen capping what I'm doing on my phone or on my laptop, I mean, especially because you said it's not able to differentiate where I'm at. That means it's getting my banking information. That means It's getting my social media. It's getting sure my porn use or not use, which is what the advertised service is. But there's a lot of stuff that I do online that I wouldn't necessarily want publicized, even though I'm very comfortable sharing most things I do. And I think that that I mean hiring somebody to create a dossier on me I don't feel comfortable about.
Wolf: There was even when I was researching this, there was even people who are talking about Minecraft cuz, I guess like little kids will be asked by their parents to put this on their computer. And there was an entire world of those talking about Minecraft and sharing some of the snapshots that were taken of their Minecraft universe, which is just weird. So 165 people, how many people do you think are responsible for securing this most sensitive data, including, as we mentioned earlier, government officials? If you had to guess.
Stefani: I mean, considering the nature of what they do, I really want that to be like 10% or better. Please, for the love of God, tell me it's 10% or better to 16 people, 20 people.
Wolf: That would be great, right? A good security function.
Stefani: What I'm used to hearing from your world, like when you and I go to things when we're I'm listening to you and our friends talk about companies. That seems about right.
Wolf: So the answer would be 3, there are three people with security in their title. There is no chief information security officer. So there's no manager, or director, or executive responsible for security. There are three people with security in their titles. One of them is a pen tester. And you know, I love myself a pen tester.
Stefani: I know you do, and I know I do, too. I also know that that is not a term that everybody is necessarily familiar with. Do you want to explain it, or do you want me to explain it?
Wolf: Go for it.
Stefani: All right. So a pen tester is short for penetration tester, which you know, if you're a sexologist or sex educator, you know, penetration makes you giggle. But in this case, what it means is that these are people who get hired by companies to penetrate their security. They are If you've ever seen Sneakers, the movie with Robert Redford. They are the people that do that. Their job is to professionally break in and then write a report up for the company that hired them, explaining how they broke in and how the company can fix those vulnerabilities so that they are not break-inable anymore. Did I do a good job?
Wolf: You did. And so, by looking at some of the job descriptions and whatnot, I have a sense of how this is built right into the mobile platform. It's built with languages like Java and Swift and objective-C. The website is, of course, built with Vue.js and node.js and typescript. There's some Python code in there, there is a database layer in the back. That's mSQL, MariaDB, and regular SQL. And so those are the technologies. Now most people are listening, like, just glazed over, So come back with me. Come back, come back.
Stefani: I think the most important thing is are those complicated technologies Are those fancy technologies like I want to know that if they don't have a huge team, they're at least using fancy technology that the average ordinary person could not un-fancy and breach.
Wolf: Yeah. I mean, they're I don't know that they're fancy. They're pretty common. But each one of those technologies has had a team of penetration testers who break into it and have, you know, figured out different vulnerabilities. And those vulnerabilities get patched, much like the patch we just applied to our phones at the beginning of this conversation. So each one of those technologies gets hammered on. And it's that hammering of technologies by penetration testers and by offensive security people–and I mean offensive in terms of like offense and defense, not in terms of like your uncle who had something terrible. Offensive security people make it good. Now, the key to a good penetration tester is he knows his technologies. He's been around for a while. He's seasoned. So I looked at this penetration tester and again, if you're listening to this, you know, mad props to you. I'm so glad that you're with us in security. Please take nothing. I'm about to say as anything but full support.
Stefani: For the pentester or for the company?
Wolf: For the pen tester, he runs a blog, which is all about how excited he is to start as a pen tester because he has no formal penetration experience.
Stefani: Oh, honey.
Wolf: And he's It's like he's listing out his journey, and I want him to succeed so bad. But he's like taking those same Certs that, like I would have my apprentice program they used to run my academy, like bare level. I'm gonna understand what a computer is. Certifications. And this, from what I can tell, is the only penetration tester, the only person who is looking at Covenant Eyes’ software with a critical eye.
Stefani: OK, so I mean, just to be fair, you know, we have done an episode in the past where you finally got me to admit that there is a lot that makes sense about the apprenticeship model that your world uses, that it is not only entirely reasonable but often really smart to hire people who are self-taught and who have kind of pieced together the knowledge they need to work in your world. So the fact that that's been his path is not really the criticism. It's the fact that there's nobody else sort of supporting that learning effort or making sure that any knowledge gaps he does have are being addressed.
Wolf: Right, because who's Who's the senior penetration tester? Who's his mentor? Who's his coach? Yeah, exactly. There's only three people there.
Stefani: Got you.
Wolf: So that's disheartening. Itty-bitty security team.
Stefani: On the off Chance you're listening Pentester buddy. Reach out to us. Wolf's fond of connections and resources and loves mentoring people and, as he said, This is not a slight towards you. This is just an analysis of the company.
Wolf: Yeah, no, I went and looked at some of his code. He's got some source code he's posted on github. He did like my first key logger. It's some cool stuff, but it's clearly like the first couple of years into the field stuff. And as we said earlier, this seems to be some very sensitive information that's being stored.
Stefani: OK, so I can see how that would be a problem. But let me ask this. The other thing that we talk about a lot is compliance right? Like people can be new. You ran an entire apprenticeship program you just mentioned. And one of the first things that you taught them was compliance and how to evaluate compliance and how to be compliant. And first of all, can you explain to people what I mean, when I ramble about compliance. And then can you tell me whether or not Covenant Eyes meets that standard?
Wolf: So the bare minimum of due care is being in compliance with security standards and federal regulations. Now, in the case of Covenant Eye, there isn't necessarily, like a standard that would apply to what they do now there there is some standards in terms of how they should be running their business. We'll get to that. But, for example, you wouldn't necessarily say, um they would need to be in compliance with the rules around protecting your grades, right? Like universities have to be compliant with protecting grades. It's FERPA. Yeah, you wouldn't necessarily say you need to be compliant with healthcare records because they're not a hospital system.nSo you wouldn't imagine that they are falling under HIPAA.
Stefani: True, but this is kind of where I come back to that whole Mike Johnson thing because if I'm using it or if my, you know elected official is using it, if I'm looking at my patient portal, that should be HIPAA-protected. If he's looking at like a national security briefing, that should be protected. So, the fact that the product isn't beholden to like, HIPAA or FERPA or anything doesn't necessarily reassure me because you said that they're capturing everything that comes across the screen, that the user has on you.
Wolf: You're spot on, correct. So at a bare minimum, my point to this is a couple things. First off, compliance is usually around specific products or specific types of data. One of the things they've got a page that says We're not HIPAA or FERPA compliant and that in and of itself raised a big red flag for me because any compliance person knows that they wouldn't be. They're not storing student records. They're not storing health care records. They say we're not compliant, and if you have a problem with that, that's fine. Here's here's our assurance that we're doing the right thing for you. Secondly, even by saying they're not compliant. They're not indicating what they should be compliant for, which would be PCI DSS for their credit cards, which would probably be a sock two or another standard. I know I'm getting a lot of acronyms. I'm sorry you're getting that glazed overlook it would have been. They should at least be doing some of the appropriate things. However, as near as I can tell, they are not doing any of the bare minimums. They misunderstand what HIPAA and FERPA means and back to employees. They have no employees with privacy or compliance in the title. They don't have a data privacy officer, so I would want this type of company to have a chief information security officer. I'd also want them to have a data privacy officer, someone who understands the rules, regulations, and the due diligence necessary to protect this information. They do not have that.
Stefani: So what are they doing? Do we have any way of knowing what they're doing to secure their the information that they're gathering from all of their users?
Stefani: What do you mean it's not right?
Wolf: Well, I mean, they don't really spell out like the types of privacy that they're offering.
Stefani: So, it's there forever?
Wolf: We can infer their security from a privacy perspective. It's pretty high level.
Stefani: OK, I think I heard you say that. They said that they will share data out.
Stefani: Do they explain under what circumstances they'll do that? Is it just the accountability buddies that are selected by the users? Or is it other people, too?
Wolf: Let me just pull it back up here. We're collecting information that you provide. Possibly visited any 1 to 1 communications. “We may monitor screen accountability,” so they say that they can monitor it. That doesn't mean who who in 100 and 65 people can do it? We don't know, but they have steps for privacy, which is great, but it doesn't really list what those are. They said that they will–you can request correction or deletion, but it doesn't have, like, the GDPR steps for it. They want you to email or mail physical letter. Um And who can they share information with with your accountability users with your filter guardians with strategic partners, Any metadata around multiple user scores can be shared in any way they like, um, by court order. And, of course, the CSAM exception which they would need to have.
Stefani: That's the question I have for you. Talk to me about the CS AM exception, because what I didn't hear, I heard strategic partners. I heard court order. Do they have any sort of mandatory reporting policy or anything that says proactively they will report CSAM images or access of problematic websites.
Wolf: They say they're in compliance with the requirements. Was it, 1018, Chapter 110? So they say they're in compliance with it, but further than that, we don't really know.
Stefani: So here's part of why I asked that question. Many of our friends, probably slightly fewer of our listeners, but y'all know me for a minute know that, um, you know, I pay a lot of attention to the world of sexual assault, survivor advocacy and the the sort of forensic part of my world and one of the things that I stumbled across when I was doing research for this conversation is that covenant ice was installed and actively working on Josh Duggar's computer when he was arrested for having copious amounts of CSAM, and I'm curious. What, if anything, they knew. And what if anything, they did about that? Because, theoretically, they were getting screen caps of his computer usage. And yet I could not find a single article that said this came to light because this web usage monitoring software reported illegal usage.
Wolf: Was that mandated by the court?
Stefani: No, it was not. He was not. He was not a mandated user.
Wolf: So that is one of the problems that I have with this is that courts are mandating people have the software on their computers. Which, of course, with all the security issues that I'm concerned about raises all sorts of red flags. Uh, our friend Eva Galperin over at, uh, the EFF. Has come out and spoke on this issue. Um, because you can imagine here's a court saying we are now going to spy on everything you do. If he was not using it, I can't speak to why the CSAM material was not identified. Maybe he was on a different computer. Maybe he knows how to get around it again. It does not seem to be software that's highly secure, which means to me that regular users with advanced computer knowledge probably can disable it or bypass it.
Stefani: So that, um, did come up in my reading about the Duggar case. Specifically, his wife was his accountability buddy, and it looks like he got around it by using the Tor browser. Which confuses me, because if what Covenant Eyes is doing is screen capping, can you help me understand how using Tor would bypass the screen capping software?
Wolf: It wouldn't bypass the screen-capping software, but it would bypass the Web browser. And remember, what they're doing is they're looking at multiple signals to determine what to expose and what not. So if they didn't if they didn't see anyone browsing the Web, there's no way for them to know that any particular screenshot correlates with something.
Stefani: OK, so this comes back to that sort of. I'm going to call it the naivete of the programming, this idea that some of the ways that people access the Internet weren't, um, recognized by the software, which then lets the people that we would perhaps most want to use such a terrible piece of technology to be using to subvert it anyway. So the only people that Covenant Eyes is really monitoring are the people least likely to be engaged in problematic behavior online?
Wolf: Uh, certainly people who are purposely going out of their way to engage in problematic behavior are likely to know how to bypass this type of software.
Stefani: And I kind of want to, you know, reframe my last statement because obviously problematic is going to vary from person to person. And for most, if not all, of the users of Covenant Eyes, they think that viewing porn is problematic for them, and that's why they're using it. So I don't want to minimise or or negate their feelings. What I meant was, you know, the people most likely to be engaging in illegal behavior are the ones who least likely to benefit from using something like covenant eyes. Oh, this is a lot.
Wolf: Well, it it gets worse.
Stefani: Does it? Um tell me more. Uh, I. I had mentioned naive coding. Uh, how secure is this are we talked about pen testing? Are there vulnerabilities here?
Wolf: That's a hard question to answer. So are there vulnerabilities? First off, whenever I'm trying to figure out what a company is doing and how secure it is, I look at the revenue, and if they're public, I look at how much you know is going into cybersecurity. Do they have cybersecurity insurance? Uh, are they funding it? If you don't have that, there's like a rule of thumb. It's called The 1010 Rule, which is about 10% of a company's revenue is going to be spent in IT, and about 10% of the IT budget is going to go into security. It's usually around 5 to 15. You know, we can argue we can debate. I'm going to say 1010 rule and all the security people are going to go. But wait, the stats are old. I know I get it. This is just back of a napkin math. So we take what I say 27 million annual revenue. Let's round that up to 30 to make it easy. They're probably spending around 3 million on technology annually, and they're probably spending $300,000 on security. So if you take that $300,000 right there is your three people, right? That's probably three people's salary. That doesn't leave you money for a 24-by-seven Security Operations Centre. It doesn't leave you money for, you know, any sort of managed services. Maybe they got a firewall. Maybe, but all those great products I mentioned earlier that doesn't leave you any money for doing code level scans. We are talking what our good friend Wendy neither calls the security poverty line, which is the line below which you cannot properly secure your environment. Uh, and that's where we are at with this organization. So we need to start there. We need to realize that this is by any measure an underfunded security function. Now, what about the code itself? So there's a couple of different areas that you can go online and again, I want to stress this is all public information. I didn't do anything here. I didn't download it. I didn't I didn't run any reversing tools. Uh, I did great things. Like going to nest and going to mitre.
Stefani: What are those things? You can't drop an acronym and not know it, baby, Come on. That's your whole thing. You speak in acronyms.
Wolf: NIST is the National Institute for Standards and Technology, and they maintain a vulnerability database. Mitre…
Stefani: The only thing I know about Mitre is that that's what they call the fancy hats that pools and cardinals wear. Please tell me there's a fancy hat involved.
Wolf: Um, yes, there absolutely should be. If there isn't, we really need to change this.
Stefani: Hacker community community hit me up in the D MS. If you want to design for a fancy hat, I will happily take point on that task.
Wolf: When we were in Caymans. So I was spending some time when you were hanging out and relaxing, spending some time with the guy who heads up the attack framework for Mitre So we can make this happen. We can reach back out and say, Hey–OK, we're getting sidetracked public databases for vulnerabilities and you plug in the product name and it lists vulnerabilities. Now, uh, there is good news, and there is bad news. The good news is When I went to the public databases and I put in Covenant Eyes, I found no published vulnerabilities.
Stefani: So that's a good thing, right?
Wolf: Yes. However, the bad news is, when I went to these databases and I put in covenant eyes, I found no vulnerabilities.
Stefani: I'm confused.
Wolf: Well, remember, I'm always saying hackers are the Internet immune system.
Stefani: Yeah, I mean you could have stopped right there. Hackers are the Internet’s immune system I will buy, Sure.
Wolf: Hackers are the Internet's immune system, so they're constantly kind of poking around and looking for that way in. So no vulnerabilities doesn't mean it's immune. No vulnerabilities means it hasn't been immunized. No one has poked at this and found anything and disclosed it, which means probably with anyone of our offensive friends. Any one of our red teamers. Any one of our penetration testers can probably rip this thing to shreds because there's undoubtedly a large number of vulnerabilities given none of them have been published and disclosed.
Stefani: And for the record, we are not. This is not a call for people to do that. This is not a independently verify Covenant Eyes episode. We are acknowledging that is a reality. We are not issuing a call to action.
Wolf: That's an important disclosure. Thank you. That raised my eyebrow again. Back to 1 of your friends reached out. Uh, they run an agency and they reached out and said, We just adopted product X. What do you think of product X and I went, Oh, product X had a breach recently, and they're like, Oh, no. Does that mean I should use it? I go. No, no, no. Here's the thing. Um, they actually handled the breach very well. Their breach reporting indicates that they had a good response. They knew what to say. They notified everyone in a timely fashion. Also, the controls that were mentioned were in place that stopped the adversaries. Uh, that breach actually filled me full. More confidence. If I had seen nothing in this company, I would be like, What are they hiding? Or worse? What don't they know? Uh, in much the same way. With no PVs, no published vulnerabilities. It makes me wonder. It makes me uneasy.
Stefani: So this makes me think of something from my world that Once I say it, you might call an unfair comparison. But in my head, as a social worker, it feels rather apt. Back in the day, when I was a student in my undergrad, I had to take a bachelor's level substance use class, and my professor was herself an addict in recovery. And one day she asked us, You know, what do you think I did back in the day when I heard that one of my friends had overdosed and we said, Well, clearly you would avoid that dealer. Clearly, you would never go back to that source and she said, No, no, Oh, contraire, my friends. That was my next stop because I knew that person had the really good stuff. That person had the really pure drugs. And that's kind of what's coming up for me as I listen to you, the fact that you know, if if if the vulnerabilities have been found, that means they've probably been fixed, and if we know that there's a problem, it's probably the right place to go back to, as opposed to the people that we've never heard from. You're probably not gonna want me to compare safe sites to drugs. I realize now, as I'm saying this, but that's the comparison that comes to my mind.
Wolf: It's a terrible comparison and a wonderful lesson.
Stefani: It kind of makes sense, though, if you know me, right. Like, you can see how I got there in my head.
Wolf: Yes. Although it does take note of you. I've got a question for you, though. Can I ask a question of you? Because I've been answering all the questions of this.
Stefani: Oh, heaven help me. Sure. Go for it.
Wolf: Ok, Back of the napkin. Math. 27 million a year. Uh, this service is $184 a year per person. I'm assuming most of their revenue comes from subscriptions. Again, I don't have any public data. A publicly traded company will let you know this, but it's assuming that's around 146,000 people using Covenant Eyes this year. Now, the website claims 1.5 million people have used Covenant Eyes to experience victory over porn. So maybe that's like 146,000 a year. Every year, over the last decade, the company was founded in 2000. So maybe that math works out. But anyways, be it 1.5 million or the more realistic 146,000 people. Does this work? Like Is this to to throw yet another thing in front of ware, from spyware to malware? Is this idea of shame warre effective, like just shame the people so that they won't watch porn and anyways, why are people so wound up about watching porn?
Stefani: To be fair, that's two questions you told me you had won, but I've asked you a lot more than to this episode, so I will take my turn. You know, first of all, shame doesn't work, right? Like shame isn't even necessarily the right word. There is a difference between guilt and shame. Guilt is when we feel bad about something we've done. Shame is when we feel bad about who we are or what we are. And I think that language is really, really important because very rarely in the communities that leverage covenant eyes, do they use guilt? They talk about shame around their usage. And what that means to me as a therapist is that this is expanding beyond. I feel bad that I watched something that I wouldn't want my kid to know that I watched a La Mike Johnson or I feel bad that my wife is upset that I watched porn. It's not about the behavior. If they're talking about shame, it's about who they are. It is about the fact that they consider watching porn to make them dirty or bad or less than. And that, to me, is incredibly problematic, because none of us are perfect humans. Humanity is fallible. That is part of what makes us wonderful. And that is part of why the world is, you know, every day, a new adventure. If everything were ideal all the time, it would be so boring. We want fallibility. We want mistakes. We want the opportunity to learn and grow. But what that means is that we need to be talking about our behavior, not our identities. And so when people talk about, uh, the fact that they think someone should feel ashamed because they use porn they're not actually talking about, I feel like your behavior has missed the mark and that you can do a little bit better. They're really saying you are not enough. You are a problem. You are bad, you are dirty, you are sinful. And those are messages that when they get internalized, become incredibly, incredibly difficult to unlearn. So, no, shame isn't an effective learning tool unless what you want the other person to learn is that they aren't good enough and probably never will be. Guilt can be a healthy emotion. If your relationship agreement with your spouse says, we agree, we are not gonna watch erotic content. Fine. That is between you and your spouse. But I would want my clients to feel guilty about the fact that they violated the relationship agreement, not shameful about the fact of what they did. And that's a really, really important distinction. Good people can make mistakes. Good people can break agreements. Good people can not live up to the ideals and values that they espouse for themselves, and that is a part of what it means to be human. But the minute we bring shame into the equation, we're no longer talking about being good people who made a mistake. We're talking about labeling people as bad, and I will always draw a hard line there. Then now that I have stepped off of that soapbox you asked me about, like, what's the issue with porn? And that could be an entire episode in and of itself.
Wolf: So maybe it needs to be an entire episode of itself. I know we are getting long, but I guess my My very broad question for you as the therapist on the line. The question for me as a technologist is would I trust this company? No, I would not. They seem to be small. They seem to be underfunded. They have a limited security and privacy team. Oh, I didn't tell you this. In the past two years, they've lost about 20% of their workforce. So one in five people with knowledge of how to run on this has left.
Stefani: So what is what is that? Well, first of all, have they been replaced? And second of all, what does this mean?
Wolf: That I don't know. I found a number from 2021 and then cross-referenced it and found that they had lost a significant number of employees. I don't know if they were like, Oh, the company wasn't making enough money. What was happening? I have no idea. But one in five people with knowledge of how to run the system are gone. So that's another factor I always look at. Like has the company lost a whole bunch of people? Then you're probably at risk. So the question for the technology is Do I trust this company? Nope. Nope. And I hope if you are listening, trying to figure out, like how to evaluate things, we've left enough crumbs for you to know how to do it. The question for you, my love as a therapist is would you recommend tools like this for people who are grappling with whether or not to watch pornography or whether or not to have eroticcontent in their your relationships?
Stefani: I can understand the thought process behind these products. Right? Uh, if somebody is to come back to my terrible, terrible drug use metaphor, if somebody is struggling with substance use in, you know, a 12-step model, they have a sponsor. They have somebody that when they are in the throes of their worst struggle, they can call 24/7 and say, I really, really want to use right now, and that person will respond. The idea of an accountability partner makes a lot of intuitive sense, and I think that for people that really struggle with any problematic behavior–be it porn or anything else, I don't think we need to limit it to porn–that having somebody that they trust that can be that sort of support person for them in the moment can be really helpful for certain personalities.
Wolf: We oftentimes talk about tech mediating between people, and in this case, the technology –This company we're talking about is like a third person in the room, right? So I hear you about accountability buddies and accountability partners. But do you need to have a third person in that room?
Stefani: No. And that was where I was going with that was that I don't think that technology is the best way to do this. I talk a lot about authentic relationships and a lot about how technology can get in the way of being our best, most authentic Selves. How technology can provide the illusion of connection rather than a real genuine connection. And I think products like this do exactly that because at no point are Mike Johnson and his kids sitting down and having a conversation about their actual struggles. They're not saying to each other what I might say to my 12-step sponsor if I had one, which is last night was really hard, and I really wanted to call you and having somebody sit with me and go, ‘Well, what got you through that? How did you navigate that? Where did you find your strength?’ Instead, we have this app that's just sending Screenshots into the void, assuming that the other person is paying attention, assuming that the other person will reach out. It's another sort of artificial intimacy that I think opens up the potential for way more problems for the individual users than it does actually build relationships between them or solve the problems they purchase the software to address.
Wolf: All right, problem answered.
Stefani: I am nothing if not opinionated.
Wolf: Well, on that passionate point, well said, thanks for tuning in to Securing Sexuality. Your source of information you need to protect yourselves and your relationships.
Stefani: Securing Sexuality is brought to you by the Bound Together Foundation, a 501(c)3 nonprofit from the bedroom to the cloud. We're here to help you navigate safe sex in a digital age.
Wolf: Be sure to check out our website, SecuringSexuality.com for links to more information about the topics we've discussed here today, as well as information on our live events
Stefani: And join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week.