Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week's episode: Redefining Online Communities: a Focus on Privacy and Security
In the digital age, online sexual communities have become increasingly prevalent, offering individuals a platform to explore their sexuality, seek support, and connect with like-minded individuals. However, with the rise in online interactions, concerns regarding privacy and security have also emerged. This blog post aims to highlight the importance of privacy and security within online sexual communities, emphasizing their significance in fostering a safe and inclusive environment for all participants.
1. Confidentiality and Anonymity: One of the fundamental aspects of privacy in online sexual communities is ensuring confidentiality and anonymity for users. Many individuals may feel hesitant to openly discuss their sexual preferences or experiences due to societal judgment or personal reasons. Therefore, it is crucial to establish a strict policy that protects community members' personal information and identities. Users should be able to remain anonymous and control the level of disclosure about themselves, allowing them to engage in open and honest conversations without fear of repercussions. 2. Consent and Boundaries: Privacy and security within online sexual communities also encompass respecting the boundaries and consent of individuals. Consent is the cornerstone of any healthy sexual interaction, whether online or offline. Community guidelines should explicitly state the importance of obtaining consent and respecting others' boundaries. This ensures that members feel safe and empowered to express their desires while knowing their boundaries will be respected and honored. 3. Data Protection and Encryption: Online sexual communities handle a significant amount of personal data, including user profiles, messages, and even sensitive information such as sexual preferences or health-related topics. Therefore, robust data protection measures are essential to safeguard this sensitive information from unauthorized access or breaches. Utilizing encryption protocols and secure server infrastructure can significantly enhance the protection of users' personal data, providing peace of mind and fostering trust within the community. 4. Moderation and Content Filtering: Maintaining a safe environment within online sexual communities requires effective moderation to prevent disseminating harmful or non-consensual content. Moderators play a crucial role in ensuring that community guidelines are followed and that inappropriate or abusive behavior is promptly addressed. Content filtering mechanisms can also help identify and remove explicit or offensive content, creating a positive and respectful community space. 5. Support and Resources: Privacy and security in online sexual communities should extend beyond the digital realm. These communities need to provide access to valuable resources, such as educational materials, helplines, and support groups. These resources can assist individuals in navigating their sexual experiences, addressing concerns, and seeking help if needed. By offering a comprehensive support network, online sexual communities can foster a sense of belonging and encourage healthy and informed discussions. In today's interconnected world, online sexual communities offer a valuable platform for individuals to explore their sexuality, seek support, and connect with others. However, ensuring privacy and security within these communities is of utmost importance. By prioritizing confidentiality, consent, data protection, moderation, and support, online sexual communities can create a safe and inclusive space for all participants to engage in meaningful discussions and experiences. Let us work together to build a digital landscape that respects and safeguards the privacy and security of everyone involved. Key Concepts:
[00:00:00] : Hello and welcome to securing sexuality. The podcast where we discuss the intersection
[00:00:04] : of intimacy and information security. I'm Wolf Goerlich. He's a hacker and I'm Stefani [00:00:10] : Goerlich. She is a sex therapist. And together we're going to discuss what safe sex [00:00:14] : looks like in a digital age. Today we're joined by Apollyon, the founder of Submit. Uh, one [00:00:21] : of the few websites that I went to and I read their description. I went, Yes, like [00:00:27] : from a security perspective from a privacy perspective, I'm like, Oh, you guys are [00:00:32] : speaking my language. But as I say that I'm sure everyone else is wondering. And [00:00:37] : I'll ask the question to you. Apollyon What is submit? Yeah, thank you. Thank you guys [00:00:41] : for having me on the show. So submit is a new social network for B, DS, M and King [00:00:47] : Focus people. And our goal really was that we looked at what was there today. We [00:00:51] : said, these aren't really doing the job from a safety security privacy perspective. So [00:00:56] : why don't we go and build something new? Taking those core tenants into account for [00:01:01] : what we want to build and so submit is a privacy first and sort of extremely inclusive [00:01:07] : social network that's being built for the ground up for those communities. Also pretty [00:01:13] : damn secure. Yes, and I've got all the security questions, but I will. I will pause [00:01:21] : and yield the floor to my lovely wife, who probably has all the king questions. I [00:01:26] : am gonna tell you a story that our listeners have heard a couple of times. But I [00:01:30] : think it really emphasises why we were so excited to hear about Submit last summer, I [00:01:35] : before the the destination of the joy that was Twitter. Got a message there. And [00:01:41] : somebody said, you know, Hey, I, I know your books. I know your work. You are [00:01:45] : an expert in BDSM and kink. I know that you work with kinky clients. I'm building [00:01:49] : a new kinky dating website. I would love for you to share it with your people and [00:01:53] : most people in my world. I mean, I'm a I'm a social worker. By training, I'd be like, yes, new [00:01:58] : resource. So excited Can't wait to share. But I'm I'm married to a hacker. And as [00:02:04] : many of our listeners have heard, I was suspicious or as the Children these days [00:02:09] : would say, and I said, You know, let me think about it Let me look at it and and [00:02:13] : let me let me get back to you And I immediately signalled a friend of ours who's [00:02:17] : a a penetration tester and I said, You know, obviously you can't do anything you're [00:02:22] : not allowed to do. You don't have a contract with them They're not giving you permission [00:02:25] : to do anything but just poke at the stuff that's open to the public and let me know [00:02:30] : if this is something I can I can share with my clients because my clients do desperately [00:02:34] : want and need a reliable kinky dating site. And it took less than 30 minutes for [00:02:40] : our friend to come back and be like, Well, here are their users Pinterest profiles. And [00:02:44] : here are their users Facebook pages. And here's Here's their kids soccer team information. And [00:02:50] : that's just what he was able to get in faster than a pizza, only looking at what [00:02:56] : was publicly available that he could do without getting into any no no areas. And [00:03:01] : so I wrote back to the the person and I said, You know, here are my concerns and [00:03:06] : I just never heard from them again, ever. There was no response and and I suspect [00:03:12] : that site is still out there, and I suspect that there are other people that are [00:03:15] : propo those sites. And that is why I think what you're doing is so important is because [00:03:20] : most of the time people don't think about it. They don't even know what questions [00:03:23] : to ask. And then people like our friends can find their kids soccer teams. I think [00:03:29] : that brings up a huge problem in the space and one of the things that I've seen, especially [00:03:34] : when I first did my research early on to say Hey, is this something that we should [00:03:38] : spend the time to build? And I should go and risk my professional reputation. And [00:03:43] : you know, what we looked at and we saw was, you know, there's one place that exists [00:03:46] : today and we all know what it is For the most part, right, it's OK. I'm trained myself [00:03:51] : not to say it because we don't mention it on our platform, but it's fed life. We [00:03:54] : all know what that is, and it's there and we looked at that and said, OK, what are [00:03:59] : the competitors that have come by since then, and they're all just one off people [00:04:04] : that had this idea in their basement. And they're like, Oh, I really want to do this [00:04:07] : thing But I don't know how to do it and they go online. They search for a WordPress [00:04:10] : plug in that does social networking or dating, and they go toss that up and like, OK, I [00:04:15] : have this great website. You guys should all go to it, not taking into account what [00:04:19] : security is, what data privacy is, what any of the things that are important to the [00:04:24] : people in this space need for that website. They don't have that understanding. They [00:04:28] : think that they're going to toss this thing up in a week or a weekend. They're going [00:04:31] : to make some money and it's going to work and it's going to be great. And they're [00:04:33] : going to be the new king of their social network. And as we've seen that the road [00:04:37] : to where we are today is littered with, I don't know, probably a couple of 100 attempts [00:04:40] : to do this, but no one's ever actually really managed to dethrone what exists today. So [00:04:45] : we took that approach quite different. And we started two years ago from Let's Start [00:04:50] : with Privacy, first security first and actually, to be perfectly fair, legal. First. We [00:04:57] : started with a very large legal framework before we even started writing code. And [00:05:02] : then from there, moved on to say, Hey, look, let's get something that's actually [00:05:05] : going to do what people want and in a way that people want it. So when we're talking [00:05:10] : about B, DS, M and kink and, uh, the tech spaces that bring us all together, you [00:05:16] : know one of the things. Of course I've learned from Stefani and her work is this [00:05:19] : concept of rack right risk aware, consensual kink. One of the things I wanted to [00:05:25] : ask you was, How do you see what submit is building in that context in terms of being [00:05:31] : risk aware in terms of being consent first? Yeah, absolutely. So one of the That's [00:05:37] : a great question. Um, so one of the things that we do as we start to build out some [00:05:41] : of the technology that we have is rack actually kind of came in early on as a core [00:05:45] : concept of what we wanted to promote from an educational perspective, right? Helping [00:05:49] : people understand that there is risk inherently in just being on a social network [00:05:54] : that's associated with KB DS M and trying to educate folks on that, trying to educate [00:05:59] : them on how to make sure that if you're sharing these things, you need to understand [00:06:03] : the risks that are associated with that. And then further to that, we see today and [00:06:08] : I know we all probably see it. But no matter what, we get all sorts of unwanted, non [00:06:13] : consensual sexual advances online in all sorts of different ways. And so we looked [00:06:17] : at saying, Hey, how do we start this out by building on the right framework to prevent [00:06:21] : that from happening so that when you are online and on our platform A you understand [00:06:25] : the risks that are associated with it and B, you are fundamentally giving your consent [00:06:30] : to every single action and step that you take, and we have a system actually built [00:06:34] : in. Um, we have two systems that are kind of core to this right. One is vault, which [00:06:39] : is where we store all of your data, and then two, we call it ascent, and that's the [00:06:43] : consent system that exists. And so every single time that you do something, you're [00:06:47] : actually giving consent to do that on the platform. We're training you to understand [00:06:50] : what that means and what that looks like from each step that you take. And we also [00:06:55] : explain Hey, you're joining this group. Here's the data the group's going to see. Do [00:06:59] : you consent to join this group? You can revoke your consent anytime and the second [00:07:02] : that you do. Everything you've heard in that group disappears. It's gone. It's removed [00:07:06] : from the group. No one can see it. And so we're starting to work on bringing that [00:07:09] : process in. Same with messaging. Same with, you know, every step that you take on [00:07:13] : the platform, um, as well as the education components that exist, or that we're currently [00:07:17] : building so that they will exist. So what does that look like in practise? What? What [00:07:23] : can users do on submit? You have your standard social networking function so you [00:07:28] : can post you can share photos, media video. You can write educational programmes, you [00:07:34] : can share tasks, and some of these features were they aren't quite out yet, but there's [00:07:38] : things that we either have ready or launching shortly. You can join groups. There [00:07:42] : are events and all standard sort of social networking features. Where we kind of [00:07:47] : go is we introduce something called circles and so let's say that you join the social [00:07:52] : network. Let's say you join Twitter today and you have 20 million followers. Well, that's [00:07:56] : it. You have 20 million followers, and the only way that you can share content is [00:07:59] : to those 20 million followers or everyone. We take that a step further and say, OK, you [00:08:04] : can create a circle of people that you trust or people that you only want to share [00:08:07] : with and you can create up to. I believe it's six or 20 depending on the type of [00:08:11] : account that you have circles that let you control exactly what you share. Because [00:08:15] : the reality is, I think you know there's a There's an influencer culture that exists [00:08:21] : where people really want to have a lot of followers. But the reality of social networking [00:08:24] : for most people is it's moving smaller to groups that we trust to individuals that [00:08:28] : we trust and smaller sets of groups, and so we wanted to give you the ability to [00:08:32] : have that control to say, I only want to share this content with these people and [00:08:36] : nothing else. And so that's what our circle stuff does. And that's sort of the the [00:08:40] : key driver to building up the social privacy and social consent framework that exists [00:08:44] : within the platform. I am getting ready to give a talk this coming week. And, yeah, they [00:08:51] : asked me to talk about our work around securing sexuality. But usually when Wolf [00:08:56] : and I do that, it's something we do together. And, you know, I've given talks on [00:09:00] : 10,000 years of cyber sex history, and I've given talks on how to be a more effective [00:09:05] : leader. But this is actually weirdly my first ever mental health talk for technologists, and [00:09:12] : and it left me a little hung up. And until I kind of have been thinking about this [00:09:16] : idea of loneliness as a vulnerability and and of patching for loneliness and of coding [00:09:21] : for authenticity. And one of the things that Wolf and I were talking about earlier [00:09:26] : today was, you know, Social Media is designed to keep people engaged. It's designed [00:09:30] : to keep people there, you know, it's clicks and views and scrolls, and that's how [00:09:34] : they monetize us. And, um, we were We were jokingly saying, you know, we needed one [00:09:40] : of my messages. Needs to be that people need to be coding to make things slippery, not [00:09:44] : sticky code to get people offline and engaging in the real world. That's really funny [00:09:50] : that you brought that up and one of the we actually. So we have a huge discord server [00:09:53] : where a lot of our community spends their time and we open up everything that we [00:09:56] : do to feedback and say, Hey, we want your feedback. Tell us what you think is working. Tell [00:09:59] : us what you don't think is working. And one of the goals that we had was I didn't [00:10:02] : want to build something that created, like, extreme dopamine loops. I don't want [00:10:07] : you to get addicted to doing certain types of behaviours, right? Looking at Oh, this [00:10:10] : is how many likes I have, or this is what's happening. And I really need this this [00:10:14] : next hit of dopamine from the the social interaction that I'm getting on the platform. So [00:10:18] : we we saw push back when we started to design systems that didn't do that, that would [00:10:22] : that would let you not build that addictive loop into it. And the second that we [00:10:26] : took away some of those things that normally they're expecting to see, um, you know, targeted [00:10:30] : notifications, the ability for you to really get in depth details as to your engagement [00:10:35] : and driving that dopamine loop. As soon as we took that away, we saw people start [00:10:40] : to revolt. First of all, because they'd be like, Wait, where's all my things that [00:10:43] : I'm used to getting? And why aren't I addicted to this? I want to be addicted to [00:10:45] : this. And But as we started to push past that, people then actually started to see [00:10:51] : Hey, I can have real interactions again, and I can start to actually talk to people [00:10:54] : in a way where I feel comfortable and safe. And I'm not just chasing that next like, and [00:10:58] : I'm not just doing that, and I'm starting to post content that's actually relevant [00:11:01] : and means something to me. And so that's sort of where we were at with that right? We [00:11:06] : don't want to build those loops to create that, just for the sake of creating that. Yeah, that [00:11:10] : authenticity is so hard to find, and I say that as a person who's still a recovering [00:11:15] : Twitter addict, I get that for sure. And we're I'm seeing some of the same things [00:11:22] : on on Instagram. So we've got, uh we've got this meme account and Stefani and I [00:11:27] : and a couple other people share posting on it and and I drive her nuts and, like, Oh, my [00:11:31] : post got like, three more followers. Stuart. She's like, What? Put your phone down [00:11:35] : just back away. But one of the things you said earlier reminds me of this account [00:11:42] : because I am. I am clueless. I am, uh, a guy who moves to the world like a guy. And [00:11:49] : so someone messages me and I'm like, Oh, they're being friendly and they send me [00:11:52] : something like, Oh, that's kind of cool and I I say this to her and she goes, and [00:11:56] : then they and they sent you the dick. And then that's where it went. I said, I thought [00:12:01] : he was just being nice. I just out of the blue like I'm like, I'm doing fine. How [00:12:05] : are you? Because there's no gender, you know, they don't know. This is why you need [00:12:10] : our butt dick, Detective. Is that what you build. Yeah, so we have a bot, and what [00:12:15] : the bot does is, and to this day, I still might regret naming it, but I named it [00:12:20] : because and I'll tell you a funny story about that in a moment. But what the bot [00:12:25] : does is it, uh, scans all media and messages prior to them being sent, uh, for genitals. And [00:12:31] : so if you send someone a dick pic, you can't send them an unsolicited dick pic. If [00:12:35] : you try and go into a group thread and post your dick pic, you can't We're going [00:12:38] : to catch that and say, no, we might even give you a like a little bot rating being [00:12:43] : like, you know, you might wanna maybe try like Penis enlargement. I don't know. It [00:12:46] : could work for you, but we're just going to start to shame them a little bit. And [00:12:49] : so just to start to discourage this type of behaviour from happening. But the key [00:12:53] : goal is that we don't want a platform full of people trying to send around unsolicited [00:12:58] : genital photos doesn't necessarily just have to be dick pics. It happens with other [00:13:01] : things, too. Obviously not as much as men. I don't know why, but we just seem to [00:13:06] : be proficient at really sending dick pics. Um, so I'm sorry for that, but we We basically [00:13:12] : have worked really hard to stop this process from happening now. We also, unfortunately, we [00:13:18] : named our bot in our discord, Dick Detective, because that's our bot. And so when [00:13:21] : you started to get your early access invite to submit, it was from Dick Detective, And [00:13:26] : so some people didn't know what that was. And it was really fun to watch a couple [00:13:31] : of 1000 people be like, Why is this bot named Dick detective or Dick doctor or whatever? And [00:13:35] : we used to get these hilarious names like The dick guy messaged me. I'm like, Oh, no, Here [00:13:39] : we go. Yeah, and I know that there's some other places that are starting to do that, too. Dating [00:13:46] : sites specifically have been working pretty hard on that. So I have to tell you, I [00:13:51] : have a friend who, in our single days when she got unsolicited dick pics, would send [00:13:57] : back pictures of like cheese cubes and almonds. And when she got the inevitably confused [00:14:03] : response from the sender, she would just say Oh, well, clearly you love Vienna sausages, and [00:14:09] : I thought we were sharing our favourite hors d'oeuvres. So I sent you cheese and [00:14:12] : almonds and it was just, like such a weird like, What's the term? I don't know. It's [00:14:17] : just, like, out of left field sort of thing. But slides in that subtle little you've [00:14:22] : got a Penis like a Vienna sausage. It just really like I don't know, I. I admired [00:14:26] : the finesse of that response. The cheese meal it's always got Are there Are there [00:14:33] : other things that you are putting into the platform? You know, as you you mentioned, like [00:14:37] : maybe I pop up a message I'm reminded of, uh, behaviour science. I'm reminded of [00:14:42] : things like, uh, nudges right. We want to give people lots of freedom within the [00:14:47] : platform. But we also want the platform to, um, display affordances in terms of what [00:14:52] : people should do as well as you know, gently and sometimes not so gently but gently, you [00:14:58] : know, push back when they when they should do things. Are there other things within [00:15:02] : that same realm that you're doing within the same realm as your detective butt to, uh, to [00:15:07] : help the community. There are, um we have a whole bunch of different things, and [00:15:10] : we work with a couple of really great partners to help us in this. I don't know if [00:15:13] : you've heard of Thorn. Let's run safer, but they're a anti child exploitation. Uh, they [00:15:18] : build, like, really amazing technology to help with with child safety online. And [00:15:23] : so one of the things that we worked with them on was first, Of course, Um, you know, our [00:15:27] : automated scanning system, which any media that you put in to submit we obviously [00:15:30] : scan to make sure that it's not CS a or CS a any of that type of content. So we we [00:15:35] : flag for that immediately. But then we also do a couple of other things. So one we [00:15:39] : have a new tool that we've been working with pretty hard, um, on grooming detection. So [00:15:44] : it helps us figure out when someone's trying to groom specifically groom, um, in [00:15:49] : a mostly in a sex trafficking type of way. So we look for that type of content that's [00:15:53] : coming across the platform. We also scan for, um, all manner of hate speech. I mean, it's [00:15:59] : the engines that we use for that are just They're really complex at this point because [00:16:03] : unfortunately, hate speech is really complex. So we, you know, we will give you warnings, too. So [00:16:09] : if, like, if your if your text is starting to kind of skew towards that, we'll say, Hey, this [00:16:12] : kind of looks like it might be hate speech, and maybe we shouldn't be doing this. Um, and [00:16:17] : we start to then basically kind of gently nudge you. And then if you start to really [00:16:21] : get down the road, we will time you out or worse, we'll ban you. And one of the things [00:16:26] : that we've kind of made clear to everyone that's in our community today. Being on [00:16:31] : submit is a privilege. And if it's something that you choose to abuse, you don't [00:16:35] : need to be on submit anymore. We're not shy to throw you off the platform, and we [00:16:39] : are transparent about it. When that happens, we have a transparency reporting section. You [00:16:43] : can go check it out. You can see who we met. We can. We'll tell you why they met [00:16:46] : what happened, what was going on. Sometimes we don't always, um, identify them because [00:16:51] : either there's law enforcement involved There's an active investigation going on, and [00:16:56] : that's already happened a few times. But for the most part, you know we'll we'll [00:17:00] : call out why you got thrown off the platform. So I'm on. I'm on the fetta now because [00:17:06] : I've recovered from Twitter by moving on to Mastodon. Of course, as one does one [00:17:10] : does. And, uh and, uh, a gentleman, a friend of mine who's running some of the servers [00:17:16] : said something, and I want to run this by you. He said that the product that social [00:17:21] : media provides the users the product is the moderation. I think if we looked at it [00:17:28] : five or six years ago, we would say that the product was the dopamine hit and the [00:17:31] : addiction. But now we've moved past that we start to realise that we want safer places [00:17:35] : to be and to talk and places to feel comfortable. So, yeah, I do think now, in this [00:17:39] : day and age, it is moderation for sure, and that's a huge focus for us. How do we [00:17:45] : do that In a way that makes people feel safe and also included when, when I teach [00:17:53] : about, um, negotiating different desires with my couple so I talk about building [00:17:59] : fences, not walls, about defining what the area of play is instead of creating barriers [00:18:05] : that are off limits. And that can be really difficult to do online because we want [00:18:11] : our users to be safe. Hell, if I'm the user, I would like to be safe. Please, Somebody, somewhere [00:18:15] : make me safe. Um, sorry. Woman on the internet. It came out. Um, but at the same [00:18:23] : time, you know, we look at things like Instagram where moderation happens in such [00:18:30] : a sometimes heavy handed way. The actual dick pics get through and Oscar Wilde quotes [00:18:35] : get censored. And I'm wondering how you do that work of building senses without putting [00:18:41] : up barriers to engagement. That that's a fantastic question. And it's one of the [00:18:45] : reasons why we actually started early on. Um, when we first. So we actually launched [00:18:51] : accidentally, kind of in February. Um, not intentionally. But we were like, Oh, we're [00:18:56] : only going to have, like, 500 people, and it'll be no big deal. We have over 100,000 [00:18:59] : on our waitlist now, and it's insane. But one of the things that we started early [00:19:03] : on was that we said, Hey, we need to collect community feedback and get community [00:19:07] : engagement in our policy development and in some of our moderation development. And [00:19:11] : so what we started out early on was a large discussion. A set of discussion forms [00:19:15] : around the different types of guidelines and fences that we want to set up and then [00:19:19] : collect community feedback on that. And so one of the reasons why we built the Transparency [00:19:22] : reporting system was so that when we do make moderation actions, we can collect feedback [00:19:27] : on them to say, Hey, this feels like it might be word policing like early on, Uh, or [00:19:31] : if you went back to like February, we were really, really great at word policing. Like [00:19:35] : we could have been word police Olympic champions for a couple of weeks there. And [00:19:38] : as we learned and got better at it, we said, OK, we're not building fences at this [00:19:43] : point. We're building monster size walls that we're not doing a good job of, and [00:19:46] : we we stepped back and scaled back from there. And so, as we kind of continue to [00:19:50] : build that out and build those processes out, we continue to go back to the community [00:19:53] : and say, Hey, this is what we're doing. This is what we think we should work. And [00:19:56] : then you guys give us your feedback and make sure that we're doing this in a way [00:20:00] : that makes you feel safe. One of the big pieces that, of course, is important to [00:20:04] : us is safety for women on the platform, right? It's one of the key reasons why we [00:20:08] : started building Dick Detective and some of the other types of tools that we have [00:20:11] : to help you prevent, you know, people stalking you unsolicited pictures, um, all [00:20:16] : the types of horrible behaviour and shaming behaviour that happens from just, you [00:20:20] : know, assholes that are on the Internet. And so we work really hard to help prevent [00:20:24] : that and create those experiences where you still feel safe. But you don't feel like [00:20:28] : you have some sort of overburdened level of policy and guidelines that are sort of [00:20:33] : limiting your ability to share or do that. There are some exceptions to that mostly [00:20:38] : around legal things. So things that are basically illegal, or things that will, like, absolutely [00:20:43] : get us kicked off of, um, our infrastructure providers, so certain types of content [00:20:48] : we just can't support. We have to put a wall up. But for the most part, we're pretty [00:20:52] : good about making sure that we go back to the community and say, Hey, this is the [00:20:56] : new thing that we did tell us if we did it right. Tell us if we did it wrong and [00:20:59] : help us sort of shape that. And it's worked out really good so far. I love that I. I [00:21:05] : feel like I'm gonna have to join the site. Yeah, I mean, we we would love to have [00:21:11] : you so jumping back to something you were saying earlier. You're talking about ascent, this [00:21:16] : idea of consent grants and, um, and the ability to say I'm gonna I wanna share this [00:21:23] : here or when I join that, that's gonna open me up over there and and those sort of [00:21:28] : things. The other thing you mentioned was vaults. Can you talk to me a little bit? About [00:21:32] : what? What is a vault? Yes, absolutely. So traditionally, today, when you join a [00:21:37] : social network, you share if you step back and look at what the social network itself [00:21:42] : does, all of your data is stored with everybody else's data. It's mixed together [00:21:45] : in a giant table in a database, and all your comments are essentially, you know, everybody's [00:21:50] : comments are all stored together. Everybody's photos are all stored together. They [00:21:53] : mix and match everything, and then they kind of just pull it out right. That's traditionally [00:21:56] : how we built software for the last 30 years or more. Because it's efficient. It's [00:22:01] : what we know how to do. But what that doesn't do is it makes it really difficult [00:22:04] : for you to feel like you have a sense of ownership around the things that you're [00:22:08] : actually posting on the platform as well. It can be really difficult, as we see in [00:22:12] : other places, for platforms to go back and show you everything you've actually posted [00:22:15] : and then remove everything that you've actually posted if you do want to do that. So [00:22:19] : we took this and sort of flipped it on its head, and, uh, we said, All right, you [00:22:23] : know what? Everything that you post on the platform is going to exist in one place. Your [00:22:27] : vault and that's it. It doesn't exist in a bunch of shared tables. It's just simply [00:22:30] : in your vault. So when you create an account, you set up a new vault that actually [00:22:34] : exists in our system. When you also set up your account, you create a set of keys [00:22:38] : that help you access that vault. Right. So you create. It's a public key system that [00:22:42] : exists, you know, public and a private key. They get generated. They're associated [00:22:46] : with the pass phrase that you use in the platform, and we sign everything that you [00:22:49] : post in that vault, so as well as generate a grant for every single type of content [00:22:54] : that you create. So any time you post a comment post a video, uh, media photo, whatever [00:22:59] : it might be, we generate an underlying cryptographic grant that validates that that [00:23:03] : content hasn't been modified and that you've given us permission to post it there. You [00:23:07] : can rip those grants away, and it'll disappear from the platform entirely, but it'll [00:23:11] : still live in your vault until you choose to remove it. If you also want, you can [00:23:15] : revoke the platform grant. And if you do that, that just means everything that you [00:23:19] : have in your vault disappears from submit, but it stays in your vault until you choose [00:23:22] : to remove it again. You can also then reuse it if you wanted to on the platform at [00:23:26] : a later point in time. Everything on the vault or sorry in the vault is stored encrypted [00:23:33] : by default. So we leverage the keys that we generated for you to do that every single [00:23:37] : year has its own individual key along with your own passphrase. We don't backdoor [00:23:40] : them, so we can't see what you do. Um, we don't have any types of so, like, those [00:23:45] : keys are straightforward. I. I love this part because when people are like, oh, what [00:23:48] : happens if I get my If you take my encrypted backup and I said nothing because I [00:23:52] : can't access it, I don't have your private key like I don't know what's in there. No, we [00:23:55] : scan everything before it goes into the vault. Right? So we're making sure that you're [00:23:59] : not putting, you know, creating a vault full of CS a or something like that, which [00:24:03] : we don't want and we're going to That will never change. We will always do that. We [00:24:07] : have to, uh, I think it's about the only reason law enforcement still lets us exist [00:24:10] : as a platform is that we do that. So we've had some fun meetings with the DOJ about [00:24:14] : originally, we were going to build submit as a part of the FTA, and that quickly [00:24:19] : came back. And they're like, No, no, you should not do this unless you want to spend [00:24:22] : a long time in jail. And I was like, OK, so centralised, centralised Moderation is [00:24:28] : important. Um and so, yeah, that was a key driver for that. We also probably one [00:24:33] : of the reasons why right now, we don't have it. We may or may not do it in the future, but [00:24:36] : we could easily trivially support end to end encrypted messaging in about three seconds. So [00:24:41] : it's something that we want to get to, but we're still working with some of the What's [00:24:46] : the word? I'm looking for some of the realities of running an adult focused platform, uh, along [00:24:50] : with the current sort of government views of what platforms like this had. So we're [00:24:58] : we're working through that. So, as we're talking through all this, I'm sure my wife [00:25:02] : is going, uh, OK, don't underestimate me. I have been hanging out with hackers for [00:25:10] : a long time now. Just because I can't do it doesn't mean I don't understand it. I [00:25:15] : am very adjacent at this point as you lay all that out. I mean, I'm I'm very excited [00:25:22] : because, like a given the hacker mindset, I mean, that is what I would want to build. I [00:25:28] : would want to build a lot of exactly what you're saying, right? That that public [00:25:31] : private key pair having it, uh, having only in the hands of the users having the [00:25:36] : ability to revoke consent from the platform from others. Everything you just laid [00:25:39] : out, which makes me wonder along this journey. How are you making sure it continues [00:25:45] : on the right trajectory and that you're, you know, avoiding all the common mistakes [00:25:50] : that can happen when you're writing code? 0 100%. Um, so internally, we have a great [00:25:55] : Now we have quarterly audits from an external security provider. Uh, we go back and [00:25:59] : forth between Mandy and a few others that we use, as well as a couple of smaller [00:26:03] : but more niche firms that are specialised in cryptography and, um, like, specific [00:26:07] : peer to peer types of communication. Because some of the underlying stuff was originally [00:26:11] : built for decentralisation. Some of that text still exists, so we also test against [00:26:15] : that. We have an internal DNS resolution system for the ability to decode user names [00:26:21] : back into vault addresses if we need to. So one of the things that we do is each [00:26:25] : quarter, we have a security review. We have a penetration review that comes in from [00:26:29] : those experts. They give us a report. We work through that. One of the goals that [00:26:32] : we have is starting in 2020. What year is it? 2024? Uh, we'll be releasing transparency [00:26:38] : reports with included with those security reports as well, as well as a audit report [00:26:43] : from, um, that we'll be doing annually. So as we get ready and we're still in the [00:26:47] : middle of launching, we're not quite ready to do this yet. But the initial goal is [00:26:51] : that we'll move into security transparency reporting in about early 2024 with strong [00:26:56] : details about you know exactly how the each quarter went. Each review went what we [00:27:01] : had to fix, we had to change and what kind of vulnerabilities might have existed [00:27:04] : for a period of time. Um, and we will be opening up a penetration testing programme [00:27:09] : and a reward programme. Yeah, B Brown. I think it's like I'm trying to remember what [00:27:15] : legal said, but I think it's like the beginning of Q four This year is when it turns [00:27:17] : on Yeah, so it's It's a huge priority for us. I mean, we're not perfect. I'm an engineer. I [00:27:23] : don't write perfect code all the time. I'm perfectly aware of that. I very much know [00:27:27] : that, uh, our team that we have today, they are also great all experts in their own [00:27:31] : field. But none of us are perfect. And so it's important that we get external validation [00:27:34] : as often and regular as we can. I think they're perfect. It's an unreasonable standard. I [00:27:39] : don't know that anybody moving in the the world of online, everything today expects [00:27:45] : perfection. That's why Target leaked my bank information. That's why you know, I [00:27:50] : mean, everywhere we go, there are vulnerabilities and and I think that people are [00:27:54] : looking for exactly what you're describing. They're looking for conscientiousness. They're [00:27:58] : looking for genuine concern. They're looking for people that don't consider safety [00:28:03] : and privacy to be boxes to tick off in order to get the next round of funding. But [00:28:08] : it's things that are really core elements of what they want for what they're building, and [00:28:14] : that, I think, is a really powerful thing to hear you describe. I don't hear that [00:28:17] : very often. Well, I think it all stems from the intent that someone has for what [00:28:22] : they're building, right. If you if I set out for the intent to build, submit with [00:28:25] : something that was going to turn, you know where I was going to go raise money from [00:28:27] : where I was going to make hundreds of millions of dollars as a social network, then [00:28:30] : my intent would be to build it as fast as I can and treat security and privacy as [00:28:34] : those check boxes. That's not our intent. We know a platform like this is niche. It's [00:28:39] : never going to make tonnes of money. It's not designed to make tonnes of money. It's [00:28:42] : designed to hopefully break even one day and support itself, and that's kind of the [00:28:46] : extent of what it's designed to do. And so because we understand that because we [00:28:49] : set up with that goal, we get to avoid those trappings of saying, Oh, I need to to [00:28:53] : move as many features forward as fast as I can to get that next, uh, fundraiser or [00:28:58] : to get that next investor on board. We don't want that. We're not taking money. We [00:29:01] : want to build things right. We took two years to get, uh, a fraction of what we actually [00:29:05] : want out of the door, because we're building it correctly and we're building it at [00:29:08] : least as best as possible as we can, and then validating that we're doing it and [00:29:11] : you're absolutely right. Perfection is impossible to attain, but in the case of when [00:29:15] : they target, there's also things that you could have just done that would have prevented [00:29:18] : a lot of that. And they're pretty basic. So, like, there's some things where I think [00:29:22] : we need to have a basic standard of This is OK and we should be doing this. And then [00:29:25] : there's some stuff where Yeah, a zero day comes out tomorrow and I don't know anything [00:29:28] : about it, and it affects a whole bunch of my infrastructure. Sure, there's absolutely [00:29:32] : nothing I can do about that. Nobody knew it was coming, and that's going to be the [00:29:35] : reality of what happens. That's why we you know, that's why we have cyber insurance [00:29:39] : and EO insurance. But, um, you know, the core things that we know we can stop, we [00:29:45] : can do that. And we should do that. And those should be our focuses. There should [00:29:48] : be our priorities. And in our case, they are always, Which is why we also slow features [00:29:54] : down the, uh the emphasis on the community and doing the right thing is something [00:30:01] : that, uh, you know, you don't often times hear. Um, but even when you do hear them, sometimes [00:30:10] : they run smack into the realities of business. Right? And one of the things that [00:30:15] : I think was a mistake. I think it was a mistake. Was, uh, the early web in the early [00:30:21] : social media trying to figure out how to monetize immediately went to advertising, which [00:30:26] : immediately went to How much data can we scrape, which immediately led to, uh, you [00:30:31] : know, a lot of downstream issues. How, uh, how are you working to monetize the platform? Um, and [00:30:40] : yet, you know, still achieve your your, uh, your values. Yeah, for sure. So, um, this [00:30:45] : is a great question, and we did. Early on, I did a feasibility analysis to basically [00:30:50] : say, Hey, what's the reality of us being able to actually hit, break even and afford [00:30:54] : what we want to be able to do. And it took a lot of research a lot of time. It took [00:30:59] : us probably six months to really figure out. But the goal was basically to build [00:31:02] : something where we weren't ad supported. We weren't collecting data or selling ads [00:31:06] : in any way, shape or form. We weren't selling any type of data. So one when you join, submit. There [00:31:10] : are no ads, period. 02. We don't sell any data. We don't even collect any data that [00:31:15] : would be helpful to sell because we don't care about that kind of information. Period. Uh, anything [00:31:19] : that we do infer you can see about yourself. So if we've inferred data about you [00:31:23] : on the platform to say, Hey, we think you like these things, you'll be able to see [00:31:26] : that you can also just remove it if you don't think that we're right, right? So you [00:31:30] : can do all those types of pieces, and then how we monetize the platform is through [00:31:33] : what we call backing, and it's basically just a subscription you pay. Uh, the base [00:31:37] : cost is $6 a month, and the lifetime cost right now is $240 to just buy it out forever. Um, and [00:31:44] : we basically use that as our our revenue driver. We also have some benefactors that [00:31:48] : help. So some folks that are just, uh, you know, high net worth individuals. They [00:31:51] : have absolutely no influence. They have no guidance over what we do. But they've [00:31:55] : just chosen to give us a larger amount of money to help. And then I personally put [00:31:59] : in enough to keep us through two years, uh, with just a little virtual staff going. And [00:32:04] : so we set specific goals each month. Uh, we're getting about. It's like 50 ish to [00:32:10] : 60% of the way there most of the time. And our research showed that after about a [00:32:14] : year and a half of kind of operation, we should be able to be at the inflexion point [00:32:18] : where we re we have enough community support to drive the the platform. Now the cool [00:32:22] : thing is is, you know, 10 years ago, this stuff is super expensive. And 10 years [00:32:27] : ago, if I wanted to go by the social network that held, you know hundreds of thousands [00:32:30] : or millions of photos, it would have been super expensive. But today it's not nearly [00:32:34] : as bad as it used to be. It's It's pretty damn good. I mean, we spend probably, I [00:32:38] : don't know, 12 $15,000 a month on CD N. And, you know, even just five years ago, that [00:32:45] : would have been like Ted X that so it's come down a huge amount. We use modern technologies [00:32:50] : to help us do that. We continuously move like Cloudflare just released R two, which [00:32:55] : has free egress. And so for us, we were originally saying, Hey, video is probably [00:33:01] : going to be something that you have to pay for. But now, with free egress, it doesn't [00:33:04] : need to be right. Video can be something that's free. Um, so we're looking at how [00:33:08] : we explore that and all those other types of pieces, because infrastructure providers [00:33:12] : keep making it easier for us to do cheaper for us to do so. That gives us the ability [00:33:16] : to have more freedom. We have already a fantastic group of backers. It's like just [00:33:21] : about, I think, a little over like six or 700 folks now that are either monthly subscribers [00:33:25] : or have backed lifetime. That have been a huge help to the early stage of just making [00:33:30] : us understand the viability of the model that we prove that we wanted to prove out. And [00:33:34] : so we've done a pretty good job so far. We feel confident that that will be the right [00:33:37] : model for us moving forward, uh and, yeah, that's where we're kind of looking at, You [00:33:42] : know, we are set up as a nonprofit. We completely understand what it takes to try [00:33:46] : and build something not even 1/10 of the scale of what you're doing. And the fact [00:33:50] : that you have found this community and built this support is really remarkable. And [00:33:55] : it speaks to the power and the need of what you're building. And that, I think, is [00:34:00] : my ultimate point in my in my silly question, it is. I think that that when you have [00:34:06] : people that believe in what you're doing, it really makes a world of difference not [00:34:11] : only in your sustainability, when things are difficult, but also just in proving [00:34:15] : to the world that what you're doing is necessary and what you're doing matters 100%. I [00:34:20] : mean, we were we? I've always known and I used to be many, many, many, many years [00:34:25] : ago. Now, just about over 10/10 years ago. Uh, I used to work at F Life, and so I've [00:34:30] : seen it from the other side, and I've seen what those platforms do and how they operate. And [00:34:35] : when I left, I knew that I wanted to build something that people actually wanted [00:34:39] : to use cause even 10 years ago, people were unhappy with it existed and, um, took [00:34:43] : me a little bit of time to get there, but I was blown away even today, still, by [00:34:48] : how much demand there was by how much people really wanted something that just took [00:34:51] : the core concepts of, like human decency, privacy and just some semblance of security [00:34:56] : and and a level of transparency where we tell you about what's going on, say, Hey, this [00:35:01] : is what we're working on. This is why we're doing it and this is what's happening. And [00:35:04] : as soon as we started to do that, more, more and more people just started to join [00:35:08] : and get excited about it. I mean, we have over 100,000 people on our waitlist and [00:35:12] : less than six months. Not yeah, not even six months. Uh, 4, 4.5 to 5 months. And, um, we [00:35:18] : expected to have maybe 1000 and we kind of ran out of the door. We're like, Yeah, maybe [00:35:23] : we get 1000 people. It would be great. No problem. Now, had we had 1000 people, it [00:35:26] : also have been a lot easier. But, uh, it's a different story. So you just said something [00:35:32] : that perked my ears up. But I want to be careful in how I ask it, not thinking of [00:35:38] : any specific platform or company and certainly not naming any names because there [00:35:43] : are lots and lots of kinky websites and dating services and apps and things out there. Now [00:35:47] : there are Can you talk a little bit about what some of the unexpected or unknown [00:35:53] : risks to users are Some of the things that in your market research in your analysis, in [00:35:58] : your own user experience, you've seen that other people might not know they're vulnerable [00:36:03] : to on some of these other platforms. The biggest one that you're going to see probably [00:36:08] : more than anything else is going to be around your data security. And so the large [00:36:12] : majority of the platforms that exist today have used off the shelf things that are [00:36:17] : just insecure by default. They don't know how to set them up. They don't configure [00:36:20] : them properly. And then to your point earlier, we can go find about your Facebook [00:36:24] : your kids soccer event from their Facebook. We can go find your Pinterest accounts, but [00:36:30] : those types of platforms that exist today generally exist to make money. And so the [00:36:35] : biggest problem that most of these platforms have today is that they're just trying [00:36:38] : to go out there and make a quick buck at the expense of your data, at the expense [00:36:41] : of your security and at the expense of your privacy. The things that you really want [00:36:45] : to look for if you're going to go look for a social network to join or a dating site [00:36:48] : to join and dating sites are something almost entirely different. But, um, it's first [00:36:53] : that they have real privacy policies in terms of use, and what I mean by that is, you [00:36:57] : can tell when you go look at a privacy policy. If it's been written by a bot, a computer [00:37:02] : or a service or if it's actually something that's been, you know, had the time taken [00:37:06] : to be put together correctly and truly explain what's actually happening with what [00:37:10] : your data. The second that you see that you know, the company is probably doing something [00:37:13] : at least a little bit, right, and then second to that is terms of use is, you know, do [00:37:18] : they have one? Did they make sense? Is there something that's actually been put together [00:37:20] : there, or is it just something that they stole from some cookie cutter and you'll [00:37:24] : be able to tell really easily? Because it'll talk about the features or it won't [00:37:26] : talk about the features. Um, but And then, you know, third, be careful of what you [00:37:32] : share anywhere, even on our platform. There are bots that exist that scrape every [00:37:36] : single dating site platform known to man. We can stop them as best as we can, but [00:37:41] : we can't stop all of them. No one can. Anybody that tells you different. They're [00:37:44] : lying. It just isn't possible. So be comfortable and be OK with whatever you share [00:37:49] : to, not at least publicly, on those platforms to be somewhere else. At some point [00:37:54] : someone might steal it, A bot might steal it. It might get categorised Now, on our [00:37:57] : platform. If you share something privately, right within one of the circles with [00:38:01] : only your followers, then those those types of data can't be scraped because those [00:38:05] : people aren't following you. They aren't in your circles. You're not going to add [00:38:07] : a bot to your circle. At least we hope not. Uh, and we have pretty good active technology [00:38:11] : to stop that from happening. So there there are risks inherent with anything that [00:38:15] : you join, including us. And, uh, we try and just make sure that that's something [00:38:18] : that people are aware of and that they're mindful of what they're sharing online [00:38:22] : and then on the dating sites. Every woman that you talk to is basically a bot. I'm [00:38:25] : sorry, but just like 99% of them are all bots. I always go back to the Ashley Madison [00:38:29] : leak where it was like 2 million cheating husbands and like three real women. Yeah, that's [00:38:37] : yeah, that's that. I'm convinced that all dating websites are just pretty much just [00:38:41] : like a pool of real men and then, like three real women and 4000 bots. Uh, well, I've [00:38:48] : really enjoyed this conversation. We're running out of time, though, And I know you [00:38:53] : said you have a a waitlist. We do, uh, can can people get on that? You can go. You [00:38:58] : can go to submit.gg. Uh, and you can get on to the waitlist. You can read more [00:39:02] : about the features in the platform. Happen to have a connection. You know, if we [00:39:06] : if we know somebody that might have an insight on submit how long is that waiting [00:39:10] : list for? Those people can be a little shorter. You know, we can, uh we got some [00:39:14] : buttons we can push. All right. Thanks so much for coming out and talking to us about [00:39:21] : this. Yeah. Thank you, guys, for having me on. It's been great. And thank you for [00:39:25] : tuning in to securing sexuality. Your source of information you need to protect yourself [00:39:29] : and your relationships. Securing sexuality is brought to you by the Bound Together [00:39:32] : Foundation a 501 C three nonprofit, together with our conference sponsor. My pleasure. Plus [00:39:38] : from the bedroom to the cloud. We're here to help you navigate safe sex in a digital [00:39:42] : age. Be sure to check out our website, securing sexuality.com for links to more information [00:39:48] : about the topics we discussed here today and join us again for more fascinating conversations [00:39:53] : about the intersection of sexuality and technology. Have a great week. Comments are closed.
|
join us on air!Archives
January 2024
Categories |