Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week's episode: Empowering Users: The Role of Blockchain in Privacy and Consent
In recent years, the digital landscape has undergone a significant transformation with the rise of blockchain technology. Blockchain, originally developed as the underlying technology for cryptocurrencies like Bitcoin, has emerged as a powerful tool with vast potential to revolutionize various industries. However, as this technology becomes more integrated into our daily lives, it is essential to examine its implications for consent and privacy critically. Consent and privacy have become increasingly important topics in the digital age. With the proliferation of data breaches and privacy scandals, individuals are rightfully concerned about the security of their personal information. The concept of consent has taken on new dimensions, particularly with the advent of social media platforms and targeted advertising. As technology evolves, it is crucial to ensure that individuals have control over their personal information and can provide informed consent.
To comprehend the relationship between consent, privacy, and blockchain technology, it is necessary to understand the fundamentals of the blockchain. At its core, a blockchain is a decentralized and distributed ledger that records transactions across multiple computers. This decentralized nature ensures transparency and immutability, making it nearly impossible to tamper with the data stored on the blockchain. Blockchain and Consent One of the critical advantages of blockchain technology is its potential to enhance consent mechanisms. Traditional consent processes often need more transparency and accountability. However, blockchain-based systems can give individuals greater control and visibility over their data. By storing consent-related information on the blockchain, individuals can have a comprehensive view of who has access to their data and for what purposes. This transparency empowers individuals to make informed decisions about their data and exercise their consent rights more effectively. Privacy Implications of Blockchain While blockchain technology offers promising solutions for consent, privacy concerns also arise. Blockchain's inherent transparency may conflict with protecting sensitive personal information. Public blockchains, which are open to anyone, can expose personal data to unintended audiences, potentially compromising privacy. Additionally, blockchain's immutability poses challenges in rectifying data breaches or errors. These privacy concerns must be addressed to ensure the responsible and ethical use of blockchain technology. Emerging Solutions Several initiatives are underway to address the intersection of consent, privacy, and blockchain technology. Privacy-focused blockchain solutions, such as zero-knowledge proofs and secure multi-party computation, aim to provide privacy-preserving mechanisms while maintaining the benefits of blockchain technology. These solutions enable the verification of data without revealing the underlying sensitive information. By leveraging these techniques, individuals can retain control over their data while still benefiting from the transparency and security of blockchain technology. Regulatory Considerations As blockchain technology evolves, regulatory frameworks must adapt to address the challenges it poses to consent and privacy. Governments and regulatory bodies are crucial in establishing guidelines and standards that protect individuals' rights while fostering innovation. Striking the right balance between privacy and transparency in blockchain applications requires a comprehensive understanding of the technology and its potential implications. The intersection of consent, privacy, and blockchain technology presents opportunities and challenges in the digital age. While blockchain has the potential to enhance consent mechanisms and provide individuals with greater control over their data, privacy concerns must be addressed. The responsible and ethical use of blockchain technology requires the development of privacy-preserving solutions and the establishment of robust regulatory frameworks. By navigating this intersection thoughtfully, we can harness the power of blockchain while safeguarding individuals' consent and privacy rights in the digital age. Key Concepts:
[00:00:00] : Hello and welcome to Securing Sexuality. The podcast where we discuss the intersection
[00:00:04] : of intimacy and information security. I'm Wolf Goerlich. He's a hacker and I'm Stefani [00:00:10] : Goerlich. She is a sex therapist and together we're going to discuss what safe sex [00:00:15] : looks like in a digital age. And today we are joined by Maya Brooks. Maya. I have [00:00:20] : been super excited to talk to you, Wolf. Tell me a little bit more about Maya. So [00:00:25] : Maya is an award winning product manager, educator, speaker, entrepreneur which I? I [00:00:32] : love having entrepreneurs in this space on this podcast to talk with. Based in New [00:00:36] : York City. Uh, X financial analyst turned Web developer turned product manager, and [00:00:42] : I very rarely seen that path. So I love that, uh, focusing on the intersection of [00:00:49] : Fintech and Ed Tech. And then here's, uh, another part from a tech perspective that [00:00:54] : just gets me excited, focused in on privy photos and Privy labs, a Blockchain enabled [00:01:00] : platform to protect content leaks and abuse, which is, of course, exactly where we're [00:01:05] : going today. But one last thing. Co organiser of Blockchain New York City, the largest [00:01:11] : meet up of Blockchain enthusiasts, 100,000 members. I thought I was doing good. Like [00:01:16] : 1000 hackers. 100,000 members. So good to have you with us, Maya. Stephanie Wolf. Thank [00:01:23] : you so much for having me. And, uh, we're we're 10,000, actually at Blockchain NYC. So [00:01:28] : I don't wanna take extra credit for, uh, for 100,000. That would be an insane amount [00:01:35] : of people that we need to corral. Um, but thank you all both for having me. I'm really [00:01:40] : excited. So let's start with the privacy side. Let's start with the consent based [00:01:47] : Internet, right? That sort of like, in my understanding that the beginning point, how [00:01:51] : do we unwind what we've built? And I I've looked at the Internet for so long, and [00:01:57] : I loved the openness and the the frictionless way of sharing data. But of course, as [00:02:03] : we've all know and we've realised in the past couple decades, that has some real [00:02:07] : downsides when it comes to consent and privacy. Definitely. And that I love starting [00:02:13] : off with that question. So to speak, just a little bit about, you know, the way that [00:02:18] : I've been thinking about this for the last few years. Um I come from the B DS M world. I've [00:02:25] : been involved in the BDSM communities in New York for over eight years, and consent [00:02:30] : is a main topic not only just in that community, but now starting to become, thankfully, a [00:02:37] : broader topic in all sexual interactions. And I always felt that the types of things [00:02:43] : you learn from the BDSM community about consent really should be broadly applied [00:02:49] : to almost every facet of our life. Um, you know everything from how we talk to each [00:02:57] : other to what we how we interact with each other in relationships, what we share [00:03:02] : on the Internet and what we don't. How a company should have access to us and how [00:03:06] : they don't, um and those are just, you know, tenets of of kind of my core belief [00:03:11] : system. So when it came time to build what we're now calling Privy Labs, um, one [00:03:18] : of the main things we really wanted to focus on was kind of a categorization of all [00:03:23] : of us, every single one of us who uses the Internet every single day, basically, as [00:03:28] : a contributor to the Internet and because we are all contributors to this Internet [00:03:33] : and we make up kind of this broad fabric of things that exist, right websites and [00:03:38] : pages and clicks and photos and media links and all of the things that we put out [00:03:44] : on the Internet every day that we should have some say on exactly what happens to [00:03:50] : that information. What happens to that piece of media that you share with some company [00:03:56] : who then decides to put it on its server and then mine it for information and then [00:04:00] : sell that information to another company? There are so many kind of additional steps [00:04:05] : that we, as contributors to the Internet, are often not privy to, um or not really, uh, upfront. Opt [00:04:15] : in consenting to as we use the Internet. And so the way we define kind of a consent [00:04:20] : based Internet is really being able to have a 100% user owned, a 100% contributor [00:04:26] : owned Internet, in which you have full end to end ownership of every single piece [00:04:33] : of information that you share about yourself on the Internet. And I think, um, us [00:04:37] : and and a few other companies are really starting to make make waves make headways, um, in [00:04:43] : in that type of thinking, you know my my, um, area of professional expertise is B, DS, M [00:04:51] : kin and mental health. And I'm actually in nine days. Now it's officially a single [00:04:56] : digit countdown teaching a workshop here in Detroit called Becoming a Boundaries [00:05:00] : Bitch. That is basically it's consent and boundary setting. So everything you're [00:05:05] : saying, I love this and one of the things that I find myself talking about a lot [00:05:10] : in my own practise is this idea of informed consent and not just informed consent [00:05:16] : around. You know, what are the risks of, I don't know, mummification play, But what [00:05:22] : are the risks in your dating profiles? What are your risks on your kinky social media? What [00:05:27] : are the agreements that you're forming in your relationships before you co parent [00:05:32] : or buy a house together or get married, you know, informed consent at all layers [00:05:36] : and levels of the relationship and the the language of B DS M, I think is really [00:05:43] : lovely for that. And I'm curious how your experience in the community has informed [00:05:46] : your work in technology. Definitely. And that's a really great, um, jumping off point. I [00:05:52] : love that you said that Stephanie, um you know my experience in the B DS M community. And [00:05:59] : I haven't shared much on a public podcast about this yet. So I guess you'll you'll [00:06:02] : all be my first. Um, But this, uh, when I first got into bdsm, I didn't realise how [00:06:10] : often I actually never really said no. In a relationship or no, even in a in like [00:06:19] : a work setting in personal relationships, I just never really said the word No, like, almost, ever. Um [00:06:24] : And it became really, really a great exercise for me, extremely empowering to me, using [00:06:33] : and being kind of armed with the language that you get from being active in the B [00:06:38] : DS M community around, setting your own boundaries, the boundaries that are personal [00:06:42] : to you, agreeing upfront, like I would like to do these things. I would not like [00:06:47] : to do these things. I'm open to these things. I'm interested in these things. I'm [00:06:52] : into that or I'm not into that. Those are words I'd never used in any encounter in [00:06:57] : my life prior to that period. And when I reflected on that, like, I felt a little [00:07:02] : bit sad for myself, like, that's kind of that's kind of sad. I just been kind of [00:07:06] : going with the flow in my own life for the last. You know, however many, however [00:07:11] : many years, and to find a place where I felt like it was not only the encouragement [00:07:19] : but the expectation that everyone would be communicating on that same level was so [00:07:26] : freeing, so empowering. And I really think that the broader public has so much to [00:07:31] : learn from the B DS M community in that regard. And I think as as technologists, we [00:07:36] : have a lot to learn, especially as privacy tech a lot to learn from that type of [00:07:40] : language. And I think organisations that are leading the way in privacy are very [00:07:45] : much aligned with, uh, you know, user informed consent, not just opt out stuff, but [00:07:52] : like full opt in. Um, and I think that that setting up those tenants is is really [00:07:59] : important to making technology. Um, that is not exploitative of of its users. And [00:08:04] : I don't think that technology needs to be exploitative of users in order to generate [00:08:09] : value or to generate profit or generate returns. And I think that there's a little [00:08:13] : bit of like a of a mismatch that's kind of been happening around. Um, you know the [00:08:18] : language we use to describe, like, oh, selling user data or mining user data. And [00:08:23] : I'm like, I think there's a lot. There's a lot of ways to get information out of [00:08:26] : people, and sometimes you'll find if you just ask and allow them to tell you what [00:08:31] : they want to tell you, they they might actually, they might actually speak. But you [00:08:35] : have to make space for them to do that. Um, and I think, uh, I think we just have [00:08:39] : a lot to learn there. That has been one of the things that I have said for years, that [00:08:44] : I think the the mainstream or vanilla community could learn so much about just how [00:08:51] : kinky people talk about consent and I. I love that you say, You know, if you just [00:08:57] : ask you, sometimes you'll get more than you even expected. And it's true. And I think [00:09:02] : in technology, you know, we have these incredibly dense T OS or these just little, um, formative. Click [00:09:09] : here to accept all, and you don't even know what you're accepting. When if it were [00:09:14] : a no seriously, we want you to click through and tell us what you want us to have. I [00:09:20] : suspect more people would click more boxes than if it were somehow just the simple [00:09:26] : Yes, no, binary that it is right now. I agree. I agree. And you know, the first inkling [00:09:30] : that I really got of people utilising this language properly or starting to at least [00:09:35] : have conversations in their daily lives that involved consent practises or consent [00:09:39] : frameworks was around the pandemic. Um, and people really vocalising like, Hey, our [00:09:45] : home has these rules. If you do not have, you know these things, please join us at [00:09:51] : the next event. But don't come to my home. You know, if you don't meet these requirements [00:09:55] : or like we, I need to know, like I've been testing on this day. Have you been testing [00:09:59] : on this day? And that was the first time I've really seen the broader public actually [00:10:03] : adopt a lot of that language. And it's, um it was It was really nice to see, and [00:10:09] : I hope it's a habit that sticks. It's good to see something good come out of that [00:10:16] : shared experience. We just went through. Yeah, in in the tech space, it oftentimes [00:10:21] : seems, you know, Stephanie brought in terms of service, Uh, of course, Clink wrapped [00:10:27] : or, uh, shrink wrapped licencing. It often seems like there's a big disconnect between [00:10:33] : how consent conversations go and how the technology is actually instrumented in. In [00:10:41] : what ways is technology, like letting us down or prohibiting us from having these [00:10:47] : consent conversations with how our data and our information and our content is used? That's [00:10:53] : a great question. Wolf, I think, Um, I think in a couple of ways, one challenge in [00:10:59] : making sure that users are fully informed of what they're signing up for when they're [00:11:03] : using your technology is working on the actual U I and UX of that process. So how [00:11:12] : easy is it for the user to read and digest this information are the buttons bright [00:11:17] : and big are the is the text and the type large and big, so that people actually understand [00:11:24] : that this a core part of the process, not some small print, some small font that [00:11:28] : you're just supposed to skip past, um, and really building it into something that [00:11:34] : users must complete before they use the product, I think is gonna be extremely necessary [00:11:38] : as we continue to to build, um, the other portions. And And I'm also curious to know, you [00:11:46] : know, what you all are are both seeing, uh, is kind of the mishandling or misuse [00:11:52] : or, uh, you know, lower level of security That actually happens on the back end server [00:12:01] : side for a lot of information. Um, and then the ability or the, um the, uh, chance [00:12:12] : the opportunity for the government to overreach into into those systems. Um, I'll [00:12:19] : give a great example from a company that I was actually, you know, I. I follow, um, for [00:12:25] : a while, and I have been listening to their CEO speak for a little bit and and their [00:12:29] : members of their trust and safety team. Um, but grinder right grinder is, like a [00:12:33] : really great example of this, and so are all kind of dating companies, uh, modern, you [00:12:38] : know, dating companies, people have given their information. They've attached a a [00:12:42] : profile a face, right? Some identifying information which might include their their [00:12:47] : real name. Their, uh, you know, age, their date of birth. All of this kind of, like, really [00:12:53] : heavy material. Um, you know, P, I and those profiles while other users on the internet [00:13:02] : may not be farming that information. Um, if you have a government, that is, uh, against, uh, queer [00:13:11] : rights against the LGBT Q community and wants to reach in by saying, Hey, this person [00:13:17] : has violated something. Uh, a lot of companies are kind of allowing this back door [00:13:23] : for the government to come in and say like, actually, we can subpoena this information [00:13:27] : from said company. Um, there are very few companies that have turned down that back [00:13:34] : door from the government, um, and built full end to end encryption on every single [00:13:38] : piece of their technology. Um, you know, Apple prides itself on on saying no to that [00:13:46] : kind of like backdoor from the government. But there are several other companies [00:13:49] : who have made it clear that they'll comply if asked. Um, and depending on the, you [00:13:56] : know, nation state that you live in that may or may not be very, very dangerous for [00:14:00] : you as a user. Um, I think you know, even now in the US, we're seeing with the overturn [00:14:05] : of Roe V Wade, the amount of health information that could potentially be leaked [00:14:10] : from some health app that might be later used against, uh, women and and people who [00:14:16] : have the ability to get pregnant, um, against them in a case of a miscarriage or [00:14:22] : an abortion Or, um, anything else that could happen in the complication of birth, Um, and [00:14:27] : having that information used against them. So there's a There's a really broad swath [00:14:32] : of education that that needs to happen on the consumer side, but also a lot of shoring [00:14:37] : up of end to end encryption, um, encrypted servers, encrypted technology. And I think, personally, um, zero [00:14:43] : knowledge technology, Uh, which has some, you know, Blockchain underpinnings there, um, in [00:14:49] : order to really fortify a lot of those applications moving forward. So it's funny [00:14:56] : that you mentioned, um, the the back doors in the systems and health apps. We, uh, months [00:15:03] : ago. I think last fall interviewed Albert Fox Khan, who's the founder of the Surveillance [00:15:09] : Technology Oversight Project, which is also based in New York by you. And just today [00:15:14] : they released a report on um, state surveillance of gender affirming care and reproductive [00:15:20] : health care and how that data collection process is being weaponized and used by [00:15:25] : law enforcement. And I think that that is something that people have become unfortunately [00:15:31] : and necessarily more aware of since Dobbs fall last year. But I'm curious. Besides, like [00:15:38] : the social media backdoors besides the the ways in which information is subpoenable [00:15:43] : or discoverable, what are some other ways that people's privacy, um, get compromised [00:15:49] : that they might not think about? Are there things that people are doing themselves [00:15:52] : that are exposing themselves or putting them at risk? A large percentage, And it [00:15:58] : depends on you know which industry you're operating in. But I, I think that there's [00:16:01] : kind of like 33 main like ways that people's privacy really gets compromised in in [00:16:06] : any technology information. One is, you know, user error, mistrust, clicking on a [00:16:13] : phishing email, clicking on a spam link. Um, you know, clicking something that has [00:16:18] : made it into your inbox or made it into your text messages. And someone you know [00:16:22] : has falsely included some harmful information in there, um, and and kind of granting [00:16:27] : an exploit that way. Um, I think that that is, uh, still a pretty wide majority of [00:16:33] : a lot of the, um a lot of the things that we see in, uh, cybersecurity in terms of [00:16:40] : exploits. Um, and Wolf, I'm happy to. If you have any stats there, I'm happy to take [00:16:44] : you throwing them out. Um, but I, I think that's I think that's a pretty large bucket. The [00:16:49] : second bucket is company exploits. Server side hacks. Uh, cloud exploits people. Um, you [00:16:55] : know, an actual hacker, a bad guy. Uh, you know, writing code that is designed to [00:17:00] : to find a a hole in the security system and then exploit it. Um, and then the last [00:17:06] : is kind of this like, uh, we're in a murky grey situation. The government thinks [00:17:10] : something is illegal. Wants to kind of, like reach in, um, and and find something [00:17:14] : that they should That technically, the company has said is private between the between [00:17:19] : the user, um, and the company. Um, those are kind of the the main ways that I see [00:17:25] : people's privacy get compromised every single day. And I think often we do ourselves [00:17:31] : a disservice by talking about, uh, cybersecurity as if it's all like a bad guy, like [00:17:38] : some bad hacker from a movie in, like, you know, dark glasses and like a hat. Um, and [00:17:43] : not often, like sometimes other people, right. Other people can present a a really [00:17:47] : big risk to us, especially, um you know, for the user group that I work with, uh, in [00:17:52] : terms of, um, you know, creators people who sell and monetize content, people who [00:17:57] : are in the sex work field, Um, and are, you know, tasked with unfortunately vetting [00:18:02] : their own, you know, clients vetting, uh, you know, people who may or may not pose [00:18:08] : harm to them and trying to determine whether or not this person is sending them real [00:18:13] : information, real links, et cetera, and trying to rely on the company to help. Um [00:18:16] : and that is a It's a really difficult place to be in. You know, from a statistics [00:18:21] : perspective, you'll see numbers. 80 to 90% of security incidents are caused by some [00:18:29] : of the items you just highlighted. And a while back I was reading the design of everyday [00:18:34] : things. I think Stephanie gave me this book by Di Norman. She's like III. I want [00:18:39] : you to talk about something other than technology. So of course I immediately applied [00:18:43] : it to all of them. My technology thanks. But there's a quote in there that said that [00:18:47] : 80 to 90% of industrial accidents are caused by human beings, and, uh, and then he [00:18:56] : ends the page. Why is that? And I flipped the page. I'm ready. I'm like I saw the [00:19:00] : statistic on, you know, this website recently. I've seen this presentation. We know [00:19:04] : this is what happens, and I flipped the page and do Norman goes. It isn't a human [00:19:09] : problem. It's a design problem. Mm. And I'm like, flip back. I'm like, Wait a minute. No, you [00:19:14] : can't tell me it's me. Don't don't don't tell me it's It's my fault. Uh, and that [00:19:19] : was really eye opening. And I and I think about how often people make mistakes these [00:19:25] : days. And I asked myself, How is the design of technology? Let them down. And I know [00:19:31] : this is something you are doing a lot with with Privy Labs, right? So can you Can [00:19:36] : you talk us through how technology could better support consent and privacy? What [00:19:41] : we're building at Privy Labs is really about changing kind of the fabric of the Internet [00:19:46] : and definitely consider myself a a Blockchain girly. Um, and I. I know that crypto [00:19:54] : gets a bad rap and you know, Blockchain gets a bad rap, but I beg the beg the audience [00:19:58] : here to to go with me on this journey for just a second, which is that Blockchain [00:20:04] : is one of those technologies Distributed ledger technology is one of those technologies [00:20:10] : we really have at our disposal to provide two really key things a self custody of [00:20:18] : your own information. So you, as the user wolf of some application, get to decide [00:20:25] : the entire way through what? What your information does, Um, and having the ability [00:20:33] : to self custody your own information through some wallet infrastructure, uh, is extremely [00:20:39] : important. It is a major technological breakthrough, and one of the things that I [00:20:44] : think will underpin the Internet for, you know, the years to come And then the second, um, is, uh, zero [00:20:53] : knowledge activity and zero knowledge authentication, um, being able to provide your [00:21:00] : identity in a safe and secure way to a company without the company seeing that information, all [00:21:09] : the company knows is yes, Wolf is over the age of 18, but I don't and I and that [00:21:15] : information has been verified by an agency. But the company doesn't need your government. ID, they [00:21:20] : don't need your licence and they don't need all this other information about right. They [00:21:24] : just need to know that like the check turned out true. And that type of technology, I [00:21:29] : think, is going to be critically important. Um, infrastructure wise to making sure [00:21:35] : that people, especially people in marginalised, um, or underrepresented groups can [00:21:40] : continue to access the Internet in a way that's safe for them. If there is anything [00:21:44] : that you feel is, uh, you know, open to exploit by another person and also by our [00:21:57] : government, you want to support the, um the development and the use of underlying [00:22:03] : Blockchain infrastructure in every single major tech company that we move forward [00:22:08] : with. Um, and we've built privy personally in a really privacy preserving manner [00:22:14] : in order to make sure that we can offer those promises to people who get on our platform. And [00:22:20] : there's a There's a lot that we built to to do that, and I'm happy to go into it. OK, so [00:22:26] : I, I wanna jump in as the resident non technical person, and I think it's been a [00:22:31] : couple episodes since I've gotten to say that when I hear Blockchain, I think of [00:22:36] : people buying digital art. Oh, art OK, the NFT SI think of I think of cartoon monkeys [00:22:45] : being sold for millions that it seems like a right click Makes this way less cool. But [00:22:51] : I don't know, So help me understand as much as you can talk to me as if I were a [00:22:58] : third grader. Yeah. What the hell is Blockchain? Maya? Because everybody talks about [00:23:02] : it. It's just like it's just, like burst into the collective consciousness. And everybody [00:23:06] : acts as a We know what it is. And I think maybe five or 10 people do, But most of [00:23:10] : us do not. And I am one of those that uses the word Blockchain have no idea what [00:23:14] : I'm talking about. What is the Blockchain? I love that question, Stephanie, and I'm [00:23:18] : gonna I'm gonna do my best. Let me know if if this if this is not helping, but, um, for [00:23:24] : for you and everyone listening, I'm gonna start off with the umbrella term Web three. I [00:23:30] : do think that Andries and Horowitz, you know, for all of the all the things we cri [00:23:33] : critiqued them on, did a great job rebranding the next version of kind of the internet. Um, if [00:23:39] : you think about the current version of the Internet that we're in now, Um, or I'll [00:23:45] : even go back to the version before this, when the Internet first launched into, like, public [00:23:49] : public consciousness, let's call it generally the nineties, Um, mostly what people [00:23:54] : were doing with the Internet, then was just reading information. You know, people [00:23:57] : weren't like writing information on the Internet. Maybe there were like a few blogs [00:24:02] : back in the day, but they really hadn't like, taken off. Mostly you went to the Internet [00:24:06] : because you wanted to read something someone wrote or like access some website of [00:24:10] : like the store that you wanna go to down the street. You want to see if they were [00:24:14] : open, right? Like a lot of the information that you're doing was just reading, reading, reading, consuming [00:24:18] : information. Web two is kind of, uh, where we are now in the Internet landscape and [00:24:26] : Internet history where, you know, starting from the early two thousands, uh, all [00:24:31] : the way till I, I would say, like around 2018 or so, What you have is kind of a real [00:24:37] : fabric of the Internet, starting to form where we, as you know, people who access [00:24:43] : the Internet actually put stuff on the Internet, like content and media and images [00:24:48] : and videos. And we make streaming services and we make Web apps and we make social [00:24:53] : media sites like Facebook. Um, and we write blogs and we contribute with each other [00:24:58] : and the Internet kind of like exponentially gets more and more complex, uh, broader, larger, deeper. Um, and [00:25:07] : we really focus on kind of like how each of us uses the Internet and also gets to [00:25:11] : experience the Internet in a really unique way. Right, Stephanie, whatever you see [00:25:15] : on your feed every day in Google is probably super different than what I see on my [00:25:19] : feed every day. Um, because it's tailored to you, right? So that's kind of where [00:25:24] : we are right now, Web two. And that's where we've been for like, the last 15 or so [00:25:27] : years. And Web three is really our ability to not only read stuff from the Internet [00:25:35] : and write stuff, publish stuff on the Internet, um, broadly, but also to own our [00:25:42] : own information on the Internet. And so when we say like a user owned Internet, that's [00:25:48] : really where we are with Web three, where you own your own data think about it. Kind [00:25:53] : of like a little tunnel where you're like travelling through the Internet. But all [00:25:58] : your information is in your tunnel and you get to take your tunnel like wherever [00:26:02] : you wanna take it, and nobody should get in your tunnel or see what's in your tunnel [00:26:05] : unless you grant them access. And so there's a a version of kind of of the Internet [00:26:10] : that's really starting to to take shape in the Web. Three. Space Now under the Web. Three [00:26:16] : space. I think there are lots of applications of Blockchain technology that enable [00:26:21] : us to engage with a couple of different products. NFTS The digital art is one of [00:26:26] : them, right? Like OK, we can generate this kind of like digital art from some algorithm [00:26:31] : and we can put cool traits in it, and then people can decide its value, and maybe [00:26:36] : they can trade it or use it or licence it or whatever. That's one use of, uh, an [00:26:42] : application that is available because of Blockchain Technology Number two is like [00:26:48] : cryptocurrencies, right? OK, well, if we have this user owned Internet where we all [00:26:53] : can do our own thing and then put out whatever we wanna put out. We could have kind [00:26:58] : of a decentralised currency, right? One that isn't, uh, centralised in, um, some [00:27:05] : central bank, but actually, uh, you know, mind and created and, uh, and driven by [00:27:13] : an algorithm. And if we have this algorithmic coin thing, then maybe we can We can [00:27:18] : do stuff with that. We can trade it, We can hold it, We can invest it, whatever we [00:27:22] : wanna do. Right. So these applications kind of of different things that have now [00:27:27] : become like crypto and NFTS and lots of stuff. All, I think, kind of sit under the [00:27:31] : big umbrella of Web three. I feel like that was longer than it was supposed to be. But [00:27:36] : is that helpful? It is helpful. Although now in my head, the mental image is not [00:27:43] : like a a chain link. Now it's a tunnel. Yeah, like a tunnel. So my metaphor has changed. Yeah. I'm [00:27:51] : wrapping my head around it. It get there by the time Web four or Web five hits. Definitely. My [00:27:57] : I'll have wrapped my head around Web three letter tunnel. Kind of like make it something. I [00:28:01] : mean, I think the chain link the chain link metaphor is an interesting one, but it [00:28:07] : speaks more about like what a smart contract is than like what Web three, I think [00:28:14] : broadly is right. Like a smart contract allows you to, like have two pieces of information [00:28:21] : in a contract. And when some criteria is met, you automatically execute the contract. Nobody [00:28:26] : gets asked. It's not a it's not a um it's not a It's not an operation that takes [00:28:34] : humans to decide on. Should we execute the thing? The thing gets executed because [00:28:40] : the criteria was agreed to, and because of that, we feel like it's more decentralised. And [00:28:45] : when that smart contract criteria is sometimes agreed to, depending on where you're [00:28:50] : applying it, that's where you get the kind of like, OK, add this new chain to the [00:28:55] : block or add this new block to the chain, um, and generate some other blank. But [00:29:00] : I, I really feel like that. That whole metaphor does a disservice to most people [00:29:05] : who are like I don't need to know how smart contracts necessarily work like, I just [00:29:08] : need to know why. Why, what am I getting by using Blockchain technology? And I think [00:29:14] : for most users what they're getting by utilising Blockchain technology or engaging [00:29:21] : with companies that properly use Blockchain technology. Is a little bit more ownership [00:29:26] : a little bit more self custody, a little bit more safety. Because if you have, uh, you [00:29:33] : know a company that never sees your information in the first place, no one can exploit [00:29:38] : it right? No one can exploit. You know, let's say I have a company called Kitchen [00:29:44] : and the kitchen wants to know if you are over the age of 18. Stephanie. The kitchen [00:29:50] : doesn't need your ID and a bunch of other information in order to know that to to [00:29:54] : figure out that you are a real person who's over the age of 18 and should be eligible [00:29:59] : to use this service right? They're gonna use kind of like a complex, uh, rhythm and [00:30:05] : algorithm of a bunch of different functions cryptographic functions to prove that [00:30:10] : that information is true, but they don't need your actual information. And that's [00:30:15] : really, really powerful, because if some government one day says, Hey, kitchen, I [00:30:20] : need you to turn me turn over all Stephanie's information, then they have nothing [00:30:25] : to turn over. There's just nothing there, you know, there's just proofs that say [00:30:29] : yes. Yes, yes, yes, yes, that's it. So it becomes really, really important for any [00:30:34] : application or any use case any person who wants to protect their their individual [00:30:40] : information. And I think that's the the benefit of a a zero knowledge proof right [00:30:46] : to tying it back. But you you said something interesting there. There's there's two [00:30:50] : things that I think technologists always get wrong. And the first is we get so excited [00:30:55] : about the technology we forget why anyone cares. So you're you're spot on. Like if [00:31:01] : if I was to talk to someone, I'd be like, Hey, do you want a more private, uh, more [00:31:06] : you centric? And that people say, Sure, I was to say, Let me tell you about smart [00:31:10] : contracts. They will give that look. That's definitely often gets when I try to explain [00:31:14] : to our toaster technology. Uh, but the other thing that we oftentimes get wrong is [00:31:21] : one use case of technology will happen that we don't like, and suddenly we don't [00:31:26] : like the technology at all, right? So clearly a lot of exploration in some of these [00:31:32] : Web three technologies have taken us down paths that we're not happy with, Um, But [00:31:37] : that doesn't necessarily mean we quit. And we like, as if we said, Well, MySpace, I [00:31:42] : didn't like that. So let's let's forget all about lead to do all I think there's [00:31:46] : a lot of good tech to build on. Um, So I'm glad you you surfaced that so if someone [00:31:53] : wants to, um, to to partner with you or someone's building an application or building [00:32:00] : a set of services, I mean, how do they get involved with what you got going on at [00:32:04] : her lapse? Privy Labs is an open source community open source project. We're really [00:32:11] : proud of a lot of the technology we've built thus far. We actually released our very [00:32:16] : first consumer facing APP built on top of the Privy Labs Protocol, which is a decentralised, um, and [00:32:23] : sophisticated anti piracy protocol that helps creators protect their content from [00:32:27] : leaks, abuse, misuse. And and we have a few systems that that we've we've built top [00:32:33] : of, uh, on top of our protocol. Uh, we are gonna be open sourcing and continuing [00:32:38] : to put out projects. Uh, that fit the criteria of a user owned Internet where creators [00:32:45] : are really empowered to own their own content. Um, and if you're interested in that, um, visit [00:32:51] : us at privy labs dot XYZ That's privy Labs PR IVYL a BS dot XYZ um dot uh, yeah dot [00:33:00] : XYZ and, uh, come visit us there. And, um, there's lots of information on how to [00:33:06] : get involved with us with our team and and with our project. And one of the things [00:33:10] : that we're really proud of, uh, is our commitment to the developer community and [00:33:16] : to making as much as possible available, um, to our fellow developers so that they [00:33:22] : can, you know, participate with us in making this next version of the and we'll throw [00:33:27] : that. We'll throw that link in there. I'm so glad it's open source because, yeah, I've [00:33:31] : when it comes to community and when it comes to trying to keep things safe, there's [00:33:37] : so much benefits that can be gleaned by sharing what we know, um, by by providing [00:33:43] : sunlight into our code, Um, which has me ask you one quick question. Stephanie, if [00:33:49] : I can just sneak one question in one more tech question, what do you do to make sure [00:33:54] : that privy itself is secure? That's a great question. So it really started with our [00:33:59] : architecture, Um, and the way that we've specifically, um built, privy, uh, and then [00:34:07] : built consumer apps, Uh, on top of it. So we have three kind of core systems. We [00:34:13] : have an account abstraction system which allows users to access the privy our consumer [00:34:21] : app, um, app without sharing a personal information. So all it requires is a user [00:34:30] : name and password. We don't need your email, just a user name and password. You then, you [00:34:36] : know, keep that secure information. We do not store that information on behalf of [00:34:41] : you. So there's a little bit of a, um of an implication there as well, Which means [00:34:46] : like, we can't reset it if you happen to lose it. So, um, write it down and we've [00:34:52] : built into, you know, our our on boarding process. Like write down this information, don't [00:34:56] : lose it. But, um, there's no long seed phrase or like, phrase for people to memorise [00:35:02] : when they use privy cam. Um, they just use their their user name and their password, um, and [00:35:09] : and log in, uh, second, we have a content certificate and tracking system, and we [00:35:15] : also explained quite a bit about each of our systems in our white paper. So if anyone [00:35:18] : is interested in reading that information and learning a little bit more about how [00:35:22] : our systems work more more than welcome, Um, but basically, we developed the second [00:35:26] : system to, uh, secure media content like an image and encrypt it, but also add an [00:35:35] : irremovable watermark onto that particular photo so that it can be traced across [00:35:40] : the Internet, Um, solely by the information that we've embedded inside of it without [00:35:45] : revealing what the actual image itself is. Um, it's really powerful, especially if [00:35:50] : you're creating image that might be explicit or sensitive. And you don't want to [00:35:55] : reveal exactly what's in that photo. You'll have full, um, you know, comfort and [00:36:00] : knowledge. Knowing that privy, uh, not only encrypts but we transform. And do you [00:36:06] : know all of these different methods on that image to make sure, um, that the contents [00:36:10] : of the image are not revealed. The contents of the image are also not stored on chain. So, um, there's [00:36:18] : no real way to, like, decrypt or see the content of that image unless you're the [00:36:22] : owner of that image. Um, and then we also issue an ownership certificate to each [00:36:29] : of our creators. So if you're a creator, let's say on Instagram and you have a photo [00:36:35] : that you want to prove is 100% yours. You want a store that, like, this is my photo? Um, and [00:36:43] : maybe not. Put your name on it and not have that revealed. You'll be able to do both [00:36:47] : of them. Those things, um, in both the decentralised and encrypted manner, um, to [00:36:52] : make that happen. So, um, kind of knocking out two of the two of the legs there, Um [00:36:57] : and then our last system is still under construction, But essentially, it is a verification [00:37:01] : network. Um, where our, You know, our goal and what we see for the future is kind [00:37:08] : of that every creator starts contributing content to this verification network. So [00:37:14] : now you have, you know, photos verified by real people in real time, saying, like, these [00:37:19] : are these are real originals of of pieces of content floating out here on the Internet, maybe [00:37:25] : for images, but also for videos, text, audio, et cetera. Then platforms can plug [00:37:31] : in, and you can make sure that when you want to share your, uh, you know, photo forward [00:37:36] : to Instagram that you've only shared it to Instagram instagram talks back to our, uh, verification [00:37:41] : network and you'll be able to share directly in that network. Right? And that piece [00:37:46] : of content will only be permission for the place that you've permission it versus [00:37:50] : every piece of content being permission everywhere just by, you know, nature of putting [00:37:55] : it on on some server somewhere. So that's where we're going in the future. And, uh, if [00:38:01] : anyone is interested in joining us on that boat, please come, uh, to privy privy [00:38:06] : labs dot XYZ. And I hope everybody does, because I'm all about, um safe, safe, safe [00:38:13] : lock everything down, informed consent practises in all life domains. But question [00:38:19] : as as we move towards the end of an amazing conversation, um, until we do have everyone [00:38:26] : using privy or something, like privy. What is one thing that our audience can do [00:38:32] : to kind of have a safer, more secure sex in in our digital age? I love this question. Um, we [00:38:40] : have a reddit community, actually, where people ask, like questions about, um, sex [00:38:43] : and sex work and and privacy. Um, and this question, we get all the time. And our [00:38:49] : number one answer is always like, get a VPN. Um, I know that a VPN Yeah, like I know [00:38:55] : that I know that, like VPN S, they're actually having a moment right now. I feel [00:38:59] : like they're sponsoring tonnes of podcasts. And, um, you know, people are getting [00:39:02] : a little bit more curious as to like, what a VPN a virtual private network is what [00:39:07] : it does for you and why it's important. Um, but I would say if you're doing anything [00:39:12] : on the Internet that you might not want other people to know about, um or be able [00:39:16] : to prove, you know, was you later, uh, in the future, get a VPN. Um, it will help [00:39:23] : protect your your traffic. Um, and the, you know, things that you're looking at, uh, from [00:39:29] : being connected directly to your IP address in your computer. So, um, get get one [00:39:34] : of those. I love it. Maya. Every time we talk, I feel like we don't have enough time [00:39:40] : together. I just want to get ready about all the things I could have chatted for [00:39:45] : hours. Thanks for coming on today. I really appreciate it. Thank you. Both for having [00:39:49] : me. And the next thing is anybody that has more questions. If it wants to chat with [00:39:54] : Maya, can hang out with her this fall at the securing sexuality conference. So if [00:39:59] : you want more Wolfgang, um, you know where to find her. I can't wait. I'm so excited. It's [00:40:04] : gonna be good. Although I will have to let you go. I mean, not only at this, uh, podcast, but [00:40:10] : also at the event. Because I'm sure a lot of people will wanna learn more about privy. So [00:40:15] : thanks again. And thank you so much, dear listener, for tuning into securing sexuality, you're [00:40:21] : a source of information you need to protect yourself and your relationships. Securing [00:40:25] : sexuality is brought to you by the Bound Together Foundation, a 501 C three nonprofit [00:40:29] : from the bedroom to the cloud. We're here to help you navigate safe sex in a digital [00:40:33] : age. Be sure to check out our website, securing sexuality dot com for more links [00:40:38] : to the information we talked about here today. It's in the show notes as well as [00:40:43] : our live conference in Detroit. And, uh, if you're coming to Detroit, use the code [00:40:48] : pod 15 to save 15% off your tickets and join us again for more fascinating conversations Comments are closed.
|
join us on air!Archives
January 2024
Categories |