The World Is Watching: An Interview with Albert Fox Cahn - Securing Sexuality Podcast Episode 20
Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEUs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week’s episode:
Wrongful Convictions, Predictive Policing Algorithms, Police Body Cameras, and Facial Recognition
In today's digital age, it is more important than ever to protect our privacy and the privacy of those around us. Unfortunately, invasions of privacy are becoming increasingly common in our local communities. From government surveillance to data breaches, there are a number of ways that our personal information can be compromised. Fortunately, there are steps that we can take to protect ourselves and others from these invasions of privacy.
In this article, we will discuss how to take action against invasions of privacy in your local community to protect your own freedoms and those of others. The first step in protecting yourself and others from invasions of privacy is to stay informed about the laws governing data protection in your area.
It is important to understand what types of data collection are allowed by law and which ones are not so that you can make sure you're not being taken advantage of or having your rights violated. Additionally, it is important to know what rights you have when it comes to accessing or deleting any personal information that has been collected about you without your consent.
Knowing the laws surrounding data protection will help ensure that you have the necessary tools for taking action if needed. The second step is to be aware of any potential threats or risks posed by technology used in your local community such as surveillance cameras or facial recognition software.
It is important to understand how these technologies work and how they may be used against individuals without their knowledge or consent so that appropriate measures can be taken if needed.
Additionally, it is important for individuals living in areas with high levels of surveillance technology use such as airports or public transportation hubs to familiarize themselves with their rights when it comes to being monitored by these technologies as well as any potential consequences for refusing them access into their private lives.
The third step involves taking proactive measures against potential threats posed by technology use within your local community such as using encryption software on all devices connected online or opting out from certain types of data collection activities like targeted advertising campaigns run by companies online.
Additionally, individuals should also consider joining organizations dedicated towards protecting civil liberties such as the Electronic Frontier Foundation (EFF) which works towards advocating for digital freedom through legal action and public education initiatives across the United States and beyond its borders too!
Finally, individuals should also consider speaking out against any violations they witness within their local communities whether it’s through social media posts highlighting injustices faced due to technological advancements or even attending protests organized around specific issues related directly with invasions on one’s right towards personal autonomy over their own bodies & minds!
By doing so, citizens can help create a culture where everyone feels safe & secure knowing that their basic human rights are respected & protected no matter where they live!
In conclusion, taking action against invasions on one’s right towards personal autonomy over their own bodies & minds starts with understanding the laws governing data protection within one’s area before taking proactive measures like using encryption software on all devices connected online & opting out from certain types targeted advertising campaigns run by companies online too!
Finally speaking up against violations witnessed within one’s local community either through social media posts highlighting injustices faced due technological advancements or attending protests organized around specific issues related directly with invasions on one’s right towards personal autonomy over their own bodies & minds helps create a culture where everyone feels safe & secure knowing that their basic human rights are respected & protected no matter where they live!
Hello and welcome to Securing Sexuality, the podcast where we discuss the intersection of intimacy and information security.
I'm Wolf Goerlich.
He's a hacker and I'm Stefani Gorlich.
She's a sex therapist and together we're going to discuss what safe sex looks like in a digital age. And today we're joined by Albert Fox Cahn, the founder and executive director of Surveillance Technology Oversight Project (STOP).
As a lawyer, a technologist, an activist, Albert has become the leading voice in how to govern and build technologies of the future. If you haven't seen it yet, check it out. He's got a TED Talk. His TED Talk has now been viewed hundreds of thousands of times and he is focusing in on exactly what we're talking about today, which is privacy, which is avoiding surveillance.
Thank you so much, Albert, for joining us. Welcome. Thank you so much for having me.
So what inspired you to start STOP?
I'm a giant nerd. So basically growing up, my biggest hobbies were protesting the NYPD and building computers in the basement. And so I was one of these people who back a long time ago started to get really concerned about the way that surveillance was being used by police departments. At the time it was camcorders that were tracking me and my friends.
It was undercover officers, but as I continued through my legal career and briefly dipped into corporate law and then briefly escaped it, I saw that this issue kept growing. And so that's sort of why I created STOP as a way to push back against the surveillance technology. Not the NSA stuff, not the stuff that the federal government's doing, but the stuff in our own backyard, the everyday parts of surveillance. Yeah.
And that's one of the things that I think we've overlooked in technology. I know when I was first building networks and applications in the nineties when we were talking about the hacker manifesto and information must be free, there was a lot of thought of the individual owning that data, the individual having access to information. Probably not enough thought given to what about everyone else and everything else that's being tracked and collected.
Now you say in your website that your vision is to turn New York into a model of surveillance oversight for the rest of the country.
What does that look like?
So the US when it comes to police governance is technically speaking a dumpster fire. We basically don't have national rules about how we police. We have 18,000 police agencies across the country and the NYPD, because it's the largest and because it's the most prominent, has taken on this outsized influence as setting the bar for how we police in America in a lot of ways.
And so our goal is to take on the NYPD, start to push back against the way they use surveillance technologies, outlaw things like facial recognition, outlaw drones, outlaw predictive policing, to fake social media accounts and start outlawing the most abusive tactics wholesale, not just to transform New York, but to then put pressure on police departments around the country.
You know, I want to just click one level deeper on that.
So a couple of things you said there were predictive policing and can you tell us like just a couple of what these tactics are that we're concerned about?
Yeah, so one of the most common is facial recognition. This is used tens of thousands of times a year in low level cases. And it's not just that it's a creepy technology, it is, but it's also error prone. We've seen people falsely arrested because of it. It's pseudo scientific.
We see police officers photoshopping images before they run a facial recognition search, literally, you know, photoshopping the eyes open if they're closed, photoshopping the mouth closed if it's open, even taking parts of people's faces from other models online, pasting it in and then putting this art collage through facial recognition software and claiming the answer is scientific. And this is what how technology is being used.
And then with predictive policing, you know, when you first hear about you think like Minority Report that that that thriller movie where, you know, you're losing your free will because suddenly the software can predict everything. But the technology is really dumb. Like we've seen police departments using earthquake prediction algorithms in order to predict where crime will.
Yes, this is not an exaggeration. LA did this. A number of police departments have this. They use a predictive policing algorithm where they just trained a model using earthquakes and then said, well, you know, crime earthquakes, it's similar enough, and then use that plugged in historical crime data and just got this garbage in garbage out. And part of what's so infuriating about the surveillance nightmare that we're in right now.
Sure, it's eroding the Constitution.
Sure, it's biased as all hell.
Sure, it's doing all these things that we find really disturbing. But the technology just doesn't work like the vast majority of it is just junk and pseudoscience. But you don't have people within government who are able to actually push back against the narrative that somehow this is the technological silver bullet to all that we're afraid of. I am not a technologist.
That's my role actually in this podcast is to be the not a technologist. I am a true crime buff and listening to you just now talking about photoshopping things, I can't imagine a scenario where like a crime scene investigator would walk into a crime scene and say, oh, well, this blood splatters incomplete. So clearly I should add some more here.
Baffling to me to hear that they would take any other, you know, potential evidence and just kind of patchwork it together. I am literally shocked right now.
I mean, here's the problem. We've had decades of law and order and all these police dramas. We had Dragnet setting the mode for this back in the 50s where police are using these, you know, serious scientific methods. And when you actually have the scientists look at how this is working out in practice, it's very different.
Take, for example, how there was this big report back in 2008, I believe, from the National Academy of Sciences, and they looked at all the forensic sciences, every forensic science used in the United States to investigate crime. They looked at arson investigation, bite mark. They looked at fiber analysis. And they found that the only one with an actual peer reviewed scientific basis for the way it was used in court was DNA evidence.
That was the only forensic science in America because you had, for example, with fingerprints, people just assumed that fingerprints are unique. But then you saw a man, I believe, in Portland being connected to the Madrid trains bombing because his fingerprint, an identical fingerprint, was found. And he was facing prosecution until they realized, oh, no, the guy who built this bomb just happens to have a fingerprint we can't tell apart.
With arson investigation for years, arson investigators were completely making up how fires operated. People have gone to death row because of this. And with the newer technologies, they're taking that same approach of basically making the science up as they go along and selling it to police departments.
But because they have even more artificial intelligence, they have more big data, the effect can be even more pronounced because now, instead of just monitoring one person or impacting one case, it can impact thousands. And I've been familiar with the CSI effect, right, this idea that juries expect a high degree of science-y sounding stuff now and that that's impacting conviction rates and things like that.
But it doesn't sound like there is science-y sounding stuff. It sounds like a lot of this is incredibly performative.
Well, that's the thing. It is science-y sounding. It's just not science.
And that's the problem is in America, our legal standard for admissibility of science in court is batshit crazy because we have this thing called the Doe-Bear standard where it says, is this type of evidence generally reliable?
Is it generally acceptable?
So basically, once one set of courts accept it, a bunch of other courts will accept it. And they'll look to the forensic scientists, who are not actual scientists in many cases, but just experts who are working in this field on behalf of police departments.
And so you have this feedback loop where forensic scientists keep vouching for each other and create this impression that this information is accurate when in reality, a lot of it is just has no peer-reviewed, independently verified evidence that works.
So even where we have evidence-based science that's not just science-y sounding like DNA, we still come back to, we're trusting the powers that be in the collection, the maintenance, the examination of those DNA evidences.
And that, if we extrapolate from Photoshopping pictures, sounds like we can't necessarily overly rely on that either. Yeah. And everyone who's supposed to be a check on poor decisions here is failing. So the police departments are bad at figuring out when this technology works. There's a revolving door between retired cops who go to work for these surveillance vendors, for these other vendors saying, hey, this thing is amazing.
Believe me, you have political pressure to adopt it from elected officials. Then the court system fails because, well, judges aren't very good at understanding how the science works or this. State-level trial judges have no idea how to look at STR amplification of DNA or how to look at the accuracy of a GPS data from a geofence warrant.
They're all these new things where they're being essentially taken for a ride by the supposed experts. And so what really just keeps happening is we see all of this money being invested in this technology, and it just is a complete boondoggle. So one example is ShotSpotter, where cities are installing arrays of directional microphones in cities to supposedly identify where gunshots happen. The problem is, gunshots sound a whole lot like fireworks.
They sound a whole lot like cars backfiring. And some reports over 90% of the gunshots in air quotes that ShotSpotter is recording prove to be wrong. And yet cities are spending millions and millions of dollars on this technology because they want to be able to say, hey, we're taking gun crime seriously.
We even invested in cutting edge technology to identify gunshots around the city, leaving out the fact that, well, in reality, the technology keeps getting disproved. I am not a fan of ML and recognition at all. And the false positives are very relevant. This is a day, we're recording a day after I flew home and I made the mistake.
There's a certain pair of pants that if I wear, reliably trips off the sensors at TSA and I get a very friendly pat down. And so I was flying back and Stefani goes through, no problem.
I get stopped and she's like, what happened?
I'm like, I messed up. I wore the wrong pants.
She's like, what?
And that's just one example between that and the gunshots and the, with facial recognition and fingerprints, how it's only so many zones. So there's only so many permutations available. There's a lot of different ways that these things can give a false positive. Oh yeah. And like one of the ones that's been really pissing me off the most, and I swear to your audience, I'm not someone going around in a tinfoil hat.
I'm not someone who like, you know, keeps my phone, you know, in the bottom of a safe, like I am someone who wants to have technology when it works, but you see systems like evolve. This is a new, what they call a gun, a gun detection technology. It's really just a metal detector.
It costs more than 10 times as much per year to lease the thing as it would cost to buy an old fashioned metal detector because it's supposedly using artificial intelligence.
And yet the thing is just looking for cylinders because gun barrels are cylinders, but guess what?
Umbrellas are cylinders. The hinges in Chromebooks are cylinders. And so this thing can't tell the difference between a handgun and a Chromebook. And it's being, you know, again, people are wasting thousands and thousands of dollars a year on it. And it's just because of the artificial intelligence myth that somehow AI can magically tell what things are.
On your website, you mentioned several different communities that you tend to focus on in terms of advocacy and protection because we live in a world where not all communities are equally impacted by biased technology and biased people. And one that you mentioned that I actually didn't, I don't know why I was surprised to see it on the list, but I was, was the LGBTQ community.
Talk to us a little bit about how these surveillance technologies impact erotic minorities.
Look, bias isn't just an add on to policing in America and particularly within YPD. It has been baked in from the inception. Like we have never had unbiased policing in the United States.
And, you know, with, you know, queer communities in particular, we see a lot of ways that surveillance technology is aimed at them.
Well, first off here in New York, we've had a long history, particularly of trans sex workers being targeted by the police over far disproportionately to any other sex worker population being targeted, not just on the streets, but online through active surveillance of dating apps through websites where people can more safely potentially connect with clients.
And, you know, and so there has been that discriminatory approach to how, you know, so-called vice laws, anti-sex laws have been targeted at sex workers here in New York. But we also have more recent developments, like the ways that surveillance is used to target, you know, trans youth in states where access to gender affirming care is increasingly criminalized.
You know, there's been a lot of concern, rightfully so, about the ways that in a post-op America that surveillance data from all of our devices and platforms will be used to target abortion care and target, you know, the ability to travel out of state for abortion or to get medication shipped to you.
But those exact same tools are going to be used by child welfare officials and police to target trans youth who need, you know, life-saving medical care and either have to travel out of state for it or, you know, are trying to receive telemedicine within their home community.
You know, surveillance is one of the dominant ways that we police in America.
And so, of course, the cost of surveillance will be biased just like every other aspect of policing. Yeah. We actually, part of what inspired not this podcast itself, but the timing of when we started was Dobbs. We had been planning and thinking and getting things lined up. And then the Dobbs ruling came down and Wolf and I looked at each other and went, well, we're starting right now.
So thank you for tying those threads. I do a lot of trans affirming therapy here. I work with a lot of gender nonconforming clients. And one of the things that I did recently was actually getting not licensed, but Florida has a process to become an approved telehealth provider.
So it's a full licensure, but it'll let you practice from out of state specifically because my state will let me do things that Florida is trying hard not to. And I wanted to be as accessible as possible for families needing gender affirming care in states that are trying to restrict it. All right. I realize I'm not a bundle of sunshine and rainbows when we're talking about all this.
And I do want to get to some of the reasons I'm hopeful. But one of the things I'm fearful of with telemedicine is that we'll see states try to enforce their laws out of state on people providing care to people in places like Florida.
And we're going to face a constitutional crisis about whether the state of Texas can prosecute someone who is operating in New York or Connecticut to provide abortion care to a resident of that state. We're going to see a constitutional crisis about the limits of what the courts call extra territoriality.
And so I think that same dilemma, it's probably not as quick to play out, but it will play out to some degree with every aspect of medical care, including gender affirming care for trans individuals. We're already seeing that with some states trying to make it illegal to travel out of state to receive care that is no longer legal in your home state. Yeah.
I personally think it is blatantly unconstitutional as a violation of the right to travel. The Constitution gives all of us the right to travel from state to state. That's been the law for centuries, literally.
But this is the sort of thing where as we see the erosion of the courts, of all the checks on abuse of government, as we see all of these really chilling things unfolding in our democracy, you don't know which lines will hold. I didn't think that dobsle is going to happen. I didn't think we would be in a post-Road America, at least not in 2022.
And for all of you who are thoroughly depressed by listening to all of this, may I suggest our recommended antidote at stop, which is a montage of cat and puppy images. It is what empowers activism around the world. Cat and laughing baby videos.
Oh, yeah, yeah, yeah. There was one case, and this is steering a little bit, but I was really fascinated to see it on your list of cases because it didn't... I feel silly saying that it wasn't something I expected to see stop doing. That was the advocacy on behalf of religious women not have to remove their hijabs or their head scarves when they're being arrested.
And I know it goes beyond women, but as somebody who used to cover her hair religiously and as somebody who works with religious couples, and I often have to remind my clients that sex positivity and being affirming isn't just about what do you choose to do, it's often about what do you choose not to do.
And intentional modesty needs to be affirmed from sex positive people in the same way that sex work or stripping or non-monogamy is. And I was really curious to see stop taking on an issue like that. And I was wondering if you could tell us more about your involvement there. Yeah. And I completely agree with you. I think my value is the organization's value.
We want to protect autonomy, whether that means using your autonomy to engage in sex work or to strip or whether it means your autonomy to be religiously observant. It's really having that choice as an individual to live your life the way you want to. Part of the reason we took on that case is because...
What happened in New York is the NYPD was arresting individuals, and if they were religious head covering, such as the Yarmulke, the Turban, the Hijab, individuals would have to take it off for their mugshot. And for those of you who don't know people who cover out of religious observance, this is not a minor thing. For my clients, it's like being asked to strip naked in front of another person.
It is that fundamental a violation. But then what happened next was even creepier because they weren't just using this mugshot as part of the court file. It appears they were using it to build up an even broader facial recognition database.
So you have people being assaulted after they're arrested, having their clothes removed against their will, being photographed, and then having that photograph used as a way to track them and police them in the future. Having their own body used as a tracking device against their will.
And to me, that was just repugnant. And thankfully, along with our co-counsel at Emory-Chelly, we brought the lawsuit and we were able to stop this practice. That is no longer the practice in New York if you're arrested. We're still litigating this to get justice for the people who were subjected to this policy in the past.
But I think that's been, it should never have been a fight we had to fight, but it's one that I'm very, very proud that we won. I want to get to some of the things that were making you hopeful. But before we do that, I have one additional question on the technology side.
When we look at some of the work we're doing, tell me about the sentiment meter and how it's being used in your area. Yeah.
So this was one of these bizarre technologies where you hear about it and you're like, huh, how did anyone actually ever approve this?
So the NYPD hired a private firm to conduct sentiment analysis block by block of what New Yorkers thought of the NYPD. And when we then asked for information about how this product operated, which is required under government transparency laws, they fought us. They're still fighting us in court to this day.
And so you have the paradox that the police are using this technology to track what we think of them, but they refuse to show us any of the data as a result, which flies in the face of how a democracy should work.
But on top of that, there are so many dangers with this sort of technology because you could easily see people getting preferential police response or facing discrimination depending on the sentiment score. You could see a lot of potential for abuse. And it's just not something that I think should be part of how we police in America.
With this sentiment analysis, one of the questions I had for you around that was back to this idea of modesty, back to this idea of privacy, was this public data they were scraping?
I've heard of folks or organizations doing sentiment analysis off of public Twitter and other channels.
Or was this like, I'm going to door to door doing a survey or email survey?
Well, that's part of what we wanted to figure out because it wasn't clear exactly how they were getting this data.
Was it some sort of natural language processing model where you're analyzing geo-located social media posts that were public?
Was it a purely survey data?
So how was that survey being submitted?
There were a lot of questions. And part of what's so frustrating when you're fighting surveillance is that it's such a hidden harm. And all too often, we're fighting just to understand how these technologies are hurting people as a first step to then fighting to end them.
Yeah, a hidden harm is a great way to frame it. I think that's so true. And oftentimes when you do risk management from a cybersecurity perspective, the visibility, the transparency drives how much we put emphasis on whether or not we're going to protect something or not. Because if it's hidden, you don't care about it.
The other thing that concerns me, and I wanted to ask you if you're seeing this, is I oftentimes see surveillance. I look at what's going on overseas right now with surveillance, and it starts off in one of two areas. You're like, think of the children. We're doing this to protect these vulnerable groups. Or it starts off as, well, this other group is really dangerous.
So it's okay if we apply extra enforcement to them because they're transgressive or they're dangerous or any number of reasons. And then very quickly, once that toehold has been established, it's like, well, we've been doing that forever. And so surely it's okay if we expand it.
Is that similar pattern playing out?
And how can we, back to being hopefuls, how can we short circuit those patterns?
Yeah. So a lot of the surveillance vendors will come up with a technology first and then figure out how to scare us most into feeling we need to use it. So one of the clearest examples of this are the companies that pivoted during the pandemic to turn military technology and policing technology into contact tracing technology.
So for example, there was one firm that was selling Bluetooth enabled badges that were supposed to help in the event that someone was attacking a high school or middle school. It wasn't ever clear how this would actually be useful in the middle of a crisis, but that was the product.
But then suddenly they rebranded the product as soon as COVID came as a way to use that same Bluetooth beacon badge as a way to track how far away people were and to do contact tracing. Even though the technology was actually pretty bad at getting the sort of data you need to actually do meaningful contact tracing.
And so we saw so many technologies being justified by in the name of fighting crime in the 90s and combating terrorism in the 2000s and then preventing child sexual abuse material and other child endangerment has become one of the major ones. And we keep seeing people urging us to make this one exception or to enable this one type of change just to fight this one harm, but it never stays there.
We routinely see people being targeted with tools that were purchased after 9-11 here in New York as a way to combat terrorism. They're being used now to prosecute graffiti and shoplifting. We have a homelessness command center where unhoused New Yorkers are tracked using surveillance equipment that again was purchased after 9-11 in the name of combating terrorism, just tracked for being unhoused.
And certainly with a lot of online technologies, with messaging, with a lot of other platforms, we see attacks on the right to remain encrypted all predicated on child protection.
But here's the thing, if you break encryption and privacy to allow wholesale monitoring for child sexual abuse material or any other type of material, you're giving the government a tool to track religious beliefs, to track political beliefs, to track every other aspect of life, to track abortion, to track trans health care.
And so I think that that is something that the proponents of these aggressive measures such as breaking encryption on the iPhones for iMessage or putting a back door in the iPhones to scan all photos, it's something that proponents have never been willing to admit.
Yeah, I agree. And that's always one of my big concerns.
Oh, just break the encryption right here. It's only for this one thing.
Well, who is ensuring that we stand by that?
Who's ensuring that it goes away when it's no longer needed?
And we have such a clear track record of not respecting those limits. We have such a clear track record of doing the exact opposite. We have such a clear track record for sure. And you said something earlier about you never know what line is going to stand. And one of the things that we were watching after Roe v.
Wade was undermined was suddenly data that was used for advertising was immediately, immediately being used to track people who went to abortion clinics. Certainly no one ever opted into that, but the data was there. And suddenly that's the use case. Yeah. And it costs $190 to buy the information on everyone who visits the Planned Parenthood website.
You know, these threats have always been there. People have always been hurt by the fact that our privacy protections are so weak. We just in moments like after Dobs where whole new communities face new threats, those are the moments we realized the level of harm that was always there.
So Wolf and I talk a lot about harm and it almost makes me feel guilty sometimes, right?
Because my whole career is about helping people maximize their safety and pleasure. And then we come on here once a week and use this podcast to rain on everybody's parade.
So Albert, talk to us about hope. You've said a couple of times. You've alluded to the things that give you hope.
What makes you hopeful right now?
The first thing is that everyone's talking about it.
You know, like we said, this is a hidden harm and the indispensable predicate to any meaningful social change is a public that is mobilized, that is taking notice of what's happening. And you know, for much of my life, I felt like a Cassandra who was warning about this danger only to be ignored. But now we see all across the country, people outraged about the sorts of surveillance that is taking place.
But more than that, we see people taking action. We see bans on facial recognition in more than a dozen cities and states. We see new surveillance oversight bills, which say that it's the public, not the police that get to decide what tools are used in our communities. We see more legislation moving forward than ever.
In New York, I helped to write legislation that would outlaw some of the most invasive location based surveillance being used today. And that model bill is being looked at across the country. So we do see people taking action. We do see protections coming forward. And I think it's just one of these things where, you know, change comes so slowly at the federal level.
But when we act locally, when we act in our own communities, when we act with our neighbors, there is this real pathway to protect ourselves, to protect our neighbors, and to protect the sort of country we want to have.
Are there actions or specific things that our listeners could be doing to advocate for stronger privacy protections, especially sort of at the local police level?
Yes. So wherever you live, there are people on the ground fighting to push back against these sorts of invasions of our lives. In New York, you know, please check out STOP. We have a number of bills where we need your support. We need you to call your local legislator, your state senator, your assembly member. And you can find all of that at StopSpying.org.
You can find partner groups all around the country through our partners at the Electronic Frontier Foundation.
They do, they host an amazing network of organizations called the Electronic Frontier Alliance. And that's community groups, you know, big and small, you know, in nearly every state. And you can connect with people that way.
You know, these bills are being pushed in conservative states as well as liberal ones.
You know, people, you know, it's not just a left-right issue.
You know, people all across the spectrum get freaked out that anyone could have this sort of power to invade our lives.
And so, you know, I really do see hope that for more and more of these bans in the coming years, and I think just continuing to educate your neighbors and, you know, highlight the unseen harm. Because as long as this remains invisible to people, people won't take action.
That's by educating our friends and family that we're able to sort of, you know, really turn this from something that's invisible to something that's, you know, at the front of everyone's consciousness. Absolutely. Thank you for that.
One thing I wanted to ask you about, jumping back to that case that Stefani mentioned, it feels to me like advocating on behalf of others for their freedoms is one of the earliest ways to protect your own. And I was wondering if you felt similar, and if so, what advice you'd have to ensure we do, we are doing just that.
Look, that's been a huge part of what I've done my whole career. I am an agnostic Jewish man, but I, you know, helped to lead a Muslim civil rights organization for nearly three years at the dawn of the Trump administration, leading some of the earliest, you know, rallies in the country against the Muslim ban, against, you know, other anti-Muslim programs.
I went on TV and told, you know, President Trump that if he wanted to register Muslim citizens, like he was threatening at the time, that he would have to register me first.
To me, that was just partly this intergenerational debt owed to those who stood with my ancestors in our moment of need. And for me, it was also a way to protect myself, my community, my kids and grandkids one day in the far-off distant future, because the powers we give the state to attack one disfavored group today can so easily be wielded against us tomorrow.
And really, no matter who you are, no matter where you are, no matter what your background, the more we empower this sort of abuse, the more risk we all are at.
Now, it's not a risk borne equally.
You know, we know that marginalized and multi-marginalized communities are always going to bear the disproportionate share of that harm. But when we erode the fundamental safeguards of a liberal democracy, we create a risk far broader than almost anyone realizes.
If there could be one key idea or one key fact that you would want folks to take away from our conversation, because we live in a social media short attention span world, what would be the one thing you would want them to remember and to walk away with?
Data is power. Data can be dangerous. And the way we need to protect ourselves is by creating new barriers to how government can collect our information. I talk a lot about legal firewalls, this idea that you create a barrier between the data that companies like Google and Uber and Lyft have about us and our lives and almost every intimate moment, and the police departments that want to use it.
It's a very different danger when the government and when companies have that information. So by taking action in our local communities to wall off the data companies have about us from the government agencies that would misuse it, we can have our technology in our constitution too. We can have a framework for democracy that is compatible with the technologies we all depend on.
And it also means that instead of going down the path of fighting government and corporate giants, we can actually have the corporate giants on our side for once, even pushing back against that sort of dystopian threat.
Yeah, thank you so much for joining us today, Albert. I really appreciate this conversation. Thank you so much for having me today. And as gloomy as all of this has been, I truly believe we've never been better poised to take the actions we need to protect our future and to protect our democracy. And I really just hope your listeners join that fight.
Yes, I do too. And I think this is ending on a good note from that perspective, because I feel like there's something I can do. There's something I can contribute to. So that is fantastic. And to you, the listener, thank you so much for tuning in to Securing Sexuality, your source of information needed to protect yourself and your relationships.
From the bedroom to the cloud, we are here to help you navigate safe sex in a digital age. Be sure to check on our website, securingsexuality.com for links to Stop, two links to Albert's TED Talk, and for more information about the topics we've discussed here today.
And also, as always, information about next year's conference. And join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week.
Comments are closed.
Write something about yourself. No need to be fancy, just an overview.
Join us in Detroit! October 19 & 20, 2023
Proudly Sponsored by The Bound Together Foundation
An IRS approved 501(c)3 nonprofit organization
Michigan Charitable Solicitation Registration# 64801