Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week’s episode:
Protecting Your Data in a Digital Age, Taking Responsibility for Data, General Data Protection Regulation (GDPR), the Implications of Algorithms on Social Media, and Manipulating Algorithms to Gain Privacy Control
Technology is continuously evolving, bringing with it new threats to our data and privacy. The rise of cybercrime makes it more crucial than ever to take steps to protect ourselves in this digital age. Fortunately, cybersecurity expert Cat Coode has shared some useful tips to help us stay safe online.
One of the essential steps is to use strong passwords for all online accounts. A strong password should contain a minimum of eight characters, with a combination of uppercase and lowercase letters, numbers, and symbols. Using the same password for multiple accounts increases the risk of being hacked, so it's crucial to avoid doing so.
Another recommended practice is to enable two-factor authentication (2FA) whenever possible. This adds an extra layer of security by requiring an additional code or token when logging in from a new device or location. It helps ensure that only authorized users can access your account, even if someone else knows your password.
Phishing scams and other malicious emails or links sent via email or text message are common. It's vital to avoid clicking on any suspicious links, which may lead to malicious websites designed to steal personal information such as usernames and passwords. If you receive an email from an unknown sender asking for personal information, it's best to delete it immediately without clicking any links contained within the message body.
Lastly, using a virtual private network (VPN) while accessing public Wi-Fi networks, such as those found in coffee shops or airports, is recommended. A VPN encrypts all data sent over the network, making it difficult for hackers to intercept while you are connected online. It's also essential to ensure that any website you visit is secure by looking for "https" at the beginning of their URL address bar instead of just "http."
By following these tips from cybersecurity expert Cat Coode, you can take measures to protect yourself against cybercrime and enjoy the benefits technology offers in our digital age!
Hello and welcome to Securing Sexuality, the podcast where we discuss the intersection of intimacy and information security.
I am Wolf Goerlich.
He's a hacker. And I'm Stefani Goerlich.
She's a sex therapist. And together we're going to discuss what safe sex looks like in a digital age.
Today we're talking with Cat Coode about how all things social media, but a particular site and a particular case has come to our attention.
So Cat, welcome. Thanks for joining us and tell our audience a little bit about yourself. Great. Thank you so much. I am so excited to be here. It is such an important topic that the two of you are discussing, so I love it. So as Wolf said, I'm Cat Coode. I am a DITA privacy consultant. I actually come from the software world.
I worked at BlackBerry for over a decade as a software architect and designing the software before it was super popular that everyone actually uses and the base to most of the social media. And my superhero origin story is I was there at BlackBerry when iPhone took off and I had to watch everyone using iPhone because it was cooler and sexier, but it was also using all your data and connecting your data.
And that was my aha moment that nobody actually understood privacy and where the data was being shared. And that is how I moved from there to being in data privacy.
Oh, I remember that moment. And I used to say that you'd have to pry the BlackBerry out of my cold, dead hands. And I still have my couple last BlackBerry’s. I am gesturing towards my shelf of well-loved, well-worn devices. I remember that moment.
And yeah, it really has shifted quite a lot because back then there weren't these platforms, right?
Back then when we were messaging, I mean, I would love to hear your perspective, but I know the BlackBerry messaging was much more secure than SMS or Facebook Messenger or any of these other ones.
Yeah, it was. And I think one of the big things is we had yet to discover big data and analytics. Your messages was about providing you a way to communicate with people. It wasn't about a way to assess your data and analyze your data and find out what more we could do with your data. So the original BlackBerry Messenger, which was the BBM that everyone loved on BlackBerry, was encrypted and protected.
And so it was widely used by the government, by financial institutions for that reason because it was created for the purpose of protecting that pipeline.
So what does that look like when we talk about data analytics, when we talk about that sort of gathering process that now exists?
Who is doing that and why?
What's the goal behind it?
Yeah, for sure. So with social media, we have the adage in the tech world too. If you're not paying for the product, you are the product. And of course, we know with Facebook and Google and all of our social media now, we are giving them data as payment for the ability to use their services.
So they, at the beginning, would use your data and give it to the advertisers. They would actually sell your data.
And then, of course, people came in and said, you can't just sell the data. There are some regulations and laws around doing that. And then so what was happening is Facebook would say, OK, we're not going to specifically give your data to that manufacturer. But what we will say is allow the manufacturer to target, let's say, women between 35 and 45 who live in this city who like these things.
And then those people would get that ad. So they were still using the data to do this targeted marketing, but they weren't actually passing the data back to the advertiser.
You know, and marketing is creepy in and of itself in terms of how it's evolved and whatnot.
However, there's also another layer to this I like to hear from you.
In that case that we'll get to, if you're listening, you're like, what is this case?
Don't worry, we'll get there. We'll get there when we get there.
The EFF, the Electronic Frontier Foundation, had a quote that said, every day across the country, and of course, this is United States specific, so across the country, police get access to private messages between people on Facebook, on Instagram, on any social media or messaging service you can think of.
What is your perspective on that side of things?
Because not only is it being used for marketing purposes, it's also being used in ways that I think a lot of us don't anticipate.
Yeah, absolutely. And I think one of the really telling things about that is that Europe created a privacy law called GDPR, and that is the General Data Protection Regulation. And it is intended to protect the individual. It is all about the individual. So as a company, you have to put the individual first and what are their rights and what are their abilities.
And when they created this, they also said that you could not transfer European data to the US. And the reason why was because the US laws are so open in the way it allows authorities to access that data.
So in order to transfer European data out, you now have to sign all sorts of agreements that say you will secure this data and that it isn't available for legal authorities to come in and access. And you need a data privacy officer. And you do. When we were first talking about doing this podcast, my original goal was actually not to start until 2023.
And then the Dobs ruling happened, Roe was overturned, and we decided we wanted to start a lot sooner because we wanted to put out some information about how people that are pregnant, that could become pregnant, could protect themselves and their data. And that's one of the things that has been a real driver for this and to the case that Wolf alluded to.
There was a situation in Nebraska where the police sent a warrant to Facebook, and Facebook then gave them all of the messages on Messenger between a mother and a daughter who the daughter had found herself in a position of being unwantedly pregnant, and she and her mother were trying to figure out how to navigate the situation. And the police got a warrant to seize their devices.
They are now charged with felonies for concealing a death and disposing of human remains because they had to do a self-managed abortion in the aftermath of Roe. And that's exactly the kind of nightmare scenario that led us to speed up the introduction of this podcast, and I would love to hear your thoughts about that case and that situation. Absolutely.
So with this specific case, the police had gone to Metta and presented a warrant that gave them the ability to access the messages that belong to these two women. Now Metta has since released a statement of their own saying they had no idea what the background was for the warrant. It's just a warrant.
Now, unfortunately in the US, when there is a warrant of this type, they don't have to give a reason or an explanation. They just have to say, we have legal authority or legal rights to these messages. So despite the fact that the reasoning behind it was around abortion and not someone who killed their husband and buried them, there's nothing to distinguish those two things from Metta's side.
So Metta, again, for anyone that doesn't know, is Facebook and Instagram and WhatsApp. So it's unfortunate because Metta as a company, well, at least they claim to be pro-abortion and pro-human rights and choice, but they still feel like their hands were tied in this instant. So my messaging really, and I cannot say this enough, is unfortunately you are responsible for your own data.
We cannot rely on systems and services to protect us. Google took a big step forward, and I don't know if you know about this, where they said they would wipe searches. So they will wipe search data because authorities can come in and say, I want to know if you've searched for abortion clinics. But they didn't take it a step further and also wipe location data on Android devices.
So there's too much out there, especially the geolocation I think frightens me the most. And then health apps too. If you're running a commercial health app, there is a health legislation in the US called HIPAA that commercial health apps are not protected under HIPAA. They're not run by doctors.
So if you decide to download a period tracker and you have tracked your period and then you don't have a period for four months and it comes back again, they can actually use that data against you and come back and say, we think you had an abortion at this point.
And then that could be enough information for them to say, this woman had this record where there were four months without her period, and now I want to make sure she did not search abortion clinics or have location data that tracks to abortion clinics so that we can verify that in fact she did have an abortion and not say a miscarriage or just some misperiods. So there's a lot of that.
There's a lot of, I would love to say that this Facebook case is going to be an isolated incident, but I feel like all of the tech right now is tracking you in ways that we don't really appreciate and understand. It seems so kind of trite and cliched to talk about something being dystopian, but this really genuinely is dystopian.
The idea that somebody, a sheriff, a district attorney would say, well, you didn't use an app for four months, so now prove to us that you didn't is terrifying.
How does one prove a negative for one?
What can people do to protect themselves?
So a lot of it is not using the health apps would be my number one advice. Let us not use these, even things like Fitbits that track heart rates and if anything that borderlines for women in that pregnancy age from this perspective that tracks temperature, that tracks again periods.
I grew up in a generation where we didn't have an app for that, right?
So if you need to track it, grab a piece of paper and a pen.
But if there is that sense of needing to seek out help and you're in a state where it is illegal, then using browsers that are anonymous, using other ways to do these searches, but not relying on the big tech giants like Google, like Facebook to do those searches because that information is fully accessible, as you had said, fully accessible to the authorities if they think there is a reason to go looking for it.
I will say in my practice, in my conversations with clients, in my own period tracking as an menstruating human, there is one tracker that has impressed me. The Clue app is based in Germany.
And even before Dobbs came down, I had reached out to them and said, if this happens, what's your plan?
And they responded to me beforehand, which I appreciated. And then the minute the ruling came out, they released the statement that was like, look, we are based in Germany. All of our servers are in Europe. It's not going to happen. They're not going to get it. We've got your backs. And I've really appreciated that.
So that is the only health tracking app at this point that I have seen take a stance like that and be legally and geographically in a position to actually do that. I love that. I hadn't heard of them. That's great. But I think another thing we can do, I mean, not just as women, as advocates, as allies, is continue to support apps like that, is to repost on them and advertise them.
And because in advertising, here's a great app you should use, you're also explaining why you shouldn't use the others. But I'm all of solutions, not problems. So giving someone the answer and saying, here's something, a viable solution you can use is fantastic. In full transparency, they don't sponsor us. That's not advertising. It's just Wolf and me. We've got no money in the game.
But to your point, Cat, I did actually go from 10 plus years as a free user to paying for a subscription just because I wanted to support that for them and to support the continuation of that service. So no horse in the race personally, other than I like having my privacy protected. Speaking of which, that leads me to another question.
I had heard somewhere, and I don't know if this was a rumor I saw on Twitter, if this was a conversation, but somewhere I had read that even if I type a search into the Facebook search bar, and then I'm like, oh no, never mind, I don't need that, even if I delete it, if I never actually hit enter and complete the search, that data is still captured.
Is that accurate?
That data is still captured.
Again, so the argument from a marketing perspective is we want to improve your service. So we want to know why you tried to search something but didn't search it.
We want to know if you wanted to type something but didn't type it, what stopped you from typing it?
There was a while from a mental health perspective, and I don't know if they're still doing it, Facebook said that they were analyzing messages in posts to see if people might be depressed or have other conditions that might require someone to intervene and help, or you're suicidal. So they said for the good of the person who was using it, that they were reading and assessing these messages for that purpose.
So if you had typed out a message that said I'm done with life, I goodbye, and then you erased it, that's something they would want to capture in order to be able to say this person's at risk and then flag a friend and say this issue is happening, please check in on your friend.
Again, they come off as this is our altruistic reason for all things, clearly not. So they do, they have reasons to capture. What is interesting actually about some of the tech giants just as a fun fact, one of Google's principles for employees is to not ever touch employee data. That's an instant firing. You touch some, or not employee data, sorry, customer data.
If you go into a user's email account, their search history, whatever, unauthorized without reason, that is not allowed at all at Google. Whereas Facebook, that is actually what they do. They go into the user data because they're looking for different things that they can do with that data in order to quote, unquote, again, improve experience.
So yes, that is the keystroke logging, it's what we call it, is when you actually log what's being typed in is something that they might look at. They again could argue and say, oh, people are always spelling this word wrong and then correcting it, so we just want to auto correct it for them. Lots of arguments on why they might be doing it and not a lot of good ones.
So that is one thing we oftentimes hear. I'm going to ask you about another thing I want to know if it's a myth or real.
The first one is, are they capturing things even if you don't search?
Yes, yes they are. The second thing is, and I'll tell you the story first and then I'll ask the question. My wife and I were traveling recently, Stephanie and I were traveling. We watched a movie and I'm assuming she Googled the movie.
I'm assuming she searched to find it somehow, but we watched a movie together and within 12 hours, I started seeing clips of that movie on Twitter advertising clickbait, like, oh, click here, top paused, you know, clips from videos and this sort of thing.
And so one of the things that oftentimes gets suggested is, are these apps like Facebook, are they listening in on our mics?
Because some of the advertisements that they put through is incredibly timely and incredibly creepy.
Yeah, they are. That would be the easy answer. So if you have a microphone turned on for any application, you've essentially given that application permission to listen at any point. The same goes for any of the devices in your home that might be smart devices that listen.
And your TV, your smart TV, essentially, like I always joke, it's like having, you know, when you have a kid in the room and two parents are arguing and the kid is pretending not to listen, but then you say their name and then they turn around. Like that's all of these things. They are listening at all times waiting for you to call their name.
They are, all these devices are listening. There was a great expression I saw the other day. So IoT is the Internet of Things. That's all things that are connected. And it said IoT and then it says the P is for privacy and the S is for security. There's no regulations in IoT and there are very few companies have instituted privacy or security into them. So most of these devices are listening.
The other thing that they're doing is sending signals to each other. So I have tried, I have an iPhone, admittedly, I know like everyone, better on privacy than some, but they are still using their own data. So they have started cutting off the connection between apps and their data, but they are certainly collecting all the data they want to collect. So I have tried in my privacy settings to turn everything off.
But then I walked into a store, similar Wolf, I walked into a store and then all of a sudden I was getting tons of ads for that store. So clearly there's some beacon in there. There's some identifier.
There's something that says even if it's as simple as device one, two, three, four walked into the store and then they tell Apple and then Apple says, okay, well then I'm going to ping device one, two, three, four with ads for your store.
So whether or not, I don't know how you saw the movie or where it was, whether or not it was listening, whether or not it pulled a beacon from a physical location, there's a lot of different ways, unfortunately, devices are getting that information. You mentioned that we actually covered a case that FTC has brought on a previous episode because it was very intriguing. The company positions themselves as marketing.
And as you walk around, they collect a whole bunch of information about your location, your device. Of course it's anonymized, but it's still by device. And what was really surprising about that case was that I Wolfgang could set up a AWS instance and get free access to a trial version of that data and start pulling down a whole bunch of things I don't want to know about.
So it does beg the question, right?
How are these social media and search data companies?
How are they able to make these predictions about specific people and regional trends?
Obviously by collecting a lot of data, but is there additional algorithms or additional concerns behind the scenes that we should also be thinking about?
Absolutely. And there's a lot of fundamental problems here. The algorithms that were developed, and I know there's several movies, like there's The Social Dilemma, there's Coded Bias is also an excellent movie. Both of them are on Netflix. Coded Bias was largely about this amazing PhD student who's a woman of color, who was tired of the fact that facial recognition kept messing up faces of color.
And so a lot of these algorithms are created by people who sometimes have a very myopic view of the world and it's exclusive, it excludes groups of people. It is very focused on what they think the answer should be. And an algorithm is only as good as the people who are training it. And I think that's largely the problem.
So sometimes they say, you know, the social media sites know you better than you know yourself. I always laugh because a lot of the ads I get now, and I am still on Facebook, but a lot of the ads that I get, they do make sense.
But then sometimes I'll get an ad and I'm like, what in the world did I do or who do I know that merited that ad that is not something I would ever buy?
One interesting story with Facebook was a gentleman, I don't know if anyone remembers in the US who was on Facebook and he had he was gay, but he had not come out.
And then somebody at work was standing beside him and his Facebook ad popped up for a gay cruise because Facebook knew he was gay because of what he had been clicking on because of who he followed because of what he looked at or they assumed they assumed they were correct. They assumed and so he was getting that information.
And he was obviously quite livid about that because, you know, the person behind was like, hey, why are you getting gay cruises?
And he's like, oh, I don't know. But you know, at the time people thought Facebook knew. So that that was unfair to him that it that it it pegged him on something that he wasn't ready to tell other people.
There's also the story about the girl who got coupons in the mail from a retailer and her father opened it and it was all pregnancy coupons that she had not told her father she was pregnant and he went back and he was livid and he said, what is this?
Why are you sending these?
And it was based on other purchases and other things she had done. So so it was it was the right algorithm. It was the right assumption, but it was the disclosure of the information that was wrong. So we have both things at play. We have we are basing this on algorithms that may or may not be right.
And then and then some of the disclosure of that information and how it's being distributed is now another problem. So the idea of algorithms are only as good as the people that train them or maintain them really resonates with me.
I years ago was told about a dating website that even if you answered all their questions and came up and there was somebody else in their system that was your mathematical soulmate, you guys were 100 percent compatible if you were either of different races or were the same sex, they would not connect you to.
They were making sort of perhaps the eugenicist choices, definitely some has some says hetero choices around matching people based on the things that were important to them as opposed to the things that might be relevant to the user. And we talked about social media, we talked about IOT devices listening in.
I'm curious what, if anything, how this plays out in the world of dating with the dating websites with the there are so many sites and apps now where you answer dozens, if not hundreds of questions about yourself in order to find your matches. And until this conversation, it's never occurred to me to wonder what they're doing with that beyond matching you up with people.
It's funny you say that because there was the CEO of one of the sites and I want to say, OK, Cupid wrote a book specifically on data analytics. And it is utterly fascinating because that is what they did.
He's like, here's a table of this kind of thing and here's a table of all of this and here's what we learned from this. So what was interesting is that what people answered didn't match the actions of what they did. And so very often, especially men would say that they would date someone between the ages of, let's say, 25 and 40, but they would only pick matches between 25 and 30.
So they would create these profiles to make themselves look more well-rounded and open minded. But they were really in their selection process, very focused on one type of person. So I found that part interesting.
So are they over collecting more data about you than you should be sharing?
Yes. But I always call it there's the curated data and the uncurated data. So curated data is like pictures that you choose to put up on Instagram, whereas uncurated data are shows you select to watch on Netflix. So one of them is I'm going to be very careful because I know the world is looking at this. And the other one is I don't care what you think of me.
I like teen flicks. So that's what I'm going to watch. I always make that joke. My husband came home once and he's like years ago, he's like, I think the babysitter was watching our Netflix because it's like all Degrassi. And I'm like, that was me. So like if it's a story about like my teenage daughter will watch less teen flicks than I do.
But Netflix knows that, right?
And every time it releases a new one of these stupid things, it says, hey, this is what you're interested in watching. I appreciate that algorithm. I did find with that one, like you're saying, it was giving me all white people. And you would take it would be like, here's comedies. And I would go through and everyone in like all the little thumbnails were white.
And so I started watching exclusively movies starring people of color. And then all of a sudden, all my recommendations changed. And it was it was a nerving for sure that they assumed that because I liked these big Hollywood movies, I would continue to like these same ones. And I was like, but you didn't even offer me the other one. You didn't even give me that option.
So we have the data analytics that dating websites are doing. We have Netflix kind of refining their predictions based on what you watch.
Is that happening at a more sort of global level?
Are companies using data information in order to make predictions about communities or about society?
How is it being used not to figure out you and me, but to figure out us collectively?
Yeah. And that's a lot of the principle behind smart cities and what they are. And the argument will be, you know, we want to see like the traffic patterns. Like there's smart traffic lights, like we want the traffic smart traffic lights. Don't go by these two minute timers. They go by, you know, at 3 p.m. we see this.
And so we're going to make these green lights longer because this makes it more efficient. There's a lot of that happening. And in some ways you could say, yeah, that is that is beneficial.
But then in other ways, it's like, what other data is that gleaning about you?
So one argument I had about a smart condo that I had worked with where they were putting in, you know, a smart fridge, which would tell you how often you went to the fridge and then it would control the power. And I'm like, but if you're if you're determining how often someone goes to a refrigerator, that like also talks about their nutritional habits or like like that's that's that's not your business.
Or if if they're frequently going to the bathroom or there was there's a smart sensor that tells how many people are in the room so that you can alter the the temperatures to accommodate for the number of people instead of just having a preset temperature.
And I'm like, but then if you know someone typically lives alone but always has company in the evening, like you're just pulling extrapolated data out of this stuff that isn't what you want. And so again, like I said, all of these products are not regulated. So that's what worries me is that the the intention is often a good one. Let's save energy. Let's improve traffic patterns.
Let's see how many people go in this building every day. Let's see how many let's how many people are using this gym at the condo so we can tell if we should make it like have longer hours. But there's there's good intention behind it.
But the very rarely do I find companies are looking at what are the consequences of the data?
What are other ways that this data could be used against someone?
That's that's really the issue. We could probably have an entire episode on on IOT. And if we do, we should bring you back because early on I did look into that space and I was really surprised because most of the companies were very small. Most of the teams were nonexistent. We're talking about security and privacy.
Most of the products were built on platforms that were rife with vulnerabilities and very rarely was their security threat modeling. Very rarely was their privacy threat modeling.
So yeah, that that introduces all sorts of risks. That's a whole other problem. And I agree, there are small companies and a small company will come up with a clever product and a building will deploy it in every apartment.
And not one person has bothered to say, but what happens if you hack it?
You've got temperature sensors and you can control the temperature. That's great. But then what happens if someone hacks that and turns everyone's apartment up to 100 degrees and then you hack the locks and you lock everyone in like where is yeah, that's that's a whole other whole other issue. But it is also an issue.
So with the on the on the we're going to have to fix that a little bit because I started again.
The things we've often said from a privacy perspective is if I don't have control my data, I'm going to lie, right?
I'm going to put in a fake age. I'm going to put in fake cars. I'm going to you know, whatever the questions are, so that I can have some some control. Do you see as emerging more of a codification of what you're saying earlier, which is I'm going to mislead these algorithms. So I do get different responses. So I do have a degree of privacy.
Or are we still too early on for that to emerge?
I think it's a well, you and I are in the tech industry and I am seeing a very small subset of technical people doing it. And I think that's it.
Like I think the amount of like there are emails you can use for 10 minutes, right?
There are secure emails you can use so that if you need to sign up, you need an email to log into something just to access it, you can use one of these emails. That way you can do the verification on the email. Then the email disappears and you don't have a trace on that one.
But the fact that I know a low percentage of people who are doing it and I'm in that space tells you in the grand scheme of things how few people would actually be doing it. I think in the long run, fooling an algorithm would be very difficult because depending on what you're on, because you have a network. So on Facebook, I can do nothing, which is what I do.
Basically, I follow, I do follow local groups and it's the only way for me to get that information like sports teams and things that my family's part of. Like it's the only way for me to get that info, which is why I'm still on Facebook. I haven't posted on my profile in years, but I am still connected to all these people that I went to high school or university with.
And every time, you know, if there is like when Trump was elected and everyone is up in arms about it and says how horrible, like so because my network is, has their opinions and is the way they are, the assumption is I am too, or I wouldn't be connected to them. And I have removed people who didn't agree. So it knows.
So even if I don't post myself, the assumptions made about the connections that you have and unfortunately, and this is the real kicker, even if you were on no social media, you are in other people's contact books. So your friend A goes on Meta and Meta now has your email address. And then your friend B goes on Meta and now Meta has your email address from there.
So Meta knows, even if you don't have a Facebook account, they know the 50 people that you're connected to because your email address is in all of their contact books. So they have a file on you anyway, even though you've never signed up.
Yeah, that is a very good point. Stephanie and I were listening to an OSINT talk and how to disappear. And one of the things that the speaker mentioned was you can do a great job disappearing, but your shadow still exists because everyone who's connected with you has a view into you. So some very intriguing stuff.
I wanted to ask you, Stefani, I remember when on the algorithm hacking side, I remember when Roe V. Wade was overturned that you were getting questions on, hey, should guys sign up for period trackers and mess with them?
Lots of people. It wasn't just guys. Lots of people were saying that they were going to just start inputting random data in order to try and mess up the algorithms in order to sort of, I suppose, discourage some of the surveillance.
And my position at the time was please don't do that because people that are using it in good faith, you're going to mess up the math for them and put them in exactly the situations you're trying to prevent. So that was a well-intentioned, but I believe poorly executed sort of short-term burst of activism that happened this summer.
I have not heard much in the way of people doing that since then, which makes me grateful. So I've got two questions, one for people building apps and building IoT devices, and then one for us individually. So I'm going to start with the builders. I'm going to start with the entrepreneurs and the product folks.
What guidance would you give to them if they don't have a data privacy officer, if they're not to that point yet?
How can they make sure that they're not going to be used against people in their efforts to help people?
This is like my favorite question. So we do have new data regulations coming out all over the place and they are coming into effect. And if you're not compliant today, you will have to be compliant at some point in the future anyway, so it makes sense to get your ducks in a row. But the number one thing is to limit the data to only what is required.
Limit, limit, limit. Spotify is the example I throw under the bus every time when you log into Spotify, it asks you for your gender and it asks you for your birth date. And it doesn't require either. In order to listen to music, you do not need my gender and you do not need my birth date.
And if the algorithms were clever enough, if I was listening to 90s grunge, you would know 90s grunge. Like you don't need to know my gender age to make other predictions and recommendations based on the music that I start with. Gender I think is the hill I will die on right now because almost everyone is collecting it and I think I can count on one hand where it's required.
The other issue with gender that we have right now is that it used to be non-identifiable data. You used to be able to say, hey, we've got a thousand people, so I'm going to divide them as male and female. And then it's like all anonymized and I can have this like 60% of men.
But now when you have a subset and if you have non-binary individuals or you have a no answer, you have identified the individual in a lot of cases. So we had a high school in Canada that this happened to where they had marks published by gender. So they would be like the average of girls in grade 12 physics was 95 and the average of guys in grade 12 physics was 87.
Oh, and the average of non-binary individuals in grade 12 physics was this mark with one non-binary student. So nobody thought down the road is this a piece of data we actually need or not.
So I mean, gender fits into this topic in general, but all of that data.
Do you need a birth date?
Could you ask for an age range?
Do you need an address?
Could you ask for a zip code or a state?
Like is there ways that you can say, I'm trying to get a general understanding of you to better serve you from a product perspective.
Like if you want to give someone a shipping code and they haven't signed in yet or given their address, can you just ask them for their zip code instead of asking them for their specific address?
So as developers of products and services figure out with the minimum data that you require to actually collect and only collect that data. I love it that I love everything about that. And then personally, short of going back to BlackBerry's, which I would love by the way, if we had in that direction, I would be a very happy person. Give me a keyboard.
But personally, what is your top tip for protecting yourself online?
Privacy settings, nobody uses them or people set and forget them. Every time there's an update to your OS, to your operating system on your device, on your computer, there's potentially new privacy settings. And we all have moments where we're waiting in life. And I know this because people find those moments to spend on social media.
So instead of taking your 20 minutes to peruse Reddit or Instagram or whatever you're on, take that 20 minutes to open up your privacy settings because they tend to be embedded too. Like you'll have a privacy setting for location.
I know on Apple, you can set like on an iPhone, you can set all of the location to only be used while you're using the app, which means if you have a weather app, a driving app, like things that are actually reliant on location, you can allow it to access your location, but it's not tracking you 24 seven on all your business trips wherever you go.
So going through those privacy settings, especially with, as you had alluded to, microphone. I'm very big again on location, turning that off.
Camera, only the apps that need the camera should really have the camera. Otherwise you're allowing it to turn on as well. And contacts and only the apps again that are actually required. You're not doing yourself or your friends any favors by going into an app and having the app say, Hey, we're going to improve your experience. That's the key sentence all the time.
You're going to improve your experience by getting your contacts for you. That's not a favor for you or them. So these are tangible, actionable choices we can make, steps we can take. I love the new knowledge you gave a few minutes ago. Now I know that the 10 minute email addresses exist and I'm going to tell my clients use those when they're signing up for dating websites.
Otherwise what else do, what else should our listeners know about how they're tracked, how they're monitored?
What final bits of wisdom, insight, guidance do you have for us today, Cat?
I mean, I will say it again because I already said it because it's so key, but your data and your privacy, it is your responsibility. And when people say I have nothing to hide, it's not about what you want to hide or what you want to share. It's about the fact that you still have that control. And so you own it.
Every decision you make, every app you choose to download that is in your control. And as soon as you allow something else on your device or you allow something else to turn on, if you control it and you're okay with what it's doing, like I said earlier, I'm on Facebook because I need to access certain groups.
And in doing so, I know what I'm trading off, but I am still in control in the sense that I have the knowledge of what is being taken versus what I am getting out of it. There is always a return on this.
So if somebody said, would you pay a hundred dollars a month for Facebook?
I probably wouldn't. And yet here you're giving maybe that much worth of your own data.
So do you need a crossword app that's free because that might be taking your data?
If you did, it would tell you, it would tell you.
But yeah, your data is your responsibility and you cannot rely on these companies to look after it on your behalf.
Cat, where can they go to find out more about you?
I am at catcoode.com and that's C-O-O-D-E or my company is Binary Tattoo. So you can find me on either website. You can find me on LinkedIn. You can find me on Twitter. I am in the social media world. Fantastic. Thank you so much for joining us and thank you for tuning in to Securing Sexuality, your source for the information you need to protect yourself and your relationships.
From the bedroom to the cloud, we're here to help you navigate safe sex in a digital age. Be sure to check out our website, Securing Sexuality, for links to Cat, to links to more information about the topics we've discussed today and of course, links to next year's conference event. And join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week.