Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week's episode:
Understanding the Risks of Location-Based Dating Apps and How to Stay Safe
In the digital age, dating apps have become a primary tool for meeting new people, whether for casual hookups or serious relationships. However, the convenience of these apps comes with inherent risks, particularly concerning the privacy and safety of users. Recent research highlights a significant vulnerability in several popular dating apps, which could potentially expose users to stalking, intimate partner violence, or other malicious activities. Here we'll explore these risks, the concept of "oracle trilateration," and what both users and app developers can do to mitigate these dangers.
The Vulnerability: Oracle Trilateration Researchers from a Belgian university recently uncovered a vulnerability in several popular dating apps, including Bumble, Badoo, Grindr, Happn, Hinge, and Hilly. The vulnerability, termed "oracle trilateration," involves a method where an attacker can determine a user's precise location by manipulating their GPS location and recording the distance reported by the app from multiple points. By repeating this process from three different locations, the attacker can accurately triangulate the user's position down to a few meters. This method poses a severe risk, particularly for individuals who may be at risk of stalking, domestic violence, or other forms of harassment. While the concept of trilateration might seem like something out of a crime drama, its real-world implications are frighteningly tangible. The Importance of Location Privacy Location-based dating apps rely on geographical proximity to connect users. The closer someone is, the more likely you are to meet up quickly. While this feature is crucial for the app's functionality, it can also be exploited if proper safeguards are not in place. Knowing someone's exact location can be particularly dangerous in cases of domestic violence or stalking, where an abuser could use this information to find a victim. In urban areas, the accuracy of this trilateration might be less precise due to the density of buildings and population. However, in suburban or rural areas, where there are fewer potential locations to narrow down, this method could easily pinpoint someone's home or workplace, putting them at significant risk. How to Protect Yourself For users concerned about their safety, there are several steps that can be taken to mitigate the risks posed by these vulnerabilities:
What Should Companies Do? Dating app developers also have a responsibility to protect their users. This involves incorporating threat modeling during the app development process, which includes considering potential risks like stalking or intimate partner violence. Here are some steps companies can take:
Balancing Usability and Security While these measures are essential, there is a fine balance between usability and security. The very nature of hookup apps, for instance, relies on users being able to find each other quickly and easily. As such, users must be aware of the trade-offs they are making and take steps to protect their privacy where necessary. While dating apps have revolutionized how we connect with others, they also introduce new risks that users and developers must be aware of. By understanding these risks and taking proactive steps, we can ensure that our digital interactions remain safe and secure. Key Concepts:
Hello and welcome to Securing Sexuality, the podcast where we discuss the intersection of intimacy and information security. I'm Wolf Goerlich. He's a hacker and I'm Stefani Goerlich.
She is a sex therapist and together we're going to discuss what safe sex looks like in a digital age. And I think today is the I Get to Scare Stefani episode. I mean most of our episodes are the You Get to Scare Stefani episodes. Sometimes you bring along a friend to help you do it. I'm looking at you, Josh Marpet. Ah, Josh, shout out to you. So, you want to know what's new in the news? Um, besides, you know, like, impending global disasters, election shenanigans, um, polarization and climate change? Oh, and the Olympics. Oh yeah, and the Olympics. Shout out to Pommel Horse Guy. Josh Marpet and Pommel Horse Guy. And the Turkish sharpshooter. He is, he is my new favorite person. But, no, but what I want to actually talking about is something that hit the, the news cycle around our type of things, right? So there's some researchers in this Belgian university. K.U. Leuven, and I'm probably pronouncing it wrong. To our Belgian listeners, I apologize. If it makes you feel any better, I pronounce my own name wrong. So these researchers in this Belgian university, they have analyzed 15 popular dating apps, and found a vulnerability, specifically in Bumble, and Badoo, and Grindr, and Happn, and Hinge, and Hilly. They, they have this scary vulnerability. What is that vulnerability? Because I don't like it when I hear dating website vulnerabilities. That makes my clients less safe. I mean, not to just be super, like, obvious about it, but. That sounds like a good idea. That makes my clients less safe. For the, for the listener. I did not write that in the show notes. She is not reading that. All right. So, it is my new favorite phrase, oracle trilateration. Okay, so this makes me think of two things. Either one, triangulation, which is a therapeutic term, or two, the trilateral commission, which is a conspiracy theorist term. So which, which of these does oracle trilateralization fall into? I think it's, tell me more about triangulation. What does that mean? So, triangulation in therapy is when you're working with a couple, or with a family, and there's this perception of sort of, like, pitting one against the other, or where the therapist has formed an alliance with one, and they're teaming up on the other. No, no, that's nothing like it. Although, it does take three points. So we are at the tri, meaning three, so I think we're on the same page there. Do you remember when we met, and you and I were on Tinder, it's how we met, we talked about this on the show many times, it used to be like, this person is nearby, they're so far away from you, right? It would give you, like, a distance. Yeah, I mean, that was the original point of Tinder and Grindr, was to hook up, right? They weren't originally, I mean, Grindr is still hook up. Tinder has evolved, a lot of people use it more for, you know, meeting people they want to date. But originally, it was a DTF, within five minutes of me, sort of website. So yeah, I remember that, because that was how you knew who was nearby, and DTF. So, I'm going to come back to that, because it's interesting. I want to ask you a question about hook up sites, but I'll get back to that. What you do with Oracle Trilateration, is you set, let's say I'm trying to find you, and I'm a bad guy. And I know you're on the dating site, so I find you on the dating site. I set my GPS location to be in one point, and I see how far away you are. And then I set it in another point, and I see how far away you are. And then I set it in a third point, I'll bet you guessed that, Trilateration 3. And now I've got three different locations, right? I've got these three different positions, and each time I've done that, and looked at the app, or more realistically, queried the API, I'm going to get a specific number of the distance between you and me, and then we can do some fancy math, and I can figure out where you are down to a few meters. Okay. Oracle Trilateration. Where does the Oracle part come into it, though? Because that's the, like, is it like Oracle the company? First of all, does Oracle the company still exist? Yeah, Oracle the company still, we have friends who work there. You know the folks who invite us over for cheese and wine parties every year? They're over, yeah, they're over at Oracle. Oh, well, I'll have to apologize for them, for questioning the existence of their employer. But does it have to do with that, or, like, where does, like, I get the triangulation, trilateralization, now I realize that we're not talking either therapy or conspiracy theories, we're talking criminal minds. Yes. Oh, yes. Criminal minds. There's always, there's always, you know, once or twice a season where they need to find the serial killer by mapping out all of their victims on a map and using that to triangulate their location. Yes. Oh, that's so cool. Yeah. Now criminal minds. This is a criminal minds attack, but in criminal minds, right, when they're asking this of some source, that source is providing an answer. And I think these researchers use the phrase Oracle to, you know, be like the mythical Oracle, right? I have asked and I've received an answer. And so it's really more just like, you know, hackers like their cool terms. Oracle, seek the Oracle and ask the distance. She just sighed. I just sighed. So tell me how this makes my clients less safe. Well, here's the thing. For people who are on a dating site and looking for a partner, you might not necessarily care. Okay, the partner now knows where I am. Now, it is a little bit creepy because you may be having some messages with someone and now they know like really where you are down to a few meters. There is a certain creep factor to that. But it might not necessarily be a big deal. Like you say, if this person is DTF, which every time I say that I think of like phone tones. But if it's this person is in that mindset, maybe knowing where you are to a couple meters doesn't necessarily make a big difference. But think about it in terms of partner violence. Think about in terms of domestic violence. Think about in terms of stalkers. If you have someone who you are no longer in a relationship with who is trying to find you and you're on one of these apps and they create a account on one of these apps and they use that account to triangulate your location, that could create risk for you as the potential victim here. So again, I'm stuck on serial killers. I mean, you've met me. Forensics, social work is my jam. In criminal minds, they would call this geographical profiling. And I'm kind of picturing that, but like instead of the good guys, the behavioral analysis unit using geographic profiling to find the serial killer, it sounds like let's just go with the hyperbolic. The serial killer using the dating website could use geographic profiling to find potentially vulnerable people? Yes. Yeah. Within a couple of meters. Now, if the person is in a city, they may be you know, let's say they're in a skyscraper, right? That could be anywhere from floor 1 to 36 or however high the skyscraper is. So yeah, remember GPS is more or less flat. So it's not going to tell you exact location up or down. But suppose you're in a more rural area or suburb, you could narrow it down to someone's house or someone's place of work. Well, that's terrifying. Yeah. Geo-profiling of victims. I mean, on one hand, I guess we should tell people that, you know, just whenever you're going to be on any sort of dating website, be in a very public place when you do it. But we used to, I remember last summer, 50 episodes ago, we were like, go to the mall, go somewhere public when you're going to be on these things. I don't think it was dating websites then, but we were talking about tracking and we talked about being in busy public places where lots of people would have lots of phones and devices. But the mall near us just closed. So unless you have a friend that lives in like an apartment tower that you can be like, hey, can I come sit on your couch and swipe? What do people do about this? How can they mitigate their exposure here? Well, do we want to talk about what people can do or do we want to talk about what companies should do? You know what? I am interested in both. So I will let you, my cybersecurity consultant expert husband, board game compatriot, you decide the order you want to answer that in. I, as the therapist, am concerned about the people. You, as the cybersecurity expert, are concerned about the companies. So I will trust you to rank the importance of how we talk about this. Oh, see, that's mean because now I got to rank your preference over my preference. Oh, this is a terrible choice. All right. I will answer your question first. What can you do? There's a couple things you can do. One is, obviously, like you're saying, swipe or interact with these apps from a public place. Also, some of the apps have the ability to not share your location. So maybe that's an option until you're ready to meet up. And there is also, by the way, if you want to get into it, other data that these researchers surfaced that could be found. But I think the location was the one that was really interesting to me. If you are on the more technical side, there are things you can do in the Android and iPhone. On the Android, you can go into developer mode, and there's a feature called select mock location. So you can use select mock location to actually set your location wherever you want. You could just always be swiping from, I don't know, the closed mall. Why is this person always in the closed mall? Maybe they're a serial killer. So you could do that. There's also Android apps like fake GPS and GPS emulator. On the iPhone, there isn't necessarily a developer mode, right? The way you do a developer mode is you jailbreak your iPhone. I am not recommending you jailbreak your iPhone, but if you did jailbreak your iPhone, which I'm not recommending, you could also tamper with the location similar to Android's select mock location in the developer mode. And of course, oh, this is kind of cool. You may like this one. This comes from the Pokemon Go community. There actually are address or GPS changing apps that don't require jailbreaking. Like one of them is Tenorshare iAnyGo. And these apps are actually not made for privacy. They're made so that Pokemon Go players could change their location and act with things without having to drive around. So again, security through the gaming community comes through. You could use one of these types of features to shift your location a little bit to the left, a little bit to the right when you're on these apps. Important note, though, I'm not, obviously as we say on so many of these podcasts, I'm not endorsing any applications. I'm certainly not endorsing jailbreaking your phone, which I think I've said three times now. But there are ways that people can change the location on their devices, either by apps or getting closer into the hardware. Wow. So I appreciate that. Also, it doesn't feel intuitive to me as somebody that's not super non-techie. Wait, I said that wrong. As somebody that is super non-techie. I heard you say, I'm not encouraging you to jailbreak your phones, but I don't know that many of my people would know how to do that if they wanted to. But they could. Someone could, if they so chose. Google, how do I change my location when playing Pokemon Go? And there's like a dozen pages and apps and suggestions how to do that. And that same thing should work once it's changed, should work for the dating app. You'd want to try it. Your mileage may vary. I've not tested all of these options. That makes sense. So what should companies be doing to protect us? This is where I get excited. as you might imagine, one of the things that really irks me, we've talked about threat modeling before, right? We have, indeed. And so what's the definition of threat modeling again? Threat modeling is kind of like risk assessment in therapy. We are looking at all of the potential things that could go wrong, all of the potential trouble sources or danger points, and then we are conceptualizing how to protect ourselves from those or respond to them. Absolutely. You nailed a gold star for my wife. I know how to define risk assessment. Wait, no. Threat modeling. Sometimes my world does overlap a little bit. Absolutely it does. And that's what makes this podcast fun. So with that in mind, if you're building dating sites, for goodness sake, threat model stalkers, domestic violence, intimate partner violence in your apps, this should have been, I would argue, this type of attack should have been caught in the threat model. Okay. Now, historically, location-based dating apps have been vulnerable to this, but some of them have put in measures to protect against, I could say again, Oracle trilateration. And there's a couple different ways to do this. One is you could say, this person is close or not close. And instead of giving a distance between you and the other person, you could just give a binary. Yeah, if you set it within 10 miles, yeah, they're within your 10 miles. And so that would address it. You could snap people to grids. You could do grid snapping. Maybe imagine a grid across the city that was a city block, and everyone was snapped to the center of that block, and the dating app site would tell you roughly what block they're in. Now, you could still argue, well, they could still maybe find my place at work or whatever, but in that case, you're not getting down to a couple square meters. You're getting into a much more granular area. Interestingly enough, Tinder now uses grid snapping. So Tinder is the one of the apps that I did not mention that was susceptible. They used to be. They fixed this a long time ago. They used grid snapping. That's how they solved it. You can also do, and this is what I like. The more decimal points that you give in terms of distance, the more accurate the trilateration is. Okay? More decimal points marker. Do you remember when we were talking to RenderMan and Nicole, and they're like, yeah, well, you should just, like, instead of returning exact geolocations, return an aggregate or return a shortened form of that, right? Don't return all the data. This other way of doing it is very similar to that. Round the coordinates up by a couple decimal points, which makes it less precise, makes it less accurate. Now you can't identify within someone within a couple meters. You're identifying someone within a hundred square meters. So it's all about how you return that data. I would also say, back to threat modeling, maybe if someone is jumping locations every couple minutes, that might be a problem, right? If you think about the path a adversary takes, and you look for weird things they're doing, I would encourage anyone who's doing geolocation apps to look for someone who keeps changing the location. That should be an obviously fraud, right? Like, if my GPS is changing on you every time I talk to you, probably I'm not a valid user behaving in good faith. That reminds me of the section of our book that I was just working on, where I was learning about Tor, the Tor browser, and how it works. And I was so excited to tell you that I had learned that one of the things that it does is that it makes it look like you're doing something else. If you are a North Korean dissident, and you're on the Freedom House website, but you're using Tor, Tor will make it look like you're working on a spreadsheet, or you're sending email. It disguises what you're doing as you're doing it. And that I thought was fascinating. Yeah, absolutely. Absolutely it is. And I think there's a lot of technical nuance in that, that I won't get caught up on, that I'm fighting not to get caught up on. But yes, this idea that you're cloaking some of the things. And some of the apps do have this idea of spatial cloaking. Where you can distract where you're at, or you can hide where you're at, or you can snap it to another city or another location, you can pin it. So there's certainly this idea of spatial cloaking that is acceptable to that. Also, from an app perspective, I wonder why all the dating apps look like dating apps. Wasn't there like, there were sexting apps that could store photos that would look like a calendar, look like something innocuous. Wasn't there? Yeah, there are several. And honestly where we see those disguised ones, they usually look like a calculator. And there's something that sex educators will warn parents about, because the kids will use them to sext with each other, or to hide problematic images that they've saved, and from, you know, when the parent looks at the phone, it just looks like a calculator app. There are some others that aren't quite as hidden. One that I've recommended to my clients a lot over the years is KeepSafe. And KeepSafe, you know, their icon for the app says KeepSafe, but it doesn't really explain what that is. And one of the things that I've always liked about it is that if you have it open and your boss walks into the room, if you put your phone face down, KeepSafe changes to your browser. So, you know, there's a quick way to disguise what you're doing in the moment, but the app itself is not necessarily disguised in your phone. Yeah, yeah. I wonder why these dating sites, or dating apps, don't have something like that. I mean, they probably don't want to give off Ashley Madison vibes, right? Where the idea would be, oh, if you have to hide the fact that you're on, I don't know, Bumble, should you even be on Bumble at all? Oh, that's a good point. That's a good point. Which gets me back to the question I wanted to ask you earlier. What is the difference between dating apps and hookup apps? So, a hookup app is specifically to meet for casual sex. The people on a hookup app might chat a little bit, but they're not interested in meeting your parents or learning your favorite color. They are interested in your favorite position, and how quickly you are interested in connecting for casual sex. How quickly you're interested in getting into your favorite position? Yeah, that too. Dating apps, on the other hand, are focused more broadly. I mean, people still will use them to find friends with benefits or casual partners, but that's not the goal. The goal is to meet people that you might actually form a longer-term relationship with. Let's play dating app or hookup app. Can we do that? Sure. Alright, you ready? You excited? I'm here for it. Let's play the game. Alright, Hilly. Never heard of it. Hinge. Can I say hybrid? Is hybrid a pick? I mean, sure. Hinge is a hybrid. Grindr. Oh, hookup. Bumble. Dating. Badoo. Never heard of it. So, Grindr is definitely a hookup website. Hinge could go either way because hinge is used a lot for people who are poly or ethically non-monogamous. So, you'll get a kind of a mix of threesome or group sex hookups combined with people that are in open relationship dynamics, might have other partners, but are looking for relationships. So, Hinge kind of can go either way depending on the user goal. So, here is something that we always argue about in security, and I hate the, I feel it's a false dichotomy, but I think it is warranted in this conversation. Oftentimes we'll say there's a conflict between usability and security, and again, I think that's a false dichotomy, but in this situation, isn't the entire point of a hookup app to find the people to hookup with? Yes. So, whereas this is a vulnerability, and this is something that I do think people should be aware of, and I do think companies should take some time to improve, one of the things I was thinking about when I was reading this was like, if I was using this to meet people for whatever reason, I might want them to know where I am. I mean, yes. It certainly is easier to meet somebody for casual sex if you know that you're in the same general proximity to one another. There's nothing worse than spending a half hour flirting, sharing dick pics, and discussing favorite positions, only to find out that the app might say you're based in Metro Detroit, but you're currently on a business trip to, I don't know, Idaho at the moment. The geolocation can be important at times. That is such a specific example. Shout out to our people who are flirting and sharing images on their business trips to Idaho. Hooking up is a time-sensitive endeavor. It is one of the few times when I would argue that geographical information is crucial to the decision-making process. Ooh! Have you ever hooked up? Have I ever hooked up on a dating app? No. Or anything. I mean, I'm going to go with the dating app, because if you're in person, then the geography problem's already solved. Keep in mind, when I was doing these sort of things, you were just like, oh, I'm at the club? You're at the club. Let's be at the club together. Or let's exit the club together. Even better. I know where to go. No, I've never used any of these apps, which is why I'm always asking you these questions. I always feel very much a babe in the woods when you're telling me about how people do these things. But it is interesting, right? I do think we want to give people protection and agency and control, and I do think that people deserve to know what these apps may be opening them up to. And again, there's other information, by the way, in this research paper. For anyone who's interested, see the show notes. This is not just the locations. This is just my favorite part of the paper. However, I also think if the idea is to be like, hey, who's within 20 meters of me? You probably want to be able to figure out if someone's within 20 meters of you. There seems to need to be a balance here between this is what I'm trying to do, and this is what I need to protect myself from. Yeah. That's one of the things, you know, I was talking to a friend of ours who does threat intelligence in your world, but not like targets data on customers getting breached. She does like, are terrorists going to bomb New York kind of threat intelligence? And she was talking about how fascinating it is for her to talk to her friends who use hookup apps versus dating apps and how their sort of personal risk framework varies and how it varies from app to app where, you know, people that are on Grindr might be a little bit more open to less information and faster connection versus somebody on Hinge, for example. And I thought that was so interesting because I don't know how often we consciously think about what is my personal risk framework for dating and hooking up, and how is that reflected in the apps that I use. I think so often people assume that their risk choices happen after they've connected with somebody. And really what I'm hearing you say is that these sort of internal decisions need to start happening when you're deciding which apps to download. Like, well before you've met a person and are deciding whether or not they're safe, we need to actually be thinking about which apps we want to use and what does that mean for our safety. Yeah, I like that framing, right? As we think about our personal risk framework, we do need to think about what information we're putting out there, how that can be tracked, how that can be used, do we want to mask it until we're ready to meet the person? And it also gets to one of the things that's so incredibly tricky, which is this idea of privacy. How do I make sure that I'm enforcing privacy? Because in some cases, in some scenarios, I may want to be public, right? I may want my location public so I can hook up with people, or at the same time, while I'm making that public, I definitely don't want it public to my ex, or I definitely don't want it public to a stalker, or I definitely don't want it public to someone who may have criminal intent against people like me. And that is really difficult for cybersecurity people and privacy people to encode in the system, because we don't know who is doing what, and we don't know who wants what. So really, it gets back to personal agency. I do think app owners obviously should do a better job and correct these things. I do think we need to do threat modeling. We're building apps, but I do think one of the ongoing themes of our conversations is personal agency and personal awareness, so you can make those risk-informed decisions. Or, you know, just meet people in real life, like I keep telling everyone to do. But then what will we talk about on podcasts? This is a podcast about technology and sexology. Come on now, we have nothing to talk about. Anyways, thank you so much for tuning in to Securing Sexuality, your source of information you need to protect yourself and your relationships, and all the technology and apps you download. Securing Sexuality is brought to you by the Bound Together Foundation, a 501c3 nonprofit. From the bedroom to the cloud, we're here to help you navigate safe sex in a digital age. Be sure to check out our website, securingsexuality.com, for links to that research paper and some popular press coverage on it, as well as information on our live events. And join us again here for more fascinating conversations about the intersection of sexuality and technology. Have a great week! Comments are closed.
|