Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week's episode: Emerging Threats: The Dark Side of Digital Voyeurism
In the modern digital age, technology has undoubtedly revolutionized the way we live, work, and interact with the world around us. With the advent of social media platforms, instant messaging apps, and advanced surveillance technology, our lives have become more interconnected than ever before. However, this increased connectivity has also brought forth a new set of ethical challenges, one of which is digital voyeurism. Digital voyeurism refers to the act of invading someone's privacy by accessing and observing their personal information and activities without their consent. This article delves into the ethical implications of digital voyeurism, highlighting the importance of protecting privacy and well-being in this digital era.
The Rise of Digital Voyeurism: The rise of digital voyeurism can be attributed to various factors, including the widespread use of social media platforms, the ease of access to personal information through online databases, and the advancements in surveillance technology. People are now more inclined to share personal details of their lives on social media, inadvertently making themselves vulnerable to digital voyeurism. Additionally, the increasing availability of hacking tools and techniques has facilitated unauthorized access to personal data, resulting in a surge in privacy breaches. Ethical Considerations: Digital voyeurism raises several ethical concerns that need to be addressed to protect individuals' privacy and well-being. Firstly, it violates the fundamental principle of autonomy, which emphasizes an individual's right to control their personal information. By invading someone's privacy without their consent, digital voyeurs undermine this principle and infringe upon the rights of others. Secondly, digital voyeurism can have severe psychological and emotional consequences for the victims. Constant surveillance and invasion of privacy can lead to feelings of anxiety, paranoia, and loss of trust. The fear of being constantly observed can significantly impact an individual's mental health and well-being. Furthermore, digital voyeurism can also lead to social and reputational harm. Personal information that is obtained and misused by digital voyeurs can be shared with others, leading to potential embarrassment, stigmatization, and even harassment. This can have long-lasting effects on an individual's personal and professional life. Legal Frameworks and Measures: In response to the growing concerns surrounding digital voyeurism, various legal frameworks and measures have been established to protect individuals' privacy. Data protection laws, such as the General Data Protection Regulation (GDPR) in the European Union, aim to regulate the collection, storage, and use of personal data. These laws emphasize the importance of obtaining informed consent from individuals before accessing and using their personal information. Additionally, governments and organizations are implementing stricter cybersecurity measures to prevent unauthorized access to personal data. Encryption technologies, two-factor authentication, and regular software updates are some of the measures being employed to enhance data security and protect individuals from digital voyeurism. Promoting Digital Literacy and Privacy Awareness: While legal frameworks and cybersecurity measures play a crucial role in combating digital voyeurism, promoting digital literacy and privacy awareness among individuals is equally vital. Educating individuals about the potential risks of oversharing personal information on social media and the importance of strong passwords and privacy settings can significantly reduce their vulnerability to digital voyeurism. Furthermore, fostering a culture of respect for privacy and consent is essential in addressing the ethical implications of digital voyeurism. By promoting open discussions on privacy rights and encouraging responsible digital behavior, we can create a society that values and protects privacy in the digital age. Digital voyeurism poses significant ethical challenges in the digital age, necessitating the implementation of legal frameworks, cybersecurity measures, and privacy awareness programs. Protecting privacy and well-being should be a collective responsibility, and individuals, governments, and organizations must work together to address the ethical implications of digital voyeurism. By respecting privacy rights, promoting digital literacy, and implementing robust security measures, we can ensure a safer and more ethical digital landscape for all. Key Concepts:
Hello and welcome to Securing Sexuality. The podcast where we discuss the intersection of intimacy-
-and information security. I'm Wolf Goerlich. He's a hacker. And I'm Stefani Goerlich. She is a sex therapist. And together we're going to discuss what safe sex looks like in the digital age. So, you know, I have been teaching a couple classes this semester, right? Yep. Do I ever. And one of the things that I teach about is the paraphilia, right, the DSM paraphilia. So we have exhibitionism. We have sadism and masochism, and we have voyeurism. And we have all the others that are in there. And often I kind of gloss over exhibitionism and voyeurism because I say, you know, these are my yellow flag paraphilia. Because there are lots of ways in which people can can engage in this sort of play safely and consensually. And so, historically, when I have been teaching, I have not been putting a lot of time or energy into talking about voyeurism. But But a friend of ours in Milwaukee sent me a couple of news articles this week that freaked me out and made me rethink, um what voyeurism looks like today. And so, um, I wanted to tell you about it. Well, I mean, I'm always down for hearing about paraphilia. You make them sound so exciting. I mean, to be fair, this is less sexy. It it, but it falls under the umbrella of voyeurism. I still had fun that one time when you're first putting together a paraphilia class or all day, I came up with different words that start with Pera. That was my paraphilia. Oh, you drove me crazy. Frustrating. Said just something I'm like It's a paragraph because I have a paragraph paraphilia first couple times you laughed, but by the end of the day, I think you're ready to stab me. Look, usually you're not a dad joke person, but you really drove the dad joke into the ground. Oh, I had bedrock. I had bedrock. There was there was bedrock hit there. So, yeah, I mean, voyeurism is a thing. And then as a more broad concept and a more broad concern, you know, I always wonder, like, is anyone spying on us? Right? Like we always know, like we're out in public. There's certain things that can happen, But you know there's cameras everywhere and everyone's phones now, and that's sort of in my mind opens up a different level of these things, right? How many times have we seen on social media? Something that someone thought was a private conversation? Get recorded and shared and yeah, and that was what piqued my interest. Because when I have talked about voyeurism, historically, I have talked about, you know, the the the sexy kind. And what came to my attention this week was way less sexy and way more, um, concerning to me. Apparently, there are several different apps now that are being used to effectively like spy on neighbourhoods and spy on people and individual. Everyday people can upload information about their neighbours and other people can access that. And we're not talking like next door or the neighbourhood apps that people opt to adjoin and share. This is like apps that have been specifically developed to figure out which people in your neighbourhood share your religion or don't. This is apps that will let people create lists of, like, who was at, um I don't know, the last police brutality protest, and and it becomes this, like open source completely unregulated way of spying on each other. And I gotta say, I've been hanging out with you and the hacker people long enough that I've become really jaded about the idea of businesses spying on me, Right? Like I totally get that. Anything I post on Facebook, Zuckerberg is selling to anybody that will give him a quarter for it. But I had not considered the idea that there were platforms out there that would let our neighbours spy on us. And that, to me, is much more disturbing. And I think what's interesting about this is it is beyond the everyday apps to your point, right? Like, you know, there is that quote by William Gibson that I just love and I use way too often, which is the street will find its own use for things. Now, whenever there's a technology out, people will figure out what they want to do with it. And you're right, like next door was all in the news a few years ago. Um, why? Because people were using that app. You know, the street finds its own useful thing, even if they're next door to to spite each other to report about things and whatnot. And, uh, I remember the surveillance technology oversight project. Stop. Uh, at the time it put out a thing highlighting uh next door and saying like, Look, we gotta make sure that people aren't doing racial profiling or gender profiling. This could turn individualism, and so that has been a problem for a little bit. But it sounds like what you're describing is like a purpose built app for this. Yeah, there are a couple, and one is specifically looking at religion. It's an app that's been developed so that, um, churchgoing people can figure out who in their neighbourhood, um, are not churchgoing people and that they can specifically approach them for like, um, missionary and evangelistic artwork, which, if you're a part of those communities, probably sounds lovely. But if you are not a part of those communities, I it it would be very uncomfortable for me to be on a list of people that are or are not religiously affiliated, like bad things have happened when we've categorised people based on their faith. Historically and yet, churches all around the country are using this app to share lists of nonbelievers with their congregants And that's not meta. That's not Google. Like that's a a very sort of grassroots intrusion that caught me off guard when I first heard about it. Well, why go door to door marking every door that you know is someone not not like you when you can just drop a pin? And and I mean, this is a perfect example of something that probably was really benign or even benevolent in intention that could go sideways very quickly. And, um, that's not the only the only instance out there. There was another app that a friend of ours brought to my attention called Citizen that is being used in much the same way right now to, you know, create localised sex offender registries, for example, to be able to communicate with other APP users, um, groups of people, or to identify a neighbour as being a part of a specific community or demographic. And then other people can see that. And that, to me, is horrifying because sure, you know, you have explained to me and to others in the past that you know the But what about the Children? Sort of, um rational is used to justify a lot of privacy incursions, but this feels even more intrusive to me because we're not even talking about, like the state sex offender registry. We're talking about any person downloading this app and slapping a label on any other person that then other users are able to see and follow and track. And that is incredibly disturbing. To me, it is disturbing. But before we get on to citizen, let's go back to this other app you're talking about. What app is that? So that one is called Bless every home I believe. And it is being rolled out by church groups ahead of the Easter holidays and and, um, again, well intentioned, perhaps poorly executed. What concerns me about that application is who's behind it and who who makes that application? I genuinely don't know. Does anybody I mean I? I if anybody's gonna know you would know. I asked this because you know, like from our Covenant Eyes episode, some of the things I look at like who's the company? How much money do they have? How many employees do they have? I can estimate, based on that, how much security they have and how much they've invested in these things. These These are the kinds of conversations that I I have when I'm trying to do, like an assessment of of a Security, uh, programme. And what's weird about this particular app is it's just run by a nonprofit, and there's nothing wrong with nonprofits. But there's not a lot of like information on the nonprofit in terms of who's behind it or what have you. Um, there is things like how much money they bring in. They bring in, like, a half million in revenue. How they spending that money. We don't know their expenses are higher than the revenue. Obviously, they're nonprofit. I guess that makes sense, but we really don't have a sense of who those folks are and how secure they are. Um, which always, always, always, always, always concerns me. Yeah, that actually surprises me because, you know, when I hear it's a nonprofit, my first thought is a social worker is oh, good. That means that there must be more information available because, you know, our nonprofits have to do like their nine nineties every year. Like they, they are required to be a little bit more transparent than your average corporation. So when you first started talking, I was optimistic. And now I'm not I. I was hoping that if I brought you these apps, you would reassure me. And I don't feel like that's the path we're going down. Well, probably not. The top donor is also the person who runs the nonprofit. So it is. Is that normal? I mean, it can be, but that's unusual. Uh, that would tell me, you know, a as as somebody that comes out of the nonprofit world, that this is probably a very small nonprofit. It's probably very new, very grassroots. It's not unusual for a new nonprofit to be kind of self funded for the first couple years, because many grant funders want to see a history, a track record of of work and good fiscal management before they start awarding grants or or making donations. What if I was to tell you it was founded in 1997? Oh, that changes everything because anything funded in 1997 that is, um, to my mind and a A nonprofit on the up and up should have a lengthy paper trail with their nine nineties and guide star and charity Navigator, and they should not at that point be self funded. That's actually raises all kinds of red flags for me. What if I was to tell you it was based out of Orlando, Florida? I mean, Florida is creepy for a number of reasons, but I don't know if the Orlando piece necessarily raises any concerns other than the fact that we know that Florida is having a particularly socially conservative movement right now. OK, so this nonprofit is built. Bless every home for tracking Non-religious people or non evangelical people, I suppose. And then I'm sure they're gathering all that data together. We don't necessarily know where that data is being stored or anything, because I don't have any insight to this application again. I'm not pen testing it for this episode. But, you know, the the very fact that data is out there and made to everyone is a concern. The very fact that the data could potentially be broken into is a concern. I mean, between now and and when this episode airs, I'm going to go on to my favourite resource charity navigator and guidestar and see what I can find. But you know, I When? When this first came to my attention, I expected you to tell me that it was one of the myriad poorly thought out tech start up venture capital, bro. Ideas that we're seeing everywhere. I was not expecting you to tell me that this was almost 30 years old. Well, the nonprofit is the application is relatively new. Well, of course. So I mean, the applications only been out for about a year, But, I mean, these are these are the things that always concerned me, right? Because, you know, we've looked at this in the past we've looked at, um you know, if you take a look at what the national database attracts hate crimes, one of the things they always say is you know, those who target others because of their religion or nationality, um usually have higher rates of previous criminal activity. They usually belong to hate groups. Now, I'm not saying a hate group is gonna be on bless everyone everywhere. Bless, bless every home. But I'm also not entirely sure how they're making sure the only the right people have access to this data and right people is an air quotes for those listening. Yeah, that's really the concerning thing for me, is I, You know, with a lack of other evidence, I would never assume negative intent from a lot of the people that develop these kinds of apps. I mean, often this happens because they think it's a great idea, and they have this very benevolent vision for how it's going to be used. What always concerns me is, well, once it's on the APP store, anybody can use it like what systems are placed to your point to make sure that these information this data, these lists that are being created for altruistic, if perhaps misguided purposes aren't being misused by people that are not nearly as altruistic, which I think gets us back to citizen. So I didn't mean to cut you off there. But before we jumped from the app that's directly made to track people not like me, and you go to citizen, which is like next door neighbour on steroids. You can build lists and you can share things. What are the concerns about citizen that you're saying? Oh, I mean, for one, you can build lists and you can share things so that can be great if you're trying to track who's the best dog walker in your neighbourhood, but we are seeing it being used for other reasons already. You know, we are living in a very, um, socially active, very politically, um, motivated time and people on both sides of the political spectrum have access to this tool to create what effectively becomes an enemies list. One of the concerns that our friend who first brought this up to my attention was sharing was that in the same way that bless every home is being used to create a list of, you know, people in your neighbourhood that you can visit and share. The gospel with Citizen is being used to create lists of, um, religious minorities and other minority groups that you don't want in your neighbourhood. You know, uh, one of these screen caps that was shared with me with somebody literally posting, you know, a Jew was spotted at this location, and you know, a as anybody who practises a minority faith, um, will will attest. Nobody likes to be known as the person at that location that doesn't feel safe, and there's really no good reason for that to be happening. So expand that outward right that can become the trans person at that location. The undocumented person at that location, the woman walking alone at that location. It becomes a way to broadcast the identity the, um perhaps risk factors and the location of people without them knowing that it's happening in real time. And that is terrifying to me in a way that the other forms of data collection and data sharing we've talked about have never landed. Yeah, and the feed, what you're talking about was in the feed, right? So people can take a video or take a photo, post it to the citizens feed in your area to let you know, like what's going on and in context, the one before it was, um oh, there's police activity. And then the one below, it was like, Oh, you know, here's here's a video that, uh, citizen took of a police arrest and then, yeah, Jew spotted over here. I was like, Ah, and And that exactly is the fear, right? Like it was brought to my attention because I'm a part of the community that was showing up in that particular screen cap. But it can be used to target and share information about all kinds of vulnerable people. And I was not able, in my digging to find any sort of mechanism that the developers had put in place to limit the kind of information that was sharing. It's not like they're using new, you know, a I moderation to make sure that if certain keywords like do or trance comes up, they're not being posted. There's not that level of moderation. So this has become to come back to, you know, my my earlier comments about voyeurism. This, to me feels like a brand new form of Peeping Tom, right? Like back in the day. The biggest concern that, um, I'm gonna speak from the eye so a woman might have would be her creepy neighbour. Like watching through the window. Is she undressed at night? But now our creepy neighbours can broadcast to everybody else that I'm undressing and that if they want to see it, this is where I am right now, and that is horrifying. Well, not only can they broadcast it, they can also take videos and clips and upload it and everything else. This does get back to a certainly a strong need for, uh, moderation. However, one of the the problems with a lot of this is there isn't if we in the US mind you, we have I know we have listeners in Europe and, you know, congratulations on your privacy policies. We're very excited for you. Please bring them over here. Um, but in in the US, you know, if we talk regularly about how terrible the lack of privacy is at the corporate level at the individual level, uh, there's even less protection. So I don't know that there's like, a lot you can do from a legal perspective right now other than to say, Hey, that's no, please don't post that. Please remove it. It's not like there's strong policies in place. Are there things that and I can't believe? I'm gonna say this the the big corporations that I usually think of as the ones exploiting my information can do to protect us in this case, like I, I know that not every app is automatically approved to be put on the APP store. So are there ways that perhaps Apple could say these apps are intrusive? We're not hosting them like how What mechanisms do exist if, um, American law specifically does not protect people? Hm? First off again, I. I think the right answer is is stop in the EFF. The right answer is, uh, using. You know, the nonprofit organisations who are advocating for safer technology and for freer technology. I I'm not sure I want to use, like, censorship to combat censorship or gatekeepers to que gatekeepers. That gets very, very tricky very quickly. Yes, The APP stores and other folks have the ability to delist and remove applications of certain Internet providers. If you're depending on how you are set up for hosting, have the ability to remove access to those, uh, sites. Uh, but again, whenever we have something like a blotter, though, right, it's a free text form that anyone could put anything in. We, we're gonna have have risk. That's why we've, you know, have certain laws protecting a app companies from what people put into those fields. So whenever you got like a free form tax, someone is going to be using or abusing them. I think part of it is going to be the fight fought by EFF and stop. I think part of it, too is going to be, um, what we need to do for ensuring that the users on these systems, um, are educated not to target people. Now, maybe that's hard to do. But back to that database, if you look at, like, attacks on sexual orientation, attacks on gender attacks on gender identity. When you look at those stats in the hate crimes database, was it fine? It finds as younger people It's unemployed people. It's unmarried people. Uh, this is oftentimes their their first Fourier into a hate crime. Right. Um, they oftentimes do this under the influence of drugs and alcohol. Um, so I mean, in addition to can we look at big companies? I would say we need to also look and work with our nonprofits who are advocating on these, and part of that solution has to be reaching out to these groups who are very prone to looking at these right, getting home, taking some drugs, you know, drinking some alcohol, talking to their friends about how terrible it is that these people are out there and holding them down, opening up an app like this and scrolling to the top and saying so and so's person was spotted over here and then going after them. We need to do pressure at all of those levels. I'm not gonna ask the question of How can somebody know if they've been added to one of these apps? Because, I don't know, especially with citizen, because it's all happening in real time that you can know. I don't know if I'm standing in a coffee shop that the person behind me has posted that, Um, but there are other ones like that. Bless every home. That's more, um, less, uh, fluid, right? It's not a live stream. It's a list that's generated. How can somebody that finds out that they're on a list like this or on an app like that, um, go about trying to get their names off? Is there a mechanism for that? I know in Europe there's the right to be forgotten. I don't know if that would even impact bless every home, for example. But are there steps that an individual could take if they find out? Let's say my neighbour shows up at my house with an Easter basket for me, and they say, Oh, we we had your name on bless every home. We thought we'd stop by and introduce ourselves. What are what are my options? Take the candy and close the door very quickly. I was always told not to take candy from strangers, baby. Yeah, yeah, they told us that growing up. But I gotta tell you, I ignored a lot of stuff. They told us growing up, and it always leads to a much better story. OK, so what I'm hearing is my options are limited. Well, yes. I mean, so the right to be forgotten. And those sort of things would also apply to, uh, people in California or people with, you know, data privacy laws in the US currently is pretty much California, so there's not a lot out there. It would be interesting if we see a citizens group form to watch citizens. When next door was going sideways. There was, uh, some folks who organise grassroots neighbourhood watch programmes that watched next door to make sure that people weren't using them for, you know, racial profiling and whatnot. So there there may be some interesting grassroots things that you could do at that level. Um, but again, how Are you gonna force a nonprofit to remove you from their list off an app that doesn't seem to have any sort of, uh, data, uh, governance policies on it. Hmm. So it sounds like this is one of those episodes we do where we don't necessarily have a good answer to the well, How do I prevent or stop that from happening so much as it is? We're just letting our listeners know that this is a thing that's out there that they need to have on their radar. I think so. And again, you know, if you are, uh, in an area that's using citizen, if you are in an area that's using, uh, next door, any of these apps and you see someone doing something inappropriate with it. You know, we as a community are equally powerful in terms of getting content, taken down, reporting and taking action, as people are who are putting it up. So I do think that there is some, uh, some things we can do there I would love. You know, as we had this conversation, I'd love to reach out to the people at stop and see if they'll come out and we could do a follow up and and deep dive on some of their recommendations. Uh, right up, Albert. Sally, Albert Foxcon. We miss you? Yeah, absolutely. Maybe we have Albert back on or we have someone you know from his team. Uh, I know that they have absolutely done stuff with next door. And there's, you know, a lot of work being done on the advocacy side that we can talk about. Uh, but when it comes to bless every home I don't see, like, where you get removed from this, uh, which is not good. They do have a privacy policy, but it doesn't necessarily say, Hey, how do I get How do I get off your list? I am going to do an experiment, and I'm going to reach out to them and ask if it's if it makes you feel any better. Their privacy policy does say that your password is not accessible to anybody. Oh, so my my, my religious affiliation, my home address, my neighbourhood, my everything else is. But as long as I have a password there, I'm golden, Right? Exactly, huh? It's almost as if their priorities are not aligned with best practises or, um, decent ethics. I wouldn't say decent ethics, because ethics is you know, as we've learned ethical conversations, get 30 Quick. I can't point to an ethical guy that says that this is unethical or not. I just simply say that this is unadvised. Somebody's been eavesdropping. Am I continuing education training? This is the question I have for you because you're the one who brought me this topic. So the question I have for you is as a therapist, what is the impact, right? Because as a technologist, I'm like, Oh, yeah, someone else has your data as a therapist. What is the impact to someone who is being monitored by these tools? That's just it. How does it impact them? There is this, um, concept this term that activists and mental health professionals talk about quite a bit called minority stress. And what it means is that when somebody is a member of a minority group within a larger system and that can mean anything, that could mean people of colour in America, that could mean, um, I don't know, Buddhists in Saudi Arabia, that could mean, uh uh oh, white man in mainland China, anything that makes you different from the wider community that you're in carries with it. A certain degree of minority stress, meaning people are constantly aware of the fact that they are different. And they experience small differences small differentiators constantly on an ongoing basis and that has been shown to be linked to higher stress levels overall to negative health out outcomes to negative mental health outcomes. Um, in America, we talk about minority stress a lot in the context of LGBT QI, a people and people of colour and people of minority religious traditions. But really, you know who's a minority depends on where they're living and when, and this is a great example of the way that minority stress can wear somebody down. It can take a huge psychological and emotional toll to know that somebody out there is making an app to track you and your home address and to share it with others just because you don't belong to the same church that they do, or just because you don't support the same political cause that they do, or any number of things. And this is, you know, we're living in an age where people are pushing back on the idea of political correctness, and the idea of wokeness and minority stress can sound like kind of a buzzword in that respect. But it is very real, the the stress of everyday life, living as a minority person in a majority. Anything culture is ongoing and pervasive, and it takes a toll emotionally relationally and physically. That is, evidence based, lots of peer review literature on that. And these apps are another example of how minority stress manifests, how marginalised people are further shoved into the margins and further isolated and further othered and further made to feel unwelcome and unsafe. And a lot of the people that are using these apps might not realise that or make that connection because again, I'm sure that the developers have these really wonderful, lovely, aspirational, benevolent goals in mind when they come up with the idea. And the majority of users are aligned with those goals when they use it. But the fact that they have good intentions does not change the impact that being on a list like this, I'm gonna say, bless every home that, being on a list like that has for their Sikh neighbours, their Jewish neighbours, their Muslim neighbours, their pagan neighbours. Regardless of what the intentions are, the consequence is other and marginalisation and fear and stress. And that is never OK. And especially when we have things like Citizen, where it is being overtly used to target people for unkind and unsafe reasons, that fear is even worse. And I think that the creators of these apps, the developers, the moderators, the people that are coding them, the people that are keeping them up and running do have a moral and ethical obligation to think about how these technologies can be used, what their intended purpose is and what the outcomes of those purposes are for the people that end up being targeted on these lists. I I know you had said that, you know, we're not going to get into ethical or unethical will say advisable or not. But we have done previous episodes with tech ethicists and I know that tech ethics is a thing and I think this is a prime example of how app developers, app, maintainers and app users need to be thinking about the ethics of what they're building and the message that it sends to the people that these apps focus on and target a really good call to action if not for all of us who are just living our day to day lives. But for those of us who are building and maintaining applications, well, thank you. Love for bringing this scary story of the week. Usually I get to be the scary one. So it's nice to to have it, uh, reversed a little bit. And thank you all for tuning into securing sexuality, your source of the information you need to protect yourself and your relationships. Securing Sexuality is brought to you by the bound and together Foundation a 501 C three nonprofit. From the Bedroom of the cloud. We're here to help you navigate safe sex in the digital age. Be sure to check out our website, securingsexuality.com for links to more information about the topics we've discussed here today as well as our upcoming live events and join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week Comments are closed.
|
join us on air!Archives
January 2024
Categories |