The Buddy System: App-Based Support Groups with Chandler Rogers - Securing Sexuality Podcast Episode 22
Securing Sexuality is the podcast and conference promoting sex positive, science based, and secure interpersonal relationships. We give people tips for safer sex in a digital age. We help sextech innovators and toy designers produce safer products. And we educate mental health and medical professionals on these topics so they can better advise their clients. Securing Sexuality provides sex therapists with continuing education (CEUs) for AASECT, SSTAR, and SASH around cyber sexuality and social media, and more.
Links from this week’s episode:
Reducing Friction and Anxiety with Relay App; self-improvement, community-based mental health, peer-to-peer support, and anonymity
As the world continues to move towards a more digital age, it is becoming increasingly important for mental health care providers to find ways to keep up with the times. Fortunately, there is a new technology-forward platform that is revolutionizing mental health care: Relay App. This platform provides users with access to self-improvement and wellness tools that can help them better manage their mental health.
In this article, we will explore the benefits of Relay and how it is changing the way people approach mental health care in a digital age. Relay was created by a team of experts in psychology, technology, and design who wanted to make mental health care more accessible and effective for everyone. The platform offers users an array of tools designed to help them improve their overall wellbeing.
These include guided meditations, mindfulness exercises, cognitive behavioral therapy (CBT) activities, and even virtual reality experiences that can be used as part of a comprehensive treatment plan. One of the most beneficial aspects of Relay is its ability to provide personalized recommendations based on user data. The platform uses machine learning algorithms to analyze user data such as moods, behaviors, sleep patterns, physical activity levels, etc., in order to provide tailored recommendations for each individual user’s needs. This allows users to get tailored advice on how best they can manage their own mental health without having to solely rely on outside sources or professionals for guidance. Another great feature of Relay is its ability to connect users with other people who are going through similar experiences or have similar goals when it comes to improving their wellbeing.
Through its online community-based mental health care feature, users can connect with others who are also using the platform and share stories about their own journeys towards better mental health or offer support and advice when needed.
This helps create an environment where individuals feel comfortable discussing sensitive topics related to their own personal struggles without feeling judged or stigmatized by those around them – something which traditional forms of therapy may not always be able provide due its lack of anonymity or privacy concerns associated with face-to-face interactions between therapist and patient/client. Finally, Relay also offers an array of resources such as articles, videos, podcasts, webinars, and others which are designed specifically for helping individuals learn more about managing their own mental health . These resources are often created by experts in the field so they offer reliable information that can be used as part of any comprehensive treatment plan. Additionally, these resources are easily accessible from anywhere at any time – making them ideal for those who may not have access traditional forms of therapy due geographical constraints or financial limitations .
Overall, Relay has revolutionized how people approach mental health care in a digital age by providing an easy-to-use platform full of helpful tools designed specifically for self-improvement and wellness.
By offering personalized recommendations based on user data, connecting individuals with others going through similar experiences, and providing access reliable resources from anywhere at any time – this technology-forward and community-based platform has made it easier than ever before for anyone looking take charge over their own wellbeing.
Hello, and welcome to Securing Sexuality, the podcast where we discuss the intersection of intimacy and information security. I'm Wolf Goerlich.
He's a hacker, and I'm Stefani Goerlich.
She is a sex therapist, and together we're going to discuss what safe sex looks like in a digital age.
Today we're joined by Chandler Rogers, the founder and CEO of Relay, an app designed to facilitate secure, ongoing peer-to-peer support.
When someone is struggling in isolation, Relay matches them with three to seven other peers with the same challenge, makes it easy to reach out for help when vulnerable, and enables change through shame-free accountability and facilitated connection. We do love a shame-free world.
Hey, guys. Thanks for having me on.
Tell us the story here. How did you come up with this idea? How did you launch this service?
Yeah, so growing up, I think I was always cognizant of the type of person that I wanted to be, but I felt like in my family, like you kind of mentioned, Stefani, the shame-free aspect, I felt like there was kind of a lot of pressure to perform or be a certain kind of person.
That if I made a mistake or was trying to, as a teenager, figure out what Chandler looked like and that didn't meet up to expectations, I definitely remember feeling these instances of shame.
Really, I think some of it was self-induced where it didn't need to be. So specifically, like many people, I kind of got exposed to pornography at a young age. For me personally, that was something that I noticed how it affected my mental health and how it made me feel in my relationships.
And so it kind of personally made the decision that that was something that I wanted to kind of cut back on or reduce. And I just remember feeling so much shame, like I couldn't talk about that openly for fear of how people would judge that decision and really just feeling like ultimately, I wanted to take control of my life.
So it wasn't as much about pornography itself as it was like I wanted to take control of my decisions and feeling like when I wasn't in control of that, I was putting so much shame on myself.
I had a close family member who struggled with an eating disorder and had conversations with her about what that felt like and just feeling kind of out of control and feeling so much shame and pressure to kind of fix it. Over the years and going into college was open with a lot of people about kind of my story and trying to figure out how to overcome compulsive behavior.
And notice like a lot of other people and close friends and family had very similar stories of various types of addictions, if you want to call it that, or compulsive behaviors that they're working on overcoming. But the common thread was that almost everyone was struggling alone and felt like there was a stigma that they couldn't talk or get support with other people who were in the same boat.
And so I just continue to think about how funny that was that so many people were struggling with similar problems or challenges that they were working to overcome in isolation and how that felt kind of dumb because I think for me it was key to my journey to start to open up and find peers that I could kind of bring into my support system to help support me and rally around me.
And so that's kind of the backstory and where I became really passionate about, you know, whatever someone's goal is in trying to become their best self, enabling them to find close tight knit support without shame is something that I feel like tools and technology don't enable very well today as I look to things like Reddit or Facebook.
I can maybe get the sense that I'm not alone, but how can I get a group of, you know, five, seven other people in my corner that we can work together as a team towards self improvement and becoming healthier versions of ourselves?
You know, I was talking to Stefani recently and I said, it feels like many people out there will have 10,000 Facebook friends or Twitter followers, but very few people have 10 good friends that they can open up with and share with. Oh yeah. Yeah.
And like the question that I kept thinking of is like, which do I care more about?
And maybe, you know, for an image perspective, it is about, you know, having quantity, but I think deep down we crave quality and that often is that handful of close people that we can rely on.
And my hypothesis early on with Relay was that that doesn't need to be existing friends or family because sometimes there's sticky relationships there at play that when it comes to needing support around a topic that can be sometimes stigmatized, whatever that is, you know, what if we can match you with other people in a way that is safe, but it is potentially even more effective than your current circle of friends or family that may be kind of hard to tap into with some of these topics that are less talked about.
I love that because so often what brings people to therapy with me is this idea that they're alone, that they are the only ones dealing with the issue, the only ones having a specific desire, the only people wanting a kind of relationship, and that idea that whatever it is that makes them feel isolated, they are the only people on the planet is incredibly harmful to them, to their mental health, to their emotional lives, to their relationships.
And I love the fact that you are helping people find one another, whether it's a shared struggle or, you know, something on the more positive side of my world, the first step to leaving shame behind, the first step to being your authentic, integrated, healthy self is community. Totally.
The way that we've kind of gone about designing the program, it is catered and certainly early on, our audiences have been largely people that are trying to overcome unwanted behaviors of various kinds, but it is more about this forward looking direction of becoming your healthy self that exactly what you just said enables people to have the kind of life they want to have, which is really empowering in a way.
And I think it's been really cool to find a model that is community based because I feel like that is so much more fulfilling to achieve that journey, that healing journey really with other people as opposed to doing it alone, which is often impossible.
You know, I always like the phrase, you are the average of the four people you spend the most time with and pick those people carefully. So that does raise a question I had, which is I understand that Relay will match you with peers with the same challenge.
What does that matching process look like?
How does that work in the back end?
Yeah, so it's definitely evolved a lot. And we've learned, I think, a lot about what is important in that process. And one of the things we learned from the get go is it needs to feel so much less scary than the idea of walking into an in person room.
I don't know, picture a typical AA meeting and the fear of going from maybe I know I need to change this thing to, oh gosh, should I actually drive over to the local building and walk in person and see a bunch of people who I might know?
And that just feels so anxiety inducing to me just thinking about that.
And so the initial hypothesis was how can we create a radically different feeling for someone getting matched with the group in Relay?
And I think the first step to that was helping them control the anonymity and being able at the same time to opt into real human connection and kind of playing this line between full anonymity versus it feeling authentic enough, if that makes sense.
I think there's a dual paradigm here of like on Reddit, right, which I think is the ultimate kind of pinnacle usually of anomymity where most handles on Reddit or usernames are totally random, I guess. They have nothing to do with my name and it's very abstracted from my personality.
We found out that people actually want to feel more like them when they get matched with a group for support purposes, but it still needs to feel ultra safe. And so essentially what that looks like is we help users choose the name that they want to go by. A lot of them choose first names, but they don't have to. So that's kind of step one.
Step two is it's not actually social media structured in that you can't browse all users in your community. So we'll take them through like an intake questionnaire, similar to what you might fill out if you go and get matched with the therapist.
So we help try to identify like what type of group does this person need in terms of the specific goals that they have and what they're working on, age, gender, preferences, such as like, am I religious?
Do I care about being with other people who are religious or not?
Do I care about being in a group that is engaged daily versus maybe I just want a group that's like once a week type engagement?
And so all those factors allow our algorithm on the backend to identify the best group that will be a good fit for them. And then we essentially place them in that group once they've committed to join the program.
And so what that means is we actually have them pay upfront for a few months of the program because we found that essentially assuring quality and making sure that the group stays safe and secure was making sure that it's hard for trolls to access and get into. You can't browse like I mentioned and see like anyone that's in the community.
Like we're only going to match you with one group and you can only get in if you're paying for the program.
Anyways, I can go into any of that more if you want me to, but we're helping identify who is this person?
What are their goals?
Curating one group that matches those criteria and then allowing them to join. I am way too obsessive a Reddit user myself. And I love this comment you made about like trying to balance anonymity and authenticity.
There are parts of Reddit specifically, and I've seen the same thing on Twitter in the before times when it was still good, where some people leveraged anonymity to really be horrible human beings and to let their worst selves out. But on the other hand, sometimes that anonymity allowed people to feel safe enough to be their authentic selves in ways or in words that might not have felt safe in their everyday life.
And I'm curious about that dichotomy in your mind. Talk to me a little bit about balancing anonymity and authenticity.
Yeah, I think what we've seen is there's like a progression of when someone first comes into our app and they're completing the intake survey and they haven't even joined a group yet.
So like, I've been in my group now, call it a month and I'm getting to know them and there's a foundation of trust and I'm oriented to the program. And we found that basically like as you go along that spectrum of your journey within Relay, people in general of our user base want more and more authenticity and you're usually willing to give up more and more anonymity.
I guess to be clear, we still control things in terms of like, there's no way that you can see someone's like email phone number, like account identifiable information. The only thing that I can still see is the name that they've indicated to go by. And of course we have community guidelines that we, you know, ask users to follow to help enforce that.
But we found that like, you know, asking someone to set their profile picture, the second step in setting up their account is not effective. Like that helps them feel like, or it makes them feel like unsafe because it's like, well, shoot, I don't even know who's going to see this and like what's, you know, how's it going to impact how safe I feel.
But they do want to set real profile pictures more likely on average, you know, at week four. And so it's helping them feel like this is really safe and approachable by letting them have control over that and amenity from step one. And then as they get integrated to their group, letting them kind of unveil maybe is the right word, like additional layers to get more authentic and less anonymous.
But at the same time, it still is preserving that baseline and amenity in terms of like, someone couldn't really deduce who, you know, who Chandler is theoretically, unless I shared a ton of personal details and chat messages. It's actually really fascinating to me.
I mean, it makes perfect sense, right?
As, as I get to know you, I'm going to trust you more. People are like that across all domains. So if I get to know the app and people in the app, I'm going to trust it more. And as it happens from what you're saying, there's this shift towards authenticity over and amenity.
And what I find interesting about that is in security, we're often like, how often should we challenge them to log in?
Or how often should we add this part of friction or that part of friction?
I hadn't thought about how often should we ask them to self disclose. And this idea that, Hey, no, no pressure, put up your profile picture when you're ready. If that's four weeks in, that's four weeks in is, is really fascinating because I could see how that would reduce the anxiety and reduce the friction of people getting involved and engaged. Right. Yeah.
And it's a similar concept that we've tried to map to other features in the app. So there is kind of a core place for group interaction. One of the ways that we've helped maintain safety and quality in these groups is we actually don't have a DM capability. So it's all group centric communication. And so the baseline for that is just chat based communication.
However, we have a lot of people saying, Hey, like, I feel like if I'd be able to send voice recorded messages or, you know, have a video call with my group, that would allow me to connect in a deeper, more authentic way.
And so we're still working on kind of, how do we layer in some of those features?
Like today, we still haven't enabled the capability to have a live group call in the app. So this is mostly asynchronous chat based communication.
However, we're seeing this interesting dichotomy continue to play out of people wanting richer forms of communication with having the baseline be this most approachable. Yeah. I guess like most approachable version of chat based messaging that doesn't require me to disclose like my voice intonation or my face. But as I go along, I might start to prefer a blend of those things to help have more authentic connection with my group.
Have you had or experienced any negative interactions on relay?
Have you had people that were able to sort of subvert it to be less kind or a little bit more trolley?
And how do you handle those situations?
Yeah, when we first launched the app, I mean, from my personal background and story, hopefully it was clear that like the reason I'm doing relays is because I care a whole ton personally about helping people find community in their healing journeys. And so we wanted to make it free to make it as accessible as possible to the whole world.
And we had some incidents early on because it was so low barrier to get into groups because it was free. We had a couple instances of trolls who were there with malicious intent that yeah, it was bad. This was like the first three weeks of the app and we had no idea what we were doing.
And luckily a lot of the users who were on the app, we knew personally and we're helping kind of beta test this. We were able to resolve the issue by identifying that and removing them within minutes actually. But we learned from that, like we need to actually have some friction to make sure that people who join upfront are kind of vetted in some way. Payment is still is a part of that.
And we've had actually zero instances of trolls reported to us since we've instituted a paid program model.
Of course, still balancing the affordability to reach the people that we want to get help. But we've also tried to think about with these groups, imagine someone is being harmful even if they're not trolling.
How would I feel if like on Reddit, there's the concept of moderators?
How would I feel if there's a moderator that I don't know that's not really in my group but is kind of like seeing our messages?
How does it affect my day to day interaction when there isn't something bad happening?
So we had a big debate for a long time.
I'm like, should we have moderators in these groups or should we keep it truly peer to peer and have these other kind of lock mechanisms to help ensure that we can detect if something unhelpful happens to deal with it without needing to like oversee each group?
And I think this is an important piece of the conversation because I think it really impacts how someone in the group feels in terms of how safe it is and how useful it is because of that. And so we ended up not placing like moderators in each group.
So they're not like actually reading all the messages but instead we went with an approach of making it easy for peers to kind of rate how they feel about the quality of their group. So it's not like I'm reading like, you know, Wolf is really helpful or like Stefani is really harmful and has been mean. But just overall, we're kind of collecting pulse data on how people feel about their group.
And then on an individual message, we have community safety features to hide or report as like spam or inappropriate content that then that does get triaged to our team that we can deal with. And so that's how we've been able to help mitigate that so far.
And one of the new things we're doing to kind of take it a step further is introducing like an AI developed model that can detect like the sentiment of a group.
So I think the more common thing we see rather than inappropriate behavior is like someone feeling a lot of shame after, you know, potentially having a relapse or slipping up or just feeling kind of like down in the dumps and depressed and they're bringing that negativity in a way that is maybe crossing the line between the type of authentic support that we want to create in the groups versus like, okay, this is just counterproductive and unhelpful.
And with relay not being intentionally like we're not trying to be the therapist as the app. How can we detect though if something crosses the line to help coach or steer a user back into healthy territory. And so that algorithm is intended for us to detect, okay, it looks like there's a user in group 72 that has been kind of on a negative spiral.
We can have our community manager maybe reach out and send an email without us even having to read the content of the message and just kind of check in on them and see, does this person need some additional support or maybe route them to more, I guess, intense resources like to partner therapists or something. I like the application of sentiment analysis to that.
When partner was something you said there, which is perhaps don't expose the message, right?
Yeah. I think the first step as I approached this, I guess product development and our product strategy at Relay, I really try to think about me as a consumer and being a part of a digital, I guess, support group environment and what would help me feel like my privacy is being taken seriously.
And the first one is really ensuring that we're only collecting data that's actually needful to create an effective experience for users. So I know a lot of people, when they think about a community based product, the comparable is our social media, which we associate with collecting any and all data and tracking us across any websites and selling that, utilizing that for ad purposes. So we're actually extremely transparent on our payment page.
When we ask people to pay, we say, this is actually strictly because of privacy, or at least one of the main reasons. I mentioned another one is it ensures the quality of our communities and prevents you from having a negative experience with a group that either sucks and isn't invested or has trolls.
But B, it's about like, we don't want to have to put ads at you, which means exposing your privacy data. That's something we will never do. And I'm comfortable saying that on the record, because that's, I think, completely inappropriate for a sensitive use case and audience like this. And so us charging a recurring subscription is explicitly about keeping data private.
And another way that we think about that then is when it comes to how we help ensure quality and safety, we try to do as much as possible with the app helping ensure quality versus humans, if that makes sense.
So I think where things can kind of break down or where it's maybe harder to enforce policies is like if I did want to hire 10 people to go oversee all of our groups, I guess one way to do that would be to recruit users to say, if you're currently a user in your group, let's promote you to almost kind of be like, you've already been a group member, and now you're just kind of a trained facilitator, and you can go through this training program.
And that's one way to do it.
But another way to do it is with the app itself, if we can detect without having to expose some of the data to a human, like humans will never be able to see that, and then take steps off of that, I think we're able to still, I guess we're respecting privacy by having to minimize the need to even deal or handle with data in a non anonymized sense.
So a couple things jumped out of me, right?
Collect only the data that's useful. Don't sell to add networks, be very transparent. I think those are key points that anyone who's building a tool really should take away.
Another question I had for you is on the back end, can you talk about how that data is protected?
We hear all the time about WhatsApp by example being end to end encrypted.
Yeah, yeah, we've had people ask us, hey, like, I know WhatsApp is end to end encrypted.
Have you guys done that?
Being a one year old company, we haven't built end to end encryption yet. That is a goal that we have that we want to do. We have built our platform on top of Google Cloud Platform, which comes baked in with a lot of kind of standard security protocols.
And so on the back end, like without getting super technical, like we have rules, for example, of like, if someone's trying to make a request to the database to access data, like there's all sorts of parameters that that has to go through. And so there's a roadmap like anything of continuing to get better.
And I think one of the ways I've talked with our CTO about privacy is, hey, let's always have a roadmap of like, where we want to keep improving as opposed to like, from the get go, we just set up, you know, some bare minimum practices, and then that's just is what it is.
So it's kind of this balance of continuing to find ways and resources and talent to do those things, because the top security, I guess, practices often require a great level of engineering focus, like WhatsApp has hundreds and hundreds of engineers, we have two right now. And one of them was me for a while. So I don't have time for that anymore.
But we we've been lucky enough that we have some great advisors, both on the clinical side of things, as well as the technology side of things that from a privacy perspective, too, we try to incorporate their perspectives and advice as we think about, hey, what what should we do the next three months to keep beefing up all of these categories?
Are all of your user groups peer to peer?
Or do you have clinicians that are maybe saying, Chandler, I'm running an, you know, an IRL support group, and I would love to be able to offer this to them outside of session, do you have ways to instead of having that sort of algorithm based connection, to have an existing group joined together through relay?
I'm glad you asked that. So that's actually about half of our entire user base is exactly what you just said. So I attended a few different kind of clinician led group based programs over the years. And so originally, that was actually purely our focus. And we weren't as focused on this matching component. And we've now kind of essentially developed the products are both use cases.
So the short answer is yes, we have a number of different organizations anywhere from a clinician with a single group that they run on Thursday nights that instead of and a lot of clinicians are not able to for HIPAA reasons, say, hey, share your phone numbers and create a WhatsApp group. And most of them are self organizing anyways, which I can say with confidence after talking to hundreds, almost 1000s of people.
Now therapists have a way to say you don't need to exchange your contact info, there is a compliance safe platform that allows us to stay connected outside of group.
And yes, we essentially help set up a private code that bypasses that randomized matching, I guess, I guess randomizes the wrong word, but just puts instead of putting you with a random group, we translate your group that you already have into a private group.
And the majority of groups that you're running right now, what issues are they primarily focused on?
I know on the website, I saw substance use recovery, I saw a mention of eating disorder, disorder recovery, who are your biggest sort of user populations right now?
Being an early stage company, we've focused our marketing largely on process addictions. So eating disorders, gambling, and sex addiction, and pornography have been some of our biggest groups so far. But we've recently like from the product perspective, we're now able to form groups for substance alcohol, smoking, and in addition to all of those, we have tested some groups with kind of general mental health. So people working on depression, anxiety.
And I think we're still trying to decide like, where we mold the product and who it's best for in terms of the feature set and what we're trying to help people do in the app. So generally, like our vision, though, is to be the platform that enables tight knit peer support for really any type of self improvement use case.
I want to circle back to something you said, because with such a small team, A, I hope you get some sleep. It's been better the last couple months.
Question B is, what are your steps for cybersecurity, right? Because you don't necessarily have a security architect on staff. I'm assuming you don't have a penetration tester on staff.
Yeah, yeah. So I think that's also a similar answer to what we talked about before when we think about roadmap. One of the things that we had to decide as a company was actually like, hey, like if we're going to make this world class tool, there's all these things that are important, including cybersecurity, including privacy, including making this an evidence based tool that eventually we can have studies that prove its efficacy.
And so we made the decision instead of kind of going a more nonprofit or even just kind of a bootstrapped is the word we use. So just kind of growing revenue slowly to grow the team. We decided to actually go and take investment from various investors to allow us to have the capital to bring on expertise earlier.
And so with that, we actually did just hire an engineer who has a background in cybersecurity and privacy. And so he is now, I mean, we still don't have hundreds to be able to boil the whole ocean in a week. But being strategic and being able to hire someone like that who has experience working for some of these bigger companies and seeing how those things work from an engineering perspective.
And so he is, I guess, kind of our in-house right now cybersecurity expert.
But as we think about the next six months where we place the investment dollars that we've received is like, how do we bring on the specialties that we don't have?
And I guess the three of us that started this company together, we did feel like we had a pretty robust, well-rounded skill set to be able to be the right type of people to go and build this. But we know that we're not, you know, can't cover everything. So we'll continue to find the right people to round us out. Absolutely.
Have you been able to have the app pen tested?
Not yet. We are working on our first pen test probably for Q1 is what we're targeting right now. Good deal.
Well, good luck. Before I ask my next question for the non-technical people in the room, I know pen testing because I hang out with the hackers.
For our sex therapy and general mental health listeners, Wolfgang, would you please explain what pen testing is?
It's the good guys acting like the bad guys to keep the bad guys from affecting the good guys.
That is a great definition. That is a great definition.
Can you explain what that looks like?
Yeah, it's effectively trained hackers pulling apart applications and looking at backend websites to find any vulnerabilities that criminals may use or the curious may use, right?
It's not always a criminal, though it is often a criminal. And then providing recommendation back to the team in terms of configurations and changes that would enhance and improve the security of the product. Thank you for that. It is my job as the default tech adjacent person to sometimes poke him for definitions.
I've learned a lot by absorption, but not everybody is as lucky as me to hang out with both the sex therapy crowd and the hacker crowd.
My, I guess, last question for you, Chandler, at least on my end, is I'm really curious to know sort of what your vision for the future is.
What do you want to build Relay into? And more specifically, what is the impact that you're hoping Relay will have? What is your sort of vision for the way you're going to change the world?
Yeah. Yeah. When I think about my vision for Relay, I think about all of kind of behavioral health as a category and building the best platform from a clinical efficacy perspective and from a user experience perspective that anyone and everyone uses almost by default.
Because I think as the world continues shifting where there's more and more acceptance and awareness of why mental health and behavioral health are so important for all of us to proactively care for, our vision is for Relay to be the enablement platform at the center of that.
And right now, while we've kind of honed in on some initial niche use cases, we really want to see Relay kind of integrating a lot of the best practices that providers out there have already developed and bring it together in a very technology forward way. We think that's kind of our superpower.
And as people are, I guess, having higher and higher expectations for technology and the smartphones we use and the apps we interact with, we think it's long overdue that someone overhauls what that looks like for mental health and behavioral health. And we want to be at the center of that to help people have healthier and happier lives.
And I just want to say on a personal note, as you are building technology to facilitate and enable mental health, thank you for being ethical in how you're going about it.
And thank you for noticing, not aspiring to be one of the better talk help spaces out there, because one of the challenges as sort of a boots on the ground, I would say bricks and mortar, but my entire practice is telehealth now since COVID.
But, you know, a human based mental health provider is so many of the startups or the mental health apps and websites that are out there really aren't centering the best needs and the best intentions of their clients. They are incredibly profit driven in ways that can be very challenging.
And it's really refreshing to hear you talk about committing to not advertising and committing to privacy and really wanting to build an ethical mental health platform. And I don't know how often you get to hear that. So I wanted to say thank you. Thank you so much. I really appreciate it. Because sometimes it feels like we're trying to do the impossible here.
But it's always a reminder to me that there are so many other awesome people out there like you guys that I think it will take a myriad of different angles and tools to help to help reach the people that need it. So thank you guys too. It certainly will.
And as we wrap to a conclusion Chandler, is there any final thoughts you would leave our listeners with?
I think I'd be remiss if I didn't conclude with really trying to instill the idea that you are not alone with whatever it is that that you're going through, that I really truly believe the world is shifting to more and more acceptance of let's do this together, not alone. And I think that that's really comforting as I think about where we're headed as a society and things sometimes seem doomy and gloomy.
I really know that if we take the step, the courageous step sometimes, to put ourselves out there and hopefully makes it a little less scary if it's through a platform like Relay, you can find a deeper support system with peers. And if that's something that has been lacking, I think that the words of encouragement I would leave is that having a support system is everything.
And it doesn't need to be scary and it doesn't need to be shameful.
Yeah, thank you so much. And thank you audience for tuning in to Securing Sexuality, your source of information you need to protect yourself and your relationships. From the bedroom to the cloud, we are here to help you navigate safe sex in a digital age. Be sure to check out our website, Securing Sexuality for links to Relay and to more information about the topics we've discussed here today, including next year's in-person conference.
And join us again for more fascinating conversations about the intersection of sexuality and technology. Have a great week.
Comments are closed.
Write something about yourself. No need to be fancy, just an overview.
Join us in Detroit! October 19 & 20, 2023
Proudly Sponsored by The Bound Together Foundation
An IRS approved 501(c)3 nonprofit organization
Michigan Charitable Solicitation Registration# 64801