,

How does Tinder develop the features that keep you safe?

A person holds an iphone screen with the Tinder app open

Who decides what we need to stay safe online? And how do they know what features we’d benefit from?

At Tinder, one person playing an integral role in the dating app’s safety features is Rory Kozoll, Tinder’s senior vice president of product integrity. Kozoll leads the team that develops in-app tools and resources which aim to keep users’ interactions respectful and safe.

Tinder has launched a slew of new safety updates and features, most notably a long press reporting function, allowing you to tap and hold chat messages to directly start the reporting process. This means it’s now easier to flag harassment, hate speech, or any other offensive texts that violate the app’s Community Guidelines.

72 percent of 18–25 year olds are as concerned for their emotional safety as they are for their physical safety, according to a recent survey conducted by Opinium on behalf of Tinder. The survey, which looks broadly at online interactions, also found that 40 percent of 18–25 year olds have witnessed hate speech online, and 30 percent of people admit to sending harmful messages online that they later come to regret. On top of this, Tinder is expanding its existing ‘Does This Bother You?’ and ‘Are You Sure?’ features to broaden its categorisation of hate speech, harassment, and sexual exploitation.

For women and marginalised genders, being on dating apps, social media, or just existing online in general, can come hand in hand with sexual harassment, receiving non-consensual, unwanted sexual messages, in addition to experiencing violations such as cyberflashing.

How does Tinder know which safety features users need?

Kozoll spoke to Mashable about how Tinder’s safety tools are developed and the four main sources of information that feed into the process.

“Our members will tell us something has bothered them and that will give us the signal that we need to unpack and try to understand what the offence may be, and how we can be a part of diminishing that offence,” he says. “The second source is the things we can see very clearly in our data. And the third is we work with a lot of outside partners, both in the gender safety space and in the LGBTQIA space and other underrepresented groups to inform us.”

The fourth source is “a little bit more art than science,” Kozoll says, referring to “product intuition”. Tinder’s own employees are using the app and they will report back and discuss their own experiences to inform what they think needs to change on the platform.

Tinder’s ‘Does This Bother You?’ feature came from a real-life experience.

In the case of Tinder’s ‘Does This Bother You?’ feature, a real-life incident led to this tool being introduced on the app. The tool uses machine learning to flag potentially offensive messages, prompting an automated message to appear for message recipients when harmful language enters a conversation. With this prompt, users have the instant option to report the bad behaviour should they wish to.

Want more sex and dating stories in your inbox? Sign up for Mashable’s Top Stories and Deals newsletters today.

Prior to this feature being released, Kozoll and his team had been looking into categories of offensive messages. When it comes to what Kozoll describes as “more forward talk” (read: sexually explicit messages), the key factor to consider is consent.

“People may open the door to, let’s say more forward talk. We want to make sure that we’re always toeing the line between keeping everybody safe and making sure everybody’s comfortable, and also not imposing ourselves and our own values upon upon our members,” he says.

Kozoll says he and his team are constantly observing real-life examples of the problems people may encounter on the app.

“I was out to dinner with my wife, walking to a restaurant in Santa Monica. This car drives by with these young guys and one of them leans out a window and catcalled. When I turned around, I could see there was a young woman by herself walking behind us. You could just see her visibly become uncomfortable with the guys catcalling,” he explains. “They kept driving and out of instinct I just turned around and said, ‘Hey, are you are you OK? You want to walk with us?’ Turned out she was walking to the same restaurant.” In that moment, Kozoll’s wife told him, “You don’t know how rare it is for somebody to actually just ask ‘are you OK?’

“That was the seed — just because we don’t know for sure that these messages are problematic for this person, it never hurts to just ask them if they’re OK. And that’s where ‘Does This Bother You?’ came from,” he adds.

What actual role does Tinder want to play here?

When it comes to the challenges that Tinder’s team faces when considering safety needs, Kozoll says it’s about “figuring out where the right line is between making sure everybody’s comfortable, but also giving them the freedom to express themselves and have the kind of conversation they want to have.”

“We see ourselves as the host of a party and we’ve invited all of these guests. We hope that people will hit it off and that they’ll meet somebody exciting and new. We’re not there to tell people how to talk to each other. But we are there if somebody looks across the room and gives us the look to say like ‘hey, I’m really uncomfortable here,’ we have to step in and help resolve the situation. Sometimes that means asking somebody to leave the party, and that’s the role we try to play,” he says.

So, why has Tinder widened the scope when it comes to hate speech? Kozoll says it has to do with the ways in which language evolves in society.

“Language is constantly evolving, emoji is constantly evolving, people are getting more and more creative, they’re not trying to evade anything we’re doing. But just the language is changing all the time, and so we’re having to adapt really rapidly to that,” he says.

“As we evolve our understanding, we’re going to be constantly updating these models,” Kozoll adds. “This is a forever stream of work, evolving these machine learning models and keyword lists to make them better at at identifying the context that these words are showing up in, and the new words that are showing up in the lexicon as well.”

Read more about staying safe in the online dating world:

https://mashable.com/article/tinder-safety-features


January 2025
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  

About Us

Welcome to encircle News! We are a cutting-edge technology news company that is dedicated to bringing you the latest and greatest in everything tech. From automobiles to drones, software to hardware, we’ve got you covered.

At encircle News, we believe that technology is more than just a tool, it’s a way of life. And we’re here to help you stay on top of all the latest trends and developments in this ever-evolving field. We know that technology is constantly changing, and that can be overwhelming, but we’re here to make it easy for you to keep up.

We’re a team of tech enthusiasts who are passionate about everything tech and love to share our knowledge with others. We believe that technology should be accessible to everyone, and we’re here to make sure it is. Our mission is to provide you with fun, engaging, and informative content that helps you to understand and embrace the latest technologies.

From the newest cars on the road to the latest drones taking to the skies, we’ve got you covered. We also dive deep into the world of software and hardware, bringing you the latest updates on everything from operating systems to processors.

So whether you’re a tech enthusiast, a business professional, or just someone who wants to stay up-to-date on the latest advancements in technology, encircle News is the place for you. Join us on this exciting journey and be a part of shaping the future.

Podcasts

TWiT 1013: Calamari in Crisis – Touching the Sun, Fake Spotify Artists, Banished Words This Week in Tech (Audio)

Touching the Sun, Fake Spotify Artists, Banished Words AI Needs So Much Power, It's Making Yours Worse How many billions Big Tech spent on AI data centers in 2024 NASA Spacecraft 'Touches Sun' In Defining Moment For Humankind Elon Musk Calls Out NASA's Moon Ambitions: 'We're Going Straight to Mars' Elon Musk and the right's war on Wikipedia Trump Asks Supreme Court to Pause Law Threatening TikTok Ban US Treasury says Chinese hackers stole documents in 'major incident' Judge blocks parts of California bid to protect kids from social media Finland probes Russian shadow fleet oil tanker after cable-cutting incident US appeals court blocks Biden administration effort to restore net-neutrality rules The Ghosts in the Machine (fake spotify artists) Massive VW Data Leak Exposed 800,000 EV Owners' Movements, From Homes To Brothels Banished Words | Lake Superior State University 2025 Public Domain Day 2025 Happy Birthday, Bitcoin! The top cryptocurrency is old enough to drive End of the lines? QR-style codes could replace barcodes 'within two years' Host: Leo Laporte Guests: Richard Campbell, Anthony Ha, and Stacey Higginbotham Download or subscribe to This Week in Tech at https://twit.tv/shows/this-week-in-tech Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: ZipRecruiter.com/Twit joindeleteme.com/twit promo code TWIT canary.tools/twit – use code: TWIT zscaler.com/security
  1. TWiT 1013: Calamari in Crisis – Touching the Sun, Fake Spotify Artists, Banished Words
  2. TWiT 1012: Our Best Of 2024 – The Best Moments From TWiT's 2024
  3. TWiT 1011: The Year in Review – A Look at the Top Stories of 2024
  4. TWiT 1010: The Densest State in the US – TikTok Ban, Drones Over Jersey, GM Quits Robotaxis
  5. TWiT 1009: Andy Giveth & Bill Taketh Away – Trump's Tech Titans, Crypto Boom, TikTok's US Ban, Intel CEO Exits