Tinder Fails To Protect Women From Abuse. But When We Brush Off Dick Pics As A Laugh, So Do We

Tinder Fails To Protect Women From Abuse. But When We Brush Off Dick Pics As A Laugh, So Do We
Shutterstock

An ABC investigation has highlighted the shocking threats of sexual assault women in Australia face when “matching” with people on Tinder.

A notable case is that of rapist Glenn Hartland. One victim who met him through the app, Paula, took her own life. Her parents are now calling on Tinder to take a stand to prevent similar future cases.

The ABC spoke to Tinder users who tried to report abuse to the company and received no response, or received an unhelpful one. Despite the immense harm dating apps can facilitate, Tinder has done little to improve user safety.

Way too slow to respond

While we don’t have much data for Australia, one US–based study found 57% of female online dating users had received a sexually explicit image or image they didn’t ask for.

It also showed women under 35 were twice as likely than male counterparts to be called an offensive name, or physically threatened, by someone they met on a dating app or website.

Tinder’s Community Guidelines state:

your offline behaviour can lead to termination of your Tinder account.


 Get The Latest From InnerSelf


As several reports over the years have indicated, the reality seems to be perpetrators of abuse face little challenge from Tinder (with few exceptions).

Earlier this year, the platform unveiled a suite of new safety features in a bid to protect users online and offline. These include photo verification and a “panic button” which alerts law enforcement when a user is in need of emergency assistance.

However, most of these features are still only available in the US — while Tinder operates in more than 190 countries. This isn’t good enough.

Also, it seems while Tinder happily takes responsibility for successful relationships formed through the service, it distances itself from users’ bad behaviour.

No simple fix

Currently in Australia, there are no substantial policy efforts to curb the prevalence of technology-facilitated abuse against women. The government recently closed consultations for a new Online Safety Act, but only future updates will reveal how beneficial this will be.

Historically, platforms like Tinder have avoided legal responsibility for the harms their systems facilitate. Criminal and civil laws generally focus on individual perpetrators. Platforms usually aren’t required to actively prevent offline harm.

Nonetheless, some lawyers are bringing cases to extend legal liability to dating apps and other platforms.

The UK is looking at introducing a more general duty of care that might require platforms to do more to prevent harm. But such laws are controversial and still under development.

The UN Special Rapporteur on violence against women has also drawn attention to harms facilitated through digital tech, urging platforms to take a stronger stance in addressing harms they’re involved with. While such rules aren’t legally binding, they do point to mounting pressures.

However, it’s not always clear what we should expect platforms to do when they receive complaints.

Should a dating app immediately cancel someone’s account if they receive a complaint? Should they display a “warning” about that person to other users? Or should they act silently, down-ranking and refusing to match potentially violent users with other dates?

It’s hard to say whether such measures would be effective, or if they would comply with Australian defamation law, anti-discrimination law, or international human rights standards.

Ineffective design impacts people’s lives

Tinder’s app design directly influences how easily users can abuse and harass others. There are changes it (and many other platforms) should have made long ago to make their services safer, and make it clear abuse isn’t tolerated.

Some design challenges relate to user privacy. While Tinder itself doesn’t, many location-aware apps such as Happn, Snapchat and Instagram have settings that make it easy for users to stalk other users.

Some Tinder features are poorly thought out, too. For example, the ability to completely block someone is good for privacy and safety, but also deletes the entire conversation history — removing any trace (and proof) of abusive behaviour.

We’ve also seen cases where the very systems designed to reduce harm are used against the people they’re meant to protect. Abusive actors on Tinder and similar platforms can exploit “flagging” and “reporting” features to silence minorities.

In the past, content moderation policies have been applied in ways that discriminate against women and LGBTQI+ communities. One example is users flagging certain LGBTQ+ content as “adult” and to be removed, when similar heterosexual content isn’t.

Tackling the normalisation of abuse

Women frequently report unwanted sexual advances, unsolicited “dick pics”, threats and other types of abuse across all major digital platforms.

One of the most worrying aspects of toxic/abusive online interactions is that many women may — even though they may feel uncomfortable, uneasy, or unsafe — ultimately dismiss them. For the most part, poor behaviour is now a “cliche” posted on popular social media pages as entertainment.

View this post on Instagram

💯 money back guarantee

A post shared by Unspirational (@tindernightmares) on

It could be such dismissals happen because the threat doesn’t seem imminently “serious”, or the woman doesn’t want to be viewed as “overreacting”. However, this ultimately trivialises and downplays the abuse.

Messages such as unwanted penis photos are not a laughing matter. Accepting ordinary acts of abuse and harassment reinforces a culture that supports violence against women more broadly.

Thus, Tinder isn’t alone in failing to protect women — our attitudes matter a lot as well.

All the major digital platforms have their work cut out to address the online harassment of women that has now become commonplace. Where they fail, we should all work to keep the pressure on them.

About the Authors

Rosalie Gillett, Research Associate in Digital Platform Regulation, Queensland University of Technology and Nicolas Suzor, Professor, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

enafarzh-CNzh-TWnltlfifrdehiiditjakomsnofaptruessvtrvi

follow InnerSelf on

facebook-icontwitter-iconrss-icon

 Get The Latest By Email

{emailcloak=off}

FROM THE EDITORS

InnerSelf Newsletter: October 18, 2020
by InnerSelf Staff
These days we are living in mini-bubbles... in our own homes, at work, and in public, and possibly in our own mind and with our own emotions. However, living in a bubble, or feeling like we are…
InnerSelf Newsletter: October 11, 2020
by InnerSelf Staff
Life is a journey and, as most journeys, comes with its ups and downs. And just like day always follows night, so do our personal daily experiences go from dark to light, and back and forth. However,…
InnerSelf Newsletter: October 4, 2020
by InnerSelf Staff
Whatever we are going through, both individually and collectively, we must remember that we are not helpless victims. We can reclaim our power to heal our lives, spiritually and emotionally, as well…
InnerSelf Newsletter: September 27, 2020
by InnerSelf Staff
One of the great strength of the human race is our ability to be flexible, to be creative, and to think outside the box. To be someone other than we were yesterday or the day before. We can change...…
What Works For Me: "For The Highest Good"
by Marie T. Russell, InnerSelf
The reason I share "what works for me" is that it may work for you as well. If not exactly the way I do it, since we are all unique, some variance of the attitude or method may very well be something…