Google, Instagram lack effective policies against self-sabotaging behaviour: report

Stanford researchers ranked 39 online platforms including social networks, search engines and gaming platforms to identify the clarity in public-facing policies relating to self-harming behaviour

(Subscribe to our Today’s Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Facebook, in its policy documents, addresses issues related suicide, euthanasia, suicide notes and live-streaming suicide attempts. But, Instagram and Reddit are at the other end of spectrum when it comes to primary policy document related to self-sabotaging behaviour. This reveals unevenness in framing public-facing policies for online platforms, according to a study by Stanford University.

The team of researchers aggregated the policies of 39 online platforms, including search engines, social networks, creator platforms, gaming platforms and dating apps. They then ranked the platforms based on policy comprehensiveness across several categories and noted that only few platforms provided suicide prevention resources. It also noted that only some of them succeeded in down-ranking self-harm content.

The team published its findings in a study titled ‘Self-Harm Policies and Internet Platforms’. Among search engines, it scored Google 1 out of 3 points, higher than rivals DuckDuckGo, Baidu and Bing. It also noted that the search giant’s policies were general, and not service-specific terms.

Also Read | Analysis of Reddit posts show pandemic’s impact on mental health

Facebook, among social networks, ranked the highest in terms of clarity in policies. Micro-blogging platform Twitter and Instagram followed, although their policies were not as clear as Facebook’s. On the other hand, Reddit and Parler had lacklustre community guidelines, the study noted.

Creator platforms are said to have performed better than social networks and search engines, with TikTok securing the top position in clarity of policies. YouTube and Twitch followed, with Alphabet’s video-streaming service providing additional resources, including hotlines and websites in 27 countries.

Also Read | Facebook shares a glimpse of its content recommendation rules

Among gaming platforms, Xbox and Minecraft had no policies relating to suicide, self-injury or eating disorders. PlayStation Network, Epic Games and Roblox only had policies addressing “self-harm” without specifying categories.

Messaging apps Signal, iMessage, WhatsApp, Telegram and audio-only app Clubhouse too have no policies against self-harm. The team also noted incomplete self-harm policies among dating apps like Tinder and Grindr, that have more than 18 million users in India, according to Statista.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.