science

Facebook 'tightening' policy on self-harm and suicide in an effort to protect users' mental health

[ad_1]

Facebook says it is ‘tightening’ its policy on content relating to self-harm and suicide in an effort to protect users’ mental health

  • Facebook says it will start removing content that depicts self-harm
  • That will include ‘graphic cutting images’ which it says could trigger users
  • Instagram will also remove such content from its Explore tab
  • The company is seeking mental health expert to join its security team 

Facebook said it will tighten its grip on content relating to suicide and self-harm in an effort to make the platform and its sister-site, Instagram, safer.

In a blog post, Facebook announced several policy changes that will affect how content relating to self-harm and suicide are treated once posted to its platform. 

The company says it will ‘no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm.’ 

That policy will apply ‘even when someone is seeking support or expressing themselves to aid their recovery’ said Facebook in a blog post.

Facebook said it will start to remove content depicting self-harm in an effort to avoid triggering users who may be dealing with similar issues

Facebook said it will start to remove content depicting self-harm in an effort to avoid triggering users who may be dealing with similar issues

The new policy will also encompass images of healed self-inflicted cuts, which the company says it will temper with a ‘sensitivity screen’ that users must click through to access the underlying content.   

Likewise, Instagram will start to deprioritize content that depicts self-harm, removing it from the Explore tab and sequestering it from the company’s suggestion algorithm.  

To help promote healthy dialogue on suicide and self-harm, Facebook says it will also direct users to guidelines developed by the National Centre of Excellence in Youth Mental Health, ORYGEN, when they search for content relating to those topics. 

The guidelines, are meant to ‘provide support to those who might be responding to suicide-related content posted by others or for those who might want to share their own feelings and experiences with suicidal thoughts, feelings or behaviors,’ said Facebook.

According to Facebook, the changes come as the result of input from mental health professionals and experts in the field of suicide prevention. 

In February, Facebook vowed to help weed out graphic content depicting self-harm citing experts from ten countries who had advised Facebook to ‘allow people to share admissions of self-harm and suicidal thoughts, but should not allow people to share content promoting it’.  

To oversee its effort, Facebook said it will also hire a health and well-being expert as a part of its safety team.

That person will be responsible for coordinating with external experts and organizations to addressed issues relating to mental health,  including, ‘suicide, self-harm, eating disorders, depression, anxiety, addiction, nutrition, healthy habits, vaccinations’ and more.

That coordination will apparently go both ways, said Facebook. For the first time, the platform said it will begin sharing data with academics on how Facebook’s users talk about suicide. 

On Instagram, pictures of self-harm will no longer be promoted in the platform's Explore tab in an effort to curb its reach. Stock image

On Instagram, pictures of self-harm will no longer be promoted in the platform’s Explore tab in an effort to curb its reach. Stock image

Researchers will have access to a tool called CrowdTangle, which allows the wielder to suss out specific content on the platform. 

While the tool has been used primarily by publishes are media companies to identify trends on Facebook as they gather steam, Facebook says it will now grant access to two unnamed researchers for their efforts in the field of suicide prevention. 

Facebook and other large companies like YouTube and Twitter have both been under an unprecedented amount of pressure from lawmakers and concerned users to crack down on toxic content emanating from their platforms.

This month, YouTube said it had banned more than 17,000 accounts for spreading ‘hateful content’ while Twitter has rolled out a number of new policy changes surrounding what it considers a violation of its user agreement.  

WHERE TO FIND HELP  

For confidential help, call the National Suicide Prevention Lifeline at 1-800-273-8255 or visit http://www.suicidepreventionlifeline.org/

For confidential support on suicide matters call the Samaritans on 08457 90 90 90 or visit a local Samaritans branch or visit http://www.samaritans.org

 

 

[ad_2]

READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.  Learn more