Facebook has addressed occurrences of murder and suicide reports on its platform. In a statement released by the company, it says such content have no place on the platform.

 

 

“This is an appalling incident and our hearts go out to the family of the victim. There is absolutely no place for content of this kind on Facebook and it has now been removed,” the statement said.

 

 

Facebook recently updated the tools and resources it offers to people who may be thinking of suicide, as well as the support it offers to their concerned friends and family members, which include integrated suicide prevention tools to help people in real time on Facebook Live.

 

 

If someone posts something on Facebook that makes you concerned about their well-being, you can reach out to them directly or report the post to Facebook. The company has teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide.

 

 

Facebook provides people who have expressed suicidal thoughts with a number of support options. For example, it prompts people to reach out to a friend and even offer pre-populated text to make it easier for people to start a conversation. Facebook also suggest contacting a helpline and offer other tips and resources for people to help themselves in that moment.

 

 

Suicide prevention tools have been available on Facebook for more than 10 years and were developed in collaboration with mental health organizations such as Save.org, National Suicide Prevention Lifeline, Forefront and Crisis Text Line, and with input from people who have personal experience thinking about or attempting suicide. In 2016 Facebook expanded the availability of the latest tools globally – with the help of over 70 partners around the world – and improved how they work based on new technology and feedback from the community.