Mobile Technology

Facebook testing AI for suicide prevention tools

Facebook testing AI for suicide prevention tools
Facebook has amped up suicide prevention measures across its platforms, including putting AI to use detecting suicidal behavior
Facebook has amped up suicide prevention measures across its platforms, including putting AI to use detecting suicidal behavior
View 2 Images
Facebook has amped up suicide prevention measures across its platforms, including putting AI to use detecting suicidal behavior
1/2
Facebook has amped up suicide prevention measures across its platforms, including putting AI to use detecting suicidal behavior
Facebook Live's new suicide prevention measures in action
2/2
Facebook Live's new suicide prevention measures in action

Facebook is expanding its suicide prevention tools and rolling them out to its Facebook Live and Messenger platforms. It's also testing AI for detecting posts that indicate suicidal or self-injurious behavior.

The social media giant has had some form of suicide prevention measures in place for over a decade. If a Facebook user posts something that invokes concern for their well-being, their friends can reach out to the person directly or report the post to Facebook. According to the company's blog, Facebook has a 24/7 team dedicated to reviewing high-priority reports like these, who can reach out to the user with support options.

A similar functionality is being rolled out to Facebook Live, the company's live video broadcasting platform. People watching the video will now have options to reach out to the person directly or report it to Facebook. The person broadcasting the video, in turn, will see a set of resources and tips on their end.

Facebook Live's new suicide prevention measures in action
Facebook Live's new suicide prevention measures in action

Live support for individuals struggling with suicidal thoughts will also be coming to Messenger. These services are offered by Facebook in conjunction with its partner organizations, which include the Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Hotline.

And, in an effort to streamline reporting and get the person in danger access to self-help tools more quickly, Facebook is putting artificial intelligence to work in detecting content that indicates potentially suicidal behavior. It is testing pattern recognition tools to automatically detect posts that are likely to indicate thoughts of suicide. If it works correctly, it could streamline the user reporting process or bypass it altogether.

Of course, these tools are not a substitute for direct action in times of crisis. If you encounter a direct threat of suicide or worry that someone is truly in danger, contact the authorities – not Facebook – immediately.

Source: Facebook

1 comment
1 comment
AugustYoung
When you post "good bye sweet world" but you are heading to mars for a vacation you know you are in the future.