In an interesting development, Facebook has turned to Artificial Intelligence to prevent its users from committing suicides. It also intends to update its tools and services directed towards the same. With this Facebook has yet again proved its efforts to build a safe community on and off Facebook.
Citing on its blog that one out of every death caused around the world in 40 seconds is by suicide, Facebook is working towards finding the best cure for preventing them– that is by helping those in distress being heard by people who care about them. Live-streaming feature, Facebook live and Messenger service are some of the features where it intends to bring the suicide prevention tool.
Facebook has facilitated the users with an option of reaching out to the company directly or reporting the post to them in case a user finds that the post could be of concern. “We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide”, Facebook said in the blogpost.
Though suicide prevention tools have been available on Facebook for more than 10 years, this is the first time they have used AI for the same. However, it’s not the first instance where the company has used AI. Facebook already uses artificial intelligence to monitor offensive material in live video streams.
How would it use AI?
Pattern recognition algorithms is what Facebook working on. Though it has been offering help to those thought to be at suicidal risk for a long time, it has relied on other users to bring report the case to Facebook. To do away with the human dependence and make the process more autonomous, algorithms are been developed to train with previous messages that has been reported or flagged.
“This artificial intelligence approach will make the option to report a post about “suicide or self injury” more prominent for potentially concerning posts like these”, the company blogpost said.
The pattern recognition would also identify posts that are likely to include thought of suicide. For instance, talk of pain, sadness, phrases like “are you okay”, “I am worried” etc would be identified and reviewed by the network’s community operations team and immediate action would be taken.
Facebook has partnered with organizations like Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline to reduce the instances of suicides, which according to a National Center for Health Statistics study, has seen a 24% jump in the US alone between 1999 and 2014.
In a nutshell, this is what FB intends to do:
- Integrate suicide prevention tools to help people in real time on Facebook Live
- Live chat support through Messenger which would be from crisis support organizations
- Artificial Intelligence assisted streamlined reporting for suicide