How Facebook can now look out for suicidal posts
Facebook will employ artificial intelligence to spot users who may be at risk of suicide, telling people to talk to friends or contact a helpline if their posts show signs they may be considering killing themselves.
Suicide prevention services have been available on Facebook for more than 10 years, but it is now testing artificial intelligence as a way of identifying users who may be at risk.
Its algorithm, which is being trialled in the US at present, will flag up posts that are likely to include suicidal thoughts, Facebook said, by using pattern recognition on previously reported posts.
Reporting tools will also be integrated into Facebook Live, so people who are watching a video will be able to report it and "reach out to the person directly". The news follows the death of Naika Venant, a 14-year-old who used the social media platform to livestream her suicide in Miami in January.
"There is one death by suicide in the world every 40 seconds, and suicide is the second leading cause of death for 15 to 29-year-olds," the company said. "Facebook is in a unique position - through friendships on the site - to help connect a person in distress with people who can support them."
Through its suicide prevention tools, Facebook users can be prompted to reach out to a friend who may be in need of support, while it also suggests contacting a helpline.
If a video is reported to Facebook, the company will be able to reach out to emergency workers if a person is in imminent danger.
From Wednesday, people will see the option to send a message to someone in real time directly from the organisation's page or through suicide prevention tools.