Facebook Expands Self-Harm Prevention Program That Monitors Users’ ‘Thoughts Of Suicide’

Facebook Expands Self-Harm Prevention Program That Monitors Users’ ‘Thoughts Of Suicide’

Facebook is expanding its artificial intelligence-based suicide prevention efforts. The company said today that it has plans to eventually monitor and respond to suicidal intent on Facebook “worldwide”, excluding the European Union.

Image: Getty

Today, Facebook’s vice president of product management Guy Rosen published a blog about the company’s efforts to detect and report users who express “thoughts of suicide”. The post states that Facebook is “[u]sing pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide, and to help respond to reports faster”. But Rosen said the company is also relying on more human reviewers who investigate suicide – and self harm-related posts. This “Community Operations team” is comprised of thousands of workers, including a “dedicated group of specialists who have specific training in suicide and self harm”. Reviewers use software that points them to portions of videos that have a spike in comments and reactions.

These workers also uses pattern-recognition software to determine the order in which they should review flagged posts and videos, and to determine which local authorities they should alert about potential suicide attempts. The flagging software looks for certain phrases, such as “Can I help?” and “Are you OK?”

Facebook did not respond to a Gizmodo request for more information on how the machine-learning software flags posts that might include suicidal thoughts, which could lead to people getting reported to their local authorities.

Facebook first announced in May that it was hiring 3000 people to help monitor the social networking site for comments, posts and videos about murder and suicide. Ryan Calo, a professor at University of Washington who specialises in cyber law and privacy, expressed concern to Reuters over the notion of Facebook scanning comments and posts for harmful activity. “Once you open the door, you might wonder what other kinds of things we would be looking for,” Calo told Reuters.

Rosen wrote that, in the last month, Facebook connected with first responders on more than 100 cases stemming from suicide-related Facebook posts. That does not include the reports that Facebook has received from concerned users. Facebook also produced a video of police officers in Chautauqua County, New York, discussing a case in which a Facebook reported a post of a person threatening to commit suicide. According to the sheriff and deputy in the video, local authorities were able to track the person down through a “mobile phone ping” and took her to a hospital.

Rosen wrote that Facebook is expanding its use of suicide-prevention AI outside of the United States and plans to eventually apply the system across the world, excluding the European Union. Facebook did not respond to a Gizmodo request for information on what other countries will be included in this effort.

If depression is affecting you or someone you know, call Lifeline on 13 11 14.

[Facebook Newsroom]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.