Facebook and Real-Time Screening for Suicide: A "Tricky Public Health Role"
January 22, 2019 | Physician Practice News
Facebook is running what may be "the world's largest suicide threat screening and alert program," according to a December 31, 2018, New York Times article. The social media company is taking on numerous challenges in conducting what some say amounts to public health screening, according to the Times. After a series of Facebook users live-streamed their suicides in 2017, the company began a campaign to identify and intervene when a post suggests a person is at "immediate suicide risk." Posts are scanned using computer algorithms, and users are encouraged to report content that is of concern. Human reviewers evaluate flagged material and, if they conclude that "serious self-harm" is imminent, may contact local emergency responders. In one reported case, Facebook workers provided the exact location of a suicidal man's phone. It is unknown how many times Facebook has intervened to stop a suicide, what criteria the company is using, and whether these efforts are effective and safe, say critics quoted by the Times.