When a Baldwin County man committed suicide last week on Facebook Live, it drew national attention. It also highlighted the fact that the intersection of social media and suicide is murky territory where even experts hesitate to make simple judgments.
“It is still a very rare event,” said Phillip Smith, an assistant professor of psychology at the University of South Alabama, of suicides involving real-time social media communication. “Our understanding is still developing.”
Based on information from the Baldwin County Sheriff’s Office, the case unfolded on the evening of Tuesday, April 26. A woman had contacted the Sheriff’s Office with concerns about her boyfriend, 49-year-old James M. Jeffrey of Robertsdale. The couple was in the middle of a breakup and he had stopped responding, she told officers.
While deputies were on the way to his residence on Ponderosa Farm Road at about 11 p.m. Tuesday, “the Sheriff’s Office Communications Division received reports that Jeffrey committed suicide and filmed the event on Facebook Live.” He was found dead, and a video was confiscated by investigators.
By the next day, when the BCSO released its account of the case, Jeffrey’s page identified him as deceased, and the video in question had been taken down. That appeared to be in keeping with Facebook’s stated policy:
“We don’t allow the promotion of self-injury or suicide. We work with organizations around the world to provide assistance for people in distress. We prohibit content that promotes or encourages suicide or any other type of self-injury, including self-mutilation and eating disorders. We don’t consider body modification to be self-injury. We also remove any content that identifies victims or survivors of self-injury or suicide and targets them for attack, either seriously or humorously. People can, however, share information about self-injury and suicide that does not promote these things.”
Organizations that attempt to help suicidal people, such as the National Suicide Prevention Lifeline, caution that news of suicides should be handled very carefully “to reduce the risk of contagion.” There’s a potential, they say, that discussion of one case could inadvertently glamorize it in the minds of others who are at risk of suicidal thinking.
But social media can dynamite that sense of caution, by giving bystanders an unfiltered look into a situation. That’s especially true of real-time services such as Facetime Live, which allows for live broadcasting and leaves a completed video behind for the record.
Smith, whose research focuses on suicide, said that even an innocuous memorial page on social media can develop a “gravitation” for others. They may see it as proof that their own suicide would make others care about them; the thought of having their own posthumous memorial might be appealing.
And yet, Smith said, he doesn’t buy the argument that social media simply isolates some people. “It’s a part of their communication,” he said of people who express at-risk feelings or behavior online. “Even if it’s not working, it’s an attempt to reach out.”
Facebook has been making efforts to prevent situations like the Robertsdale case from happening. Its help center includes an extensive page on suicide prevention resources, both for those who’ve had suicidal thoughts and those who’ve seen warning signs in the behavior of others. It has sections addressing the issue among military personnel and veterans, LGBT people and law enforcement officers.
In February, Facebook founder Mark Zuckerberg wrote that suicide prevention measures were a component of his vision to make Facebook a force in building safer communities: “To prevent harm, we can build social infrastructure to help our community identify problems before they happen. When someone is thinking of suicide or hurting themselves, we’ve built infrastructure to give their friends and community tools that could save their life,” he wrote. “Going forward, there are even more cases where our community should be able to identify risks related to mental health, disease or crime.”
He went on to say that the company is exploring the use of artificial intelligence for a variety of purposes, including specifically to identify and respond in real time to suicide threats:
“There are billions of posts, comments and messages across our services each day, and since it’s impossible to review all of them, we review content once it is reported to us. There have been terribly tragic events — like suicides, some live streamed — that perhaps could have been prevented if someone had realized what was happening and reported them sooner. There are cases of bullying and harassment every day, that our team must be alerted to before we can help out. These stories show we must find a way to do more.
Artificial intelligence can help provide a better approach. We are researching systems that can look at photos and videos to flag content our team should review. This is still very early in development, but we have started to have it look at some content, and it already generates about one-third of all reports to the team that reviews content for our community.”
Zuckerberg said “It will take many years to fully develop these systems.”
Following his February statement, techcrunch.com reported in March that Facebook was adding suicide prevention tools to its Live and Messenger tools. Primarily, the tools would make it easier for other people to reach out to people at risk, or to notify Facebook, if they saw warning signs.
The techcrunch.com report also highlighted the delicacy of the issue. “Some might say we should cut off the livestream, [in a suicidal situation] but what we’ve learned is cutting off the stream too early could remove the opportunity for that person to receive help,” said Facebook Researcher Jennifer Guadagno. “What we heard from various people is any extra friction in someone reaching out for support can be the thing that stops them from getting support,” Facebook Product Manager Vanessa Callison-Burch told Techcrunch.
Smith said that myths about suicide abound, including the argument that it is the coward’s way out of a situation.
Smith said that according to his research and that of others, people who’ve been saved from suicide by interventions tend to express gratitude. Looking back, they often describe spiraling into a temporary mental state where their basis for a life-or-death decision was distorted.
In the long run, artificial intelligence might help spot people who’ve fallen into that kind of “acute state,” Smith said.
But in the meantime, he said, the most human form of intervention can make a difference: Simply expressing a sympathetic interest.
Smith said that part of his work with a suicide prevention student group at USA is teaching participants to spot warning signs of distress in others, and encouraging them to be more willing to make a connection. Many times, he said, someone can make a difference simply by being willing to listen.
“One of the things we’re trying to do at South is encouraging people to be comfortable talking to people in crisis,” he said.
Suicide resources include the National Suicide Prevention Lifeline, 1-800-273-TALK (8255). Alabama’s Suicide Prevention Center provides an online listing of resources throughout the state.