Deepak Kumar Rohilla, 32, of Sonipat district in Haryana, hanged himself while streaming his suicide on Facebook (FB) Live last week. His sister Neelam Rani and his friends watched him killing himself on FB, helplessly pleading with him: “Don’t do this brother, we all are there for you,” Neelam commented on the Live post.
She couldn’t stop him. “I called up the neighbours, who rushed to our house, but it was too late,” said Neelam.
Deepak, who left behind a young wife and five-year-old son, scribbled a suicide note on a wall in his room. He said he was “facing threat to his life from a married woman in his neighbourhood and her alleged lover after he told her husband about their illicit relationship.”
“Deepak wrote on the wall that the woman, an assistant sub-inspector, had an affair with an inspector and after he came to know about it, he narrated everything to her husband,” Sonipat superintendent of police (SP) Ashwin Shenvi said in this article.
His recent Facebook posts had hinted at his growing depression and suicidal thoughts.
Deepak even uploaded a black-and-white image of a silhouetted figure hanging from a rope on April 15, which seems to indicate how he was planning his death. When some of his friends asked why he put it up, he cryptically asked them to “wait and watch”.
This is not the first time someone has used a social media platform to “share” his/her anguish and live stream a suicide. FB Live streaming of suicides and murders is becoming alarmingly frequent. On April 3, 24-year-old Arjun Bhardwaj killed himself by jumping off the 19th floor of a five-star hotel in Bandra, Mumbai. Bhardwaj, too, streamed an FB Live, making it seem like a tutorial on how to commit suicide.
While the original FB Live video was deleted by Facebook (it reportedly got 36,336 views in two weeks) it is still available on YouTube, where it has notched up 4,933 views. The writer of this article reached out to Bharadwaj’s family in Mumbai, but they declined to comment.
In the US in January this year, Jay Bowdy, an aspiring actor, shot himself dead in his car on a Los Angeles street after threatening to kill himself on Facebook Live.
Last October, 22-year-old Erdogan Ceren from Turkey sat in front of his computer camera holding a shotgun while on FB Live. He was unsuccessful in first attempt to take his life, but made a second attempt and was later found dead by his family.
What is Facebook doing?
The disturbing and increasing number of such episodes of people streaming suicides on FB Live have led to netizens raising objections to social media platforms making the live streaming option available to all their users without any filtering mechanism in place.
Facebook recognises the problem, and is trying to limit the broadcast of violent and disturbing videos besides working to help prevent suicides. In a blog post on its site, the company said it “is in a unique position — through friendships on the site — to help connect a person in distress with people who can support them”. It added that it was updating the tools and resources it offers to people who may be contemplating suicide, as well as the support it offers to their friends and family. According to the blog, it is taking a three-pronged approach:
- Integrated suicide prevention tools to help people in real time on Facebook Live
- Live chat support from crisis support organisations through Messenger
- Streamlined reporting for suicide, assisted by artificial intelligence
During the annual F8 developers conference held in San Jose, California in the US April 18-19, Facebook CEO Mark Zuckerberg spoke about a murder video posted by Steve W Stephen gunning down 74-year-old Robert Godwin Sr posted on Facebook. He said the company would do everything to prevent such tragedies. ‘We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening.”
True to Zuckerberg’s word, Facebook is using artificial intelligence (AI) and is developing algorithms to scan for posts, comments and other online behaviour of users that could be early warning signs of suicidal ideation or intent, so it can take preemptive action by reporting them to the community, law enforcement agencies or rescue operation teams. According to a BBC report, Facebook is testing AI to identify users who may be at risk of killing themselves in the US.
Facebook is using AI and is developing algorithms to scan for posts, comments and other online behaviour of users that could be early warning signs of suicidal ideation or intent, so it can take preemptive action by reporting them to the community, law enforcement agencies or rescue operation teams
AI offers the possibility of spotting people with suicidal tendencies more accurately than most other methods, thereby allowing individuals and parties the opportunity to intervene before thoughts turn into action. According to this Wired article, a study used machine learning to predict with 80-90% accuracy whether or not someone will attempt suicide, as far off as two years in the future.
Supplementing AI at FB is human intervention. “We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide. We provide people who have expressed suicidal thoughts with a number of support options,” the blog post adds.
The company says suicide prevention tools have been available on the platform for over 10 years and they are now being integrated into Facebook Live.
The tools provide support to people with suicidal intent both before and during the act, and also help their friends and family get them help. “People watching a live video have the option to reach out to the person directly and to report the video to us,” says the blog post. Besides, a person sharing a live video depicting any kind of self harm will see a set of resources on their screen and can choose to reach out to a friend, contact a helpline or see tips.
A Facebook spokesperson said in India the company it is working closely with mental health experts like The Live Love Laugh Foundation, a mental health organisation founded by actor Deepika Padukone, and AASRA, a crisis intervention centre for the lonely, distressed and suicidal, to develop resources for people with suicidal intent and their loved ones.
“Our in-product tools provide resources to help someone reach out on Facebook to a friend who may be struggling and also has resources specifically for a person who may be expressing suicidal thoughts,” the spokesperson said.
In June last year, Varun Malik, an IT professional, slit his wrist and posted a suicide note on his Facebook wall. Friends who saw his post rushed to save him. They also informed the police. That very day Facebook India had announced a suicide prevention feature, similar to ones in the US and UK, that would enable Indian users to flag posts by their friends who they felt were contemplating self-harm or suicide.
“Our in-product tools provide resources to help someone reach out on Facebook to a friend who may be struggling and also has resources specifically for a person who may be expressing suicidal thoughts” — Facebook spokesperson
The Facebook India spokesperson added that they were aware at least four instances earlier this month (he didn’t give any details) in which attempted suicides being streamed live on FB were interrupted by the victim’s friends who saw the posts and alerted the authorities.
SP Shenvi told FactorDaily that he contacted Facebook India after Deepak’s death and asked them to alert law enforcement agencies about early indications of such impending, extreme acts. FB hadn’t responded until April 24 night. “They usually take one-two weeks time to respond to our requests. We had also asked them to flag such videos and posts to the police so that we can prevent such incidents,” he said. However, he said that Facebook may have removed the suicide video based on their email request.
The video is, however, still available on YouTube. In an email response, a Youtube spokesperson said, “We have strict Community Guidelines that prohibit videos that contain violent or graphic content, harmful and dangerous content. And we review content that anyone flags to us 24 hours a day. We also work closely with law enforcement agencies as per the due process of law and act quickly to remove material that violates our policies.” Youtube’s policy on violent and graphic content can be reviewed here.
According to research, an immense amount of information on the topic of suicide is available on the internet and on social media. In fact, suicide, suicide methods, how to kill yourself and the best suicide methods are among the most popular searches on Google and YouTube.
A final plea for empathy on social media
About 800,000 people commit suicide worldwide every year and India accounts for 17% of them. The male-female suicide ratio in India is 2:1. On an average, 300 people commit suicide in India every day. Consuming poison (33%), hanging (38%) and self-immolation (9%) were the primary methods used to commit suicide in 2012 and family problems was the top cited reason.
Noted sociologist Dr Jitender Prasad from Central University of Haryana said many victims seek to draw attention to the anguish or desperation that is leading them to have suicidal thoughts. “It happens when they lose faith in the system; they resort to social media tools like Facebook and YouTube to vent their frustration,” said Dr Prasad.
But acts of threatened and attempted suicide don’t seem to be evoking the empathy their perpetrators are looking for on social media. Bharadwaj’s desperate suicide spectacle received a bunch of apathetic comments poking fun at him on the video posted on YouTube. Some users, including news channels, have shared the video.
Dr Mahesh Kumar, a professor of psychology at Delhi University, attributes the rising trend of suicides on social media and the enormous power it offers to connect to millions with the click of a mouse. “A social media user immerses himself in the virtual world so completely that he is connected with everybody despite sitting alone. But, he feels (he’s) alone when sitting in a group or among family members,” Kumar said. He added that China runs digital detox clinics for patients who become social media addicts and is an urgent need for that sort of intervention in India.
Also read: The deadly addiction in your home that you are aware of — and comfortable with
The policy interventions in China, meanwhile, is cold comfort for a sister in Sonipat.
Neelam blames technology for her brother’s act and for live-streaming his own death: “Earlier people used to connect with each other by sitting together and talking. Things have changed completely. Now, everyone is busy with their smartphones. They share their feelings, miseries, ups and downs on Facebook and other social media platforms.”
She fears more people may be prompted to replicate the act if her brother’s suicide video is still on the internet.
Research shows that when suicides are reported or shown in detail, including how the person did it, there is a sharp rise in “copycat” suicides. Vulnerable people are triggered and influenced by the gratuitous details of suicide stories. If one feels suicidal, it’s not uncommon to feel compelled to watch such videos and step closer to the edge.
Subscribe to FactorDaily
Our daily brief keeps thousands of readers ahead of the curve. More signals, less noise.
To get more stories like this on email, click here and subscribe to our daily brief.
Additional inputs by Prakriti Singhania. Lead visual: Angela Anthony Pereira Sat Singh is a Rohtak-based independent journalist and a member of 101Reporters.com, a pan-India network of grassroots reporters. Updated on April 26 at 11.55pm to remove some hyperlinks. Disclosure: FactorDaily is owned by SourceCode Media, which counts Accel Partners, Blume Ventures and Vijay Shekhar Sharma among its investors. Accel Partners is an early investor in Flipkart. Vijay Shekhar Sharma is the founder of Paytm. None of FactorDaily’s investors have any influence on its reporting about India’s technology and startup ecosystem.