Tik Tok Scrambles to Remove Graphic Suicide Video
Mainstream social media platforms struggle to get a handle on taking down graphic video of a man’s gunshot suicide
- Social media suicide video
- Facebook initial slow reaction
- Spread of video dangerously exposes many
- YouTube monetizes some clips and images of the video
Social media suicide video
Social media platforms have always struggled with controlling information and violent content and most recently, Ronnie McNutt was able to post a livestream of his suicide which spread to other popular platforms like Tik Tok and Youtube. Tik Tok has especially been struggling because their popular “For You Page” uses an algorithm that recommends unlimited content that it believes you may like.
Facebook initial slow reaction
It took Facebook nearly three hours to take down the original video of Ronnie McNutt. Facebook has claimed that they are removing reuploads of the video. Facebook spokesperson Drew Pusateri said, “We removed the original video from Facebook last month on the day it was streamed and have used automation technology for removing copies and uploads of the since that time” and expressed that the companies’ “thoughts remain” with McNutt’s family.
Usually, Facebook utilizes contractor human reviewer employees to review violent and sexual content. However, Facebook admitted in a “community standards enforcement report” that this workforce had partly been disabled by the pandemic. They said, “With fewer content reviewers, we took action on fewer pieces of content on both Facebook and Instagram for suicide and self-injury, and child nudity and sexual exploitation on Instagram.”
Spread of video dangerously exposes many
This was especially dangerous for Tik Tok users due to their recommendation algorithm. The newly popular lip-syncing video app became popular near the end of last year and skyrocketed due to the global coronavirus pandemic. Tik Tok spokesperson told CNN Business regarding when the suicide video began to circulate, “Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.” Tik Tok has also been “banning accounts that repeatedly try to upload clips.”
Tik Tok has also taken measures to make the video harder to find. For example, when users search things like “suicide video”, the results are blocked and the users are provided with the National Suicide Prevention Lifeline number and other suicide prevention resources instead. Although Tik Tok requires users to be 13 years old to use the app, it does not enforce accurate birth dates like many other social media platforms. For those under 13, TikTok still offers a limited version of the app that includes additional safety and privacy protocols.
Youtube monetizes some clips and images of the video
Youtube is another major platform being accused of being too soft and slow on its crackdown of the suicide video’s spread. McNutt’s friend and co-host, Josh Steen told TechCrunch that he found many cases in which clips and images of the original video being monetized with ads from Squarespace and the Motley Fool. Some uploaders were doing this to attract viewers.
Although YouTube had been taking down any videos upon being reported, several of the videos had already garnered almost half a million views. Thus, some critics argue that by the time YouTube gave a proper response, the damage had already been done in terms of the exposure the banned videos received.
Steen said, “These companies still aren’t fully cooperating and still aren’t really being honest.” He expressed, “This is exactly why I created #ReformForRonnie because we kept seeing over and over again that their reporting systems did nothing. Unless something changes it is just going to keep happening.”