Trending · September 8, 2020 0

TikTok Is Warning Users About A Video Showing A Suicide

TikTok is trying to stop the video from spreading by taking down clips and banning users who repeatedly share it.

WASHINGTON, DC – AUGUST 07: In this photo illustration, the download page for the Tiki Tok app is displayed on an Apple iPhone on August 7, 2020 in Washington, DC. On Thursday evening, President Donald Trump signed an executive order that bans any transactions between the parent company of TikTok, ByteDance, and U.S. citizens due to national security reasons. The president signed a separate executive order banning transactions with China-based tech company Tencent, which owns the app WeChat. Both orders are set to take effect in 45 days. (Photo Illustration by Drew Angerer/Getty Images) == FOR NEWSPAPERS, INTERNET, TELCOS & TELEVISION USE ONLY ==

TikTok is warning users to be on the lookout for videos of a man killing himself that are spreading on the social media platform.

The suicide video has been circulating on the app since at least Sunday night, TikTok spokesperson Hilary McQuaide told BuzzFeed News.

“Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide,” she said.

“We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”

The video is from a Facebook Live recording that a Mississippi man made last week of him killing himself.

Trolls are also inserting sections from the video into other seemingly harmless clips in an effort to trick people into watching it.

Some users on TikTok have been filming videos warning others of the footage by showing them a screenshot (a bearded man sitting at a desk) to know what to be on the lookout for.

Unlike other apps where users must subscribe to or befriend others to see their content, TikTok users frequently encounter videos from people they do not follow via their For You pages.

The efforts by TikTok to remove the video were first reported Sunday by the Verge.

This is by no means the first suicide to be aired on Facebook. In 2017, BuzzFeed News found at least 45 instances of violence — suicides, shootings, murders, torture, and child abuse — that were streamed via Facebook Live since it first launched in December 2015.

Facebook now uses artificial intelligence to identify posts from users indicating thoughts of suicide or self-harm.

In the past, websites like Reddit have also come under fire for not acting quickly enough to remove videos of suicide or other violent acts.

The National Suicide Prevention Lifeline is 1-800-273-8255. Other international suicide helplines can be found at You can also text TALK to 741741 for free, anonymous 24/7 crisis support in the US from the Crisis Text Line.

Would love your thoughts, please comment.x
%d bloggers like this: