This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
A video of a man taking his own life is spreading across TikTok, despite attempts to take it down.
The video apparently spread on Facebook and Instagram before being shared on TikTok.
Users report that the distressing footage is being placed inside of other, seemingly unrelated videos, so that it may appear without warning in users’ feeds.
Media reports say the video was streamed live on Facebook on 31 August by a Mississippi man.
It also was spread on the website 4chan, an anonymous messaging forum that has risen in notoriety due to its extreme content.
Now the video has made its way onto TikTok where users report they have seen the video appear, without warning, on their feed.
These tell users that if they see the opening frame of the video - a white man with a beard sitting at a desk - they should immediately swipe away or close the app.
TikTok said that it was using automated systems that are attempting to stop the video.
“Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide,” TikTok said in a statement.
“We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”
This is not the first instance that a video of a suicide has been spread on social media, nor the first time technology giants have struggled to remove graphic content.
Between April and June of that year it had to take down 834,000 pieces of content from its site.
In March 2019, when videos of the Christchurch terrorist attack were being spread on social media, Facebook said that it could not remove them quickly from its platform because of their visual similarity to video games.
“If thousands of videos from livestreamed video games are flagged by our systems, our reviewers could miss the important real-world videos, where we could alert first responders to get help on the ground,” Guy Rosen, Facebook VP of integrity, said at the time.
When life is difficult, Samaritans are here – day or night, 365 days a year. You can call them for free on 116 123, email them at jo@samaritans.org, or visit www.samaritans.org to find your nearest branch.