As a 15-year-old girl who likes to stay socially active online via apps like Instagram, Snapchat, Bereal, and Twitter, I can report that I have fallen victim to the endless scroll system multiple times. TikTok is the main perpetrator guilty of using this tactic, getting users to stay on the app for as long as possible by recommending an infinite amount of videos based on your activity. When you think about it, this ruse is very smart from a designing perspective. It will get customers attached to your product while collecting more people to produce content by the day resulting in endless videos to pop up on your feed. There is one thing that is flawed with this system though; endless scrolling equals endless possibilities… even triggering content. The for you page, created specifically for each user by the algorithm is also slightly tampered with, pushing more popular content to you based on the demographics of your country. Initially listening to this you might wonder, why is this system so bad? Well, what if I told you that TikTok is pushing pro anorexia and bulimia content within minutes following the creation of new profiles?
The Center of Countering Digital Hate (the CCDH) hosted a study using researchers setting up fake accounts posing as 13-year-old users, putting their interests down as self-image and mental health. The researchers discovered that in no more than two minutes, the algorithm has recommended eating disorder content, and suicide related content. I myself can even vouch for this, and I can confirm that the content is overwhelmingly concerning. Some of this content includes slideshows of skinny individuals being seen eating very small portions of food, with captions like “meal planning” or “cute meals to make.” Other videos that promote suicide can be seen with captions on the videos describing plans to swallow bottles of pills, or sounds behind the videos heavily relating to poor mental hygiene and self-harm.
During the study, two different accounts were created from each researcher. One that showed more interest in suicidal and eating disorder content, and one whom ignored the same content being shown to them.
Around 1,200 lawsuits are being filed against social media companies like TikTok under the claims of children’s mental health worsening because of the apps. — CBS News
“TikTok is able to recognize user vulnerability and seeks to exploit it….It’s part of what makes TikTok’s algorithms so insidious; the app is constantly testing the psychology of our children and adapting to keep them online.” Imran Ahmed, CEO of CCDH claims. Around 1,200 lawsuits are being filed against social media companies like TikTok under the claims of children’s mental health worsening because of the apps. Some parents have reported a higher trend in self-harm and even suicide following the content being shown on social media apps. Hopefully these lawsuits go through and win, resulting in lower death rates in adolescents and young adults.
If you see this negative type of content on any social media platform, please report it. Remember, no change can be made if we don’t try to eliminate the recommendation of this content. If you or a loved one are having troubling thoughts or are considering suicide, call the National Suicide Prevention Hotline at 1-800-273-TALK (8255). Or call 988 to reach the Suicide and Crisis Lifeline.
By Jada Strong, Freshman, Whitney M. Young Magnet