ugc_banner

This is how social media giants are feeding dangerous content to children

WION Web Team
New Delhi, IndiaWritten By: Bharat SharmaUpdated: Jul 20, 2021, 06:40 PM IST
main img
File photo Photograph:(AFP)

Story highlights

A new report reveals how social media platforms expose children to inappropriate content regarding unrealistic body types, dieting, self-harm and suicide. WION explains how companies deliver harmful content to minors under the guise of "targeted content"

Just minutes after logging into social media accounts, children are potentially bombarded with a host of dangerous content that could lead to self-harm, sexualisation, misleading diets and wrong concepts on weight loss, a disturbing study has revealed.

Through the "endless scroll" features, kids are sucked into a vortex of "targeted" content.

Now a study has shed light on the extent of this dangerous practice.

Social media giants like Facebook, Instagram and Twitter have come under the radar of governments across the globe with new regulation drives.

Many administrations are increasingly growing wary of the role played by these companies in dissemination of content on their platforms, especially in terms of "targeted content".

Targeted content may be defined as that which is created especially for a niche audience in mind to elicit a specific response. Easiest example? Perhaps the little advertisements you see every time on Instagram that weirdly pitch products you may be looking to buy on other portals. You become the target, and the purchase of the advertised product is the expected response.

With each positive response to such target advertisements, the algorithms that pitch personalised content to users become refined and stronger. While adults may be able to successfully understand that they're targets of a marketplace strategy, children may be inadvertently exposed to dangerous products and content.

social media

Graphic online content

A new research sheds light on the disturbing content targeting designed especially for children. According to researchers at "Revealing Reality" who undertook the project, accounts of minors and children are being fed inappropriate material soon after joining any social media platform.

What is "inappropriate material", you wonder? In this case, it refers to images of self-harm including razors and cuts. 

The researchers set up social media accounts that would essentially resemble the online avatar of a child based on information from kids aged between 13-17. They took into account the kind of profiles generally followed by kids and the nature of content they're expected to "like" across platforms.

Just hours into the creation of accounts on social media, these fake accounts representing kids were fed content about diets and sexualisation. In addition, pornography and other explicit content was easily accessible through these accounts.

social

Worryingly, just hours after signing up, the accounts were approached by unknown adults, positing that a part of this targeted content attempted to link kids to adults on social media. This experiment was commissioned by 5Rights Foundation and the Children's Commissioner for England, who are urging governments to formulate rules to regulate the design and models of online platforms.

Researchers believe that such content is dangerous especially for kids who are grappling with body image issues, for they're constantly fed unrealistic ideas of an ideal body type.

A 14-year-old girl, Molly Russell took her own life after viewing graphic self-harm and suicidal content online. Her father, Ian Russell told SkyNews that social media companies prioritise profit over safety and is urging governments to bring online content in line with what kids are shown in the physical world.

Facebook, Instagram and TikTok were named explicitly in the report. In response, both Facebook (which also owns Instagram) and TikTok commissioned the same staple response, claiming they have robust models to ensure kids are safe on their platforms. 

social

Weight-loss frenzy

The same research also revealed how Instagram is flooding the feeds of teenagers with weight-loss content. While accounts created for girls were receiving more content about dieting and losing weight, ghost accounts for boys were constantly fed images of models with unrealistic body types.

In one instance, an account set up for a 17-year-old girl liked a post about dieting which Instagram fed to the account in its "Explore" tab where people discover new content from users they may not be "following". Soon, suggestions in the "Explore" section radically changed to focus on weight loss, with amplification of "distorted body shapes". Similar patterns were noted for accounts created for girls.

Instagram claims that the study only represents a fraction of the teenage experience on its platform, claiming the content shown in the research is the kind which shows up when actively searched by users.

social

United Kingdom has taken the need to protect children from harmful content extremely seriously.

Six weeks from now, social media companies will be required to follow a strict set of rules for age-appropriate content on social media, which the county's Information Commissioner's Office is calling the pursuit of a "child-safe internet".

Starting September, companies will be expected to curate child-friendly content on their platforms by default, instead of the current model which operates under the assumption that every user on their portal is an adult until they state otherwise. With this, the United Kingdom is setting a precedent for other countries where there are no safeguards for children in the online sphere.