Enlarge image

Instagram logo: Does the recommendation algorithm steer some users to disturbing children's pictures?

Photo: FILIP SINGER / EPA

Instagram's recommendation algorithm has apparently suggested videos to certain users in which minors can be seen in sexualized poses. This is the result of research by the Wall Street Journal, which set up several previously unused test devices to test the algorithms' suggestions.

According to the Wall Street Journal, if you follow young gymnasts, cheerleaders and young influencers, the platform automatically displayed videos that advertised sex videos with adults. According to the report, the young girls were already followed by a conspicuously large number of male Instagram users. According to the report, Instagram also offered "disturbing" and sexualized videos of underage girls to people who followed these older male fans.

Copy of a TikTok feature

According to the media report, Instagram showed certain users videos of young girls imitating a sexual act, stroking their clothed torso, or imitating sexual acts. Such videos, which do not depict nudity but depict children in a problematic, sexualized form, are popular among pedocriminals and are often exchanged in relevant darknet forums. Child protection experts warn that they could lower or normalize the inhibition threshold for the consumption of abuse images.

The current research is about Instagram's Reels feature, which displays an endless stream of short videos that Instagram compiles based on user interests. The feature copies TikTok's recipe for success and is intended to compete with the company. Back in June, the Wall Street Journal revealed that Instagram's algorithm had suggested pedocriminal posts to users, and that relevant hashtags and pedocriminal networks were relatively easy to find on the platform.

At the time, Instagram referred to the great effort with which specialized teams track the ever-changing tactics of pedocriminals and take down their networks. In addition, the company said it set up a task force as a result of the investigation, which improved automated detection systems for suspicious users. As a result, tens of thousands of users were blocked.

Confronted with the current research, Meta, the parent company of Instagram, explained that it generally either deletes or restricts four million suspicious videos per month. The company criticized the newspaper's current test because it creates an artificial scenario that is not comparable to what billions of users experience on the platform.

Success of Instagram vs. Children's Safety?

The renowned child protection organization »Canadian Centre for Child Protection«, which technically tracks the dissemination of abusive child recordings on the Internet, came to similar results in comparable tests, the »Wall Street Journal« reported. According to the report, Instagram also regularly showed videos and pictures of children that appear in the world's most important database of victims of abuse images from the National Center for Missing and Exploited Children. In addition, Instagram continued to play videos of "children and adults in sexualized poses," even though the company had already been informed of the research results in August.

According to the report, current and former Meta employees confirmed that the issues were known internally. According to the report, changes to the suggestion algorithm that could prevent the distribution of harmful content in certain niches could reduce the useful life of the entire user base. Instagram employees themselves also seem to be aware of the dilemma that protection against potentially abusive or harmful content can in certain cases collide with the company's goal of distributing the app as widely as possible.

Apparently, Disney ads between inappropriate videos

It could be particularly problematic for Instagram that advertising was also played out between the sexualized videos. Advertisers generally don't want to appear near inappropriate or disturbing content. In a similar case at YouTube, protests from advertisers led the company to intensify its defenses against problematic content.

In the current case, the »Wall Street Journal« found that advertisements for the dating app Bumble, for Pizza Hut, Walmart or for the newspaper itself were played out after sexualized videos. Ads from the Disney Group also appeared in corresponding places, which is why the company has apparently already turned to Meta and addressed the problem "at the highest level," as a company spokeswoman said. Disney has been one of Meta's largest advertisers in recent years.

The dating app developer Match Group, which runs Tinder, for example, has now paused all ads in Reels. Meta pointed out that it has now launched new security features for advertisers.

hpp