Pablo R. SuanzesBrussels Correspondent

Brussels Correspondent

Updated Monday, February 19, 2024-13:15

  • Technology Brussels warns TikTok: "The era of candidness has come to an end. With a younger audience, the responsibility is greater"

The European Commission on Monday launched a formal procedure to assess whether TikTok may have infringed the European Digital Services Act with regard to the protection of minors, advertising transparency and access to data for researchers, as well as the risk management of addictive design and harmful content.

"Today we opened an investigation into TikTok for suspected non-compliance with transparency and child protection obligations due to addictive design and screen time limits. For the 'rabbit hole effect' and age verification, as well as the default privacy settings. The application of the European directive is necessary for a safer internet for young people," said the commissioner responsible for the Internal Market, the French

Thierry Breton

, who already has similar proceedings against other digital giants.

Based on the preliminary investigation carried out to date, and in particular after the analysis of the risk assessment report sent by TikTok in September 2023, as well as "the company's responses to formal requests for information from the Commission on illegal content, protection of minors and access to data", the French commissioner's team has decided to launch this formal procedure.

In a statement published this afternoon, Brussels explains

the points in which the company has not been able to convince with its

preliminary explanations, which is why it faces a file that may end in sanctions. The EU is concerned about

the platform's algorithm

, "which can stimulate addictions or create so-called "rabbit hole effects." The in-depth evaluation, they state, "is necessary to counteract the potential risks for the exercise of the fundamental right to the physical and mental well-being of the person, respect for the rights of the child and its impact on radicalization processes." Well, it is not only a problem of consumption, but of political or religious radicalization. "The measures, and in particular the

age verification

tools used by TikTok to prevent minors from accessing inappropriate content, may not be reasonable, proportionate and effective," say community experts.

The other big problem, which was one of the topics highlighted in the first conversations between the Commission and the technology company, affects the requirements contained in the European Digital Services Law to guarantee "a high level of privacy, security and protection of minors , particularly with regard to the

default

privacy settings for minors as part of the design and operation of their recommendation systems."

As almost always happens in these investigations, European officials complain about the lack of access and information to their data and systems. The opening of a formal process does not prejudge the result. Sometimes companies are exonerated, but the normal thing is that there are usually specific demands, behavioral modifications, operational changes or even division of the business structure, given the risk of multimillion-dollar sanctions. Brussels also explains that what is decided on the aforementioned points does not mean that there cannot be measures in the event of other infringements, "for example in relation to the obligations of a provider in relation to the dissemination of illicit content, such as terrorist content or sexual abuse of minors online, or with the notification of suspected crimes.

From now on, another phase begins, but unlike Competition procedures, for example, the Digital Services Law does not establish any legal deadline to end "formal" investigations. The duration of an in-depth investigation depends on several factors. , among them the complexity of the matter, the degree of cooperation of the affected company with the Commission and the exercise of the right of defense," they warn in their statement.