APPLE

TikTok under investigation in Europe for failing to protect children and more

Popular short video app TikTok is under investigation in Europe for a number of potential violations of the Digital Services Act (DSA). ), including failure to protect the privacy and safety of children.

The company is also accused of putting users' mental health at risk by deliberately seeking to make the app addictive and sending people down &#8220 ;rabbit holes&# 8221; which can lead to radicalization of users …

Digital Services Act ( DSA)

The Digital Services Act (DSA) is European legislation designed to protect people from false information, disguised advertising and from having your personal data used to target you without your knowledge.

This initially affected 19 tech giants who were found to operate very large platforms, meaning they influence very large numbers of people. The primary goal of the DSA is to prevent the spread of harmful misinformation and disinformation on social media, and a secondary goal is to ensure that companies are transparent about the reasons for any recommendations they make.

The second factor potentially impacts Apple.

X is already under investigation

The social network formerly known as Twitter became the first company to come under investigation by the DSA back in December last year.

Social network X is being monitored by an official European Union investigation to decide whether the company broke the law in no fewer than eight ways.

At the top of the list is “dissemination of illegal content in the context of Hamas terrorist attacks against Israel,” but the blue checkmarks are also under the microscope again.

TikTok is also under investigation

TikTok has now been named as the second company to come under investigation for potential DSA violations.

Key among them is the commitment to “provide a high level of privacy, safety and security for minors, particularly with respect to the default privacy settings for minors in as part of the design and operation of their recommendation systems.

In addition, the EU is concerned that TikTok's algorithms may be deliberately designed to be addictive and provide extreme content.

Algorithmic systems that can stimulate behavioral addictions and/or create so-called “rabbit hole effects.” Such an assessment is necessary to counter potential risks to the realization of the fundamental right to the physical and mental well-being of a person, respect for the rights of the child, as well as its impact on radicalization processes.

The Commission states that it is not yet possible to predict how many The investigation may take time. This depends on a number of factors, including the extent to which TikTok has collaborated.

TikTok told us:

TikTok has pioneered the use of new functions. and settings to protect teenagers and keep children under 13 off the platform—issues the entire industry is grappling with. We will continue to work with experts and industry representatives to ensure the safety of young people on TikTok, and look forward to explaining this work in detail to the Commission.

Photo by Bolivia Intelligente on Unsplash

Leave a Reply