The investigation by the European Commission will determine whether Meta has broken any laws related to the Digital Services Act.
It appears that Meta is once again in hot water due to the procedures (or lack thereof) that it employs to protect youngsters. In order to assess whether or whether the owner of Facebook and Instagram has breached the Digital Services Act (DSA) by contributing to children’s addiction to social media and by failing to ensure that they have high levels of safety and privacy, the European Commission has initiated official proceedings.
In particular, the examination that will be conducted by the Commission will look into whether or not Meta is appropriately evaluating and responding to risks that are caused by the interfaces of its platforms. It is concerned about the ways in which their designs could “exploit the weaknesses and inexperience of minors and cause addictive behavior, and/or reinforce the so-called ‘rabbit hole’ effect.” It is necessary to conduct such an evaluation in order to mitigate the potential dangers that could be posed to the exercise of the fundamental right of children to experience both bodily and mental well-being, as well as the right to have their rights respected.
During the hearings, it will also be investigated if Meta takes the required precautions to prevent minors from accessing inappropriate content, whether it has age verification methods that are effective, and whether it provides children with easily accessible and powerful privacy features, such as default settings.
Meta is one of the extremely large online platforms and search engines that are subject to the DSA’s criteria. These platforms and search engines have 45 million or more monthly users in the European Union. Some of the obligations that designated firms are required to fulfill include being transparent about the decisions they make regarding advertising and content moderation, sharing their data with the Commission, and investigating the dangers that their systems represent in relation to issues such as the protection of children, mental health, and gender-based violence.
In response to the official procedures, Meta referred to features such as parental supervision settings, quiet mode, and the fact that it automatically restricts content for teenagers. “We have spent the better part of a decade establishing over fifty tools and regulations that are designed to protect young people. Our goal is to ensure that they have safe and age-appropriate experiences when using the internet. A spokesman for Meta told Newtechmania that the company is looking forward to discussing specifics of their work with the European Commission. “This is a challenge that the entire industry is facing,” the spokesperson said.
On the other hand, Meta has on multiple occasions failed to make the protection of young people a priority. Previous alarming incidents include Instagram’s algorithm suggesting content that features child sexual exploitation and claims that it designs its platforms to be addictive to young people while suggesting content that is psychologically harmful, such as the promotion of eating disorders and body dysmorphia. Both of these incidents have occurred in the past.
A well-known fact about Meta is that it has been a center for spreading false information to individuals of all ages. The European Commission has already initiated formal proceedings against the company on April 30. The Commission is concerned about a number of issues, including deceptive advertising, data access for researchers, and the absence of a “effective third-party real-time civic discourse and election-monitoring tool” prior to the elections for the European Parliament in June. Meta made the announcement earlier this year that CrowdTangle, a platform that has demonstrated to the public how fake news and conspiracy theories spread throughout social media platforms like Facebook and Instagram, would be fully shut down in the month of August.