The European Commission has launched its first investigative actions under the Digital Services Act guidelines on the protection of minors, focusing squarely on whether the biggest platforms keep children away from illegal products and harmful material.
Officials have asked Snapchat to explain how it blocks under-13s from using its service, as required by its own terms, and what tools it uses to prevent the sale of illegal goods to children, including vapes and drugs. YouTube has been pressed for details of its age-assurance system, as well as the way its recommender feeds are preventing harmful content, such as material that promotes eating disorders, from reaching minors. Apple’s App Store and Google Play have been asked to set out how they manage the risk of users, including children, downloading illegal or otherwise harmful apps, including gambling apps and so-called nudify tools, and how they apply and enforce age ratings.
“We will do what it takes to ensure the physical and mental well-being of children and teens online. Platforms have the obligation to ensure minors are safe on their services,” said Executive Vice-President for Tech Sovereignty Henna Virkkunen, adding that the Commission, together with national authorities, is now assessing whether the measures taken so far are actually protecting children.
To ensure consistent enforcement across large and small platforms, Brussels is also working with national regulators to identify services that pose the greatest risk to children, with further action to follow.