Prime Minister Pedro S谩nchez has formally requested that Spain鈥檚 Public Prosecutor鈥檚 Office examine whether large platforms could face criminal liability over the spread of AI generated child sexual abuse material. The request targets X, Meta and TikTok over the alleged spread of AI generated CSAM.
Dr Pain of the University of Nevada, Reno said, 鈥淪pain鈥檚 decision in February 2026 to ask prosecutors to investigate major social media platforms marks a significant escalation in Europe鈥檚 ongoing effort to regulate the digital environment.鈥
Spanish Prime Minister Pedro S谩nchez tweeted, “Today, the Council of Ministers will invoke Article 8 of the Organic Statute of the Public Ministry to ask it to investigate the crimes that X, Meta and TikTok may be committing through the creation and dissemination of child pornography using their AI.
“These platforms are jeopardising the mental health, dignity, and rights of our sons and daughters.
“The State cannot allow this. The impunity of the giants must end.”
In a letter first reported by El Pa铆s, the government pointed to what it described as the abundance of such material circulating on social media.
Spain鈥檚 constitutional structure means the executive cannot order prosecutors to open a case. Attorney General Teresa Peramato must first consult the Board of Prosecutors of the Supreme Court before deciding on any investigation.
How Does This Connect To EU Law?
鈥淔rom an EU policy perspective, Spain鈥檚 move aligns with鈥攂ut also intensifies鈥攖he regulatory trajectory established by the Digital Services Act (DSA),鈥 Dr Pain said.
The Digital Services Act already requires large platforms to mitigate systemic risks, including harm to minors and the spread of illegal content. Spain鈥檚 action tests whether national criminal law can operate alongside EU level enforcement.
鈥淏y asking national prosecutors to examine criminal offences, Madrid is effectively testing the outer limits of platform accountability within the European legal framework. This could create a precedent whereby member states pursue parallel enforcement strategies alongside Brussels.鈥
The European Commission has opened its own investigation into X over sexually explicit deepfakes generated by its Grok AI chatbot. French authorities recently raided X鈥檚 Paris headquarters in a related probe. Spain鈥檚 move increases pressure on platforms already facing action across Europe.
鈥淎nother key implication concerns regulatory fragmentation within the EU,鈥 Dr Pain added. If multiple member states pursue overlapping criminal or administrative cases, platforms could face uneven enforcement across the single market.
More from News
- What Do The April 2026 ONS Market Figures Mean For UK Businesses?
- FinanceWire And Symex Global Partner To Boost PR And IR Reach For Euronext Paris Companies
- Could You Be Answering A Normal Call When It鈥檚 Actually A Deepfake?
- Do People Trust AI More Than They Trust Humans?
- Power Costs Are Causing 1 In 5 UK Firms To Move Overseas
- What Will Happen If EU Regulators Win At Getting Google To Share Its Data?
- Uber Eats Makes Influencers Central To Its UK Growth Strategy
- It Sounds Ridiculous, So Why Is Allbirds鈥 AI Pivot Actually Working?
What Evidence Is Motivating Madrid鈥檚 Action?
鈥淎t its core, Spain鈥檚 action reflects growing alarm about the intersection of generative AI and platform amplification,鈥 Dr Pain said.
The Spanish government has cited evidence that one in five young people in Spain, mostly girls, reported AI generated fake nude images of themselves being shared online. That statistic underpins the urgency behind S谩nchez鈥檚 request.
Earlier this month, S谩nchez announced he would ban children under 16 from accessing social media. He is also proposing that repeat violations by tech executives be treated as criminal offences and that algorithm manipulation be criminalised. Spain鈥檚 Youth Minister Sira Rego has floated banning X outright. Deputy Prime Minister Yolanda D铆az said she had left the platform, arguing that remaining users were 鈥渇eeding the politics of hatred.鈥
TikTok rejected any suggestion that it tolerates such material. A spokesperson said, 鈥淐SAM is abhorrent and categorically prohibited on our platform. TikTok has robust systems in place to thwart attempts to exploit or harm young people, and we continue to prioritize and invest in advanced technologies to stay one step ahead of bad actors.鈥
What Are The Risks For Digital Rights In Europe?
鈥淐ritically, the Spanish initiative reflects a broader European shift toward child-safety framing as the primary justification for digital regulation,鈥 Dr Pain said.
He said that when regulation is driven mainly through child protection narratives, fundamental rights such as privacy, anonymity and freedom of expression may receive less attention.
鈥淚ndeed, critics warn that Spain鈥檚 wider package of measures, including a proposed social-media ban for under-16s, could expand state and platform surveillance.鈥 The prime minister has previously advocated reducing online anonymity and increasing traceability of users, proposals that digital rights groups fear could chill speech.
鈥淯ltimately, Spain鈥檚 probe represents both an opportunity and a risk for EU internet safety,鈥 Dr Pain said. It presses the European Union to confront AI driven sexual abuse and platform amplification, and it forces lawmakers to consider how to protect minors without undermining digital rights across Europe.
What Do Other Experts Say?
On how this move impacts the EU, Trevor Horwitz, CISO and Founder at TrustNet shared: “If Spain moves to restrict social media access for minors or increases enforcement under the Digital Services Act, there are direct privacy implications. One example is if stronger age verification or age assurance is required, this typically poses the need for platforms to process additional personal data to determine a user鈥檚 age. That can increase the volume and sensitivity of data being collected.
“However, it’s important to note that the DSA does not override data protection law. Any measure, if reinforced within the EU, remains subject to the GDPR. Platforms must comply with data minimization, purpose limitation, and lawful processing requirements. Even when the objective is child protection, companies are still required to ensure that any personal data collected is necessary, proportionate, and adequately protected.
“What this means, in practical terms, is that child safety enforcement and privacy compliance operate simultaneously. Platforms will need to demonstrate that protections for minors are implemented in a way that complies with existing EU data protection rules. Regulators are not only examining whether platforms reduce systemic risks to children, but also whether those controls are implemented within the boundaries of established privacy law.”