Dutch Groups Sue X and Grok Over AI ‘Nudify’ Features, Seek Daily Fines
Two Netherlands-based advocacy organizations have filed a lawsuit against social media platform X
Two Netherlands-based advocacy organizations have filed a lawsuit against social media platform X and its AI tool Grok, accusing them of enabling users to create sexually explicit images of people without consent by digitally removing clothing from photos.
The groups, Offlimits and Fonds Slachtofferhulp, have initiated summary proceedings and are asking the court to impose a penalty of €100,000 per day if the companies fail to disable such features immediately.
In a press statement, the organizations said AI tools like Grok can be used on a large scale to generate and distribute illegal images. They argue that every day this content is created and shared, new victims are harmed, making urgent action essential.
Fonds Slachtofferhulp director Ineke Sybesma stressed that while legislation and oversight processes take time, the damage is happening in real time. According to her, the number of victims is rising rapidly, and individuals should not have to pay the price for unchecked technological capabilities.
Offlimits director Robert Hoving echoed the urgency, saying AI-generated or AI-edited images are increasingly being used to intimidate, shame, and sexually harass people online. He warned that the accessibility of such tools allows mass production and rapid distribution of abusive content, describing the situation as a “slow-motion disaster” that demands intervention.
The lawsuit calls on the court to immediately block any features that allow users to portray individuals as nude or partially exposed without their consent. It also seeks action against tools that could facilitate the creation of content qualifying as child sexual abuse material (CSAM). Until the platforms comply, the organizations want daily financial penalties enforced.
Lawyer Otto Volgenant, representing the groups, argues that so-called “nudify” features violate key European regulations, including the General Data Protection Regulation and the Digital Services Act. He also contends that such tools breach criminal law, civil law standards, portrait rights, and established European case law, which sets strict limits on distributing sexual images without consent.
The case is scheduled for a hearing on March 12, 2026, at the Amsterdam District Court.
Meanwhile, the United Kingdom is moving in a similar direction. The government has proposed an amendment to the Crime and Policing Bill that would require technology companies to act against non-consensual intimate images within 48 hours of being notified.
UK Technology Secretary Liz Kendall said the measure places responsibility squarely on tech firms that have the power and resources to respond. According to her, the step is crucial to making the online environment safer, fairer, and more respectful, particularly for women and girls.
The outcome of the Amsterdam hearing could set an important precedent for how European regulators handle AI tools that blur the line between innovation and abuse.




