Loading…
Loading…
The EU's foundational data privacy law, governing how organizations collect, process, and store personal data of EU residents. Article 22 restricts solely automated decision-making that significantly affects individuals, with direct implications for AI-driven decisions.
If you process any personal data of EU residents — including via AI tools that analyze customer emails, user behavior, or employee data — GDPR applies. You need a lawful basis for processing, a signed Data Processing Agreement with every AI vendor that touches personal data, and a process for responding to data subject access requests. AI tools that make significant automated decisions about individuals require additional safeguards.
Up to €20M or 4% of global annual turnover
Italy's data protection authority temporarily banned ChatGPT from processing Italian users' data, citing GDPR violations: no lawful basis for mass collection of training data, no age verification to prevent minors from accessing the service, and failure to provide adequate transparency about data collection. OpenAI had 20 days to comply or face a permanent ban.
Outcome: ChatGPT was blocked for Italian users for approximately one month (March 31 – April 28, 2023). OpenAI resolved the ban by implementing an age verification mechanism, adding a GDPR privacy notice, and providing an opt-out mechanism for Italian users' data. The Garante later opened a separate formal investigation.
Clearview AI scraped billions of photos from the internet to build a facial recognition database sold to law enforcement. The UK ICO and French CNIL both found this violated data protection law: no lawful basis for collecting biometric data at scale, individuals had no knowledge their images were being used, and Clearview failed to respond adequately to data subject access requests.
Outcome: ICO fined Clearview £7.5M and ordered deletion of UK residents' data. CNIL fined Clearview €20M. Italy, Australia, Canada, and Greece also took enforcement action. Clearview was effectively banned from operating in Europe.
Amazon retained children's voice recordings collected by Alexa indefinitely — even after parents requested deletion — in violation of the Children's Online Privacy Protection Act (COPPA). Amazon used the retained data to improve Alexa's AI models despite being told to delete it. A separate violation related to Ring doorbell cameras allowed employees and contractors to access private customer video footage.
Outcome: $25M civil penalty for the Alexa COPPA violations; $5.8M in disgorgement for the Ring privacy violations. Amazon was required to delete all children's data collected in violation of COPPA, prohibited from using that data for training AI, and required to implement a comprehensive children's data deletion program.