The AI-Privacy Intersection Small Teams Miss
When your team adopts AI tools, you are almost certainly processing personal data through them — even if you do not think of it that way.
A sales rep pasting a prospect's name and email into ChatGPT to draft an outreach email. A support agent summarizing a customer complaint in an AI writing tool. A recruiter uploading a CV to an AI screener. In each case, personal data is leaving your environment and entering a third-party AI system.
Data protection law was not written with AI tools in mind, but it applies to them fully.
This guide covers what small teams need to know to use AI tools without creating privacy compliance exposure — under both GDPR (EU) and CCPA (California).
Core Concepts
Personal data (GDPR term) / personal information (CCPA term): Any data that identifies or can identify a natural person. Names, email addresses, IP addresses, location data, or any combination of data that could single out an individual. Under GDPR this definition is broad — anonymized data that could be re-identified still counts.
Data controller: Your organization. You decide why and how personal data is processed. You are responsible for compliance.
Data processor: A vendor that processes data on your instructions. AI tool providers (OpenAI, Anthropic, Google, Microsoft, etc.) are processors when you send them personal data to process on your behalf.
Data Processing Agreement (DPA): The contract required between controller and processor under GDPR Article 28. Without one, your use of an AI tool for personal data is non-compliant.
The Three-Question Test
Before your team uses an AI tool with any data, ask:
1. Does this data include personal information?
If yes — names, emails, phone numbers, customer IDs, any data linked to an identifiable person — privacy rules apply.
2. Do we have a DPA with this vendor?
- Enterprise/paid plan, DPA signed: Likely compliant (check the details).
- Paid plan, no DPA signed yet: Request it from the vendor before using personal data.
- Free tier: Almost certainly no DPA. Do not use personal data.
3. Is there a lawful basis for this processing?
Under GDPR, you need a legal reason to process personal data. For most AI use cases in small businesses, this will be legitimate interests (you have a genuine business reason and it does not override individuals' rights) or contract performance (you need to process the data to deliver a service the person signed up for). Document your lawful basis.
What GDPR Requires When Using AI Tools
Get a DPA
For any AI vendor processing personal data of EU residents on your behalf, you need a signed DPA. Most major AI vendors offer this on their business plans:
- OpenAI: Available on ChatGPT Team, ChatGPT Enterprise, and API users who accept the data processing terms
- Anthropic (Claude): Available through their API usage terms and enterprise agreements
- Google (Gemini/Workspace AI): Covered under Google Workspace DPA for business accounts
- Microsoft (Copilot): Covered under Microsoft's data processing terms for commercial accounts
Action: Check every AI tool in your tool register. If it touches personal data and lacks a DPA, either get one or stop using personal data in that tool.
Opt out of model training
Many AI vendors' default terms allow them to use your inputs to train future models. Under GDPR, using personal data to train a model is a new processing purpose that likely requires separate consent or a lawful basis.
Most enterprise plans include a training opt-out. Free tiers often do not. Confirm training opt-out status for every tool that handles personal data.
Check data transfer rules
If you are in the EU and your AI vendor processes data in the US, you need a valid data transfer mechanism (Standard Contractual Clauses are the most common). Most major vendors' DPAs include SCCs. Check this when reviewing the DPA.
Know your retention periods
How long does the vendor retain your inputs and outputs? Does that align with your own data retention policy? This is a required DPA clause — if it is missing or vague, ask the vendor directly.
What CCPA Requires When Using AI Tools
CCPA applies to for-profit businesses that meet at least one of these thresholds: annual gross revenue over $25 million, buy/sell/share personal information of 100,000+ consumers or households, or derive 50%+ of annual revenue from selling personal information.
If you are subject to CCPA:
Check whether AI use qualifies as a "sale" or "share"
Under CCPA, "sharing" personal information for cross-context behavioral advertising — or selling it for value — triggers disclosure and opt-out rights. If your AI vendor uses customer data for model training or shares it with partners, this may qualify.
Action: Review the vendor's data use terms specifically for CCPA language. If they "share" data as defined by CCPA, you may need to:
- Update your privacy policy to disclose this
- Honor opt-out requests from California residents
Use Service Provider agreements
CCPA has a concept equivalent to GDPR's DPA — a Service Provider agreement — that limits how vendors can use your data. This restricts the vendor from using the data for their own purposes. Major AI vendors typically include CCPA Service Provider language in their DPAs.
Data subject requests extend to AI-processed data
If a California resident submits a request to know, delete, or correct their personal information, and that data was processed through an AI tool, you may need to make a deletion request to the AI vendor as well. Know whether your AI vendor's DPA includes a provision for this.
High-Risk Use Cases to Evaluate First
Some AI use cases carry higher privacy risk and deserve closer review:
| Use case | Risk | What to check |
|---|---|---|
| Pasting customer emails into AI writing tools | High | DPA, training opt-out |
| AI meeting transcription (customer calls) | High | DPA, recording consent, retention |
| AI analysis of CVs/job applications | High | GDPR lawful basis, automated decision rules |
| AI chatbot handling customer queries | High | DPA, privacy policy disclosure |
| AI summarizing support tickets | Medium | DPA, training opt-out |
| AI drafting internal documents (no customer data) | Low | Training opt-out preferred |
| AI coding assistant (no personal data in prompts) | Low | Training opt-out preferred |
Practical Steps: What to Do This Month
-
Audit your AI tool register for personal data use. For each tool, note whether personal data is routinely entered. Mark these as priority for DPA review.
-
Get DPAs signed for all high-priority tools. Contact vendors for enterprise plan DPA templates. Most can be signed within a week.
-
Enable training opt-out wherever available. Even for tools where it is not strictly required, opting out is good practice.
-
Update your privacy policy. If you disclose categories of vendors or describe how you process customer data, AI tool vendors should appear in that disclosure.
-
Add a "no personal data in free-tier AI" rule to your AI acceptable use policy. Make it explicit. The AI acceptable use policy template has a placeholder for this.
-
Brief the team. The most common violation is accidental — someone not realizing that a prospect's email address is personal data. Thirty seconds in the next team meeting prevents most incidents.
A Note on AI-Specific Risks
Beyond standard data protection, AI tools introduce some privacy risks not covered by conventional compliance frameworks:
Inference and re-identification. AI models can infer sensitive attributes (health conditions, political views) from seemingly innocuous data. Data that appears non-personal can become personal in combination with an AI's inference capabilities.
Prompt injection and data extraction. Malicious content in inputs can trick AI tools into revealing data from other users' sessions in some architectures. This is an active area of vendor security work but worth monitoring.
Model memorization. AI models can memorize and reproduce training data verbatim in some circumstances. If your data was used to train a model, fragments of it could appear in outputs to other users. Training opt-out mitigates but does not eliminate this risk.
These are emerging areas where regulation is still catching up. Staying informed through your quarterly governance review is the practical response.
This guide covers general principles and is not legal advice. For GDPR or CCPA obligations specific to your organization, consult a qualified legal advisor.