AI Vendor Due Diligence in 30 Minutes (Questions + Scoring Sheet)
Enterprise vendor questionnaires assume dedicated security engineers. This version assumes you have a calendar opening, a notes doc, and the willingness to say no to vendors who cannot answer basic data questions.
When to use this
- Evaluating SaaS assistants, hosted inference APIs, or bundled copilots
- Renewing a contract that predates your AI policy
- Comparing two vendors before you standardise a workflow
Pair this page with the downloadable AI vendor evaluation checklist so you store consistent evidence.
The pass/fail gate (5 minutes)
Ask the vendor—or read their docs—for these binary answers:
- Can enterprise customers opt out of training on customer content?
- Is a Data Processing Agreement (DPA) available and countersignable?
- Are subprocessors listed with geography?
- Are SOC 2 Type II / ISO 27001 reports available under NDA?
- Can you export prompts and outputs or at least audit logs on demand?
If you get two or more “no” answers for a use case touching customer data, pause the pilot until leadership accepts the residual risk in writing.
Deep questions (15 minutes)
Data lifecycle
- What is the default retention for prompts, outputs, and embeddings?
- Who can access customer content for support debugging?
- How fast can you delete data on exit—hours, days, or “contact legal”?
Security + reliability
- How does the vendor separate tenants at the application and model layers?
- What SLAs apply to uptime and inference latency—and what are the remedies?
- How are incidents communicated—status page, email, contractual notice?
Model behaviour
- Are customers liable for outputs produced with default prompting?
- What safety filters exist—and can they be tuned for regulated domains?
Scoring sheet (copy into your doc)
Rate each area 1 (weak) to 3 (strong):
| Area | Notes | Score |
|---|---|---|
| Data control | ||
| Transparency | ||
| Security proofs | ||
| Exit + portability | ||
| Commercial fit |
9–15: proceed with standard contract language
6–8: proceed with compensating controls (logging, redaction, human review)
≤5: find an alternative or limit to non-production experiments
How to document the decision
- Store the completed checklist in your policy repository
- Link the vendor record inside your AI use-case inventory
- Add a calendar reminder 30 days before renewal to re-run scoring
Related reading
- Shadow AI prevention — stop unapproved vendors before they spread
- AI monitoring tools comparison — when spreadsheets stop being enough
- Risk assessment guide — translate vendor gaps into tracked risks
Subscribe via the form on this page if you want the procurement email template we send monthly subscribers—it mirrors the question list above so you can forward it to vendors verbatim.