Loading…
Loading…
The Consumer Financial Protection Bureau (CFPB) has issued guidance clarifying that lenders must provide specific, accurate reasons when taking adverse action (denial, unfavorable terms) in credit decisions — even when the decision was made by an AI or machine learning model. 'The model said no' is not a legally sufficient adverse action notice under ECOA and Regulation B.
If your product uses AI to make or assist in credit decisions — loan underwriting, credit line setting, insurance pricing, rental screening — you must be able to explain, specifically and accurately, the factors that led to each adverse decision. AI models that cannot produce interpretable reasons for their outputs are non-compliant. The CFPB has explicitly rejected 'checklist' adverse action reasons that do not reflect the actual model factors. This pushes lenders toward explainable AI or hybrid systems that combine model scoring with auditable factor attribution.
Civil money penalties up to $1M per day for knowing violations; private right of action under ECOA