CFPB says creditors must provide reasons for taking adverse action, even when relying on AI | Goodwin

ABSTRACT

On May 26, 2022, the Consumer Financial Protection Bureau (“CFPB”) issued a Consumer Financial Protection Circular (the “Circular”), confirming that creditors must provide specific reasons for taking adverse action against a applicant, even when the creditor relies on black box models or complex algorithms for credit decisions. While black box models and complex algorithms are widely used by creditors for credit granting decisions, the reasoning behind some of the model and algorithm results may not be known or fully understood by creditors. or model developers. Nevertheless, the CFPB has confirmed that the use of such models and algorithms does not relieve creditors of their obligation to disclose to consumers specific and specific reasons for taking adverse action, as required by the Equal Opportunity Act. Credit Opportunity and its Implementing Regulation B (collectively, “ECOA”).

ADVERSE ACTION UNDER ECOA

The ECOA prohibits any creditor from discriminating against any applicant, with respect to any aspect of a credit transaction on the basis of race, color, religion, national origin, sex or nationality. marital status, or age (provided the applicant has capacity to contract); because all or part of the applicant’s income comes from any public assistance program; or because the applicant has exercised a right under the Consumer Credit Protection Act in good faith.[1] To promote transparency and fairness in the credit underwriting process, the ECOA requires creditors taking adverse action against consumers to provide consumers with a written statement setting forth the specific primary reason(s) for the adverse action. It is critical that the reasons provided in the Notice of Adverse Action be “accurate” with respect to information about the consumer’s claim or the circumstances that do not meet the creditor’s underwriting criteria. It is not sufficient to provide vague or general statements that the adverse action was based on the lender’s internal standards or policies or that the consumer did not achieve a qualifying score on the lender’s credit reporting system .[2]

RISE OF BLACK-BOX CREDIT MODELS OR COMPLEX ALGORITHMS

In the circular, the CFPB acknowledges that although financial institutions have long used complex underwriting and other calculation methods to make credit risk decisions, they are still able to provide disclosures of specific adverse action to comply with ECOA. More recently, however, the financial industry has begun to rely on models and algorithms with increasingly detailed datasets and complex methodologies, often including some form of artificial intelligence (“AI”) that deals with large volumes of data, which makes it extremely difficult to identify the specific criteria that led to the refusal of a consumer’s credit application. Some credit algorithms and models use machine learning, which can actually change creditworthiness standards or reinforce model biases over time. Many companies that develop or market these decision-making models to financial institutions view the technology as proprietary information, providing users with little information about how, or on what basis, results are provided. This lack of access to or understanding of the decision and model rationale may prevent creditors from being able to articulate the specific reason(s) for an adverse credit decision.

Previously, the CFPB has shown support for AI, machine learning, and the use of alternative data to expand consumer access to credit. In 2017, the CFPB granted its first no-action letter to Upstart Network, Inc., a company using alternative data to make credit and pricing decisions.[3] The CFPB also said it hoped to “facilitate the use of this promising technology [AI/ML] to expand access to credit and benefit consumers.[4] This support and aspirations stand in stark contrast to the ominous tone of the recent CFPB circular, particularly when juxtaposed with its tone with that of a CFPB blog post published in 2020 under the previous administration by Patrice Alexander Ficklin, Director CFPB’s Fair Lending Office. His 2020 blog post spoke of a more supportive spirit for innovation by industry participants and experimentation with AI models, citing ECOA[5] recognizing that “the existing regulatory framework has built-in flexibility that can be compatible with AI algorithms.”[6]

The CFPB, however, has since added a disclaimer to this 2020 blog post, warning that it “provides an incomplete description of the ECOA Notice of Adverse Action and Regulation B requirements” and that the ECOA “does not permit creditors to use the technology for which they cannot provide specific reasons for the adverse actions,” instead referring readers to the circular.[7] This shift in CFPB policy stance is less surprising given that CFPB Director Rohit Chopra has long spoken out about the built-in risk of bias he perceives in aggregated data and the black box decision model. and the need for consumer protection laws to address this risk. – specifically distinguishing between AI or machine learning in general and black box models that cannot be explained.[8] In 2021, director Chopra once again foreshadowed, “[a]Algorithms can help eliminate biases, but black box underwriting algorithms do not create a more level playing field and only exacerbate the biases that fuel them.[9]

CFPB CONFIRMS THAT SPECIFIC REASONS FOR ADVERSE ACTION ARE REQUIRED, REGARDLESS OF TECHNOLOGY USED

Given the increased use of black box models and algorithms and under the direction of Director Chopra, the circular answered the question: taking other adverse action, should these creditors comply with the Equal Credit Opportunity Act requirement to provide a statement of specific grounds to applicants against whom adverse action is being taken? The CFPB’s short answer: “Yes.”[10]

The CFPB confirms that the adverse action requirements of the ECOA apply equally to all creditors, regardless of the technology used for the credit decision. Accordingly, the CFPB asserts that the ECOA does not permit the use of black box models or complex algorithms for credit decisions “when it means [creditors] cannot provide specific and specific reasons for an adverse action. Ultimately, a creditor’s lack of understanding of the decision-making technology they employ does not justify non-compliance with the ECOA.[11]

The CFPB also notes that it is considering the use of black box models and algorithms beyond adverse action notices, referencing its recent spotlight on automated valuation models.

While the CFPB and the industry seem to recognize the potential benefits of these new technologies in the credit decision space, it is clear that the CFPB is focused on transparency in the credit decision process and ensuring compliance with the ‘ECOA. Creditors should be mindful of the potential for consumer harm that could result from the models or technology they choose to use in the credit decision-making process and take reasonable steps to ensure understanding and transparency of those models. or technology. Creditors must also establish and maintain a strong fair lending program[12] and model risk management framework,[13] leveraging industry standards and practices, to ensure models are properly integrated, validated and monitored.


[1]15 U.S.C. 1691(a); 12 CFR 1002.1(b).
[2]15 U.S.C. 1691(d)(2)(A); 12 CFR 1002.9(b)(2).
[3]See CFPB Announces First No Action Letter to Upstart Network | Consumer Financial Protection Bureau (consumerfinance.gov).
[4]See Innovation Spotlight: Providing Adverse Action Notices When Using AI/ML Models | Consumer Financial Protection Bureau (consumerfinance.gov).
[5]12 CFR 1002.9(b)(2)-3 and 1002.9(b)(2)-4.
[6]See https://www.consumerfinance.gov/about-us/blog/innovation-spotlight-providing-adverse-action-notices-when-using-ai-ml-models/
[7]See https://www.consumerfinance.gov/about-us/blog/innovation-spotlight-providing-adverse-action-notices-when-using-ai-ml-models/
[8]See Prepared Speech by Commissioner Rohit Chopra at the FTC Competition and Consumer Protection Hearings, George Mason University, Antonin Scalia School of Law – October 15, 2018; Commissioner Chopra’s Comment on the Department of Housing and Urban Development’s Proposed Rule Regarding the Discriminatory Effects of the Fair Housing Act Standard – October 16, 2019 (ftc.gov); Prepared speech by Commissioner Rohit Chopra at the Silicon Flatirons Conference (ftc.gov); Remarks by Director Rohit Chopra at a Joint DOJ, CFPB, and OCC Press Conference on Trustmark National Bank Enforcement Action | Consumer Financial Protection Bureau (consumerfinance.gov).
[9]See Remarks by Director Rohit Chopra at a Joint DOJ, CFPB, and OCC Press Conference on Trustmark National Bank Enforcement Action | Consumer Financial Protection Bureau (consumerfinance.gov).
[10]See Consumer Financial Protection Circular 2022-03: Adverse Action Notification Obligations in Relation to Credit Decisions Based on Complex Algorithms | Consumer Financial Protection Bureau (consumerfinance.gov).
[11]See CFPB acts to protect the public from black-box credit models using complex algorithms | Consumer Financial Protection Bureau (consumerfinance.gov).
[12]See 201510_cfpb_ecoa-narrative-and-procedures.pdf (consumerfinance.gov).
[13]See Attachment SR 11-7: Prudential Guidelines on Model Risk Management (federalreserve.gov); OCC 2011-12: Sound Practices for Model Risk Management: Prudential Guidelines on Model Risk Management (treas.gov); fil17022.pdf (fdic.gov); The Fed – SR 21-8: Model Risk Management Interagency Statement for Banking Systems Supporting Bank Secrecy Act/Anti-Money Laundering Compliance (federalreserve.gov).

[View source.]

Comments are closed.