Fed watchdog: Fair Housing Act, ECOA must evolve with realities of AI

At an event marking the 55th anniversary of the passage of the Fair Housing Act (FHA) during the National Fair Housing Alliance 2023 National Conference, Federal Reserve Vice Chair for Supervision Michael Barr delivered remarks urging for the evolution of both the FHA and the Equal Credit Opportunity Act (ECOA) to reflect realities and dangers posed by emerging technologies in the mortgage space.

“As our financial system evolves, it is critical that we adapt our application of the Fair Housing Act and ECOA to deal with technological change and other developments,” Barr said in his speech.

There are potentially positive implications that come with such technological advances, including providing “a window” into the creditworthiness of a person who may not have a “standard credit history,” he said. New artificial intelligence technologies, including machine learning, could also make use of such data “at scale and at low cost to expand credit to people who otherwise can’t access it,” Barr added.

However, there is also the potential for these technologies to exacerbate existing issues related to lending equality, Barr explained.

“While these technologies have enormous potential, they also carry risks of violating fair lending laws and perpetuating the very disparities that they have the potential to address,” he said. “Use of machine learning or other artificial intelligence may perpetuate or even amplify bias or inaccuracies inherent in the data used to train the system or make incorrect predictions if that data set is incomplete or nonrepresentative. There are also risks that the data points used could be correlated with a protected class and lack a sufficient nexus to creditworthiness.”

Barr called “digital redlining in marketing” — defined as “the use of criteria to exclude majority-minority communities or minority applications” — one such risk that could come with the advent of these technologies in lending, something that has already been the “subject of several settlements,” he said.

Additionally, if lenders select their target audiences based on the characteristics commonly associated with a protected class, then a form of “digital redlining” becomes more possible.

“New technologies can also result in ‘reverse redlining,’ or steering in the advertisement of more expensive or otherwise inferior products to minority communities,” he said. “These risks are amplified when a model is opaque and lacks a sufficient degree of explainability—the degree to which the bank can understand how data, variables, and other features inform the credit decisions.”

The banking and lending ecosystems themselves are still navigating these emerging technologies, he said. As a result, the Fed is aiming to ensure that its oversight of such practices “keeps pace” with the implementation,

“Through our supervisory process, we evaluate whether firms have proper risk management and controls, including with respect to those new technologies,” he said.