Why AI alone won’t solve biased underwriting

Why AI alone won’t solve biased underwriting

Jin Han is the chief legal officer for Pipe Technologies, a San Francisco, California-based fintech that provides a capital platform for small and medium-sized businesses. 

Even after decades of industry and regulatory efforts to improve fair access to business financing, major gaps remain, especially for women-, minority-, and immigrant-owned businesses.  

Many of these gaps are a result of the structural bias in the current risk rating system, exacerbated further by additional biases introduced in both human-based and algorithm-based (AI-powered or not) underwriting processes. Increasing financial access for underserved communities requires more than just following the law.  We must tackle the structural disadvantages for these communities.

Much of the argument around the idea that AI and machine learning can make the underwriting process impartial borders on wishful thinking. No algorithmic process is immune from biases inherent in the data it analyzes, and it is no substitute for thoughtful consideration at the underwriting policy level.   

More discussion on how to methodically achieve financial inclusion is sorely needed. Too often, advances in AI and machine learning technology are touted as a cure-all for uneven access to financing. What we need is a plan for how these technologies can actually help, rather than take over, this endeavor to increase financial access.  

AI technology is a great tool for ingesting and analyzing data accurately, as well as automating the implementation of fair and well-designed underwriting policies. We believe that combining the efficiency gains from recent advances with thoughtful, human-powered selection of risk data that benefit the members of underserved communities is the winning strategy. Here is how we think about this problem:

Traditional underwriting methodology limits

Traditional risk assessment models relying primarily on FICO or Vantage scores have clear benefits to capital providers. Positive payment and credit history show the business owner’s responsibility, management acumen, and willingness to meet his/her obligations. Relying on credit bureau data also tends to lower review time by underwriters and makes it easier to automate the underwriting process. A good credit score is a reliable indicator of business responsibility that can substitute for time-consuming research and interviews.  

Despite these benefits, traditional underwriting models have long been criticized for perpetuating unequal access to financing. As recently as 2023, women- and minority-owned businesses are disproportionately denied financing or more likely to be approved for smaller amounts than applied for, according to data from the Federal Reserve’s Small Business Credit Survey. 

The reason for this disparity is straightforward. Women and members of minority or recent immigrant groups tend to be newer entrepreneurs, often operating brick-and-mortar retail businesses with thinner margins. Some recent immigrants may also be debt-adverse and may place less importance on building payment or credit history.  

These historical and cultural factors make credit score-based underwriting ill-suited for increasing financial access. While considering a business’s operational history for risk underwriting is value-neutral on its face, its impact is more widely felt by newer entrants to business ownership. 

It is not easy to address problems with a system that has clear predictive power and does not exhibit obvious discriminatory intent. But it is simply not possible to solve the financial access problem unless the industry finds a way to reduce its reliance on credit scores and business history.  

This is a big challenge. But magical thinking around AI is not the answer.  There’s a different, more methodical approach to tackling this problem. We’ve seen it work at Pipe, and I believe it will continue to increase financial access. 

link

Leave a Reply

Your email address will not be published. Required fields are marked *