Close Menu
  • Home
  • Finance News
  • Personal Finance
  • Investing
  • Cards
    • Credit Cards
    • Debit
  • Insurance
  • Loans
  • Mortgage
  • More
    • Save Money
    • Banking
    • Taxes
    • Crime
What's Hot

China’s retail sales disappoint as stimulus fails to spur demand; industrial output defies tariffs

May 19, 2025

Moody’s U.S. Credit Downgrade Might Seem Unimportant But It’s Bad News

May 19, 2025

Millennials struggle financially despite higher earnings

May 19, 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Smart SpendingSmart Spending
Subscribe
  • Home
  • Finance News
  • Personal Finance
  • Investing
  • Cards
    • Credit Cards
    • Debit
  • Insurance
  • Loans
  • Mortgage
  • More
    • Save Money
    • Banking
    • Taxes
    • Crime
Smart SpendingSmart Spending
Home»Banking»Your AI credit models are fine, but their training data is problematic
Banking

Your AI credit models are fine, but their training data is problematic

December 4, 2024No Comments4 Mins Read
Facebook Twitter LinkedIn Telegram Pinterest Tumblr Reddit WhatsApp Email
Your AI credit models are fine, but their training data is problematic
Share
Facebook Twitter LinkedIn Pinterest Email

AI systems built to assess creditworthiness are trained on data that implicitly accepts past discriminatory lending decisions as legitimate signals about borrowers today, writes Deon Crasto, of Velocity Global.

Adobe Stock

The promise of artificial intelligence in lending offers faster decisions and broader access to credit, but it often perpetuates existing inequities. Be wary: Your AI lending model might not be as fair and objective as it appears.

Don’t believe me? Let’s look at a few instances. First, car loans — researchers at the University of Bath reported that women were more likely to be disproportionately favored for loan originations as opposed to their male counterparts, even while controlling other financial factors. Oh, and with mortgages, we see a very similar story. A 2024 examination leveraging leading large language models to determine creditworthiness found that Black applicants were at a higher risk of being denied as compared to their white counterparts. And it’s not just race or color. It expands across age, postal codes and even the college you attended. 

At the end of the day, lenders are looking for deterministic factors to underwrite products — and that’s what’s going on here. I know all too well. I ran the product for the data science and decisioning team at Ondeck Capital and we looked at every data point we could get our hands on. And I mean it. Got a bad Yelp rating? It was accounted for in our model. Your FourSquare check-ins were down? Oh, we know. We even considered factors like seasonality in cash flow and how businesses in your neighborhood were doing. Our machine learning, or ML, models were designed to process thousands of data points to make lending decisions in seconds. 

See also  Google's Epic Games loss cedes little payments ground | PaymentsSource

But I’m here to give you an alternate narrative. I think your AI models are fine (for now), but your data is fundamentally flawed. The issue isn’t in the algorithms themselves, but in the historical data we’re feeding them. You see, models are trained on datasets that literally go back decades. So, if a certain group has historically been denied loans at higher rates, ML models will subconsciously associate this with “high risk.” The model doesn’t know it’s being unfair; it’s simply learning from the patterns we’ve provided.

The problem is exacerbated by what we in the industry call “thin files” — credit reports with limited history. This disproportionately affects young adults and recent immigrants — arguably two groups most in need of access to credit. The alternative is to take on loans, often ones people cannot afford and on unfavorable terms, to build up credit, creating a Catch-22 situation that can trap people in a cycle of debt.

The impact of thin files on creditworthiness is staggering. According to a recent study by LexisNexis, banks in the U.K. could be denying loans to 80% of adults with thin credit files, often low-risk customers. These applications typically deemed high risk by traditional lending models would often, in theory, be auto-declined through “hard cuts,” a process where applications are eliminated based on specific criteria deemed necessary for approval. If these criteria are missing, models typically disregard any other relevant financial information — often with more exhaustive data points, effectively shutting out individuals from credit. 

So how do we solve this? I’d argue the future of our lending models needs to account for a more holistic picture to determine creditworthiness. We need to diversify our data sources, implement rigorous back-testing for biases and make our models as transparent as possible. Transparency should also extend to the consumer, by allowing them to understand the factors influencing their creditworthiness.

See also  Investors await another Monday jolt after Moody's downgrades US

It shouldn’t just be about smarter algorithms. It should be about smarter, fairer and more complete data. And on some level, it’s about ensuring algorithmic accountability — and the ethical application of AI and ML in products that have a broad-reaching impact on society.

Source link

credit data fine models problematic training
Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email
Previous ArticleA Health Reimbursement Account Is Another Tax-Advantaged Another Way to Pay Medical Expenses
Next Article How To Make An AI-Proof Small Business Fortune

Related Posts

Moody’s U.S. Credit Downgrade Might Seem Unimportant But It’s Bad News

May 19, 2025

Capital One closes Discover acquisition after 15-month saga

May 18, 2025

Checks Being Sent To Victims Of T-Mobile Data Breach

May 18, 2025
Add A Comment
Leave A Reply Cancel Reply

Top Posts

Why now is an ideal time to do a financial reset, advisor says

January 17, 2025

Inside Sunbit’s Mission To Bring Buy Now, Pay Later To Your Auto Mechanic And Dentist

November 2, 2024

Why finishing college could help solve the student loan crisis

May 7, 2025
Ads Banner

Subscribe to Updates

Subscribe to Get the Latest Financial Tips and Insights Delivered to Your Inbox!

Stay informed with our finance blog! Get expert insights, money management tips, investment strategies, and the latest financial news to help you make smart financial decisions.

We're social. Connect with us:

Facebook X (Twitter) Instagram YouTube
Top Insights

China’s retail sales disappoint as stimulus fails to spur demand; industrial output defies tariffs

May 19, 2025

Moody’s U.S. Credit Downgrade Might Seem Unimportant But It’s Bad News

May 19, 2025

Millennials struggle financially despite higher earnings

May 19, 2025
Get Informed

Subscribe to Updates

Subscribe to Get the Latest Financial Tips and Insights Delivered to Your Inbox!

© 2025 Smartspending.ai - All rights reserved.
  • Contact
  • Privacy Policy
  • Terms & Conditions

Type above and press Enter to search. Press Esc to cancel.