Close Menu
  • Home
  • Finance News
  • Personal Finance
  • Investing
  • Cards
    • Credit Cards
    • Debit
  • Insurance
  • Loans
  • Mortgage
  • More
    • Save Money
    • Banking
    • Taxes
    • Crime
What's Hot

Mortgage Rates Stuck Above 6% as New Korean Tariffs Roll Out

January 27, 2026

Tax season opens. How Trump’s tax cuts will affect your return

January 26, 2026

Lawsuit calls out AI hiring practices that many banks use

January 26, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Smart SpendingSmart Spending
Subscribe
  • Home
  • Finance News
  • Personal Finance
  • Investing
  • Cards
    • Credit Cards
    • Debit
  • Insurance
  • Loans
  • Mortgage
  • More
    • Save Money
    • Banking
    • Taxes
    • Crime
Smart SpendingSmart Spending
Home»Banking»Lawsuit calls out AI hiring practices that many banks use
Banking

Lawsuit calls out AI hiring practices that many banks use

January 26, 2026No Comments8 Mins Read
Facebook Twitter LinkedIn Telegram Pinterest Tumblr Reddit WhatsApp Email
Lawsuit calls out AI hiring practices that many banks use
Share
Facebook Twitter LinkedIn Pinterest Email

  • Key insight: The use of AI to screen job candidates, a common practice among banks and other companies, is under fire for silently blacklisting qualified candidates.
  • What’s at stake: Hiring transparency failures could trigger legal, reputational and compliance costs for employers.
  • Expert quote: “AI hiring must follow existing fairness, transparency laws.” —David Seligman, Towards Justice.

Erin Kistler has a computer science degree from Ohio State University, was a program manager at Microsoft for nearly six years and has 19 years’ experience working in product management, including in AI, data management systems and user experience technology.
But over the past four years, she’s received automated rejections when she has applied for positions for which she is qualified at PayPal, Microsoft, Netflix and other employers.

Processing Content

“I’ve applied to hundreds of jobs, but it feels like an unseen force is stopping me from being fairly considered,” Kistler said. “It’s disheartening, and I know I’m not alone in feeling this way.”

Each time Kistler received an automated rejection, she was required to submit her application through a link that included “eightfold.ai/careers.” She realized the prospective employers all work with a company called Eightfold AI that provides artificial intelligence-based recruiting and hiring services.

Kistler is one of two plaintiffs in a lawsuit that was filed last week against Eightfold in a California court. Eightfold provides AI-based background checks and credit checks. Three of its financial-industry clients are BNY, Morgan Stanley and PayPal, none of which responded to a request for comment about the lawsuit.

The case underscores the risks banks face when they let AI do some of the grunt work of filtering vast numbers of job applications.

To be sure, employers have been aware of the dangers of using AI in hiring for years. Many banks use HireVue, a provider of a video interviewing system that uses AI to analyze responses, which has been sued multiple times for allegedly discriminating against people who are disabled. Beyond HireVue or Eightfold, AI is embedded in many systems that companies use for recruiting, talent management and other aspects of human resources. 

According to the new lawsuit, filed by the nonprofit law firm Towards Justice, Eightfold “uses hidden AI technology to collect sensitive and often inaccurate information about unsuspecting job applicants and to score them from 0 to 5 for potential employers based on their supposed ‘likelihood of success’ on the job.”

Eightfold’s technology lurks in the background of job applications, according to the complaint, “for thousands of applicants who may not even know Eightfold exists, let alone that Eightfold is collecting personal data, such as social media profiles, location data, internet and device activity, cookies and other tracking, to create a profile about the candidate’s behavior, attitudes, intelligence, aptitudes and other characteristics that applicants never included in their job application.” 

See also  Scotiabank's former CEO thrived on crisis

Job applicants have no meaningful opportunity to review or dispute Eightfold’s AI-generated report before it informs a decision about whether or not they get a job, the lawsuit states.

“Just because this company is using some fancy-sounding AI technology and is backed by venture capital doesn’t put it above the law. This isn’t the wild west,” said David Seligman, executive director of Towards Justice. “AI systems like Eightfold’s are making life-altering decisions about who gets a job, who gets housing, who gets health care, and we’ve got a choice to make: Are we going to let them and their investors pull the wool over our eyes and hijack our marketplace? Or are we going to make sure they follow the laws on the books and provide the most basic things, like fairness, transparency and accuracy? That’s what this case is about.”

An Eightfold spokesman said the lawsuit’s allegations are without merit. 

“Eightfold’s platform operates on data intentionally shared by candidates or provided by our customers,” he said. Eightfold does not lurk or scrape personal web history or social media to build secret dossiers, he said. “We are deeply committed to responsible AI, transparency and compliance with applicable data-protection and employment laws.”

Eightfold uses information that applicants choose to submit about their skills, experience and education, as well as data authorized by its customers, he said.

But Eightfold’s own statements suggest that the company does ingest information from across the internet, according to Rachel Dempsey, an attorney at Towards Justice. 

“Our claims are predicated on the argument that they have created this LLM that has taken all this information from various places online,” Dempsey told American Banker. “One of their big selling points is that they say they have more than a billion pieces of data that all go into the LLM that ultimately results in these evaluations.”

Job candidates do have the opportunity to view and, if necessary, correct the data Eightfold has gathered, the Eightfold spokesman said.

“This is one of many differentiators of our platform,” he said. “Eightfold supports a dedicated experience where candidates can view the resume information used by our system, correct inaccuracies and represent their skills accurately.” 

See also  Climate First Bank's Ken LaRoe is still bullish on climate tech

Yet according to Dempsey, the plaintiffs in this case have never been given the ability to review the data that’s causing them to be auto-rejected, or to correct any inaccuracies. 

At the heart of the lawsuit is a straightforward privacy issue, Dempsey said.

“Moving forward, I think the fear is really getting stuck in this black box where you feel like you have been blacklisted by these employment-screening companies, and you just don’t know why,” she said.

As for Kistler, at no point during the application process did she receive a disclosure providing notice that a consumer report based on her personal data would be obtained for purposes of evaluating her employment application, according to the lawsuit.

Nor did she receive a summary of her consumer protection rights, or information regarding Eightfold’s name, address, telephone number, website address and privacy practices, or other required information under federal and state law, she says. 

During the job-application process, Kistler alleges that she didn’t have the opportunity to opt out of having her consumer data collected and evaluated for purposes of employment. Eightfold used the consumer report information it collected to score Kistler’s application against other applicants’ data, the lawsuit states. 

“We have the right to have certain kinds of control over the information about us that’s public,” Dempsey said. “Some of it is also fundamentally an AI issue, where one of the things that Eightfold does is it puts individual people’s profiles through this LLM that makes serious predictions about them. And those predictions may not always be correct, and they may be informed by discriminatory inputs — and without any access to understanding how these decisions are being made, what this LLM is saying about people, and how it gets to that point.”

Under the Fair Credit Reporting Act,  lack of visibility as to where a model’s data comes from and an inability to correct it is in itself a harm, according to Towards Justice.

Eightfold says it uses what it calls a Match Score to determine if a job candidate is suitable for a role. 

“It is not a universal or portable score that follows a candidate across different companies or unrelated roles, and it is not generated outside the context of a role,” the Eightfold spokesman said.

See also  CFPB lawsuit is just the latest regulatory headache for Comerica

Eightfold’s AI model is “assistive … but it does not reject an applicant, he said. “Importantly, the model is only one part of the system: Match Scores are driven by how employers define their job requirements and criteria, and the AI evaluates alignment against those inputs.”

Eightfold’s model is trained on large, diverse datasets that reflect skills, experience and job-related attributes, the company spokesman said. “Like all modern AI systems, our model learns patterns from historical data, but they are explicitly designed not to replicate past bias. We apply anonymization, rigorous fairness testing, bias mitigation techniques, and continuous monitoring to ensure outcomes are independent of protected characteristics, and our technology is used to support human decision-making — not replace it. Responsible AI and fairness are foundational to how Eightfold builds and deploys its models.”

The company provides tools like candidate masking, which hides personally identifiable information such as name or gender, according to the company spokesman. “This ensures recruiters focus on candidates’ qualifications when making hiring decisions.”

The company also undergoes regular internal and external bias testing, including independent third-party bias audits such as those required by NYC Local Law 144, the spokesman said. It conducts disparate-impact analysis to measure whether the system’s results are significantly different for one group versus another. And it does equalized odds checks to validate that the model’s accuracy is consistent across different demographic groups.

Kistler didn’t want to speculate about the reason she’s received so many auto-rejections, but Dempsey noted that the plaintiffs in the case are both women who have STEM backgrounds, are qualified for the jobs they have been applying for, and keep getting denied.

“They’re both eager to get a job, eager to work, eager to share and develop their skills, and they’re just being boxed out, and they don’t know why, and I think that’s really painful,” Dempsey said. “These AI job screening tools are a big part of why.”

A Contra Costa, California, superior court will need to work out whether Eightfold has violated any laws. Meanwhile, the frustration that people feel when they are rejected for a job by an automated system is real, whether an AI model is involved or not.

As one industry observer noted, “Those in their 20s are increasingly wary of AI, deepfakes etc., and find the black box of the job-application process disheartening and dehumanizing.”

Source link

Banks calls Hiring Lawsuit Practices
Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email
Previous ArticleStocks making the biggest moves midday: GME, BAH, CVI, CRWV
Next Article Tax season opens. How Trump’s tax cuts will affect your return

Related Posts

U.S. Bank among Minnesota companies urging ‘de-escalation’

January 26, 2026

How many rate cuts in 2026? These mounting pressures will put the Fed at a crossroads this year 

January 26, 2026

Anti-money-laundering rules in the US need an overhaul

January 26, 2026
Add A Comment
Leave A Reply Cancel Reply

Top Posts

High DTI Ratios Continue to Be the Leading Cause of Mortgage Denial

May 1, 2025

The expected rise of in-store biometric payments | PaymentsSource

August 17, 2025

Banks make the case for ‘single, risk-based’ capital standard

October 28, 2025
Ads Banner

Subscribe to Updates

Subscribe to Get the Latest Financial Tips and Insights Delivered to Your Inbox!

Stay informed with our finance blog! Get expert insights, money management tips, investment strategies, and the latest financial news to help you make smart financial decisions.

We're social. Connect with us:

Facebook X (Twitter) Instagram YouTube
Top Insights

Mortgage Rates Stuck Above 6% as New Korean Tariffs Roll Out

January 27, 2026

Tax season opens. How Trump’s tax cuts will affect your return

January 26, 2026

Lawsuit calls out AI hiring practices that many banks use

January 26, 2026
Get Informed

Subscribe to Updates

Subscribe to Get the Latest Financial Tips and Insights Delivered to Your Inbox!

© 2026 Smartspending.ai - All rights reserved.
  • Contact
  • Privacy Policy
  • Terms & Conditions

Type above and press Enter to search. Press Esc to cancel.