The lawsuit, filed in California against Eightfold AI, challenges the operation of systems that automatically analyze résumés and assign candidates numerical fit scores. The software, used by hundreds of employers, aggregates data from sources such as professional social-media profiles and compares them with job requirements, assigning applicants a rating on a scale from one to five. According to the plaintiffs, such systems function much like credit bureaus, because they create profiles and rankings that in practice determine access to employment, while remaining opaque to the people being evaluated.
The plaintiffs argue that tools of this kind should be covered by the U.S. Fair Credit Reporting Act (FCRA), a law dating back to the 1970s that requires consumer-reporting agencies to disclose what data they collect, explain their evaluation methods, and allow individuals to correct inaccurate information. In their view, algorithmic scores used in hiring decisions meet the statutory definition of a “consumer report” for employment purposes, which would obligate companies to inform applicants what data are being collected about them and how those data are processed.
One of the plaintiffs is Erin Kistler, a software engineer with years of experience in the technology sector who submitted thousands of applications over the course of a year and found that only about 0.3% resulted in interviews. Some of her applications were evaluated by Eightfold’s system. Kistler says that candidates receive no meaningful feedback on why they were rejected or which data influenced their scores, leaving them unable to correct possible errors in their professional profiles or work histories.
This is not the first lawsuit involving algorithmic hiring tools. In 2023, a federal court in San Francisco allowed a class-action case against Workday to proceed, in which its screening systems were accused of discriminating on the basis of age, disability, and race. The court held that the evidence presented was sufficient to suggest that the algorithms might have disproportionately rejected candidates based on characteristics unrelated to their qualifications.
Lawyers for the plaintiffs in the Eightfold case argue that there is no “AI exemption” in the law and that technology companies cannot evade disclosure obligations simply because decisions are made by statistical models rather than by humans. At the same time, labor-law experts note that ranking systems could be viewed as a digital version of preliminary screening by a recruiter, raising questions about whether consumer-reporting rules truly apply in this context.
The dispute is also unfolding against a shifting regulatory backdrop. In 2024, the U.S. Consumer Financial Protection Bureau concluded that profiles and scores used in hiring fall under the Fair Credit Reporting Act, but in 2025 that guidance was withdrawn by new leadership at the agency. The lawsuit against Eightfold, filed as a proposed class action, could therefore become a precedent-setting case determining whether algorithmic recruitment systems must operate under transparency rules similar to those governing credit bureaus, or whether they will remain outside that legal framework.

