How to Reduce Hiring Bias Using Applicant Tracking Systems
ATS Recruit

How to Reduce Hiring Bias Using Applicant Tracking Systems

Gauri Asopa Content Writer
Modified
Read time 9 min read

Discover how HR teams can reduce hiring bias using applicant tracking systems through independent audits, structured scorecards, blind screening, AI governance, and DEI analytics.

Get Started

Your applicant tracking system is either your most powerful bias-reduction tool or your most efficient bias amplifier and the difference comes down entirely to how it is configured and audited. Most HR teams get this wrong, not because they lack intention, but because every competing article on this topic gives them the wrong starting sequence.

As of 2025, 94% of companies use applicant tracking systems in their hiring process. Yet only 26% of job applicants trust AI will fairly evaluate them a gap that signals a real and growing credibility problem that HR leaders must address head-on.

This guide is written for HR professionals and talent acquisition leaders who already have an ATS and are under pressure from DEI teams, legal counsel, or their own data to make it more equitable. We will walk you through the correct sequence: audit your liability first, then configure your features, then measure continuously.

Is Your ATS Already a Liability?

This is the step every competing article skips. They tell you to turn on blind screening process and add structured scorecards. But if your ATS uses AI-powered scoring or even weighted keyword filters you may already be in violation of employment law before you configure a single new feature.

Before touching any ATS setting, your first action should be to answer three questions:

  1. Does your ATS use AI scoring or automated ranking? If yes, map every feature that influences candidate ranking.
  2. Are you hiring anyone who lives in New York City, Illinois, or Colorado? State-specific AEDT laws may apply even if you have no office there.
  3. When did you last run a bias audit and who conducted it? Vendor self-audits do not satisfy legal requirements.

The EEOC recovered almost $700 million for discrimination victims in FY2024, highlighting the legal exposure and costs associated with biased hiring practices.

The US Legal Landscape: Which Laws Apply to Your Right Hiring Process

The regulatory map for AI-powered hiring is patchy but consequential. Federal enforcement has deprioritized disparate impact cases under the 2025 executive order changes, but state laws have not moved and in some cases have strengthened.

New York City Local Law 144 (LL 144)

This is the most technically detailed AEDT law in the US and the one most diverse teams misunderstand. Three things you must know:

  • Remote-hire trigger: LL 144 applies whenever you hire a New York City resident for any role, including fully remote. You do not need a NYC office. Violations carry $500–$1,500 per day per tool.
  • Employer obligation: Compliance is your responsibility, not your vendor's. A vendor's internal audit does not satisfy LL 144 you need an independent third-party audit conducted annually.
  • Candidate rights: You must provide candidates with 10-day advance notice before using an AEDT tool and offer a reasonable alternative assessment on request.

Illinois HB 3773 and Colorado SB 24-205

Illinois requires written disclosure to applicants when AI is used in hiring and mandates annual demographic reporting of AI screening outcomes. Colorado's law focuses on algorithmic discrimination standards for high-risk AI systems including hiring tools. Both laws remain in full force regardless of federal posture. Setting diversity goals is essential as it keeps the issue of diversity front and center within organizations, encouraging accountability and focus on hiring practices.

Title VII and the Four-Fifths Rule

Adverse impact under Title VII is still measured using the four-fifths (80%) rule: if one demographic group's selection rate is below 80% of the highest-selected group's rate, adverse impact is indicated. But the EEOC's 2023 technical assistance document clarified a critical point that most content gets wrong.

Where Hiring Bias Actually Lives in an ATS Workflow

The framing that ATS systems are bias reducers is dangerously incomplete. University of Washington research (2024) found significant racial, gender, and intersectional bias in how three leading large language models ranked job applicants based on names alone. The bias is not hypothetical. It is measurable, documented, and present in the systems most organizations are running today.

Bias enters ATS workflows at four distinct stages:

Stage 1: Job Description Language

Coded language 'rockstar,' 'ninja,' 'dominant' statistically deters women from applying. Poorly worded job descriptions exclude qualified candidates before the ATS even processes a single reviewing resume. Use a language audit tool like Textio or Gender Decoder as a standard pre-publication step. Tracking hiring outcomes against established diversity goals can lead to significant increases in representation of underrepresented groups within management over time.

This is the configuration failure that no competitor article addresses: keyword lists are written by recruiters, and recruiters carry assumptions. When you hard-filter for 'Stanford,' 'Goldman Sachs,' or 'Fortune 500' experience, you are not filtering for skill you are filtering for socioeconomic access. The keyword list is a direct encoding of historical hiring bias into an automated system.

Stage 3: AI Scoring Models

AI models trained on historical hiring data inherit historical bias. Forbes (2026) cites research showing that AI hiring systems often inherit or exacerbate human biases embedded in training data because models are trained on historical hiring decisions which themselves reflect decades of inequitable practices.

Stage 4: Video and Async Assessment Tools

AI-analyzed video interviews introduce additional bias vectors: lighting, accent, camera quality, and background environment all influence AI scoring in ways that correlate with socioeconomic status and ethnicity. If your ATS integrates a video screening tool, it requires its own independent bias audit separate from the core ATS review.

▶ VIDEO | Reducing Bias in the Recruitment process and Hiring Process
Gary L. Davis, a talent management and diversity leader with 10+ years of experience, walks through systematic approaches to bias reduction in recruitment directly applicable to ATS configuration and workflow design.
Watch: Reducing Bias in the Recruitment and Hiring Process

The Right Sequence: Compliance First, Configuration Second

Every major competitor article starts with feature configuration. That is the wrong order. Here is the correct sequence:

  • Map your AEDT exposure: List every ATS feature that automates candidate ranking or rejection. Note which states your candidates reside in.
  • Pull your vendor contract: Check for an audit-rights clause. Verify that your vendor will share demographic outcome data with an independent auditor. If they won't or the contract is silent this must be renegotiated before any compliance work begins.
  • Commission an independent bias audit: Not a vendor self-assessment. An independent third-party audit covering all AEDT features, with intersectional demographic cross-tabs.
  • Remediate by priority: Address highest-risk features first (AI scoring models), then mid-risk (keyword filters), then lower-risk (structured scorecards).

ATS Feature Toolkit: What to Configure and How

Once your compliance baseline is established, these are the features with the strongest evidence base for reducing bias at scale. Collaborative hiring reduces the impact of any single individual's bias and prevents groupthink by requiring multiple team members to score candidates independently. Regularly auditing ATS data is critical to ensure automated tools aren't magnifying historical human biases.

Resume Anonymization and Blind Screening

Anonymization removes name, photo, graduation year, address, and institution from resume display during initial screening. Franklin Electric reduced time-to-hire by 55% across 20 countries after implementing anonymized screening demonstrating that equity and efficiency are not in tension.

Structured Evaluation Scorecards

Structured interview questions reduce bias more reliably than any other single intervention. Effective unconscious bias training does more than just raise awareness; it actually reduces bias in attitudes and behaviors at work, from hiring decisions to everyday workplace interactions.

DEI Analytics and Funnel Tracking

Tracking diversity ratios at every pipeline stage is how you find where bias is operating not just whether it is operating. Awareness training is the first step to unraveling unconscious bias because it allows employee turnover to recognize that everyone possesses them and to identify their own biases.

Skills-Based Matching Configuration

Blind hiring practices are most effective when combined with structured interview stage, as this approach helps prevent bias from re-entering the process at later stages, such as during face-to-face interviews. US states have moved toward skills-based hiring practices to expand talent pools a signal that skills-based approaches are becoming the regulatory expectation, not just a best practice.

For specific implementation, listen to Joseph Fuller's Future of Work Podcast episode on redefining 'qualified' it provides an evidence-based framework for restructuring ATS criteria around competencies rather than credentials.

Independent Bias in Hiring Audits for Applicant Tracking System

An independent bias audit is a structured analysis of your ATS's demographic outcomes by a third party with no commercial relationship to your ATS vendor. This is a legal requirement under LL 144, and it is best practice under Title VII regardless of geography.

What a Compliant Audit Covers

  • Analysis of adverse impact at every automated decision point in the hiring funnel
  • Four-fifths calculations with statistical significance testing
  • Intersectional cross-tab analysis (race × sex at minimum)
  • Documentation of training data sources and model update history for AI-scored features
  • A written audit summary suitable for regulatory submission

The Vendor Data-Sharing Problem

This is a real and underreported operational risk, some ATS vendors refuse to share the demographic outcome data needed to conduct a bias audit. Before signing or renewing any ATS contract, negotiate an explicit audit-rights clause that requires the vendor to provide disaggregated demographic outcome data to any auditor you authorize.

Audit Costs and Timeline

Independent audits for mid-size ATS implementations typically range from $5,000 to $25,000 depending on the number of AEDT features, data volume, and the depth of intersectional analysis required. Budget for annual re-audits LL 144 requires them, and best practice under Title VII supports them.

What Gets Measured with Applicant Tracking Metrics for fair Hiring

Reporting that your bias-reduction program is 'in place' is not the same as proving it is working. These are the metrics that demonstrate actual progress for unbiased hiring:

Pipeline Diversity Ratios by Stage

Track the demographic composition of your candidate pipeline at application, screen, interview, offer, and hire stages. The goal is not identical numbers at every stage it is understanding where the funnel narrows disproportionately and why.

Time-to-Hire by Demographic Group

Disparate time-to-hire is a documented form of process bias. If candidates from certain groups consistently wait longer between stages, that gap requires investigation even if final selection rates appear balanced.

Interview Pass Rates by Interviewer

Structured scorecards generate data. Analyzing scorecard ratings by interviewer surfaces individual raters whose scores diverge significantly from panel consensus a signal of either bias or a calibration gap that training can address.

Post-Hire Performance and Retention by Source

If your bias-reduction measures are working, you should see improved quality-of-hire across demographic groups over time not just more diverse hiring. This reported improved post-hire performance demonstrating that equitable process and quality-of-hire are reinforcing, not competing, goals.

Conclusion

Reducing bias using applicant tracking systems is not a feature toggle. It is a compliance discipline, a measurement practice, and a change management challenge that runs across your entire hiring infrastructure. The organizations getting this right share a common approach, they audit before they configure, they treat keyword lists as bias sources rather than neutral filters, they require independent third-party audits rather than accepting vendor self-assessments, and they build monitoring into their standard operating calendar. It is a talent acquisition competitive advantage.

Start with the audit. Map your legal exposure. Then build the configuration, monitoring, and measurement infrastructure that makes bias reduction a durable process not a one-time project.

Frequently Asked Questions

How to measure bias reduction success in ATS?

Bias reduction success in an ATS can be measured through diversity hiring metrics, unstructured interview-to-selection ratios, and evaluate candidate based progression rates across different demographic groups. Companies should also track time-to-hire fairness, adverse impact ratios, and diverse candidate feedback scores.

How does bias occur in applicant tracking systems?

Existing Biases in ATS platforms often occurs through historical hiring data, keyword filtering, and poorly configured screening rules. If past hiring patterns favored certain backgrounds, the system may unintentionally prioritize similar candidates. Signal age bias can also emerge from resume parsing limitations, job description language, and recruiter-driven filtering criteria.

What ATS features help reduce hiring bias?

Modern ATS platforms reduce hiring bias through blind hiring tools, AI-driven skill matching, structured interview workflows, and diversity analytics dashboards. Features like anonymized resume review, standardized scorecards, and inclusive job description recommendations help recruiters focus more on qualifications and competencies rather than personal identifiers.

How to configure ATS settings to minimize bias?

Organizations should configure ATS systems to prioritize skills-based screening instead of demographic or background-based filters. Removing identifiers such as name, gender, age, and photo during initial screening can improve fairness. Recruiters should also standardize evaluation criteria, work sample tests, enable structured interview scoring, and regularly review AI recommendations for consistency.

Which ATS platforms are best for reducing bias?

ATS platforms with strong DEI capabilities typically include blind screening, AI fairness monitoring, structured assessments, and diversity reporting tools. Solutions like Greenhouse, Lever, Workday, and Zimyo offer features that support more inclusive, job success, and data-driven hiring practices.

How to audit your ATS for unconscious bias?

An ATS bias audit should include reviewing hiring data, future job performance screening rules, AI recommendations, and candidate drop-off patterns across demographics. Companies should test whether certain groups are disproportionately filtered out at specific hiring stages.

Gauri Asopa

Gauri Asopa

Senior Marketing Executive at Zimyo

LinkedIn

I believe great content isn't just written — it's felt. As a Senior Marketing Executive at Zimyo, I craft stories around HR tech, payroll, compliance, and modern workplace trends. Whether it's a blog, brand campaign, or email sequence, I love turning complex ideas into clear, engaging narratives. My journey has always been rooted in curiosity — about people, patterns, and what makes a message truly stick. When I'm not writing, I'm curating mood boards, collecting new books, or getting lost in lofi playlists and timeless aesthetics.

Ready to Let AI Run Your HR?

Join 500+ US companies that replaced HR busywork with AI agents. Sign up and start in minutes.

Get Started