Technical

What Is Intersectional Bias in AI Hiring?

Intersectional bias occurs when discrimination affects people at the intersection of multiple protected characteristics more severely than any single characteristic alone.

**Example:** An AI hiring tool might show no bias against women overall and no bias against Black candidates overall, but still discriminate against Black women specifically. The bias only becomes visible when you analyze the intersection of gender AND race.

**Why it matters:**

  • Single-axis analysis misses critical discrimination patterns
  • Intersectional groups (e.g., older women of color) face compounded disadvantages
  • Courts increasingly recognize intersectional discrimination claims

**How OnHirely detects it:**

  • Cross-tabulates gender × ethnicity × age combinations
  • Runs four-fifths rule analysis on each intersection
  • Applies Fisher exact test for small sample sizes
  • Flags intersectional patterns that single-axis analysis misses

OnHirely Pro plan includes full intersectional analysis — a critical capability for comprehensive bias detection.

Related Questions

Still Have Questions?

Start a free audit and see how OnHirely makes AI hiring compliance simple.

Start Free Audit