What Is California AB 331?
California Assembly Bill 331 establishes requirements for deployers and developers of automated decision tools (ADTs) used in consequential decisions, including employment. The bill reflects California's broader approach to AI accountability and represents one of the most detailed state-level frameworks for AI governance in hiring.
AB 331 takes a different approach than NYC's LL144 by requiring impact assessments rather than solely bias audits, and by placing obligations on both deployers (employers) and developers (vendors).
Who Must Comply?
AB 331 applies to:
- Deployers: Any entity that uses an automated decision tool to make or assist with consequential decisions. In the hiring context, this means any California employer using AI in employment decisions
- Developers: Any entity that creates, codes, or substantially modifies an automated decision tool
If you use AI to screen resumes, rank candidates, assess skills, analyze interviews, or make any other hiring-related decisions in California, you are likely a deployer under AB 331.
The AB 331 Compliance Checklist
Impact Assessments
- [ ] Identify all automated decision tools used in employment decisions
- [ ] Conduct an impact assessment for each tool before deployment and annually thereafter
- [ ] Document the purpose and intended use of each tool
- [ ] Analyze the tool's outputs for differential impact across protected categories including race, color, ethnicity, sex, religion, age, national origin, disability, veteran status, and genetic information
- [ ] Assess the tool's validity — is it actually measuring what it claims to measure for the employment decision at hand?
- [ ] Evaluate alternatives — are there less discriminatory alternatives that achieve the same business purpose?
- [ ] Document safeguards in place to mitigate identified risks
- [ ] Record the assessment date and maintain assessment records for at least three years
Notice Requirements
- [ ] Pre-use notice: Inform candidates that an automated decision tool will be used
- [ ] Purpose disclosure: Explain what the tool evaluates and how it influences the decision
- [ ] Opt-out information: Provide instructions for how candidates can request an alternative process
- [ ] Contact information: Include details for how candidates can request more information or contest a decision
- [ ] Accommodate requests for alternative evaluation processes in a timely manner
Data and Privacy
- [ ] Minimize data collection to what is reasonably necessary for the tool's stated purpose
- [ ] Disclose data sources used by the tool
- [ ] Establish data retention policies and document them
- [ ] Ensure data security for all candidate information processed by the tool
- [ ] Provide data access — candidates should be able to request their data
Ongoing Monitoring
- [ ] Establish continuous monitoring processes for deployed tools
- [ ] Track outcome metrics across demographic groups over time
- [ ] Document any corrective actions taken when disparities are identified
- [ ] Reassess after significant changes to the tool, data sources, or applicant population
- [ ] Maintain audit trail of all monitoring activities and findings
Governance and Accountability
- [ ] Designate a responsible officer or team for ADT compliance
- [ ] Develop internal policies governing the use of automated decision tools
- [ ] Train relevant staff on AB 331 requirements and internal policies
- [ ] Establish an escalation process for flagged tools or disputed decisions
- [ ] Document vendor due diligence for third-party AI tools
Key Differences from NYC LL144
Understanding how AB 331 differs from LL144 helps employers operating in both jurisdictions:
| Requirement | NYC LL144 | California AB 331 | |---|---|---| | Focus | Bias audit (impact ratios) | Impact assessment (broader scope) | | Protected categories | Race/ethnicity, sex | All protected categories | | Developer obligations | None specified | Yes — developers have obligations | | Validity assessment | Not required | Required | | Less discriminatory alternatives | Not required | Must evaluate | | Notice timing | 10 business days before use | Before use (timing flexible) | | Penalties | $500-$1,500 per violation | Civil penalties, private right of action |
Common Mistakes to Avoid
- Treating AB 331 like LL144 — AB 331 requires broader impact assessments, not just statistical bias audits
- Ignoring developer obligations — if you build or significantly customize AI hiring tools, you have developer obligations too
- Skipping validity assessment — AB 331 requires you to demonstrate the tool actually measures job-relevant qualities, not just that it does so without bias
- Inadequate documentation — AB 331's documentation requirements are extensive. Incomplete records create enforcement risk
- One-time compliance — AB 331 requires ongoing monitoring, not just an annual snapshot
How OnHirely Helps with AB 331
OnHirely's platform addresses the core analytical requirements of AB 331. The bias audit functionality calculates impact ratios across all protected categories — not just race and sex. The statistical testing layer determines whether observed disparities are significant. And the generated reports provide structured documentation that supports your impact assessment process. While OnHirely handles the quantitative analysis, employers should pair it with qualitative assessments of validity, alternatives, and safeguards to achieve full AB 331 compliance.