AI in Recruiting and Employment Decision-Making
- Paul Peter Nicolai

- Jan 13
- 4 min read
ED NOTE: Although what is described here applies only to California employers, the concerns it addresses are widespread. Employers should anticipate regulation in other states soon.
The use of artificial intelligence (AI) in employment decision-making is here, reshaping how employers find, assess, and promote talent. As employer use of AI has increased, so has the development of AI regulation at the state and local level, including in California.
California took several steps in 2025 to regulate the development and use of AI in employment, ensuring that California employers’ use of AI tools is free from discrimination and bias.
On October 1, 2025, the California Civil Rights Council’s (CRC) Employment Regulations Regarding Automated-Decision Systems took effect (CRC Regulations) under the Fair Employment and Housing Act (FEHA). Every California employer covered by FEHA must practice algorithmic accountability when using Automated Decision Systems (ADS) and AI in employment decisions.
The purpose of the CRC Regulations is to ensure that innovation promotes fairness and equity, rather than undermining them. An AI tool’s efficiency should not replace human oversight, judgment, and analysis. According to the CRC Regulations, human involvement is necessary not only to understand how the tool affects a candidate or employee’s opportunities but also to decide when and how to intervene when an ADS is used.
Automated Decision System
Under the CRC Regulations, an ADS is computational process that makes a decision or facilitates human decision making regarding an employment benefit…derived from or using artificial intelligence, machine learning, algorithms, statistics, or other data processing techniques.
The Regulation’s definition of an Artificial Intelligence System is similarly broad—any machine-based system that infers, from the input it receives, how to generate outputs, whether those outputs are predictions, recommendations, or decisions.
That scope captures most of the AI-based technology now shaping employment decisions, like:
Resume filters that rank or score candidates;
Online assessments measuring aptitude, personality, or “fit;”
Algorithms targeting specific audiences for job postings;
Video-interview analytics evaluating tone, word choice, or expression; and Predictive tools drawing on third-party data.
If a tool influences an employment outcome, directly or indirectly, it likely qualifies as an ADS under the CRC Regulations.
Key Compliance Duties and Risks
The CRC Regulations establish a framework that blends civil rights principles with technical oversight. Employers must now take the following steps when implementing ADS and Artificial Intelligence Systems:
Prevent Discrimination: It is unlawful to use any ADS or selection criteria that create a disparate impact against a protected class under FEHA. Scrutinizing liability does not stop with the question of intent. Impact must be considered.
Conduct Bias Testing and Audits: ADS tools must undergo anti-bias testing or independent audits that are timely, repeatable, and transparent. A single validation at launch is not enough and will not demonstrate sufficient reasonable measures. Fairness checks must be integrated as regular and systemized maintenance practices.
Provide Notice and Transparency: Applicants and employees must receive pre-use and post-use notices explaining when and how ADS tools are used, what rights they have to opt out, and how to appeal or request human review.
Assert an Affirmative Defense Through Good-Faith Efforts: Employers facing claims under FEHA may defend themselves by showing reasonable, well documented anti-bias measures including but not limited to: audits, corrective actions, and continuous oversight. But that defense is only as strong as the evidence supporting it.
Assume Responsibility for Vendors and Agents: Employers cannot outsource accountability. Bias introduced by a vendor or third-party platform remains the employer’s legal and ethical burden.
Retain Records for Four Years: FEHA now requires retention of ADS-related documentation for at least four years. This retention requirement includes but is not limited to: data inputs, outputs, decision criteria, audit results, and correspondence.
Through these requirements, the CRC clarifies that automation in decision-making is not banned, but employers must act responsibly when using such tools.
Practicing Algorithmic Accountability
At the core of its framework, the CRC Regulations emphasize algorithmic accountability. Algorithmic accountability means technology must be combined with human judgment. Employers cannot claim ignorance of how an algorithm functions or what data an AI tool uses. Under the CRC Regulations, an employer that deploys AI without understanding its foundation and logic now risks negligence and potential liability.
The CRC emphasizes the need to keep human input in decision-making despite AI use. At a minimum, employers must include a human element in the employment decision process to comply with the CRC Regulations. Accountability involves transparency in procedures, traceability of data, and intervention if fairness is at risk. It means working with AI and making use of its strengths while maintaining ethical, legal, and managerial responsibilities.
Best Practices
To comply with the CRC Regulations, facilitate a culture of algorithmic accountability, and reduce risk, employers should consider:
Investing in Education and Awareness - Empower human resources and leadership teams with a foundational understanding of ADS, its potential, its blind spots, and the social dynamics it can amplify. Oversight begins with literacy.
Engage Independent Auditors - External bias audits and model validations deliver credibility and objectivity. They also bolster an employer’s affirmative defense by showing due diligence.
Adopt Continuous Review and Monitoring - Bias is not a linear risk, and it can change as data, users, and markets evolve. Regular audits, outcome monitoring, and feedback loops should become part of daily governance. Consult with outside counsel to establish an appropriate cadence for audit-related protocols.
Institutionalize Documentation - Implement systems to capture, retain, and preserve ADS-related records including but not limited to: inputs, model parameters, audit logs, and decisions. These records must be kept for at least 4 years.
Maintain Human Oversight - Employers should create decision processes that encourage human involvement, review, challenge, correction, and intervention.




Comments