OPINION | Can a job seeker claim discrimination if the recruiter used AI?

A recent landmark case in California – where a job applicant alleged that bias in AI had cost him more than 100 jobs – raised questions that are likely to arise in South Africa as well, says Preeta Bhagattjee.


A federal judge in California recently heard a claim for workplace discrimination involving AI. While there were specific circumstances considered that are quite unique and groundbreaking, this is likely to become more common in workplace disputes and apply to the South African workplace in the future.

The California court heard a proposed class action lawsuit brought by a job applicant (Derek Mobley) against Workday Inc., a U.S. provider of on-demand financial management, human capital management, and student information systems software. The lawsuit alleged that Workday’s AI-powered recruiting software perpetuates existing biases against job applicants based on race, age, and disability.

Mobley claimed that he was overlooked for more than a hundred jobs because of these prejudices.

In a landmark ruling, the California court held that Workday qualified as an employer under federal antidiscrimination laws because it performed screening functions normally performed by its clients. However, Workday did not screen Mobley for direct employment with the company, but instead relied on the results of its AI-powered recruiting software to determine whether or not to present Mobley as a candidate to its clients. These clients were seeking candidates for employment, and as such, Workday was acting in the traditional role of a recruiting firm.

While the California court dismissed claims of intentional discrimination and ruled that Workday is not a “staffing agency,” it maintained that the company’s AI tools could subject it to discrimination liability since they perform critical hiring functions. The court noted that Workday’s customers delegated their traditional hiring functions, including rejecting applicants, to the algorithmic decision-making tools Workday provided.

Could this also happen in South Africa?

This finding may have implications for South African employers who use service providers such as Workday, either when they engage the service provider directly as a recruiter for an open position or when the employer uses software, tools or platforms purchased or licensed from the service provider to identify or screen candidates for a job.

This is because workers in South Africa are protected from unfair discrimination under the Employment Equity Act (EEA). The Act provides that no person shall unfairly discriminate, directly or indirectly, against any worker in any employment policy or practice on any ground, including race, sex, gender, pregnancy, marital status, family responsibilities, ethnic or social origin, colour, sexual orientation, age, disability, religion, HIV status, conscience, belief, political opinion, culture, language, birth or any other ground whatsoever.

The EEA extends this protection to applicants for employment.

Given the above protections against unfair discrimination, which are already enshrined in South African employment law and practice, and the broad scope of what is required to be protected, it may be the case in the future that if a South African employer engages a third party such as Workday as a recruiter, or itself uses software, tools or platforms provided by a third party such as Workday, that employer may be subject to claims for unfair discrimination.

Where the employer engages a recruitment agency, a court assessing whether the recruitment agency’s use of AI or algorithmic decision-making tools results in unfair discrimination against an individual may take into account the fact that the EEA prohibits not only an employer, but “no person” from unfairly discriminating, even against an applicant for a job with the recruiter’s client.

How do such claims work?

Such a claim could potentially be brought against the prospective employer, who could theoretically be held liable under South African law based on the actions of the recruiter as its agent, or on the prospective employer’s reliance on or use of the discriminatory results of the algorithmic decision-making tools.

Additionally, if the prospective employer uses software, tools, or platforms provided by a third party such as Workday, and those tools or processes produce discriminatory outcomes, not only may the prospective employer face legal claims, but it is also possible that the provider of the algorithmic decision-making tools may itself face legal claims, as the developer of the tool. This is aligned with the legislative landscape taking shape in various countries around the world to manage the potential harms that can arise from the development and use of AI systems, where developers and implementers of AI systems can be held liable for, among other things, biases in their AI tools.

For example, the EU AI Act classifies AI systems intended for recruitment or selection of employees and monitoring and evaluating employee performance as high-risk AI. AI systems that use automated decision-making to profile individuals in, among other things, a work environment are also high-risk AI.

This category of AI is subject to strict obligations that must be complied with under this legislation. These obligations include the quality of the data used in relation to the AI ​​tool, but not the AI ​​tool itself. In addition, obligations must be made to prevent bias, ensure accuracy, address cybersecurity risks and provide transparency on the operation of the AI ​​tool. In addition, there must be human oversight to ensure consistency, prevent harm and ensure responsible use.

Algorithmic decision-making tools may also need to comply with specific South African laws, including the Protection of Personal Information Act. Under this, such AI tools may not be permitted if they make decisions and create a profile solely based on automated processing of personal information.

While the decisions of the courts of a foreign jurisdiction are not binding on SA courts, foreign case law may be considered and found persuasive where a new point of law needs to be considered. As such, the approach of the California court in Workday should be noted.

Employers in South Africa should be aware that recruiters could inadvertently be involved in claims of unfair discrimination by employees or applicants when using AI. Recruiters themselves may face similar claims.

Not only must the algorithmic decision-making tools be carefully calibrated to remove biases that could constitute grounds for discrimination and to ensure compliance with all applicable laws and regulations, the contracts for services between the parties must also be carefully designed to provide commercial protection in the event that such legal risks arise.

Preeta Bhagattjee, Director and Head of Technology & Innovation and Bradley Workman-Davies, Director, Werksmans Attorney.

News24 encourages freedom of speech and the expression of diverse views. The views of columnists published on News24 are therefore their own and do not necessarily represent the views of News24.

Source link

Share this content:

You May Have Missed