[ad_1]
[authors: Bre Timko and Dave Schmidt, DCI Consulting]
Artificial intelligence (AI) continues to revolutionize many industries, and the employment space is no exception. According to the Society for Human Resource Management (SHRM), almost a quarter of organizations leverage automation or artificial intelligence for HR-related activities. Of these organizations, the majority (79%) specifically apply AI to selection (e.g. recruitment and hiring processes). Artificial intelligence has many uses in employee selection and recruitment, including automatically sourcing applicants, reviewing or screening applicant resumes, pre-selecting applicants for interviews, administering or scoring skills assessments, conducting video interviews, and administering and scoring game-based assessments ( SHRM), 2022).
Although the use of artificial intelligence in the field of choice is increasing, it is not without concerns and criticism. For example, AI tools may face backlash if the algorithms used lack transparency and explainability. In other words, when AI systems are complex and difficult to understand, it is difficult for candidates and recruiters (and sometimes even algorithm developers) to understand why certain decisions are made (Ravi, 2023). A lack of algorithmic transparency may reduce trust in the recruitment or hiring process, thereby increasing concerns about fairness and accountability (Ravi, 2023). Concerns about the use of artificial intelligence in employee selection have led to a significant increase in regulation in the field. Recently, New York City Local Law 144 went into effect on July 5, 2023. The law applies when an automated employment decision tool (AEDT) is used to hire or promote an individual located in a New York City office. New York, at least part-time; Fully remote employment, but with an associated location in a New York City office; or using an employment agency based in AEDT located in New York City.
Local Law 144 requires an independent auditor to conduct an annual deviation audit, the results of which must be publicly posted on the employer’s website.The law also requires that applicants be notified1 The organization’s use of AEDT in the selection process, including the job qualifications and characteristics assessed by AEDT, the types of data used, the sources of such data, the organization’s data retention policy, and instructions or accommodations on how applicants can request an alternative selection process.2 Notably, the City has defined the concept of AEDT in a way that goes beyond artificial intelligence methods such as machine learning and natural language processing. The law applies not only to selection programs that use these advanced technologies, but also to many existing programs that rely on complex algorithms. All such processes must comply with this law to ensure compliance.
New York City’s law is just the tip of the iceberg when it comes to regulating the use of artificial intelligence in employee selection; we will likely continue to see new state and local laws proposed and enacted on this issue for the foreseeable future. Several such laws are in effect or under consideration. DCI’s State Legislation Tracker provides an overview of each piece of legislation. In addition to Local Law No. 144, three other laws in this area are currently enacted. These laws either require applicant consent to use AI-based technology (i.e., the Illinois Video Interview Act and Maryland House Bill 1202) or require an inventory of AI technologies used by state agencies (i.e., the Connecticut Senate Bill 1103).
Laws currently in the proposed stage (six in total as of this writing) may place an even higher burden on organizations. Each of these proposed AEDT-focused laws contains references to the need for algorithmic explainability and transparency as well as applicant notification requirements. Some also include adverse impact analysis (i.e., “bias audit”), consideration of job relevance (i.e., effectiveness), provision of alternatives, and data privacy and/or data retention provisions.
Below, we take a closer look at two proposed laws that are in the process of being developed (one in California, one in Washington, D.C.) where we may see increased activity over the next year.
California’s proposed law, which would go into effect on January 1, 2025, would require notices, bias audits submitted to the California Department of Civil Rights, and alternative non-AEDT evaluations for candidates who opt out of AEDT. If technically possible. ” As currently written, the proposed law states that a bias audit must include:
- A statement of the purpose of artificial intelligence and its expected benefits,
- A description of the AEDT output and how to use it,
- a summary of the types of data collected and processed,
- The extent to which AEDT use is consistent with the developer’s statement of use (i.e., actual use versus intended use),
- Analysis of potential adverse effects on protected categories (subgroups),
- a description of the safeguards implemented to reduce risk,
- A description of how to use or monitor AEDT, and
- Information about how to evaluate the effectiveness or job relevance of artificial intelligence.
Specifically, the added authority to evaluate AEDT’s effectiveness and job relevance sets the law apart from many other AI-focused laws. The requirement to assess validity is noteworthy given its importance within the broader legal framework for assessing selection procedures and its utility in determining the validity of selection instruments. For these reasons, validation is something organizations are often advised to consider.
Like many enacted and proposed AEDT laws, Washington, D.C.’s Stop Algorithmic Discrimination Act of 2023 requires applicants to notify that AEDT will be used in selections before using any algorithm-based tool. As it currently stands, the proposed law states that the notice must:
- Includes information about how personal information is used in algorithmic eligibility determinations,
- Clear, concise, and complete – available in English and any non-English language spoken by at least 500 people in Washington, D.C.,
- Updated within 30 days of the organization changing its collection or use practices, and
- Display consistently and prominently on the Company website.
Bias audits must be conducted annually and submitted to the District of Columbia Attorney General’s Office. If any risks or unlawful disparate impacts are identified during the audit, the audit must also include identifying reasonable measures to address such risks.
Considerations of AI-related laws extend beyond employers’ use of AI-based assessments. It includes not only organizations that employ AI technologies in their selection processes, but also the vendors responsible for developing such technologies. Take New Jersey Assembly Bill 4909, for example, which requires a bias audit before selling AEDT. The audit results must be provided to the purchaser and bundled with the tool at no additional cost. Even if vendors are not the primary target of AI-based laws, their role is critical. Suppliers play a vital role in providing information to clients regarding their assessment, enabling each employer to assess the relevance of specific laws to their use and ensure they have the data they need to comply, often through This is accomplished by completing a bias audit.
There is a lot of activity around the use of AI-based technology when it comes to employee selection. While the focus of this blog is on unpacking state and local laws seeking to regulate this area, several converging forces in the federal arena will significantly shape the regulatory landscape over the next 12-18 months, including on safety, security, and the White House’s 2023 “Trusted Development and Use of Artificial Intelligence” released on October 30, 2019. This area will continue to become more complex and we will no doubt see more in this area in the coming months and years.
Get the latest legislative updates and compliance insights from our strategic partner DCI Consulting Group. Their State Legislation Tracker provides the latest information on proposed and enacted laws and regulations, as well as compliance implications and guidance.
1According to an FAQ document provided by the City, notice must be posted on the employer’s job site at least 10 days before the first use of AEDT.
2 It’s important to note that organizations are not required to provide non-ADT alternatives unless required by other laws (for example, the Americans with Disabilities Act (ADA)).
[View source.]
[ad_2]
Source link