In latest edition of The Wright Toolbox:
OFCCP Publishes Guidance On The Use Of Artificial Intelligence In The Employment Context For Federal Contractors
On April 29, 2024, in keeping with Executive Order 14110, the Office of Federal Contract Compliance Programs (OFCCP) published guidance on the use of Artificial Intelligence (“AI”) in the employment process and ensuring compliance with the Equal Employment Opportunity (EEO) requirements (“Guidance”). The OFCCP recognizes that the use of AI systems has the potential to “perpetuate unlawful bias and automate unlawful discrimination, among other harmful outcomes.” The purpose of the Guidance is to answer questions and share “promising practices” to clarify federal contractors’ legal obligations, promote EEO, and mitigate the potentially harmful impacts of AI in employment decisions.
The OFCCP defines AI as “a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. AI systems use machine and human based inputs to perceive real and virtual environments; abstract such perceptions into models through analysis in an automated manner; and use model inference to formulate options for information or action.” See 15 U.S.C. § 9401(3). OFCCP further defines “automated systems” in the employment context as “software and algorithmic processes, including AI, that are used to automate workflows and help people complete tasks or make decisions.” Automated systems can include “workplace algorithms that inform all aspects of the terms and conditions of employment including, but not limited to, pay or promotion, hiring or termination algorithms, virtual or augmented reality workplace training programs, and electronic workplace surveillance and management systems.” For example, an automated system may help a federal contractor’s HR professional sift through hundreds or thousands of resumes, identifying applicants that meet basic requirements for a job. A federal contractor could also use AI to determine which criteria to use when making employment decisions – for instance, to define the parameters by which the resumes are filtered and reviewed.
The Guidance makes clear that covered federal contractors are obligated by law to ensure that they do not discriminate in employment and that they take affirmative action to ensure employees and applicants are treated without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. These EEO obligations extend to the federal contractor’s use of automated systems, including AI, when making employment decisions. OFCCP investigates the use of AI during compliance evaluations and complaint investigations to determine whether a federal contractor is in compliance with its nondiscrimination obligations. OFCCP examines any measure, combination of measures, or procedure that a federal contractor uses to make employment decisions, including decision-making tools that use an AI system. Employment decisions can include hiring, promotion, termination, and compensation, among other decisions.
The Guidance notes the following examples of federal contractors’ compliance obligations related to AI:
Federal contractors must:
- Maintain records and ensure confidentiality of records consistent with all OFCCP-enforced regulatory requirements. For example, contractors must keep records of resume searches, both from searches of external websites and internal resume databases, that include the substantive search criteria used.
- Cooperate with OFCCP by providing the necessary, requested information on their AI systems.
- Make reasonable accommodation to the known physical or mental limitations of an otherwise qualified applicant or employee with a disability as defined in OFCCP’s regulations, unless the federal contractor can demonstrate that the accommodation would impose an undue hardship on the operation of its business. The reasonable accommodation obligation extends to the contractor’s use of automated systems, including but not limited to, electronic or online job application systems.
Among the risks identified by the OFCCP in the Guidance is the risk that AI has the potential to embed bias and discrimination into a range of employment decision-making processes, which if not designed and implemented properly, can replicate or deepen inequalities already present in the workplace and may violate workers’ civil rights. An example given was: a resume scanner programmed to reject applicants with gaps in their resume may automatically reject applicants who took time off for the birth of a child or for medical treatment for a disability, having an adverse impact on women or individuals with a disability and potentially violating laws enforced by OFCCP.
To help minimize risks of using AI automated systems in the employment process, OFCCP notes that federal contractors must:
- Understand and clearly articulate the business needs that motivate the use of the AI system.
- Analyze job-relatedness of the selection procedure.
- Obtain results of any assessment of system bias, debiasing efforts, and/or any study of system fairness.
- Conduct routine independent assessments for bias and/or inequitable results.
- Explore potentially less discriminatory alternative selection procedures.
Federal contractors are responsible for meeting their nondiscrimination and affirmative action obligations under the laws enforced by OFCCP and are responsible for their use of third-party products and services, such as tools for employment screening and selections, recordkeeping, and automated systems, including AI. The Guidance states that “[a] federal contractor using another entity’s product or service cannot delegate its nondiscrimination and affirmative action obligations.”
The Guidance provides a list of “promising practices” for the use of AI in the employment context, which are not expressly required, but are actions contractors may consider to help avoid potential harm to workers and promote trustworthy development and use of AI. The OFCCP states that “[t]hese practices are rooted in the concept that federal contractors should have a sufficient understanding of the design, development, intended use, and effects of any AI system they use in their employment practices such as hiring, promotions, terminations, and compensation, among others.”
Promising Practices on Providing Notice that the Federal Contractor is Using AI
Federal contractors should provide advance notice and appropriate disclosure to applicants, employees, and their representatives if the contractor intends to use AI in the hiring process or employment decisions that allows individuals to understand how they are being evaluated.
Promising Practices on Federal Contractors’ Use of an AI System
Federal contractors should standardize the system to ensure all candidates/applicants go through the same process and establish, in advance, procedures the employer or the third party administering the process will follow to receive and promptly respond to reasonable accommodation requests. Routinely monitor and analyze where the use of the AI system is causing a disparate or adverse impact before implementation, during use at regular intervals, and after use. If such an impact exists, take steps to reduce it or use a different tool. This should include assessing whether the use of historical data in the creation of an AI system may reproduce patterns of systemic discrimination. Not rely solely on AI and automated systems to make employment decisions and ensure there is meaningful human oversight of any such decisions supported by AI. Provide training about the AI system and its appropriate use to all staff, especially those responsible for monitoring and analyzing the AI system. Establish an internal governance structure that sets clear procurement and use case standards for new AI technologies and ongoing monitoring requirements.
Promising Practices on Obtaining a Vendor-Created AI System
Federal contractors should be able to verify: (i) the specific provisions in the contract with the vendor regarding records related to the AI system, ensuring that those provisions require the vendor to maintain records consistent with all OFCCP-enforced regulatory requirements and provide OFCCP with access to such records during a compliance evaluation; (ii) the source and quality of the data being collected and analyzed by the AI system and that it is representative, objective, robust, and pertinent to the employment decision; (iii) whether the vendor documents and maintains the data used in collecting, cleaning, training, and building algorithms and the rationale for why the vendor used the data points; (iv) the vendor’s protections and privacy policy on data provided by the contractor; (v) critical information about the vendor’s algorithmic decision-making employment tool, e.g., captured data, scoring system, and the basis for selection or elimination of applicants/candidates; (vi) the screening tool(s) and data used to filter in candidates such as listed job skills, keywords, or other criteria; data used to filter out candidates such as gaps in employment history; and data used to prioritize candidates based on job skills, keywords, or other criteria; (vii) the predictive nature of the system, e.g., relatedness of the AI system’s prediction model to the specific job(s) for which the contractor intends to use it in the selection process or other employment decisions; (viii) any differences between the data upon which the AI system was trained, developed, and validated and the contractor’s candidate pool or labor market; (ix) any differences between the AI system that was developed and validated and the AI system in operational use by the contractor; (x) the applicability of the validation results to the contractor’s selection process; (xi) the reliability and safety of the system, i.e., ability of the system to resist manipulation and prevent discriminatory outcomes; (xii) the transparency and explainability of the system, e.g., the basis of decisions about candidates can be clearly communicated to them; (xiii) the results of any assessment of system bias, debiasing efforts, and/or any study of system fairness.
The Guidance provides links to numerous resources relating to this topic for further analysis of the issues. If you have any questions regarding this matter, please do not hesitate to contact any member of the Wright, Constable & Skeen Government Contracts practice group.