In response to President Biden’s Executive Order 14110 calling for a coordinated U.S. government approach to ensuring the responsible and safe development and use of AI, the U.S. Department of Labor Wage and Hour Division (WHD) issued Field Assistance Bulletin No. 2024-1 (the “Bulletin”).
This Bulletin, published on April 29, 2024, provides guidance on the application of the Fair Labor Standards Act (FLSA) and other federal labor standards in the context of increasing use of artificial intelligence (AI) and automated systems in the workplace.
Importantly, reinforcing the DOL’s position expressed in the Joint Statement on Enforcement of Civil Rights, Fair Competition, Consumer Protection, and Equal Opportunity Laws in Automated Systems, the WHD confirms that the historical federal laws enforced by the WHD will continue to apply to new technological innovations, such as workplace AI. The WHD also notes that, although AI and automated systems may streamline tasks for employers, improve workplace efficiency and safety, and enhance workforce accountability, implementation of such tools without responsible human oversight may pose potential compliance challenges.
The Bulletin discusses multiple ways in which AI interacts with the Fair Labor Standards Act (“FLSA”), the Family and Medical Leave Act (“FMLA”), the Providing Urgent Maternal Protections for Nursing Mothers Act (“PUMP Act”), and the Employee Polygraph Protection Act (“EPPA”). The Bulletin makes the following pronouncements regarding the potential compliance issues that may arise due to the use of AI to perform wage-and-hour tasks:
AI and the FLSA
Hours Worked
- Tracking work time – The WHD cautions employers that may rely on automated timekeeping and monitoring systems to inject proper human oversight to ensure proper tracking of employee work time. Employers may not delegate the analysis of time actually worked by an employee to a system that uses keystrokes, eye movements, internet browsing, or other activity to measure productivity or time worked. In addition to this WHD guidance, employers using AI to track work time cannot engage in prohibited employee surveillance specifically proscribed by the National Labor Relations Board (NLRB) in Memorandum GC 23-02, entitled “Electronic Monitoring and Algorithmic Management of Employees Interfering with the Exercise of Section 7 Rights.”
- Monitoring break time – Employers cannot disclaim liability for paying wages by simply relying on automated systems that predict working or break time, auto-populate time entries, and/or automatically deduct meal or rest periods. Employees must be relieved of duty in order for time to be counted as unpaid break time, and employers retain the responsibility of ensuring accurate records even when using AI or other automated systems to assist in timekeeping.
- Waiting time – Employers that use automated scheduling and/or task assignment systems should be aware that time spent by employees, already on duty, waiting for their next task to be assigned or their schedule to be updated likely counts as hours worked. Employers must be sure that employees are completely relieved from duty and can use the time effectively for their own purposes before categorizing the time as non-compensable.
- Work performed at multiple geographic locations – When using geolocation software to monitor employees’ location, employers should utilize human oversight to ensure that the software accurately records compensable travel time and/or work performed in different geographic locations. Employers using geolocation software must also comply with the NLRB’s directives in Memorandum GC 23-02.
Calculating Wages Owed under the FLSA
Employers that use AI or other automated systems to calculate wage rates should exercise proper human oversight to ensure that that system calculates the rate correctly, and that employees are paid in accordance with federal and state minimum wages, overtime, and other wage requirements. The WHD appears particularly concerned about potential for AI systems to mis-calculate the pay for employees who are paid multiple or different wage rates based on different metrics.
AI and the FMLA
Processing Leave Requests
AI systems that process leave requests, track time off, integrate absence calendars, determine eligibility for FMLA leave, and the like, could result in errors such as improperly denying a leave request, miscalculating a leave entitlement, or too frequently testing for eligibility. The WHD acknowledges that these compliance challenges may also be exhibited by humans, but notes that the impact of a faulty algorithm may be felt across the entire workforce.
Certifications to Support FMLA Leave
Using an AI system to determine whether a leave is FMLA-qualifying may be particularly risky in the event the system, for example, asks the employee to disclose more medical information than the FMLA allows, imposes an improper certification deadline, or incorrectly interprets the medical information submitted by the employee’s provider. Again, employers using AI in these circumstances must be wary of the potential to cause systemic violations across the workforce.
FMLA Interference and Retaliation
Employers using AI systems to make other employment decisions – such as promotional or restructuring decisions – must be careful not to feed FMLA or other permitted leave data into any such decision-making system. Using an AI system that tracks authorized employee leave and then identifies such leave as a negative factor in employment actions may be considered to constitute FMLA interference or retaliation.
AI and Nursing Employee Protections
The PUMP Act provides nursing employees the right to reasonable break time and a location to express breast milk while at work. Thus, employers that use AI or other automated systems to track work hours, set employee schedules, assign tasks, manage break time, and assess worker productivity must be prepared to accommodate nursing employees’ rights at work. Employers should monitor any such systems to ensure that they do not improperly limit the length, frequency, or timing of a nursing employee’s breaks, penalize such employee for failing to meet productivity standards that do not take into account lawful breaks under the PUMP Act, or require the employee to work longer hours to make up for nursing breaks.
AI and the EPPA
The EPPA generally prohibits employers from using lie detector tests (polygraphs) on employees or applicants. The WHD cautions that employers may not employ AI systems that use eye measurements, voice analysis, micro-expressions, or other body movements to circumvent the provisions of the EPPA. Any such AI system employed to detect truthfulness must comply with the EPPA and the limited exemptions provided therein.
AI and Prohibited Retaliation
The WHD ends the Bulletin with the obvious – employers may not use AI systems to retaliate against employees, including by targeting employees who engage in protected activity, using automated surveillance systems to monitor employees suspected of filing complaints with the WHD, or deploying AI to predict whether employees or work locations will engage in protected activity.
Implications for Employers
The Bulletin serves as a reminder to employers that AI tools or systems used for any workplace purpose – including tracking time, scheduling, and administrating leave – should be deployed only after thorough diligence by the employer and continued human oversight. Federal contractors also should be aware of the Office of Federal Contractor Compliance Program’s guidance with respect to the use of workplace AI, a summary of which can be found here.