HB Ad Slot
HB Mobile Ad Slot
DOL’s Wage & Hour Division Delivers Guidance on AI
Thursday, May 23, 2024

On April 29, 2024, in compliance with President Biden’s October 2023 Executive Order addressing artificial intelligence, the Department of Labor’s Wage & Hour Division (WHD) issued guidance regarding the potential risks posed by employers using AI tools to monitor or augment worker productivity to violate the Fair Labor Standards Act (FLSA). The 12-page Field Assistance Bulletin No. 2024-1 (FAB) covers more than just the expected topics, such as hours worked and wages paid, it also touches on AI’s implications for other worker protection laws, like the Family and Medical Leave Act (FMLA), the recently-effective Providing Urgent Maternal Protections for Nursing Mothers Act (PUMP Act), as well as the Employee Polygraph Protection Act (EPPA). 

Tracking Time Worked

The first half of the FAB is devoted to outlining the risks that AI poses when used to monitor employee productivity and break time. The WHD flags AI or automated systems that companies may use to track when employees are actively working or are idle, potentially by monitoring computer keystrokes, mouse clicks, or an employee’s presence in front of a computer’s camera, etc. Employers are reminded that time spent working must be paid “regardless of the level of productivity or performance of the employee,” so these tracking systems are not determinative of whether the employees’ time is “work time,” and thus compensable.

Likewise, software or AI systems that predict when and if an employee has taken a break may not be 100% accurate and may conflate non-computer work time with break time. Breaks are non-compensable only when workers are fully relieved of their duties—a subjective assessment that may not be well suited to computer automation. For workers who might be assigned tasks by an automated system, such as warehouse workers, time spent waiting on the system to assign a task may be compensable when the employee is not free to use that time for their own purposes. Under the FLSA, this “engaged to wait” time is compensable.

Finally, the FAB warns that AI-powered systems that use geolocation to track employee “clock-ins and -outs” may not adequately account for worker tasks that occur at multiple locations or locations away from the “main” office. Using construction workers as an example, the WHD notes that time spent at job sites without geolocation infrastructure, or time spent traveling between job sites, might not be appropriately tagged as compensable work time by the system. Analogously, workers whose job responsibilities include intermittent travel outside of an office setting, like non-commissioned salespeople, could face similar issues.

A discussion of worker monitoring using AI would be incomplete without noting that the National Labor Relations Board will be taking a hard look at AI systems that are used to surveil workers. Indeed the FAB references the NLRB’s guidance in its own and cautions that AI-based surveillance may constitute retaliation under the FLSA as well. More coverage of the NLRB guidance can be found in prior a post by Hunton attorneys.

Calculating Wages

The FAB moves on to address use of AI or automated systems to calculate workers’ wage rates, including systems that combine different wage rates, e.g., hourly and piece rate depending on the task. In an apparent reference to “gig workers,” the FAB spotlights algorithms that determine workers’ rates of pay based on “fluctuating supply and demand, customer traffic, geographic location, worker efficiency or performance, or the type of task performed by the employee.” Even where such tools and data are utilized, the calculation for minimum wage is still the same: total pay divided by number of hours worked.

Other Worker Protection Laws Potentially Impacted by AI

The remainder of the FAB addresses other workplace areas where workers are protected by laws administered by the DOL and where AI might pose risks.

Under the FMLA, workers become eligible for protected leave after 12 months of employment and 1,250 hours of service. Based on the same flaws that AI could exhibit in monitoring break time, these systems may also miscalculate an employee’s hours of service, leading to an improper eligibility determination for coverage under FMLA. Additionally, once the automated system determines that an employee is eligible for FMLA leave, the employers are not permitted to “retest” the employee’s eligibility until the beginning of a new 12-month period or upon a different FMLA request. A system that continually tracks eligibility under FMLA violates this rule. Once an FMLA leave request is made, employers may request that an employee submit a certification for the need to take FMLA, i.e., a “doctor’s note.” An AI system that determines the sufficiency of this health care provider’s certification is at risk of requesting more medical information from the employee than the law permits. Because the FMLA provides an independent cause of action for workers, any or all of these scenarios could expose an employer to liability under the law.

The FAB also addresses the 2023 PUMP Act, which guarantees “reasonable break times” for nursing mothers to express breast milk at work. Reiterating AI’s potential problems tracking break time, the FAB warns employers against penalizing nursing mothers for taking lactation breaks. An automated system that considers breaks taken to assign “productivity scores,” or to determine future work scheduling, could violate the FLSA if it uses these inputs to negatively impact nursing mothers.

The rarely-mentioned Employee Polygraph Protection Act received some attention in the FAB, with the WHD cautioning that AI systems that measure an employee’s truthfulness through voice, “micro-expression,” or other behavioral analysis, may be prohibited by the Act. These “lie-detector” tests are only permitted under certain exceptions, such as in the security and pharmaceutical industries, or for workers reasonably suspected of involvement in workplace theft and embezzlement.

The FAB wraps with a blanket warning to employers attempting to use AI or other automated systems as a defense to retaliation under the FLSA: blaming the algorithm will not avoid liability. Therefore, employers are urged to take a measured approach to AI-based employee monitoring and to always include a human in the loop to double-check the system’s homework.

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins