HB Ad Slot
HB Mobile Ad Slot
Happy Privacy Day: Emerging Issues in Privacy, Cybersecurity, and AI in the Workplace
Wednesday, January 29, 2025

As the integration of technology in the workplace accelerates, so do the challenges related to privacy, cybersecurity, and the ethical use of artificial intelligence (AI). Human resource professionals and in-house counsel must navigate a rapidly evolving landscape of legal and regulatory requirements. This National Privacy Day, it’s crucial to spotlight emerging issues in workplace technology and the associated implications for data privacy, cybersecurity, and compliance.

We explore here practical use cases raising these issues, highlight key risks, and provide actionable insights for HR professionals and in-house counsel to manage these concerns effectively.

1. Wearables and the Intersection of Privacy, Security, and Disability Law

Wearable devices have a wide range of use cases including interactive training, performance monitoring, and navigation tracking. Wearables such as fitness trackers and smartwatches became more popular in HR and employee benefits departments when they were deployed in wellness programs to monitor employees’ health metrics, promote fitness, and provide a basis for doling out insurance premium incentives. While these tools offer benefits, they also collect sensitive health and other personal data, raising significant privacy and cybersecurity concerns under the Health Insurance Portability and Accountability Act (HIPAA), the Americans with Disabilities Act (ADA), and state privacy laws.

Earlier this year, the Equal Employment Opportunity Commission (EEOC) issued guidance emphasizing that data collected through wearables must align with ADA rules. More recently, the EEOC withdrew that guidance in response to an Executive Order issued by President Trump. Still, employers should evaluate their use of wearables and whether they raise ADA issues, such as voluntary use of such devices when collecting confidential medical information, making disability-related inquiries, and using aggregated or anonymized data to prevent discrimination claims.

Beyond ADA compliance, cybersecurity is critical. Wearables often collect sensitive data and transmit same to third-party vendors. Employers must assess these vendors’ data protection practices, including encryption protocols and incident response measures, to mitigate the risk of breaches or unauthorized access.

Practical Tip: Implement robust contracts with third-party vendors, requiring adherence to privacy laws, breach notification, and security standards. Also, ensure clear communication with employees about how their data will be collected, used, and stored.

2. Performance Management Platforms and Employee Monitoring

Platforms like Insightful and similar performance management tools are increasingly being used to monitor employee productivity and/or compliance with appliable law and company policies. These platforms can capture a vast array of data, including screen activity, keystrokes, and time spent on tasks, raising significant privacy concerns.

While such tools may improve efficiency and accountability, they also risk crossing boundaries, particularly when employees are unaware of the extent of monitoring and/or where the employer doesn’t have effective data minimization controls in place. State laws like the California Consumer Privacy Act (CCPA) can place limits on these monitoring practices, particularly if employees have a reasonable expectation of privacy. They also can require additional layers of security safeguards and administration of employee rights with respect to data collected and processed using the platform.

Practical Tip: Before deploying such tools, assess the necessity of data collection, ensure transparency by notifying employees, and restrict data collection to what is strictly necessary for business purposes. Implement policies that balance business needs with employee rights to privacy.

3. AI-Powered Dash Cams in Fleet Management

AI-enabled dash cams, often used for fleet management, combine video, audio, GPS, telematics, and/or biometrics to monitor driver behavior and vehicle performance, among other things. While these tools enhance safety and efficiency, they also present significant privacy and legal risks.

State biometric privacy laws, such as Illinois’s Biometric Information Privacy Act (BIPA) and similar laws in California, Colorado, and Texas, impose stringent requirements on biometric data collection, including obtaining employee consent and implementing robust data security measures. Employers must also assess the cybersecurity vulnerabilities of dash cam providers, given the volume of biometric, location, and other data they may collect.

Practical Tip: Conduct a legal review of biometric data collection practices, train employees on the use of dash cams, and audit vendor security practices to ensure compliance and minimize risk.

4. Assessing Vendor Cybersecurity for Employee Benefits Plans

Third-party vendors play a crucial role in processing data for retirement plans, such as 401(k) plan, as well as health and welfare plans. The Department of Labor (DOL) emphasized in recent guidance the importance of ERISA plan fiduciaries’ role to assess the cybersecurity practices of such service providers.

The DOL’s guidance underscores the need to evaluate vendors’ security measures, incident response plans, and data breach notification practices. Given the sensitive nature of data processed as part of plan administration—such as Social Security numbers, health records, and financial information—failure to vet vendors properly can lead to breaches, lawsuits, and regulatory penalties, including claims for breach of fiduciary duty.

Practical Tip: Conduct regular risk assessments of vendors, incorporate cybersecurity provisions into contracts, and document the due diligence process to demonstrate compliance with fiduciary obligations.

5. Biometrics for Access, Time Management, and Identity Verification

Biometric technology, such as fingerprint or facial recognition systems, is widely used for identity verification, physical access, and timekeeping. While convenient, the collection of biometric data carries significant privacy and cybersecurity risks.

BIPA and similar state laws require employers to obtain written consent, provide clear notices about data usage, and adhere to stringent security protocols. Additionally, biometrics are uniquely sensitive because they cannot be changed if compromised in a breach.

Practical Tip: Minimize reliance on biometric data where possible, ensure compliance with consent and notification requirements, and invest in encryption and secure storage systems for biometric information. Check out our Biometrics White Paper.

6. HIPAA Updates Affecting Group Health Plan Compliance

Recent changes to the HIPAA Privacy Rule, including provisions related to reproductive healthcare, significantly impact group health plans. The proposed HIPAA Security Rule amendments also signal stricter requirements for risk assessments, access controls, and data breach responses.

Employers sponsoring group health plans must stay ahead of these changes by updating their HIPAA policies and Notice of Privacy Practices, training staff, and ensuring that business associate agreements (BAAs) reflect the new requirements.

Practical Tip: Regularly review HIPAA compliance practices and monitor upcoming changes to ensure your group health plan aligns with evolving regulations.

7. Data Breach Notification Laws and Incident Response Plans

Many states have updated their data breach notification laws, lowering notification thresholds, shortening notification timelines, and expanding the definition of personal information. Employers should revise their incident response plans (IRPs) to align with these changes.

Practical Tip: Ensure IRPs reflect updated laws, test them through simulated breach scenarios, and coordinate with legal counsel to prepare for reporting obligations in case of an incident.

8. AI Deployment in Recruiting and Retention

AI tools are transforming HR functions, from recruiting to performance management and retention strategies. However, these tools require vast amounts of personal data to function effectively, increasing privacy and cybersecurity risks.

The EEOC and other regulatory bodies have cautioned against discriminatory impacts of AI, particularly regarding protected characteristics like disability, race, or gender. (As noted above, the EEOC recently withdrew its AI guidance under the ADA and Title VII following an Executive Order by the Trump Administration.) For example, the use of AI in hiring or promotions may trigger compliance obligations under the ADA, Title VII, and state laws.

Practical Tip: Conduct bias audits of AI systems, implement data minimization principles, and ensure compliance with applicable anti-discrimination laws.

9. Employee Use of AI Tools

Moving beyond the HR department, AI tools are fundamentally changing how people work. Tasks that used to require time-intensive manual effort—creating meeting minutes, preparing emails, digesting lengthy documents, creating PowerPoint decks—can now be completed far more efficiently with assistance from AI. The benefits of AI tools are undeniable, but so too are the associated risks. Organizations that rush to implement these tools without thoughtful vetting processes, policies, and training will expose themselves to significant regulatory and litigation risk.

Practical Tip: Not all AI tools are created equal—either in terms of the risks they pose or the utility they provide—so an important first step is developing criteria to assess, and then going through the process of assessing, which AI tools to permit employees to use. Equally important is establishing clear ground rules for how employees can use those tools. For instance, what company information are they permitted to use to prompt the tool; what are the processes for ensuring the tool’s output is accurate and consistent with company policies and objectives; and should employee use of AI tools be limited to internal functions or should they also be permitted to use these tools to generate work product for external audiences. 

10. Data Minimization Across the Employee Lifecycle

At the core of many of the above issues is the principle of data minimization. The California Privacy Protection Agency (CPPA) has emphasized that organizations must collect only the data necessary for specific purposes and ensure its secure disposal when no longer needed.

From recruiting to offboarding, HR professionals must assess whether data collection practices align with the principle of data minimization. Overcollection not only heightens privacy risks but also increases exposure in the event of a breach.

Practical Tip: Develop a data inventory mapping employee information from collection to disposal. Regularly review and update policies to limit data retention and enforce secure deletion practices.

Conclusion

The rapid adoption of emerging technologies presents both opportunities and challenges for employers. HR professionals and in-house counsel play a critical role in navigating privacy, cybersecurity, and AI compliance risks while fostering innovation.

By implementing robust policies, conducting regular risk assessments, and prioritizing data minimization, organizations can mitigate legal exposure and build employee trust. This National Privacy Day, take proactive steps to address these issues and position your organization as a leader in privacy and cybersecurity.

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins