On January 16, 2024, New Jersey Governor Phil Murphy signed into law the New Jersey Data Protection Act, making New Jersey the thirteenth state to enact a comprehensive state privacy law. This law imposes a wide range of obligations on businesses and nonprofits that collect and use personal data of New Jersey residents, including notification requirements to consumers of collection and disclosure of their personal data. Following the example of California and Colorado, where the data privacy laws call for promulgation of implementing regulations, the Director of the Division of Consumer Affairs in the Department of Law and Public Safety is empowered to enact rules and regulations necessary to effectuate the Act’s purposes. The Act becomes effective on January 16, 2025.
Applicability
The Act will apply to entities that conduct business in New Jersey or produce products or services targeted to New Jersey residents. Specifically, the Act applies to controllers that annually process the personal data of 100,000 or more consumers, excluding data that is processed solely for the purpose of completing a payment transaction. Alternatively, the Act will apply to the organizations that meet the “sales threshold,” that is, if the organization controls or processes the personal data of at least 25,000 consumers and derives revenue or receives a discount on the price of any goods or services from the sale of personal data.
While the threshold requirements for applicability are fairly standard, organizations must remain vigilant concerning their information practices involving cookies, pixels and similar tracking technologies. Collecting cookies potentially (and sometimes, inadvertently) makes an entity subject to the Act under the sales threshold. Specifically, the term “sale” is defined broadly to mean “the sharing, disclosing, or transferring of personal data for monetary or other valuable consideration by the controller to a third party.” This definition is similar to the one used by the California Consumer Privacy Act. As seen from the California Attorney General’s enforcement efforts, where a business discloses or makes available consumers’ personal data to third parties through the use of online tracking technologies such as pixels, web beacons, software developer kits, third-party libraries and cookies, in exchange for monetary or other valuable consideration, including for analytics purposes or for free or discounted services, such practices meet the definition of a “sale.” See The People of the State of California v. Sephora USA, Inc., Final Judgment and Permanent Injunction.
Pursuant to the Act, “controller” means an individual or legal entity that, alone or jointly with others, determines the purpose for and methods of processing personal data. This means that the law will apply to nonprofits, distinguishing the Act from most other comprehensive consumer privacy laws but following the example of Colorado, Delaware and Oregon. In terms of covered data, the Act regulates practices relating to personal data, that is, any information linked or reasonably linkable to an identified or identifiable person. De-identified data or publicly available information is not within the scope of the Act.
Controller Obligations
The Act imposes several obligations on controllers. As an initial matter, it requires that controllers limit the collection of personal data to what is adequate, relevant and reasonably necessary in relation to the purposes for which such data is processed. Furthermore, the controller’s information practices and purposes for processing must be disclosed to the consumers in a privacy notice, as discussed below. It also is incumbent upon the controller to establish, implement and maintain administrative, technical and physical data security practices. The Act specifies that this obligation extends to securing personal data from unauthorized access during both storage and use. It also requires the controller to assess and ensure that the security practices are appropriate in relation to the volume and type of data the entity is processing. The appropriateness of the security practices is likely to be addressed in the forthcoming implementing regulations.
As a baseline rule, processing of sensitive data is not permitted unless the controller first obtains consent. Consent must be manifested as a clear affirmative act signifying a consumer’s freely given, specific, informed and unambiguous agreement to allow processing of personal data pertaining to the consumer. Significantly, acceptance of general or broad terms of use will not satisfy the consent requirement, nor will any agreement obtained through the use of “dark patterns.”* In other words, in contrast to, for example the California Consumer Privacy Act, the Act requires opt-in treatment for processing of sensitive data. The definition of sensitive data is similar to those of the other state laws but also includes a broader category of financial information as well as gender identity status as transgender or nonbinary. Personal data collected from a known child is included in the definition of sensitive data, and must be processed in accordance with the Children’s Online Privacy Protection Act.
The controller also has a duty to provide consumers with a privacy notice that is reasonably accessible, clear and meaningful. The notice must include, but may not be limited to:
- The categories of the personal data that the controller processes
- The purpose for processing personal data
- The categories of all third parties to which the controller may disclose a consumer’s personal data
- The categories of personal data that the controller shares with third parties
- How individuals may exercise their consumer rights
- The process by which the controller notifies consumers of material changes to the privacy notice, along with the effective date of the notice
- An active electronic mail address or other online mechanism that the consumer may use to contact the controller.
Like several other state privacy laws, the Act mandates that controllers enter into written agreements with third parties that process data on behalf of the controller. These agreements should govern the processing procedures, and the Act provides a list of specific requirements to be addressed in such data processing agreements.
The Act also imposes a duty on controllers to conduct a data protection assessment prior to processing of personal data that would present a heightened risk of harm to individuals. These requirements are similar to the draft rules presented by the California Privacy Protection Agency in December 2023. Essentially, the risk assessment (California term for data protection assessment) will require businesses to create a robust inventory of their processing activities and subsequently weigh any benefits of such processing against potential risks to individuals’ privacy. The Act lists targeted advertising, profiling, selling personal data and processing sensitive data as examples of what constitutes heightened risk.
Consumer Rights
The Act grants consumers rights similar to those of the other state privacy laws. Individuals have the right to (1) confirm processing of the consumer’s personal data, (2) correct inaccuracies, (3) make deletions, (4) determine data portability, and (5) opt out of processing for the purposes of sale of their personal information, targeted advertising and profiling. In addition, the Act creates certain opt-in requirements for minors. A controller may not process personal data for targeted advertising, sale or profiling without express consent from the consumer, where the controller knows, or willfully disregards, that the consumer is at least 13 years old but younger than 17 years old.
The Act has additional requirements regarding opt-out methods. Entities that fall within the Act’s application and engage in targeted advertising or sale of personal data must comply with opt-out requests delivered passively through Universal Opt-Out Mechanisms (UOOMs). UOOMs allow users to universally express to all sites their preference to not be traced on the internet. The duty to comply with UOOM requests will start six months after the Act’s effective date. Notably, controllers must not make use of a default setting that opts a consumer into the processing or sale of personal data, unless the controller has determined that the consumer has selected such default setting and the selection clearly represents the consumer’s affirmative, freely given and unambiguous choice to opt in. This provision is interesting, given that the opt-in is not required for processing non-online personal data for the purposes of sale, but appears to be required for online data.
Enforcement
The New Jersey Division of Consumer Affairs in the Department of Law and Public Safety will have the authority to enforce the Act. There is a 30-day cure period that will expire 18 months after the effective date. The Act declares that a violation of its provisions shall be “an unlawful practice and a violation of P.L. 1960, C. 39 (C.56:8-1 et seq.)” (the New Jersey Consumer Fraud Act). Initial violations are punishable with fines up to $10,000, and subsequent violations up to $20,000. Notably, no private right of action is specifically available under the Act.
Conclusion
Organizations with a national footprint already familiar with similar data privacy laws will need to update their compliance to meet New Jersey–specific requirements. For some organizations with New Jersey–centric operations, this new law may be the first time they need to consider building a comprehensive consumer privacy program. Entities that may become subject to the New Jersey Data Protection Act will have to carefully review their information practices, draft privacy policies and develop comprehensive internal compliance programs. A controller that is subject to the Act also should be mindful of, and prepare for, any implementing regulations to be issued by the Director of the Division of Consumer Affairs in the Department of Law and Public Safety.
Elisabeth Axberger, a recent law graduate, is a co-author of this article.
* The California Privacy Rights Act regulations offer extensive guidance and examples of dark patterns. For example, the use of confusing language or interactive elements is a dark pattern, as would be the use of “Yes” or “No” choices next to the statement “Do Not Sell or Share My Personal Information.” This is a confusing choice due to the double negative.