This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

A Fresh Take

Insights on US legal developments

| 6 minute read
Reposted from Freshfields Technology Quotient

The California Privacy Protection Agency Releases a First Draft of Automated Decisionmaking Opt-Out and Access Regulations

The California Privacy Protection Agency (CPPA) has been steadily making strides to progress rulemaking under the California Consumer Privacy Act (CCPA) on requirements for cybersecurity audits, risk assessments, and automated decisionmaking. Most recently, the CPPA released its first draft of proposed regulations on automated decisionmaking opt-out and access rights in order to facilitate discussion and public participation. While certain features of the draft automated decisionmaking regulations are similar to other privacy laws, certain new features are notable.

What is “Automated Decisionmaking Technology?”

The draft regulations broadly define “automated decisionmaking technology” (ADMT) as “any system, software or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation… to make or execute a decision or facilitate human decisionmaking” (emphasis added). Significantly, this suggests that “automated decisionmaking technology” would not be limited to technology used to make automated decisions but may more broadly extend to software or other technology used to assist human decisionmaking.

The draft regulations provide that ADMT includes “profiling,” defined as “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”

Proposed Rights to Opt-Out of Certain Uses of ADMT

The draft regulations would require businesses subject to the CCPA to give new rights to California residents (or “consumers”) to opt out of the use of ADMT for the following purposes:

Decisions that Produce Legal Effects

The draft regulations would give consumers the right to opt-out of the use of ADMT in making a decision that produces “legal or similarly significant effects”:  i.e., a decision that results in access to, or the provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or services.

At first glance, the language about “legal or similarly significant effects” may appear to be inspired by Article 22 of the EU General Data Protection Regulation (GDPR), which generally provides that an individual has the right not to be subject to a decision “based solely on automated processing…which products legal effects concerning him or her or similarly significantly affects him or her.” However, there are substantial differences between GDPR and the draft regulations. On one hand, the draft regulations would restrict not only decisions “based solely on automated processing” but also the use of ADMT technologies that are used to “facilitate human decision-making.” In this respect, the draft regulations appear to regulate a broader scope of automated decisionmaking than GDPR. On the other hand, the draft regulations would require providing only an opt-out from such a use of ADMT, rather than requiring opt-in consent or another lawful basis as typically required for automated decisionmaking under GDPR.

Profiling of Employees, Other Workers, or Students

The draft regulations would give consumers the right to opt out of the use of ADMT when they are acting as an employee, independent contractor, job applicant, or student. The draft explains that this type of profiling includes, for example, profiling an employee using keystroke loggers, productivity or attention monitors, video or audio recording or live-streaming, facial or speech recognition or detection, automated emotion assessment, location trackers, speed trackers, and web-browsing, mobile-application, or social-media monitoring tools.

This provision demonstrates how the draft regulations differ from automated decisionmaking restrictions under other state consumer data privacy laws as well. For example, the Colorado Privacy Act gives Colorado consumers the right to opt out of “profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer,” but the Colorado Privacy Act applies only to consumers acting in an individual or household context, and excludes individuals acting in an employment context. As a result, the draft CCPA regulations on ADMT would have a wider impact on businesses, by affecting their activities as employers as well.

Profiling Consumers in a Publicly Accessible Place

The draft regulations would give consumers the right to opt-out of the use of ADMT to profile them while they are in a “publicly accessible place,” which is defined as a “place that is open to or serves the public.” Some may be surprised by the examples included in the definition, which broadly include a long list of locations: shopping malls, stores, restaurants, cafes, movie theaters, amusement parks, convention centers, stadiums, gymnasiums, hospitals, medical clinics or offices, transportation depots, transit, streets, or parks. This extensive definition of publicly accessible places and list of examples may present a new challenge for businesses that may be addressing these types of opt-out requirements for the first time in this context. For example, this opt-out right would affect businesses using facial-recognition technology or automated emotion assessment in publicly accessible places like shopping centers. 

Other Potential Opt-Out Rights from Use of ADMT

Additionally, the draft regulations propose several other potential opt-out rights from use of ADMT, for discussion by the CPPA Board:

  • Profiling for Behavioral Advertising

The draft regulations include an option that would give consumers the right to opt-out of profiling for behavioral advertising, such as ADMT that evaluates preferences and interests to display online advertisements. While the opt-out right for other types of profiling do not have to be provided if the business can rely on an exception, the draft regulations provide that businesses using profiling for behavioral advertising could not rely on an exception and would need to honor opt-out requests for this type of profiling.

It is also interesting that while the draft regulations use the term “behavioral advertising,” the CCPA itself utilizes the term “cross-context behavioral advertising” (emphasis added). It is not clear whether the CPPA intentionally used a different term than the CCPA defined “cross-context behavioral advertising” for this opt-out right, or whether it is simply a drafting oversight.

  • Profiling Individuals Under 16

The draft regulations include an option that would give consumers the ability to opt-out of any profiling where the business has actual knowledge that the consumer is under the age of 16. While the other ADMT opt-out rights for profiling are limited in scope to profiling in certain contexts, this would be a full-stop ability to opt-out of any profiling where a business has actual knowledge that a consumer is under the age of 16. For consumers under the age of 13, this would change to an obligation to obtain opt-in consent from the consumer’s parent or guardian, in addition to any verifiable parental consent that may be required under the Children’s Online Privacy Protection Act (COPPA).

  • Processing to Train ADMT

Lastly, the draft regulations provide an option that would give consumers an opt-out right from use of their personal information to train ADMT. 

Pre-use Notice Requirement

The draft regulations would require businesses to provide a detailed “pre-use notice,” which would include a description of the right to access information about use of ADMT, the right to opt-out of certain uses of ADMT, the purpose for which the ADMT will be used, and additional information such as a “plain language” explanation of the logic used in the ADMT (including key parameters that affect the output), intended output of the ADMT, how the business plans to use the output to make a decision, and whether the use of ADMT has been evaluated for validity, reliability, and fairness. The challenge of providing a plain language explanation of ADMT has long been discussed, and the draft regulations would present the additional challenge of how to present these explanations as part of a pre-use notice. 

The draft regulations make a point to require a descriptive purpose in the pre-use notice. Specifically, the draft calls out that generic purpose descriptions, such as “to improve our services,” would be insufficient to meet the plain language requirement. Businesses may know that drafting sufficient purpose language can be difficult in such notices, and this proposed requirement deserves attention as it seems to seek further clarity on purpose descriptions than businesses may currently provide.

Notice of Adverse Decision Based on ADMT

If a business makes a decision using ADMT that results in the denial of goods, services, or other opportunities (e.g., employment opportunities) to the consumer, the draft regulations would require the business to notify the consumer of that decision, including information about their right to request access to information regarding the business’ use of ADMT in making that decision and how the consumer may file a complaint with the CPPA and California Attorney General.

Exemptions

While there are several proposed requirements that may present a new aspect of compliance for businesses, there are some additional clarifications on exemptions and exceptions that may be helpful. The draft proposes that businesses would not need to offer a right to opt-out of ADMT if the use is consistent with reasonable expectations of consumers, as outlined in CCPA Regulation § 7002, and are used solely for a permissible purpose. This includes, for example, preventing security incidents or resisting malicious, deceptive, fraudulent, or illegal actions directed at the business. 

What's Next

The CPPA clarifies that it has not yet started the formal rulemaking process, which will include a public comment period, but these draft regulations will be further discussed at the next CPPA meeting on December 8, 2023.

 

Tags

ai, cyber security, data, data protection