This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

A Fresh Take

Insights on US legal developments

| 5 minute read

More California Privacy Rules: California Agency Issues Proposed Regulations on Automated Decisionmaking, Cybersecurity Audits, and Risk Assessments

The California Privacy Protection Agency (CPPA) has opened the public comment period on its long-awaited proposed regulations on automated decisionmaking technology (ADMT), cybersecurity audits, privacy risk assessments, and general application of the CCPA to insurance companies (the “draft regulations”). If adopted in their current form, these draft regulations would impose substantial new obligations on companies subject to the California Consumer Privacy Act (CCPA), including detailed notice and procedural requirements for use of ADMT and formal cybersecurity audit and/or privacy risk assessments for many covered businesses.

Below, we provide an overview of key topics in the current draft regulations and next steps.

New Proposed Rules for Automated Decisionmaking Technology

The draft regulations regarding ADMT would apply to businesses using ADMT for “significant decisions” that have a "legal or similarly significant effect" on consumers. ”Significant decisions” are defined by the draft regulations to include decisions related to essential goods and services and criminal justice, as well as opportunities related to financial services, lending, insurance, healthcare, housing, educational, employment, or independent contracting opportunities. Decisions affecting compensation or work status or that involve profiling in the workplace or educational settings, or profiling for use in targeted advertising and marketing, are also likely to fall within the ambit of the regulations.

The draft regulations would require businesses to provide a clear and conspicuous Pre-use Notice to consumers before using ADMT for such “significant decisions.” This notice would, among other things:

  • Explain that ADMT is being used and describe the purpose;
  • Describe the logic or key parameters involved in the decision-making process;
  • Describe what the ADMT is designed to output or generate (e.g. a numerical score) and how that will be used;
  • Where applicable, explain the consumer's right to opt out or appeal the decision; 
  • Describe the consumers right to access ADMT (see below); and
  • State that the business is prohibited from retaliating against a consumer for exercising any CCPA rights.

This Pre-use Notice address two other rights provided by the draft regulations. First, consumers must be given a right to “access ADMT,” meaning the right to an explanation of and relevant information about the ADMT. The information that must be provided significantly overlaps with the information required in the Pre-use Notice. Second, consumers must be provided a right to opt-out of ADMT, with limited exceptions, for example, where ADMT is used for fraud detection and prevention or where the business provides the consumer with an opportunity to appeal the decision to a human reviewer. Other exceptions may apply in defined circumstances, and regulated companies should consult counsel to ensure proper application of these exceptions when finalized. If the right to opt-out of ADMT applies, the business must provide at least two means for the consumer to opt out, including one via the primary medium through which the business interacts with the consumer (e.g. online via a link in the Pre-use Notice).

Finally, when a business is using physical or biological identification or profiling in its ADMT, it must conduct an evaluation to ensure that the ADMT works as intended and does not result in discrimination based on protected characteristics. 

Cybersecurity Audits

The draft regulations would require covered businesses to conduct annual cybersecurity audits if they engage in activities where the “processing of consumers’ personal information presents significant risk to consumers’ security,” including businesses that: 

  • Process personal information of 250,000 or more consumers or households or the sensitive personal information of 50,000 or more consumers; or 
  • Derive 50 percent or more of their annual revenue from selling or sharing consumers’ personal information. 

These audits would require use of a qualified, objective, independent auditor using generally accepted standards, such as those established by the National Institute of Standards and Technology (NIST) or the International Organization for Standardization (ISO) to evaluate the adequacy of the business’s technical, administrative, and physical safeguards for protecting personal information. These audits would need to include the name, affiliation and relevant qualifications of each auditor, as well as a certification that each auditor completed an independent, objective and impartial review and did not primarily rely on assertions or attestations by the business’ management. These audits would need to be reported to the business’ board, governing body, or highest-ranking executive responsible for the program. The scope of the audit includes, but is not limited to, authentication, encryption, zero trust architecture, access controls, asset inventory and management, vulnerability scans, penetration testing, network segmentation, oversight of service providers, and data retention schedules, as well as assessing the effectiveness of incident response, business continuity and disaster recovery protocols.

The findings of the audit must be documented in a report, which would need to assess the effectiveness of the business’ cybersecurity program, identify any gaps and the measures taken to address those gaps, note the titles of the individuals responsible for the cybersecurity program, and include the date that the program was presented to the board, governing body, or highest-ranking executive responsible for the program. Businesses required to perform cybersecurity audits would be required to submit a written certification of completion to the CPPA annually.

Risk Assessments

The draft regulations would require many, if not most, businesses subject to the CCPA to undertake formal privacy-related risk assessments, and to submit annual certifications and documentation about these assessments to the CPPA.

Under the draft regulations, businesses subject to the CCPA would be required to conduct risk assessments for any of the following types of activities:

  • “selling” personal information (within the broad meaning of this term under the CCPA) or “sharing” personal information for cross-contextual behavioral advertising;
  • processing sensitive personal information (such as Social Security numbers or other government identifiers, health-related data, precise geolocation data, biometric data, and information about children under the age of 16);[1]
    • Even businesses that do not collect such information in the commercial context are likely to collect sensitive personal information in the employment context, such as Social Security numbers for tax reporting and citizenship or immigration status to verify right to work in the US. Thus, covered businesses with employees who are California residents are likely to need to perform privacy risk assessments in the employment context.
  • using ADMT for a “significant decision” concerning a consumer or for “extensive profiling,” as discussed above; or
  • processing personal information to train ADMT or AI that is capable of being used to establish individual identity, for the generation of a deepfake, or for the operation of generative models, such as large language models.

Given the breadth of these covered activities (such as sales/sharing related to targeted advertising), it is likely that most CCPA-covered businesses would be required to perform regular risk assessments. These risk assessments would need to include detailed information about the processing, address a number of operational elements, and assess risks and safeguards. Additional requirements would apply to risk assessments for ADMT-related activities.

Businesses would need to perform risk assessments before initiating the covered processing activities, and to review and update them at least every 3 years—or immediately if there will be a material change in the processing. The draft regulations also detail requirements for annual certifications and submissions.

Next Steps

The public will have the opportunity to provide formal written comments through January 14, 2025, at 6 p.m. PST. The CPPA will also hold a virtual public hearing for oral comments on January 14, 2025.

 

 

 

[1] “Sensitive personal information” includes Social Security numbers or other government-issued identification numbers, health-related data, precise geolocation data, biometric or genetic data, racial or ethnic origin, citizenship or immigration status, religious or philosophical beliefs, sexual orientation, union membership, account log-in credentials, contents of certain consumer communications unless the business is the intended recipient of the communication, and information about children under 16.

Tags

ai, cybersecurity