California’s New Draft Regulations on AI: What You Should Know

Client Alert

This week, the California Privacy Protection Agency (CPPA) released preliminary draft regulations on automated decisionmaking technology ( or “ADMT”) that would require disclosures and associated opt-out and access rights related to businesses’ use of artificial intelligence (AI) and AI-adjacent technologies. The draft regulations are far from final: Released in advance of the CPPA’s December 8, 2023 board meeting, the draft is expressly intended to facilitate Board discussion and public participation before the formal rulemaking process begins.

The draft signifies a major step forward in the significant efforts by federal, state and international governments to begin to regulate uses of AI by balancing consumer protection with the technology’s vast promise. For example, European regulators have been working for most of the year to negotiate and potentially finalize its sweeping Artificial Intelligence Act (“AI Act”), though that process will now extend into 2024. In September, California Governor Gavin Newsom signed an executive order that directed state agencies to study the security and privacy risk of AI, and authorized state employees to experiment with integrating AI tools into state operations. And last month, President Biden released an executive order addressing a wide range of AI-related challenges and proposing a coordinated approach to ensure responsible government use of AI. 

Here’s what you should know: 

1) What is automated decisionmaking technology (ADMT)?

The draft regulations broadly define ADMT to include “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.” ADMT also includes profiling, which is defined as “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”

The definition’s inclusion of technology that aids in human decisionmaking extends beyond the European Union’s General Data Protection Regulation (GDPR) and similar UK GDPR focus on fully automated decision-making. In this respect, the proposed regulations align more closely with the Colorado Privacy Act (CPA), which governs “human reviewed automated processing” and “human involved automated processing” in addition to “solely automated processing.” But unlike the CPA, the drafted regulations do not differentiate between fully and partially automated decision-making.  

2) Regulated uses 

The draft regulations focus on ADMT uses that may significantly impact consumers, including “decision[s] that produce[ ] legal or similarly significant effects concerning a consumer,” such as decisions to provide or deny financial or lending services, housing, insurance, education, criminal justice, employment or compensation, health care services, or essential goods or services. Other regulated uses include profiling a consumer while the consumer is acting in their capacity as an employee, job applicant or student; or in a publicly accessible place. The focus on potential legal and similarly significant impacts follows the lead of the Virginia Consumer Data Protection Act (“VCDPA”), which went into effect at the beginning of this year, as well as its analogs in other states such as Colorado and Connecticut.

The CPPA’s governing board will discuss whether the regulations should apply to profiling a consumer for behavioral advertising purposes or where the business has “actual knowledge is under the age of 16.” The draft also proposes potential options for additional consumer protections around the use of personal information to train ADMT.

3) “Pre-use Notice” obligations

The proposed rules deploy a series of disclosure obligations and consumer rights that recall the current generation of privacy laws. Specifically, the CPPA would require businesses to provide a “Pre-use Notice” to inform consumers about the business’s use of ADMT. These mandated notices would include:

  • A “plain language” explanation of the purposes that a business may be using ADMT;
  • A description of the consumer’s right to opt out of the business’s use of ADMT and instructions for how to submit an opt-out request;
  • A description of the consumer’s right to access information about the business’s use of ADMT and instructions for how to submit an access request; and
  • A “simple and easy-to-use method” for the consumer to obtain additional information about the business’s use of ADMT, including the technology’s logic and intended output, the role of any human involvement, and any assessments of the technology’s validity, reliability and fairness.

4) Opt-out rights

The regulations would provide consumers the right to opt out of certain uses of ADMT, except where such use is “necessary to achieve and used solely for” security, fraud detection, protection of consumers’ life and physical safety, or essential service delivery.

Businesses would be required to designate two or more methods for consumers to submit opt-out requests that are “easy for the consumer to execute.” Businesses that interact with consumers online must provide an interactive web form for consumers to submit such opt-out requests. 

5) Access rights

Consumers would also have the right to access information about a business’s ADMT usage,including why it uses the technology, the intended output of the technology, and how the business has used or plans to use the output to make decisions with respect to the consumer. A business responding to such a request would also be required to provide the consumer instructions on how to exercise other rights under the California Consumer Privacy Act (“CCPA”) and how to file a complaint about the business’s use of ADMT to the business, the CPPA and the California Attorney General’s Office.

Where a business makes an ADMT-related decision that results in denying a consumer certain goods or services that produces one of the legal or similarly significant effects, described above, the business must notify the consumer of the decision, explain the consumer’s access right (including how to exercise the right), and provide the consumer information on filing a complaints. 

6) Risk assessments

The draft ADMT regulations are intended to work alongside the draft risk assessment regulations released by the CPPA in September. Businesses would be required to conduct risk assessments where processing personal information “presents significant risk to consumers’ privacy,” including through the uses of ADMT subject to the proposed regulations.    

What’s next?

The CPPA’s governing board is scheduled to discuss the draft regulations at its December 8, 2023 meeting, and the formal rule making process is expected to begin in 2024. While far from final, the CPPA’s new draft regulations should be added to the growing list of nonbinding draft regulations and industry standards from which businesses may draw in modeling their AI governance programs. We will continue to monitor the California rulemaking process, and provide additional guidance along the way.

For more information and resources about Manatt's AI practice, please visit our dedicated Artificial Intelligence webpage.

manatt-black

ATTORNEY ADVERTISING

pursuant to New York DR 2-101(f)

© 2024 Manatt, Phelps & Phillips, LLP.

All rights reserved