A New Landmark for Consumer Control Over Their Personal Information: CPPA Proposes Regulatory Framework for Automated Decision-making Technology
Today, the California Privacy Protection Agency released draft automated decision-making technology (ADMT) regulations that define important new protections related to businesses’ use of these technologies. The proposed regulations would implement consumers’ right to opt out of, and access information about, businesses’ uses of ADMT, as provided for by the California Consumer Privacy Act (CCPA). The Agency Board will provide feedback on these proposed regulations at the December 8, 2023 board meeting, and the Agency expects to begin formal rulemaking next year.
“Once again, California is taking the lead to support privacy-protective innovation in the use of emerging technologies, including those that leverage artificial intelligence,” said Vinhcent Le, Member of the California Privacy Protection Agency’s Board and the New Rules Subcommittee that drafted the proposed regulations. “These draft regulations support the responsible use of automated decision-making while providing appropriate guardrails with respect to privacy, including employees’ and children’s privacy.”
“Automated decision-making technologies and artificial intelligence have the potential to transform key aspects of our lives. We’re proud that California is meeting the moment by giving consumers more control over these technologies,” said Ashkan Soltani, Executive Director of the California Privacy Protection Agency. “We thank staff and the New Rules Subcommittee for their incredible work on the draft regulations and look forward to receiving additional input from the Agency Board and the public as we move through the appropriate process.”
The draft regulations outline how the new privacy protections that Californians voted for in 2020 could be implemented. Specifically, the draft regulations propose requirements for businesses using ADMT in any of the following ways:
- For decisions that tend to have the most significant impacts on consumers' lives. This would include, for example, decisions about their employment or compensation.
- Profiling an employee, contractor, applicant, or student. This would include, for example, using a keystroke logger to analyze their performance, and tracking their location.
- Profiling consumers in publicly accessible places, such as shopping malls, medical offices, and stadiums. This would include, for example, using facial-recognition technology or automated emotion assessment to analyze consumers’ behavior.
- Profiling a consumer for behavioral advertising. This would include, for example, evaluating consumers’ personal preferences and interests to display advertisements to them.
The draft also proposes potential options for additional consumer protections around the use of their personal information to train these technologies.
For the above uses of ADMT, the draft regulations would provide consumers with the following protections:
- Businesses would be required to provide “Pre-use Notices” to inform consumers about how the business intends to use ADMT, so that the consumer can decide whether to opt-out or to proceed, and whether to access more information.
- The ability to opt-out of the business’s use of ADMT (except in certain cases, such as to protect life and safety).
- The ability to access more information about how the business used ADMT to make a decision about the consumer.
These draft requirements would work in tandem with risk assessment requirements that the Board is also considering at the December 8, 2023 board meeting. Together, these proposed frameworks can provide consumers with control over their personal information while ensuring that automated decision-making technologies, including those made from artificial intelligence, are used with privacy in mind and in design.
Contact: press@cppa.ca.gov