by Traverse Legal, reviewed by Enrico Schaefer - March 25, 2026 - Business Law, Corporate Law, Privacy
The California Privacy Protection Agency (CPPA) finalized the 2026 regulation package in September 2025, and the rules took effect on January 1, 2026. The CCPA 2026 Regulations do more than update existing language. They change how businesses handle consumer requests, privacy notices, and user choice mechanisms. They also add compliance duties tied to automated decision-making technology (ADMT), cybersecurity audits, risk assessments, and insurance company scope.
For businesses that collect personal information from California residents, these rules now shape both legal compliance and operational design. Privacy compliance no longer stops at a privacy policy and footer link. It now reaches request workflows, product design, internal controls, and data governance.
The timing split matters. Core updates took effect on January 1, 2026. Other obligations follow later deadlines.
Key dates include:
That structure makes 2026 the year businesses need to fix current compliance gaps and prepare for the next layer of obligations.
The 2026 rules update several existing CCPA duties. They tighten how businesses receive, process, and respond to consumer requests. They also expand what businesses must do for opt-out requests, privacy disclosures, and correction workflows.
Some of the biggest changes include:
These are not minor updates as they require businesses to build request workflows that work across systems and preserve corrected data over time.
California also tightened the rules on privacy choice design. Methods for submitting CCPA requests and obtaining consent must be easy to understand and symmetrical. A business cannot make the privacy protective choice harder, slower, or less visible than the alternative.
This rule targets dark patterns directly. It affects cookie banners, preference centers, account settings, and other privacy controls. If acceptance is simple but opt-out is buried or confusing, the design creates risk.
The 2026 package also expands California privacy compliance into new areas. The California Privacy Protection Agency (CPPA) approved rules covering:
Some less visible changes also matter. A consumer denied a request to correct health information may submit a written statement of up to 250 words contesting accuracy. Personal information of consumers under 16 is now treated as sensitive personal information for right to limit analysis.
These rules reach far beyond privacy notices. They affect consumer rights, correction procedures, youth data, automated systems, risk review, and security governance.
Beginning January 1, 2026, a business must give consumers a way to confirm whether an opt-out request has been honored, including a request submitted through Global Privacy Control. The CPPA says a business may do this by displaying “Opt-Out Request Honored” or showing the status through a privacy setting.
This requirement goes beyond collecting a signal. The business must connect the request path to backend logic that reflects the consumer’s actual status. If it cannot show whether the request took effect, the process remains incomplete.
The rules also block a common mistake. When a business knows the consumer’s identity, it cannot treat the later absence of an opt-out preference signal as consent to opt back in.
The 2026 regulations expand the reach of requests to know. If a business retains personal information for more than 12 months, the consumer must be able to request access to information going back to January 1, 2022.
This change puts pressure on retention and retrieval practices. The business must know:
The regulations also limit a business’s ability to rely on poor process as an excuse. A business cannot claim disproportionate effort where it failed to put adequate request handling procedures in place.
Some of the most important 2026 changes sit outside the usual banner and consent discussion.
First, corrected information must remain corrected. If a business receives inaccurate data from a recurring source, it must prevent that data from overriding a correction already made. The business must also either identify the source to the consumer or inform the source that the information is inaccurate and must be corrected.
Second, if a business denies a request to correct health information, the consumer may submit a written statement of up to 250 words contesting the accuracy of the information. If the consumer requests it, the business must make that statement available to any person to whom it disclosed the contested information.
Third, personal information of consumers under 16 is now treated as sensitive personal information and may trigger the right to limit. Businesses with teen users or family account structures should review whether their current data use practices still fit within the permitted uses in the regulations.
The 2026 regulations move privacy compliance into interface design. Methods for submitting CCPA requests and obtaining consent must be easy to understand and symmetrical. A business cannot make the privacy protective choice longer, harder, or more time-consuming than the less protective option.
The regulations give clear examples of what fails this test. These designs create risk:
Businesses should review every consumer-facing privacy control, not only the main cookie banner. That includes preference centers, pop-ups, mobile toggles, account settings, checkout prompts, and financial incentive flows. The CPPA’s enforcement advisory warns businesses to offer symmetrical choices and clear language.
The regulations also impose specific rules for apps and nontraditional interfaces. For mobile apps, the privacy policy must be accessible through the app’s platform or download page and through a link within the app, such as in the settings menu. A website footer will not cure an in-app notice gap.
Connected devices create a timing issue. When a business sells or shares personal information collected through a connected device, the consumer must encounter the notice before or at the time the device begins collecting that information. The same timing rule applies in augmented and virtual reality environments. When covered collection, sale, sharing, or certain sensitive information uses occur there, the consumer must encounter the notice before entering the environment or before encountering the business in that environment.
Businesses should review:
Global Privacy Control now sits at the center of California opt-out compliance. Businesses must treat a qualifying opt-out preference signal as a valid request to opt out of sale and sharing. They must also provide a way for consumers to confirm the status of that request.
That means signal detection alone is not enough. The business needs a working chain from browser or device signal, to backend suppression logic, to a status display that the consumer can understand. If the consumer cannot tell whether the opt-out took effect, the process is weak.
Businesses should test GPC across:
The regulations also block a common shortcut. When a business knows the consumer’s identity, the absence of an opt-out preference signal does not allow the business to treat the consumer as opted back in by default. A prior opt-out choice still matters.
This is especially important in account-based environments. A consumer may send a signal in one session and log in later from a device where the signal is not present. The business cannot treat that silence as permission to resume sales or sharing if it has already tied the earlier opt-out to a known consumer.
If a business retains personal information for longer than 12 months, it may need to produce personal information going back to January 1, 2022, in response to a request to know. That makes retrieval readiness a legal issue, not only a records issue.
A business needs to know:
The regulations also limit a common excuse. A business cannot rely on disproportionate effort if it fails to put adequate request-handling processes in place.
Correction rights now demand stronger controls. Corrected information must remain corrected. If a business accepts a correction request and later allows bad source data to overwrite that correction, the problem remains unresolved.
The CPPA also requires source-level action. Where inaccurate information came from another source, the business must either provide the consumer with the source’s name or inform the source that the information is inaccurate and must be corrected.
Health information adds another layer. If a business denies a request to correct health information, the consumer may submit a written statement of up to 250 words contesting the accuracy of the information. If the consumer asks, the business must make that statement available to any person to whom it disclosed the contested information.
California’s ADMT rules become a major compliance issue on January 1, 2027. The final regulations define automated decision-making technology as technology that processes personal information and uses computation to replace human decision-making or substantially replace human decision-making. They also define meaningful human involvement. The human reviewer must know how to interpret and use the output, review the output and other relevant information, and have the authority to make or change the decision.
That definition narrows the focus. The rule does not cover every tool with analytics or machine learning features. It targets systems used to make significant decisions about a consumer, especially when the technology drives the outcome without meaningful human review. The regulations identify significant decisions in areas such as lending, housing, education, employment, and healthcare.
For most businesses, the right questions are straightforward:
The ADMT rules also create consumer facing obligations. Businesses using ADMT for significant decisions must provide a pre use notice, offer access rights tied to the use of the technology, and in many cases provide a right to opt out. California has moved automated decisionmaking into the broader CCPA framework for notice, choice, and request handling.
The rule still has limits. Advertising alone does not qualify as a significant decision under the final text. Businesses should still review ad tech for other CCPA issues, but they should not stretch the ADMT rule beyond what the regulation covers.
Businesses using automated tools in high-impact decision settings should map those systems now, assess whether human review is meaningful, and prepare notices and request workflows before 2027. That timing point is an inference based on the final compliance date and the scope of the required obligations.
California now treats risk assessments as a formal compliance control. Starting January 1, 2026, a business must conduct a risk assessment before starting several covered activities, including selling or sharing personal information, processing sensitive personal information, and using or training certain automated technologies. The CPPA’s guidance also says the assessment must identify the business purpose, the personal information and operational elements involved, the benefits and negative impacts of the activity, and the safeguards used to address those impacts.
This rule goes beyond paperwork. The assessment is part of the decision process, not a document created later to justify what the business already chose to do. The regulations ask whether the activity creates a significant risk to consumers’ privacy or security and whether safeguards reduce that risk enough to proceed.
For many businesses, the first trigger points will be:
The reporting timeline shows that California views risk assessments as an ongoing governance system. The regulations took effect on January 1, 2026, but businesses subject to the risk assessment rules do not begin submitting summaries and attestations until April 1, 2028. The duty to conduct the assessment for covered activities starts first.
Scope also matters. California remains broader than many privacy regimes because employment-related and business-to-business data remain in scope under the CCPA framework. Companies focused only on marketing data may miss where their actual exposure sits.
A business needs a repeatable method to identify triggering activities, gather the required facts, evaluate consumer impact, and preserve the record. That practical point follows from the content that the regulations require in a risk assessment.
The cybersecurity audit rules also follow a phased schedule. The CPPA states that the regulations took effect on January 1, 2026, but businesses get more time to comply with the audit requirements. The certification deadlines are:
Revenue alone does not decide the issue. The regulations tie the audit requirement to businesses whose processing of personal information presents a significant risk to consumers’ security. Companies need to examine both revenue and the nature of the processing at issue.
Planning needs to start early. Audit readiness depends on knowing which systems, datasets, and controls fall within scope and what evidence supports the company’s cybersecurity program. That planning point is an inference drawn from the structure of the audit and certification requirements.
The audit requirement calls for a structured review, not a general statement that the company takes security seriously. The final regulations define a cybersecurity program as the policies, procedures, and practices that protect personal information from unauthorized access, destruction, use, modification, or disclosure, and from unauthorized activity resulting in loss of availability. They also define a cybersecurity audit report and specify what it must contain.
The rules also require executive certification. That shifts cybersecurity audits into a governance issue for legal, compliance, and leadership teams.
If a company has fragmented controls, weak documentation, or poor visibility into how vendors and internal systems handle personal information, the audit process will expose those gaps. That is an inference based on the certification structure and report requirements in the final regulations.
The final regulation package also addresses narrower issues that can create real compliance exposure. Insurance is one of the clearest examples. The CPPA states that the regulations clarify when insurance companies must comply with the CCPA. That matters because the California Insurance Code does not remove every insurance-related data practice from CCPA analysis. Businesses in this sector need to map which personal information falls under insurance-specific rules and which personal information remains subject to the CCPA.
Mobile apps bring their own disclosure obligations. When a business collects personal information through a mobile application, it may provide a link to the notice on the application’s download page and within the app, such as through the settings menu. App operators should make sure those links appear where users can actually find them.
Connected devices and immersive environments create a timing problem. The regulations require notice in a manner that ensures the consumer encounters it before or at the time the device begins collecting personal information that the business sells or shares. The same timing logic applies in augmented and virtual reality environments when covered collection and use occur there.
Financial incentive programs also deserve review. The CCPA allows businesses to offer financial incentives tied to the collection, retention, sale, or sharing of personal information. Those programs still need to fit within the broader rules on consent, consumer choice, and non-manipulative design. A rewards flow or sign-up prompt that pressures users into privacy waivers can create exposure. The last sentence is an inference based on the statute and the regulations’ symmetry rules.
The clearest enforcement risk in 2026 sits in the gap between what a business says and what its systems actually do. California has already warned businesses about dark patterns. In September 2024, the CPPA’s Enforcement Division stressed that privacy choices must be symmetrical and easy to understand. The agency also said dark patterns are judged by effect, not intent.
Recent enforcement confirms that point. In March 2025, the CPPA announced a Honda settlement that alleged Honda used an online privacy management tool that failed to offer privacy choices in a symmetrical or equal way. The CPPA also alleged Honda required excessive personal information to process certain privacy requests and shared personal information with ad tech companies without contracts containing required terms.
Global Privacy Control creates another obvious risk area. The 2026 rules require businesses to honor qualifying opt out preference signals and provide a way for consumers to confirm opt out status. A business that claims to support GPC but cannot detect the signal, apply the suppression logic, or display the resulting status creates a visible compliance problem. The same is true when the privacy policy promises one thing while the website, app, or backend process delivers something else.
Documentation is its own enforcement issue. A business may believe it made reasonable compliance choices, but it will struggle to defend those choices without records showing what it did and why. California now ties compliance to process design, risk assessment content, cybersecurity audit reports, and consumer request handling.
If a regulator asks how a company honors GPC, preserves corrected information, mapped retained data back to January 1, 2022, or decided whether an activity triggered a risk assessment, the business needs more than a general statement. It needs evidence. That can include:
The examples above are inferences based on the obligations imposed by the regulations and CPPA guidance.
Start with the interfaces consumers actually see. Audit cookie banners, privacy settings, account controls, incentive flows, and request forms for symmetry, clarity, and accurate labeling. Remove design choices that make opt out harder than acceptance. Test those interfaces across desktop, mobile, and logged in environments.
Next, test Global Privacy Control from end to end. Confirm that your systems:
These points come directly from the final regulations and CPPA guidance.
Then move to data governance. Confirm whether you retain personal information beyond 12 months and whether you can retrieve responsive data going back to January 1, 2022 when required. Review correction workflows to make sure corrected information stays corrected, incoming source data does not overwrite the correction, and disputed health information can carry a consumer statement when the rules require it.
After that, inventory future state risk. Identify any automated systems that may qualify as ADMT in significant decision settings. Map activities that trigger risk assessments, including sale or sharing, sensitive personal information processing, and certain automated or training-related uses. Assess whether your processing profile may bring the business into the cybersecurity audit framework.
Finally, preserve compliance evidence. Keep records showing what changed, when it changed, how the system behaves, what legal decisions were made, and who approved them. That record will matter if a regulator asks how the business implemented the CCPA 2026 Regulations instead of merely describing them. The final sentence is an inference drawn from the enforcement materials and the documentation-heavy structure of the regulations.
The CCPA 2026 Regulations changed more than the privacy notice language. They changed how businesses need to design interfaces, handle requests, preserve corrections, evaluate higher risk processing, and prepare for the next round of governance obligations. California has already shown through guidance and enforcement that weak design, weak signal handling, and weak process discipline can create real exposure.
That is why 2026 is the build year. Businesses need to fix the visible issues now and use the same year to map systems, tighten records, and prepare for ADMT, risk assessment, and cybersecurity audit requirements before those later deadlines mature. Companies that move early will put themselves in a stronger position than companies that wait for a request, complaint, or enforcement inquiry to expose the gaps.
Now is the time to review your privacy interfaces, test your request workflows, and identify which parts of your business may trigger the next round of California compliance obligations.
As a founding partner of Traverse Legal, PLC, he has more than thirty years of experience as an attorney for both established companies and emerging start-ups. His extensive experience includes navigating technology law matters and complex litigation throughout the United States.
This page has been written, edited, and reviewed by a team of legal writers following our comprehensive editorial guidelines. This page was approved by attorney Enrico Schaefer, who has more than 20 years of legal experience as a practicing Business, IP, and Technology Law litigation attorney.