Regulators are making it clear that 2026 will be a big year for scrutiny of algorithmic pricing models, the use of personal data and algorithmic tools to set individualized prices. Armed with both preexisting consumer protection and privacy laws, as well as new algorithmic-pricing-specific legislation, state and federal regulators are opening inquiries that highlight the risk of regulation by enforcement. This is an area to watch for consumer-facing companies with algorithmic or dynamic pricing models.
While dynamic or personalized pricing is quite common and does not necessarily violate existing laws, concerns have been raised that it may feel unfair. Algorithmic pricing reached national prominence in 2024 when the US Federal Trade Commission (FTC) initiated a 6(b) market study and released January 2025 staff findings. The report raised concerns about risks to consumers arising from the use of highly granular or sensitive personal data, such as precise location, mouse movements, and demographics; a lack of transparency or consumer awareness; and the potential for unfair or discriminatory pricing. Then-Commissioner, and now Chair, Andrew Ferguson noted that customers may these models as unfair even if they do not run afoul of existing laws, and that their concerns could undermine trust in digital marketplaces.
A year later, reporting indicates the FTC has gone beyond its study and has opened a probe of the use of AI driven tools to generate different prices for different customers. It remains to be seen how broadly it will investigate, including whether it will pursue additional information from the recipients of the 6(b) letters or their corporate customers.
In addition to the FTC, states have begun legislating and using their investigative powers in this area. For example, California Attorney General Rob Bonta recently announced an investigation into how businesses use personal data to set targeted prices for goods and services. The California Attorney General has broad authority to investigate under the state’s privacy, consumer protection, and other laws. The full scope of the investigation is not public, but Attorney General Bonta’s office has begun issuing inquiry letters to grocers, hotels, and retailers with a significant online presence, requesting detailed information on their use of consumer data, pricing experiments, and compliance measures across privacy, competition, and civil rights laws.
And California isn’t alone. New York’s Algorithmic Pricing Disclosure Act, signed by Governor Kathy Hochul in May 2025 and that went into effect in November now requires retailers using personal data to set individualized prices to post a conspicuous, all caps disclosure: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.” The law—a challenge to which remains on appeal—is the first of its kind in the US and reflects a potential trend toward mandatory transparency in data driven pricing. The state is also considering a bill to outlaw algorithmic price discrimination based on protected characteristics like race and age. Other states, including Pennsylvania, Texas, and New Mexico, are considering legislation similar to New York’s.
With efforts in Congress to pass AI and other tech-related legislation largely stalled, companies should expect the FTC, states, and state attorneys general to continue stepping up, often framing their efforts as part of an “affordability” agenda. Paying close attention to investigations like California’s will provide forward-looking insight into the evolving compliance landscape of data-driven pricing practices.
Companies should consider the following to prepare for such inquiries:
- Conducting a coordinated review of AI and pricing disclosures in public-facing materials (including marketing) to ensure accuracy and clear disclosures indicating that consumer data can be used in pricing algorithms, which may include requesting opt-in;
- Testing algorithms for dynamic pricing—both before use and periodically thereafter—to ensure appropriate controls are in place to prevent improper bias and discrimination;
- Auditing services and pricing practices to ensure they are in compliance with applicable privacy laws;
- Assessing the compliance of practices used by third-party entities with whom companies may contract to acquire data; and
- Monitoring legislative developments and investigations to ensure ongoing compliance and risk calibration.
