What Businesses Need to Know about Surveillance Pricing
What Is “Surveillance Pricing”?
“Surveillance” or “differential” pricing is the growing practice of adjusting prices, fees, or offer terms based on detailed data collected about a specific consumer or a small microsegment of consumers, rather than on broad market factors like supply, demand, or time of day. It’s dynamic pricing supercharged by “surveillance grade” data collection and “profiling.”
Companies may rely on information such as location, device type, browsing history, purchase history, demographics, and even inferred financial stress or “pain points” to optimize the maximum amount a given consumer may pay before walking away.
Who is Using Surveillance Pricing?
This tactic is no longer reserved for large platforms or sophisticated AI teams. Surveillance pricing allegations may target any company relying on consumer data to influence pricing or offers.
- Ecommerce businesses experimenting with personalization or dynamic pricing
- Subscription businesses using retention and churn analytics
- Retailers working with third party pricing, marketing, or loyalty vendors
- Hospitality, travel, and ticketed events
How Surveillance Pricing Works
At a high level, surveillance pricing follows a simple order of operations: gather data, profile the user, make a prediction about willingness to pay, and then push a tailored price or offer. However, that process can involve complex data ecosystems and machine learning models.
The Data Behind Surveillance Pricing
To understand how surveillance pricing works, start with the data inputs. Common sources include:
- Browsing behavior, such as pages viewed, time on page, and cart abandonments
- Purchase history and loyalty program records
- Location data (GPS, IP based location, neighborhood)
- Device characteristics, like using an older phone versus a high end device
- Demographic and inferred attributes, such as income bracket or household size
- Third party data brokers that supply additional profiling information
Algorithmic Pricing and “Willingness to Pay”
From there, companies may deploy algorithmic pricing systems that translate those data points into real time pricing decisions. These systems allegedly set the highest price an individual customer is likely to accept.
In practice, the algorithmic pricing model might:
- Score users on a scale of price sensitivity
- Use experiments (A/B tests) to see which users accept which prices
- Continuously retrain on new data to refine those predictions
Two consumers loading the same page, at the same time, could see different prices with no clear explanation.
Surveillance Pricing vs. Dynamic Pricing
Not all dynamic pricing is “surveillance pricing.” The transportation industry (such as airlines and rideshare services) have used dynamic pricing for years, usually based on factors like time, demand, inventory, and competition. What makes surveillance pricing different is the expanding shift from market level variables to person level variables.
Traditional dynamic pricing: Adjusts based on time, demand, and inventory; usually visible and predictable.
Surveillance pricing: Adjusts based on personal data and profiling; highly individualized and often opaque.
From a potential regulator or complaining consumer perspective, the key question is not just “Why did the price change?” but “Why did it change for me specifically?”
How Surveillance Pricing Enters a Business
Many business owners may not set out to implement “surveillance pricing.” It can emerge as a byproduct of personalization, A/B testing, or third party optimization tools unless actively constrained. It tends to happen in stages:
- A marketing or sales team wants more personalized offers
- A vendor promises improved conversion or higher lifetime value
- Pricing experiments are layered on top of behavioral data
- Over time, discounts disappear for some users—and prices rise for others
At that point, pricing is no longer reacting to the market. It is reacting to the individual customer.
Surveillance Pricing Under Mounting Scrutiny
Lawmakers, regulators, and consumer advocates are increasingly voicing concerns that surveillance pricing may be unfair, deceptive, discriminatory, or exploitative – even when it appears technologically sophisticated.
Privacy and Transparency
First, there are privacy considerations. Surveillance pricing may depend on extensive tracking and profiling – cookies, browser fingerprinting, app level tracking, location data, loyalty programs, and more. Many consumers have no meaningful visibility into what data is collected, why it is being collected, and how it will influence prices.
Second, transparency may be limited. A consumer rarely sees a notice that says, “You are being shown a higher price because our system believes you are less price sensitive.” This can create serious risks under unfair and deceptive practices laws when disclosures and expectations do not match reality.
Discrimination and Exploitation Risks
Opponents may also argue that surveillance pricing may also magnify inequities. When pricing models lean heavily on demographic or location data, they can produce allegedly disparate impacts on protected classes or vulnerable populations.
For example, algorithms might infer a higher “willingness to pay” based on a consumer living in a historically affluent neighborhood or using a newer device, while charging lower prices to others – or vice versa, targeting consumers perceived as financially stressed with higher prices for essentials. Even if those outcomes are emergent rather than intentional, they can raise discrimination and fairness concerns for regulators and class action plaintiffs.
Current Legal Landscape
Is surveillance pricing illegal? The legal landscape is evolving quickly, but a few themes are emerging across federal and state law: it depends where you are, what data you use, how you disclose it, and how the practice operates in real life.
Federal Consumer Protection and the FTC
At the federal level, the Federal Trade Commission has squarely put surveillance pricing on its radar. The FTC has launched a market study into surveillance pricing, issued 6(b) orders to gather information from companies, and released staff perspectives indicating that a wide range of personal data – including precise location and browsing history – is already being used to target consumers with different prices for the same goods and services.
The FTC’s core tools are Section 5 of the FTC Act (unfair or deceptive acts or practices) and sector specific rules. A surveillance pricing program could draw fire if, for example:
- Consumers are misled or not meaningfully informed about how data will affect prices
- Algorithms exploit vulnerable consumers in ways the FTC views as unfair
- Data is collected or combined in ways that conflict with privacy promises
While there is not yet a federal statute that bans surveillance pricing outright, FTC enforcement theories are developing, and businesses should assume scrutiny will intensify rather than fade.
New York’s Algorithmic Pricing Disclosure Law
At the state level, New York has moved aggressively. New York’s Algorithmic Pricing Disclosure Act (N.Y. Gen. Bus. Law § 349a) requires businesses using algorithmic pricing based on personal data to provide conspicuous disclosure to consumers.
In practice, this means covered companies must clearly explain that prices may be set or adjusted using personal data and algorithmic techniques. The New York Attorney General can enforce the law, beginning with cease and desist letters and escalating to injunctive relief and civil penalties for ongoing violations. For organizations experimenting with personalized pricing, New York has become a focal jurisdiction that cannot be ignored.
California’s Privacy Framework
California’s Consumer Privacy Act (CCPA), as amended, does not directly outlaw surveillance pricing, but it imposes limits on how businesses may use personal information. In particular, the CCPA’s “purpose limitation” principle requires that data collection and use remain consistent with consumers’ reasonable expectations and disclosed purposes. If a company uses consumer data, originally collected for one purpose, to drive undisclosed individualized prices, it may be on unstable ground under California law.
California lawmakers are trying to go further. A 2026 bill, AB 2564, sought to prohibit retailers from engaging in “surveillance pricing” as defined by the statute, subject to certain exceptions. The bill targets customized prices based on personally identifiable information collected through “electronic surveillance technology,” including data from third parties, and proposes substantial civil penalties per violation, with enhanced penalties for intentional violations. It also declares that any waiver of its protections is void and that its remedies are cumulative of other laws.
Federal Legislative Proposals
On Capitol Hill, some lawmakers are pushing for a nationwide floor. The proposed “One Fair Price Act,” introduced by Senator Ruben Gallego and co-signed by others, would prohibit companies from using customers’ personal data to set individualized prices. The bill seeks to outlaw surveillance pricing outright, at least in its most consumer specific form, and would establish federal protections against data driven price discrimination.
Whether and when such federal legislation will pass remains uncertain. But for companies building pricing models, the introduction of the bill itself is a signal: legislators view surveillance pricing as a foundational issue.
What Are Risks with Surveillance Pricing Today?
From a compliance perspective, a useful question is: under what circumstances is surveillance pricing likely to be treated as unlawful or high risk? The answer turns on several overlapping bodies of law.
Consumer Protection and UDAP/UDAAP
Surveillance pricing can implicate general consumer protection laws at both the federal and state levels. Practices that are unfair, deceptive, or abusive (for example, misrepresenting how data will be used, creating dark patterns around consent or optout, or quietly charging higher prices to vulnerable consumers) may trigger regulatory action or litigation.
State UDAP statutes (unfair and deceptive acts and practices) are famously broad, and plaintiffs’ lawyers have a long history of using them to challenge novel marketing practices. A personalized pricing program that is opaque, poorly disclosed, or perceived as exploitative could fit easily into that playbook.
Privacy Laws and Data Use Restrictions
Privacy laws like the CCPA, Virginia’s CDPA, and similar statutes impose requirements around notice, consent, purpose limitation and sensitive data. If your surveillance pricing program relies on precise geolocation, financial status, or health indicators, or repurposes data in ways consumers did not expect, you may be out of alignment with these statutes.
Privacy laws increasingly give consumers the right to opt out of certain forms of profiling, automated decision making, and the sale or sharing of personal information. If a consumer exercises those rights, continuing to use their data to set individualized prices could become a compliance issue.
Anti-Discrimination and Civil Rights Considerations
Even if surveillance pricing is facially neutral, it may produce disparate impacts on protected classes. Where pricing relates to housing, credit, employment, insurance, or similar domains, antidiscrimination laws may be directly implicated if protected characteristics or close proxies (like neighborhood) influence pricing decisions.
Regulators have signaled that they see algorithmic systems as subject to the same antidiscrimination rules as analog ones. That means companies may need to test for disparate impact, document mitigation steps, and be prepared to explain how their models work – or at least what guardrails are in place.
Disclosure Requirements
In jurisdictions like New York, failing to provide required disclosures about algorithmic pricing can itself be a statutory violation, regardless of whether the underlying pricing is otherwise fair. That raises the stakes for multistate businesses, which must decide whether to regionalize disclosures or adopt a nationwide standard that meets the strictest state’s rules.
Key Enforcement and Oversight Trends
Beyond the statutes on the books, enforcement and oversight trends tell you where regulators are most likely to focus energy next.
FTC Market Studies and Future Actions
The FTC’s surveillance pricing study, based on compulsory 6(b) orders, has already yielded initial findings that companies routinely use location, browsing history, and other personal data to target prices. Those findings may be a precursor to enforcement actions, guidance, or even new rulemaking.
Because the FTC can frame surveillance pricing as both a privacy and a consumer protection issue, it has a wide lane to bring cases if it identifies egregious practices. That is especially true for models that target essential goods and services or vulnerable populations.
State Attorney General Investigations
State attorneys general, particularly in California and New York, are already active in the algorithmic pricing space. California’s Attorney General has reportedly launched investigative sweeps in sectors like retail, grocery, and hospitality to understand how companies are using consumer data in pricing and to test compliance with state privacy and consumer protection law.
New York’s AG is charged with enforcing the Algorithmic Pricing Disclosure Act and can seek injunctive relief and penalties for noncompliant companies. That combination of privacy obligations, disclosure rules, and aggressive enforcement posture makes these states bellwethers for national practices.
Recently, other states have also considered proposals regarding data-driven pricing practices, including for issues involving discrimination and personal and sensitive data. For example, Colorado, Texas, Vermont, and Illinois have seen legislative efforts in this area, in some cases focusing on certain types of industries or specific practices.
Congressional and Legislative Attention
Congressional committees have begun investigating the use of surveillance pricing across industries, sending oversight letters and requesting detailed information from companies in travel, hospitality, and related sectors. At the same time, legislative proposals like the One Fair Price Act keep the issue front and center in policy debates.
For companies, this means your pricing practices may be scrutinized not just in courtrooms or regulatory offices but in hearings and public reports. Reputational risk and political risk are now part of the surveillance pricing calculus.
When Does Pricing Become Legally Indefensible?
Across federal and state regimes, enforcement risk tends to crystallize around four recurring fault lines:
- Expectation misalignment - When consumers reasonably believe prices are uniform or market based, but in practice pricing is individualized using personal data
- Proxy-based discrimination - Even neutral pricing inputs (such as can location data, device characteristics, and behavioral indicators) may be legally sensitive when they function as stand ins for income, race, age, or financial stress
- Lack of meaningful choice for sensitive data - Sensitive data increases compliance requirements, and regardless, certain laws require that consumers must not only be informed, but given practical alternatives to access a non-personalized price or opt out of profiling
- Absence of internal controls - Companies that cannot identify which features influence pricing, cannot test for harmful outcomes, or cannot produce documentation of safeguards are structurally disadvantaged in investigations
These themes appear repeatedly across FTC inquiries, state privacy enforcement actions, and legislative drafting choices, even where the statutory language differs.
Practical Risk Areas for Businesses Using Surveillance Pricing
If your organization is experimenting with or already deploying surveillance pricing, where are you most exposed? A few recurring themes stand out across enforcement and legislative activity.
Opaque or Misaligned Disclosures
One of the biggest red flags is a mismatch between what your privacy notice and marketing materials say and what your systems actually do. If you tell consumers you use data “to improve the customer experience,” but in practice you are using it to push higher prices without clear disclosure, regulators may see that as deceptive.
Disclosures required under laws like New York’s Algorithmic Pricing Disclosure Act must also be conspicuous and understandable. Boilerplate buried in a privacy policy may not cut it.
Use of Sensitive or Unexpected Data
Another risk hotspot is reliance on particularly sensitive or unexpected data types (precise geolocation, health adjacent data, or indicators of financial hardship) especially where consumers have not explicitly consented to that use. Using such data to ratchet up prices can look less like legitimate segmentation and more like exploitation.
Privacy laws increasingly treat sensitive data differently and often require opt-in consent or special handling. Surveillance pricing strategies built on this kind of information may demand heightened controls and legal review.
Lack of Governance Around Algorithms
Many organizations underinvest in governance for pricing algorithms. They deploy models, watch revenue metrics, and assume success means the model is “working.” But from a legal standpoint, you need to know more:
- What features are driving pricing decisions?
- Are there guardrails to prevent protected class proxies from dominating?
- Are you regularly testing for disparate impact or unfair outcomes?
Without governance and documentation, it is hard to defend a system when regulators or plaintiffs’ counsel come knocking.
Pillars of Dynamic Pricing and Advertising Compliance
Given the speed of legislative developments and the sensitivity of the underlying practices, businesses should not treat surveillance pricing as “just another optimization tactic.” A structured review of the following areas with an experienced advertising and privacy law firm can help reduce risks.
Mapping Data Flows and Legal Obligations
A first step in any legal review is to map how data flows through your surveillance pricing pipeline – from collection, to enrichment, to modeling, to real time decisioning. Counsel can then overlay applicable laws and regulations, including:
- State privacy laws (e.g., CCPA and similar statutes)
- Sectorspecific requirements where applicable
- UDAP statutes and FTC consumer protection standards
- New York’s disclosure statute and any similar emerging laws
This type of analysis helps identify where you may need to adjust notices, obtain consent, narrow data uses, or redesign aspects of your pricing program.
Updating Disclosures, Policies, and UX
Legal review should also encompass your external facing materials – privacy policies, terms of service, inflow notices, and user interface design. Counsel can help align your descriptions of data use with what your systems actually do, craft jurisdiction specific disclosures (for example, New York algorithmic pricing notices), and identify dark pattern risks in consent and optout flows.
In some cases, it may be advisable to offer consumers clearly labeled “standard pricing” options or to avoid surveillance pricing altogether in certain sensitive product lines.
Building Governance, Testing, and Documentation
Finally, counsel can work with technical and compliance teams to build governance frameworks around your pricing models:
- Documenting model objectives and constraints
- Implementing testing protocols for bias and disparate impact
- Establishing review processes for significant model updates
- Creating incident response plans for algorithmic failures
If a regulator or committee sends you a letter asking, “How does your surveillance pricing system work, and what safeguards are in place?”, having that governance framework in place can be the difference between a manageable inquiry and a prolonged investigation.
The Future of Surveillance Pricing
Surveillance pricing sits at the intersection of cutting-edge data science and some of the most sensitive questions in consumer protection and privacy law. Although AI-powered dynamic pricing promises granular optimization, it may come at the cost of extensive tracking, opaque personalization, and a growing sense among consumers and regulators that the deck may be stacked. Today’s legal landscape is a patchwork, and as scrutiny increases, business owners face three realistic paths:
- Avoid or limit surveillance pricing altogether, especially for essential goods or trust-based products
- Use pricing personalization with hard guardrails, clear disclosures, and governance
- Accept higher risk knowingly, with full documentation and oversight
If your organization is using or considering surveillance pricing, now is the time to step back and understand where the legal fault lines lie. Contact Kronenberger Rosenfeld, LLP to partner with experienced advertising and privacy counsel to review your data collection, profiling, and pricing practices, tighten your disclosures, and build governance around your algorithms.
FAQs
1. What is surveillance pricing?
Surveillance pricing is generally when a business uses detailed personal and behavioral data – like your location, browsing history, or purchase patterns – to set or adjust the price you see for a product or service, rather than relying only on general market factors.
2. How does surveillance pricing work?
Under a regulator viewpoint, companies collect data about you across sites and apps, feed that data into pricing algorithms that estimate your willingness to pay, and then show you individualized prices or offers in real time based on those predictions.
3. Is surveillance pricing illegal?
Surveillance pricing is not categorically illegal everywhere, but it may violate consumer protection, privacy, or antidiscrimination laws depending on how it is implemented, what data it uses, how transparent it is, and where consumers are located.
4. Are there laws against surveillance pricing?
Yes. This is an evolving area where existing general laws and also pricing-specific laws come into play. For example, New York’s Algorithmic Pricing Disclosure Act requires conspicuous disclosure of algorithmic pricing based on personal data, and California has proposed AB 2564 to prohibit certain forms of surveillance pricing, while federal proposals like the One Fair Price Act would ban the practice nationally if enacted.
Related Topics
Related Practice Areas
This entry was posted on Friday, April 17, 2026 and is filed under Resources & Self-Education, Internet Law News.