Skip to main content
Home
Home

Algorithmic Price-Fixing: US States Hit Control-Alt-Delete on Digital Collusion

Algorithmic Price-Fixing: US States Hit Control-Alt-Delete on Digital Collusion

Abstract Digital

Key Takeaways

Over the past year, several U.S. states have introduced legislation restricting the use of algorithmic pricing. State lawmakers and regulators are increasingly concerned that these tools may facilitate tacit collusion (i.e., where firms appear to be colluding, but there is no evidence of an agreement or communication), thereby circumventing liability under Section 1 of the Sherman Act and analogous state statutes.

Tacit collusion is generally not actionable under the federal antitrust laws without more—because Section 1 of the Sherman Act requires that an agreement be shown. In other words, parallel conduct, standing alone, is insufficient to establish an antitrust violation unless accompanied by either an agreement, or certain “plus factors” indicating that coordination, as opposed to independent decision making, is afoot. The same can be said for those state antitrust laws that follow the Sherman Act.

When it comes to algorithmic pricing, there has long been a concern among regulators that the existing antitrust laws cannot reach a scenario where tacit collusion occurs through an algorithm. For example, an algorithm may be programmed to monitor and react to competitor pricing or other terms. Humans may expect conscious parallelism to ensue from such a setup, but without an actual agreement, such a scenario is often hard to convert into a cause of action. Or, as another example, perhaps a firm licenses an algorithmic tool from a third party, and both the firm and its competitors may all input competitively sensitive data into that tool—but there is no direct agreement between competitors. A “hub and spoke” conspiracy theory may be deployed by regulators and private plaintiffs in such scenarios, but it is far from foolproof. [1]
 

Three states—California, New York, and Connecticut—have crafted laws arguably intended to expand the definition of collusion more broadly than the Sherman Act. Each state’s law has individual nuances—creating a complex and diverging environment for companies operating across state lines. However, all three statutes appear to share the same goal: to make illegal certain algorithmic practices which are harder to reach under current antitrust jurisprudence.[2] 

This article seeks to analyze and compare these three new laws while providing general considerations for minimizing legal exposure. As more regulators bring algorithmic pricing lawsuits, and state attorneys general interact with legislatures, this trend may accelerate beyond the three states, expanding state-level regulation in the near future. 

Comparing the Three State Statutes

California (Assembly Bill No. 325)[3]

The California legislature made two significant amendments to the Cartwright Act, California’s state antitrust law: (1) making certain collusive activities involving algorithmic pricing illegal; and (2) lowering antitrust pleading standards. 

Algorithmic Pricing

As a result of the new law, it will be “unlawful for a person to use or distribute a common pricing algorithm as part of a contract, combination in the form of a trust, or conspiracy to restrain trade or commerce.” 

Additionally, the new law will make it “unlawful for a person to use or distribute a common pricing algorithm if the person coerces another person to set or adopt a recommended price or commercial term recommended by the common pricing algorithm for the same or similar products or services.”

The statute itself does not explain or define what “coercion” means. However, the bill’s legislative history indicates that coercion is when the person enabling the conspiracy imposes “negative consequences” for not accepting the set price or commercial term (e.g., level of service, availability, or output).[4] The legislative history broadly interprets coercion, potentially including “a financial penalty or withholding a financial benefit, deprioritizing or hiding a person’s listings or posts, or tweaking the algorithm to penalize the person’s interests.”[5]

“Common pricing algorithm” is defined broadly in the statute, and includes “any methodology ... used by two or more persons, that uses competitor data to recommend, align, stabilize, set, or otherwise influence a price or commercial term.” This creates ambiguity as to what practices, actions, or tools might fall within the scope of the statute and broadens its application beyond explicit price setting or pricing recommendations to influencing any commercial term. 

Notably, the law only applies to shared algorithmic tools (i.e., tools that are used by two or more persons). The law does not appear to apply to algorithmic pricing tools used by only one party, such as developing one’s own pricing algorithm. However, even single-party (i.e., not shared) algorithmic pricing tools are not without risk under existing federal and state antitrust laws. For example, in 2015, the DOJ prosecuted an executive for using a pricing algorithm to implement a price fixing agreement with a competitor.[6]

Finally, California’s law does not have a carve-out for the use of publicly available competitor information that is inputted into algorithmic pricing technology. Therefore, an algorithm that relies on publicly available competitor data may still be violative of the statute even though the use of such data is generally considered less risky. 

Lowering the Pleading Standard

Almost presented as an afterthought in the statutory text itself, the new law significantly lowers antitrust pleading standards across all violations of the Cartwright Act, not just algorithmic pricing. This provision will make it easier for California plaintiffs to plead evidence of a conspiracy and get to discovery, rejecting the federal Supreme Court-endorsed Twombly standard in the process. This is an attempt to remove the difficulties of pleading conscious parallelism or tacit collusion under existing antitrust law.

In an antitrust complaint under the Cartwright Act, it will be sufficient to plead factual allegations demonstrating that the existence of a contract, combination in the form of a trust, or conspiracy to restrain trade or commerce is “plausible.” The pleading will not be required to allege facts tending to exclude the possibility of independent action, as is generally required under federal law.[7]

New York (Senate Bill S.7882)[8]

The New York legislation, which amends the Donnelly Act, is narrower, targeting only the residential rental market. Namely, the statute prohibits: (1) agreements among residential owners or managers not to compete, including those facilitated by algorithms, or (2) residential owners or managers from setting or adjusting rental terms based on algorithmic recommendations that coordinate between owners or managers. 

The New York statute notably requires a certain level of intent as the prohibited actions must be made “knowingly or with reckless disregard.” 

Competitor data is indirectly defined as rental-related information which can be “historical or contemporaneous.” The statute, like that of California, does not provide an exception for the use of public competitor data in algorithmic pricing tools. 

It appears that a tool that uses a single landlord’s data would not be violative of the statute as opposed to tools that aggregate data across multiple landlords and provide subsequent rental, occupancy, and other recommendations. 

It should be noted that a firm that develops revenue management software which, among other things, recommends rental prices to property managers and owners recently filed suit against New York State, challenging the statute as an unconstitutional restriction of speech under the First Amendment to the U.S. Constitution.[9]

Connecticut (Section 32 of House Bill No. 8002)[10]

The new Connecticut law, which amends the Connecticut Antitrust Act, is similar to the New York statute in focusing on the rental housing market. The statute is a brief subsection of a lengthy omnibus bill that prohibits any person from using “a revenue management device to set rental rates or occupancy levels for residential dwelling units.”

Importantly, Connecticut’s law is the only statute of the three to explicitly define “nonpublic competitor data” as problematic, thereby excluding information available to the general public from the statute’s algorithmic pricing prohibition.

The Connecticut statute also explicitly carves out “[a] report that publishes existing rental data in an aggregated manner but does not recommend rental rates or occupancy levels for future leases” from the definition of a “revenue management device,” thereby appearing to safeguard the use of market surveys and their data by landlords. 

Below is a chart comparing the various components of the state statutes: 

 

California
(AB 325)

New York
(S.7882)

Connecticut
(HB 8002)

Effective DateJanuary 1, 2026December 15, 2025January 1, 2026
Industries CoveredAll industriesResidential rental housingResidential rental housing
Summary of Prohibited ConductIt is unlawful to use or distribute a common pricing algorithm: (1) as part of a contract, combination in the form of a trust, or conspiracy to restrain trade or commerce, or (2) if a person coerces another person to set or adopt a recommended price or commercial term recommended by the common pricing algorithm for the same or similar products or services.

It is unlawful to “knowingly or with reckless disregard”: (1) facilitate agreements among residential property owners/managers to not compete on residential dwelling units, including through the use of algorithmic devices that perform a “coordinating function”; or (2) set or adjust rental terms and prices based on recommendations from an algorithmic device that performs a “coordinating function.”

 

“Coordinating function” means, in brief, all three of: (1) collecting “historical or current” rental-related data, such as price or occupancy; (2) analyzing or processing this data using an algorithm; and (3) recommending rents, renewal terms, occupancy levels, or other terms to a landlord.

It is unlawful to use a “revenue management device” to set rental rates or occupancy levels for residential units. 
Definition of Algorithmic Pricing Technology “‘Common pricing algorithm’ means any methodology, including a computer, software, or other technology, used by two or more persons, that uses competitor data to recommend, align, stabilize, set, or otherwise influence a price or commercial term.” 

“‘Algorithmic device’ means any machine, device, computer program or computer software that on its own or with human assistance performs a coordinating function.”

 

“‘Revenue management device’ means a device commonly known

as revenue management software that uses one or more programmed or

automated processes to perform calculations of nonpublic competitor data concerning local or state-wide rents or occupancy levels, for the purpose of advising a landlord on (A) whether to leave a unit vacant; or (B) the amount of rent that the landlord may obtain for a unit.”

 

Excludes reports that publish existing rental data in an “aggregated manner,” but do not recommend rental rates or occupancy levels for future leases. 

Covered Competitor Data Inputs

Covers competitor prices and commercial terms (i.e., level of service, output, and availability). 

 

No exception for using public competitor information in the algorithmic pricing technology. 

Covers historical or contemporaneous competitor data concerning rental prices, occupancy levels, and lease termination and renewal dates. 

 

No exception for using public competitor information in the algorithmic pricing technology.

Covers “nonpublic competitor data”—including rent prices, occupancy levels, lease start and end dates and other “similar data.” 

 

Publicly available competitor data is not in the scope of the statute.    

Penalties/Remedies

Recent changes to criminal penalties under the Cartwright Act include: (1) up to $6 million in criminal corporate fines (previously $1 million); and (2) imprisonment of up to three years and up to $1 million fine (previously $250,000) for individual criminal convictions.[11]

 

Civil liability includes treble damages, injunctive relief, reasonable attorneys’ fees and costs. 

 

New provision allows for $1 million civil penalty for each violation in certain suits brought by Attorney General or district attorney. 

Criminal penalties under the Donnelly Act include: (1) up to $1,000,000 in criminal fines for corporations; and (2) imprisonment of up to four years and up to $100,000 in criminal fines for individuals. 

 

Civil liability includes treble damages, injunctive relief, reasonable attorneys’ fees and costs up to $10,000.

 

Civil fines up to $1,000,000 for corporations and $100,000 for individuals for each violation in cases brought by the Attorney General. 

No explicit criminal penalties under the Connecticut Antitrust Act. 

 

Civil penalties in cases brought by the Attorney General include up to $1,000,000 in civil fines for corporations, and up to $100,000 in civil fines for individuals. 

 

Civil liability includes treble damages, injunctive relief, reasonable attorneys’ fees and costs.

 

 

 

 


Practical Implications

Organizations using algorithmic pricing tools must exercise careful oversight to ensure compliance with these varied and emerging state antitrust statutes. The following issues should be considered as part of such oversight:

  • Competitor data inputs. As more states adopt laws addressing alleged anticompetitive practices surrounding algorithmic pricing tools, it is critical for market participants to examine any emerging statute for language concerning the use of public versus nonpublic competitor data. Statutory exceptions surrounding the use of public data may directly affect the permissible scope of algorithmic tools. For instance, the new Connecticut law expressly permits the use of public competitor data, while California and New York do not provide such exceptions. Furthermore, in California and New York, the statutes do not appear to apply when companies employ algorithmic pricing tools that rely exclusively on the firm’s own data, thereby providing an opportunity to reduce legal risk through the use of “insulated” systems.
  • Aggregated, anonymized, or historical competitor data. For many years, the use of aggregated, anonymized, and historical competitor data was generally viewed by U.S. antitrust authorities as presenting a lower risk of anticompetitive effects and thus were often considered to fall within a “safe harbor” for information exchanges among competitors. This approach is shifting at the federal level and is further underscored by the recent state laws addressing algorithmic pricing. For example, the New York algorithmic pricing law prohibits the use of historical competitor data in algorithmic pricing tools setting rental rates or terms. In contrast, the Connecticut algorithmic pricing law provides a limited exception for the use of aggregated nonpublic competitor data, but only in reports that do not recommend future rental prices or occupancy levels. Market participants should carefully examine any further state statutes regarding algorithmic pricing, as well as future FTC or DOJ policy statements, to discern the degree to which aggregated, anonymized, or historical competitor data may be exchanged while avoiding antitrust risk.
  • Application of other antitrust laws. While the recently passed laws in New York and Connecticut regarding algorithmic pricing tools are directed at the rental housing market, companies in other sectors employing similar technology in these states are not insulated from antitrust risk. Exposure remains under the price-fixing provisions of the general state and federal antitrust laws.

    For example, there has recently been a flurry of antitrust litigation brought by the DOJ, various state attorneys general, and private plaintiffs involving rental housing markets. In those cases, the plaintiffs’ allegations were often based on Section 1 of the Sherman Act and similar state provisions. The complaints alleged that a technology vendor facilitated a conspiracy among competing landlords to fix or raise rental prices by coordinating pricing strategies through its revenue management software and algorithmic tools.[12] The vendor in questions was also alleged to have facilitated more traditional collusive activities through industry events, video conference calls, and the like. The technology vendor is alleged to have acted as a “hub,” coordinating among “spoke” property management companies, enabling them to share sensitive pricing and occupancy data and align pricing decisions, even if the “spokes” (i.e., competing landlords using the same technology) did not always communicate directly. While the vendor in question is resisting vigorously, denying the allegations, such cases illustrate regulators’ heightened scrutiny of algorithmic technologies to allegedly set pricing, output, and other commercial terms under Section 1 of the Sherman Act and analogous state laws. 

  • Due diligence of vendor contracts. Given the scrutiny surrounding algorithmic pricing, especially in the rental housing market, companies employing such technologies should fully understand the software products they subscribe to or otherwise utilize, including what data they use (e.g., public or confidential), what algorithms are employed, and how outputs are formulated. Therefore, it is important to carefully read and understander any user contracts or agreements with algorithm vendors and consult with experienced antitrust counsel to assess legal risk under such contracts. Any tool or algorithm that shares or otherwise aggregates data between multiple competitors and presents recommendations on prices or other terms presents antitrust risk—both for the users and developers of the software at issue. Understanding the algorithmic pricing tools that you use can help to avoid such challenges before they arise. This is especially true in New York, given its new statute’s “knowingly or with reckless disregard” standard, which makes plain that mere ignorance of a software’s functions and inputs is no defense.
     

Endnotes
[1] Tacit collusion can mean multiple things. In the context of today’s ongoing algorithmic price fixing litigation, multiple individuals may not reach an express agreement to price fix, but if they all subscribe to a company that provides a common pricing algorithm, for example, that could provide sufficient material for an unlawful collusion cause of action, either under existing antitrust laws and a “hub and spoke” theory, or the new state statutes outlined infra. However, an emerging question in antitrust law is whether people using totally independent algorithms that operate autonomously, as opposed to common pricing algorithms, could still be the basis for a cause of action based on tacit collusion. In such an autonomous scenario, there may be no agreement between humans directly, or to use a common algorithm, but the independent algorithms may still learn from and interact with one another, potentially leading to coordination on prices. Current statutes and caselaw may not reach this autonomous scenario and is not the focus of these new state laws but could be the subject of future legislation.

[2] On December 11, 2025, the Trump administration issued an Executive Order (EO) concerning artificial intelligence (AI). Among other provisions, the EO authorizes several federal government agencies to issue guidance regarding the potential preemption of state AI laws, and appears to target consumer protection and diversity, equity, and inclusion issues in the context of AI use. It does not, however, appear to focus on state antitrust laws related to algorithmic pricing. https://www.whitehouse.gov/fact-sheets/2025/12/fact-sheet-president-donald-j-trump-ensures-a-national-policy-framework-for-artificial-intelligence/

[3] https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202520260AB325

[4] https://sjud.senate.ca.gov/system/files/2025-06/ab-325-aguiar-curry-sjud-analysis.pdf

[5] Id. 

[6] https://www.justice.gov/archives/opa/pr/former-e-commerce-executive-charged-price-fixing-antitrust-divisions-first-online-marketplace.

[7] https://www.justice.gov/atr/case-document/file/488956/dl

[8] https://legislation.nysenate.gov/pdf/bills/2025/S7882.

[9] https://storage.courtlistener.com/recap/gov.uscourts.nysd.653712/gov.uscourts.nysd.653712.1.0.pdf

[10] https://www.cga.ct.gov/2025/ACT/PA/PDF/2025PA-00001-R00HB-08002SS1-PA.PDF

[11] https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202520260SB763.

[12] https://www.justice.gov/archives/opa/media/1364976/dl?inline

Print and share

Authors

Profile Picture
Partner
SAlfonso@perkinscoie.com

Notice

Before proceeding, please note: If you are not a current client of Perkins Coie, please do not include any information in this e-mail that you or someone else considers to be of a confidential or secret nature. Perkins Coie has no duty to keep confidential any of the information you provide. Neither the transmission nor receipt of your information is considered a request for legal advice, securing or retaining a lawyer. An attorney-client relationship with Perkins Coie or any lawyer at Perkins Coie is not established until and unless Perkins Coie agrees to such a relationship as memorialized in a separate writing.

206.359.3980
Profile Picture
Partner
ChristopherWilliams@perkinscoie.com

Notice

Before proceeding, please note: If you are not a current client of Perkins Coie, please do not include any information in this e-mail that you or someone else considers to be of a confidential or secret nature. Perkins Coie has no duty to keep confidential any of the information you provide. Neither the transmission nor receipt of your information is considered a request for legal advice, securing or retaining a lawyer. An attorney-client relationship with Perkins Coie or any lawyer at Perkins Coie is not established until and unless Perkins Coie agrees to such a relationship as memorialized in a separate writing.

202.661.5870
Profile Picture
Counsel
CTunca@perkinscoie.com

Notice

Before proceeding, please note: If you are not a current client of Perkins Coie, please do not include any information in this e-mail that you or someone else considers to be of a confidential or secret nature. Perkins Coie has no duty to keep confidential any of the information you provide. Neither the transmission nor receipt of your information is considered a request for legal advice, securing or retaining a lawyer. An attorney-client relationship with Perkins Coie or any lawyer at Perkins Coie is not established until and unless Perkins Coie agrees to such a relationship as memorialized in a separate writing.

312.324.8595
Profile Picture
Associate
RBerry@perkinscoie.com

Notice

Before proceeding, please note: If you are not a current client of Perkins Coie, please do not include any information in this e-mail that you or someone else considers to be of a confidential or secret nature. Perkins Coie has no duty to keep confidential any of the information you provide. Neither the transmission nor receipt of your information is considered a request for legal advice, securing or retaining a lawyer. An attorney-client relationship with Perkins Coie or any lawyer at Perkins Coie is not established until and unless Perkins Coie agrees to such a relationship as memorialized in a separate writing.

Explore more in

Related insights

Home
Jump back to top