Skip to main content
Home
Home

New York Bill Would Create Private Right of Action Against Chatbot Proprietors Offering Professional Advice

New York Bill Would Create Private Right of Action Against Chatbot Proprietors Offering Professional Advice

Woman speaking to AI chatbot on smartphone

Key Takeaways

SB 7263 Overview

The New York State Senate has advanced Senate Bill 7263, which would add a new Section 390-f to the General Business Law. The bill would impose liability on proprietors of AI-powered chatbots for certain professional-type outputs. 

Who and What Does the Bill Cover?

The bill casts a wide net with its definitions. An “artificial intelligence system” is defined as a machine-based system that infers from input how to generate outputs that can influence physical or virtual environments. Notably, it carves out basic software tools like antivirus programs, calculators, databases, firewalls, spellcheck, spreadsheets, and similar utilities.

A “chatbot” is any AI system or application that simulates human-like conversation via text, voice, or both to provide information and services to users. And the term “proprietor” covers any person, business, company, organization, institution, or government entity that owns, operates, or deploys a chatbot system used to interact with users. Importantly, third-party developers that merely license chatbot technology to a proprietor are expressly excluded from this definition.

The Core Prohibition: No Professional or Legal Advice

The heart of the bill is its requirement that proprietors “shall not permit” chatbots to provide any substantive response, information, or advice or to take any action that, if performed by a natural person, would constitute a crime under Education Law §§ 6512 or 6513 (relating to unauthorized practice of numerous licensed professions) or would violate Judiciary Law Article 15’s prohibitions on practicing or appearing as an attorney without being duly admitted and registered.

What makes this particularly notable is that the bill states that proprietors “may not waive or disclaim” liability merely by notifying consumers that they are interacting with a nonhuman system.

Enforcement Through Private Litigation

Rather than relying solely on state regulatory action, SB 7263 takes the unusual step (among AI laws passed to date) of creating a private right of action. Any person may bring a civil suit to recover actual damages arising from a violation. And if a proprietor is found to have willfully violated the statute, the plaintiff can also recover costs, reasonable attorneys’ fees, and disbursements.

This is a meaningful design choice. It effectively deputizes private litigants as the primary compliance enforcers, potentially expanding chatbot proprietors’ exposure well beyond traditional tort claims and professional discipline channels.

Disclosure Requirements

Beyond restricting substantive advice, the bill requires proprietors of chatbots, apparently in circumstances not subject to the bill’s prohibitions above, to provide clear, conspicuous, and explicit notice to users that they are interacting with an AI chatbot program. That notice must appear in the same language the chatbot is using, be in a size easily readable by the average viewer, and be no smaller than the largest font size of other text on the website where the chatbot is used.

The bill thus couples liability for professional-type outputs with a robust AI identification requirement for other types of outputs and use cases, a reflection of lawmakers’ concern that users may mistake AI systems for human professionals.

The Policy Rationale

The bill’s sponsors frame SB 7263 as ensuring that professional advice comes only from licensed humans, not from AI systems. The bill’s Sponsor Memo quotes a February 2025 New York Times report describing warnings by the American Psychological Association to the FTC that AI chatbots “masquerading” as therapists and programmed to reinforce rather than challenge users’ thinking could drive vulnerable individuals to harm themselves or others.

From a policy standpoint, the bill is consistent with New York’s longstanding emphasis on rigorous professional licensure and restrictions on the corporate practice of professions. SB 7263 extends those principles to AI systems that can mimic professional judgment at scale, particularly where users may reasonably perceive them as providing clinically or legally consequential guidance.

That said, there’s a fair argument that the bill may do more to expand litigation risk than to meaningfully curb unlicensed “digital practice.” Because enforcement relies on private suits tied to “substantive” chatbot outputs, the practical effect may be to push proprietors toward tightly constraining, filtering, or abandoning higher-risk use cases such as legal or health-related informational chatbots.

The Big Open Questions

If SB 7263 becomes law, its real-world impact will depend heavily on how courts interpret several ambiguous concepts.

First, what counts as a “substantive” response, information, or advice? Drawing the line between general, high-level information and guidance that crosses into professional practice will be central and very complex and fact-dependent for chatbots providing educational or informational content in regulated domains like health, law, or mental health.

Second, what constitutes the “practice” of a profession? Existing unauthorized-practice case law will provide some guidance, but AI-mediated interactions may not map neatly onto human-centric precedents.

Third, who qualifies as a “proprietor”? In today’s AI supply chains—where foundational model providers, fine-tuning vendors, integrators, and front-end deployers may all be distinct entities—determining which of these entities “own[], operate[], or deploy[]” the chatbot that interacts with users could get complicated quickly. Allocating liability among multiple “proprietors” when several entities qualify under SB 7263 would make this analysis even more complex.

Practical Risk-Management Considerations

For organizations deploying chatbots, the bill raises several immediate questions worth thinking through now:

  • How should you design and configure chatbot outputs in professional domains? 
  • How should user notices be updated to meet the statute’s font, language, and conspicuousness requirements?
  • How should agreements covering third-party chatbot use be updated? Even though third-party technology licensors are expressly excluded from the “proprietor” definition, thoughtful contractual allocation of responsibility between proprietors and their technology providers remains essential.

Bottom Line

SB 7263 represents New York’s attempt to apply its well-established professional licensure and corporate practice norms to the fast-moving world of AI chatbots. While the bill is framed as ensuring professional advice stays in the hands of licensed humans, it ultimately may function more as a litigation-exposure and compliance-cost driver for chatbot proprietors than as a direct tool for curbing unlicensed digital practice.

Its effectiveness will hinge on how courts, regulators, and litigants interpret “substantive” advice, unauthorized practice, and proprietorship in the AI context—and on how the industry adapts in response to the combined pressures of liability risk, disclosure mandates, and other states’ AI statutes and regulations. Attorneys and organizations deploying or integrating chatbots in or near regulated professional domains should monitor this bill closely and be ready to recalibrate governance, product design, and risk controls if SB 7263 becomes law.

Print and share

Authors

Profile Picture
Partner
EEvans@perkinscoie.com

Notice

Before proceeding, please note: If you are not a current client of Perkins Coie, please do not include any information in this e-mail that you or someone else considers to be of a confidential or secret nature. Perkins Coie has no duty to keep confidential any of the information you provide. Neither the transmission nor receipt of your information is considered a request for legal advice, securing or retaining a lawyer. An attorney-client relationship with Perkins Coie or any lawyer at Perkins Coie is not established until and unless Perkins Coie agrees to such a relationship as memorialized in a separate writing.

650.838.4334

Explore more in

Related insights

Home
Jump back to top