Skip to main content
Home
Home

California Companion Chatbot Law Now in Effect

California Companion Chatbot Law Now in Effect

Artificial Intelligence Brain Compute

Key Takeaways

Over the past several years, states have increasingly experimented with regulating how chatbots and other AI-driven conversational tools are used in consumer-facing contexts. Early efforts focused largely on transparency, requiring businesses to disclose when users were interacting with automated rather than human agents.

The California Companion Chatbot Law, California Senate Bill 243, went into effect on January 1, 2026, and reflects a new phase of regulation: In addition to disclosure requirements, the law also imposes safety, governance, and reporting requirements. This new tack is designed to respond to concerns about how certain chatbots influence user behavior, emotional well-being, and decision-making over time, especially with regard to minors. We covered this law briefly in an earlier Update on several new California AI laws, but this Update provides more detail on the Companion Chatbot Law in particular.

What Types of Chatbots Are Covered

California’s Companion Chatbot Law does not apply to all chatbots or conversational AI tools. Instead, it regulates a narrower category the statute refers to as “companion chatbots.” These are chatbots that respond to users with adaptive, human-like responses and are designed to engage users in ways that can meet social or emotional needs. Chatbots that do not meet these criteria, either because they do not sustain a relationship across multiple interactions or are not capable of eliciting emotional or social engagement, fall outside the law’s definition and are not subject to its requirements.

The narrow focus of the Companion Chatbot Law is emphasized by the AI agents that are expressly excluded from the “companion chatbots” definition:

  • Customer service chatbots. Chatbots used solely for customer service, business operations, productivity or analysis based on source information, internal research, or technical assistance are excluded.
  • Video game chatbots. Chatbots that operate within video games are excluded only if their responses are limited to replies about the video game. To fall within this exclusion, a chatbot must not be able to discuss mental health, self-harm, or sexually explicit conduct, and it must not be able to maintain a dialogue on other topics unrelated to the video game.
  • Voice-activated assistants. Stand-alone consumer electronic devices that function as a speaker and voice command interface, act as a voice-activated virtual assistant, and do not sustain a relationship across multiple interactions or generate outputs likely to elicit emotional responses from a user are excluded.

As conversational AI systems become more sophisticated and multipurpose, the distinction between a companion chatbot and a purely functional chatbot may not always be clear. Companies offering conversational tools that adapt to users over time, personalize interactions, or market themselves as supportive or relational should carefully assess whether their products could fall within the statute’s scope.

Core Operational Requirements

The Companion Chatbot Law imposes operational requirements that apply to the operator of a covered companion chatbot. Some requirements apply to all users, while others apply only when the operator knows that a user is a minor.

  • Required disclosure. If a reasonable person would otherwise be misled into believing they are interacting with a human, the operator of a companion chatbot must provide a clear and conspicuous notification to users that they are interacting with an AI system.
  • Required safety protocols. The operator of a covered chatbot must implement a protocol designed to prevent the chatbot from producing content related to suicidal ideation, suicide, or self-harm. At a minimum, the protocol must include measures to refer users to appropriate crisis service providers, such as a suicide hotline or crisis text line, if they express statements indicating a risk of suicide or self-harm. The operator must also publish details about its safety protocol on its website.
  • Minor suitability disclosure. The operator of a companion chatbot must include a disclosure that the companion chatbot may not be suitable for some minors.
  • Additional requirements for minors. When the operator of a companion chatbot knows that a user is a minor, the operator must implement additional safeguards. When that is the case, the operator must disclose that the minor is interacting with an AI system and, by default, provide a clear and conspicuous notification at least every three hours during continuing interactions reminding the user to take a break and that the chatbot is not human. Additionally, the operator must institute reasonable measures to prevent the companion chatbot from producing visual material depicting sexually explicit conduct or directly stating that the minor should engage in sexually explicit conduct.

These operational obligations require companies to assess both the design and behavior of their chatbots and the information they have about end users.

Reporting and Transparency Obligations

In addition to operational requirements, the Companion Chatbot Law imposes ongoing transparency obligations on the operator of a covered companion chatbot. Beginning on July 1, 2027, such an operator must annually report to California’s Office of Suicide Prevention on: (1) the number of times the operator issued a crisis service provider referral notification in the preceding calendar year; (2) the protocols the operator has implemented to detect, remove, and respond to instances of suicidal ideation by users; and (3) the protocols the operator has implemented to prohibit companion chatbot responses relating to suicidal ideation or actions. The information provided by operators is to be published by the Office of Suicide Prevention on its website.

Enforcement and Liability Considerations

The Companion Chatbot Law creates an express private right of action, allowing a user to bring a civil action against an operator that violates the law’s requirements and to seek injunctive relief, actual damages, and recovery of reasonable attorneys’ fees and costs. 

Looking Ahead

Companies that make chatbots available to users should be evaluating their functionality now to determine whether the Companion Chatbot Law applies and, if so, implementing any operational changes necessary to comply with its requirements. Although this is a first-of-its-kind statute, it is unlikely to be the last, and companies should expect continued experimentation among states as regulators explore different approaches to regulating conversational AI. This state activity, however, takes place against the backdrop of renewed federal interest in preempting state AI regulations following the Trump administration’s recent Executive Order that authorizes several federal government agencies to issue guidance regarding the potential preemption of state AI laws. Accordingly, this will remain an area that companies deploying chatbots will need to continue to closely monitor at both the state and federal levels.

In addition to staying current with new statutory obligations and federal activities, companies should also evaluate their chatbot designs and user engagement features against a backdrop of increased efforts by plaintiffs’ lawyers to pursue claims based on allegedly addictive features and other theories of user harm.

Print and share

Authors

Profile Picture
Partner
KSoderquist@perkinscoie.com

Notice

Before proceeding, please note: If you are not a current client of Perkins Coie, please do not include any information in this e-mail that you or someone else considers to be of a confidential or secret nature. Perkins Coie has no duty to keep confidential any of the information you provide. Neither the transmission nor receipt of your information is considered a request for legal advice, securing or retaining a lawyer. An attorney-client relationship with Perkins Coie or any lawyer at Perkins Coie is not established until and unless Perkins Coie agrees to such a relationship as memorialized in a separate writing.

206.359.6129
Profile Picture
Partner
DTaneja@perkinscoie.com

Notice

Before proceeding, please note: If you are not a current client of Perkins Coie, please do not include any information in this e-mail that you or someone else considers to be of a confidential or secret nature. Perkins Coie has no duty to keep confidential any of the information you provide. Neither the transmission nor receipt of your information is considered a request for legal advice, securing or retaining a lawyer. An attorney-client relationship with Perkins Coie or any lawyer at Perkins Coie is not established until and unless Perkins Coie agrees to such a relationship as memorialized in a separate writing.

206.359.3427
Profile Picture
Counsel
DWest@perkinscoie.com

Notice

Before proceeding, please note: If you are not a current client of Perkins Coie, please do not include any information in this e-mail that you or someone else considers to be of a confidential or secret nature. Perkins Coie has no duty to keep confidential any of the information you provide. Neither the transmission nor receipt of your information is considered a request for legal advice, securing or retaining a lawyer. An attorney-client relationship with Perkins Coie or any lawyer at Perkins Coie is not established until and unless Perkins Coie agrees to such a relationship as memorialized in a separate writing.

206.359.3598

Explore more in

Related insights

Home
Jump back to top