Skip to main content
Home
Home

Texas Becomes Latest State to Address Kids’ Privacy and Safety Online

Perkins on Privacy

Texas Becomes Latest State to Address Kids’ Privacy and Safety Online

privacy related image

Texas has become the latest state to impose age-related privacy and safety restrictions on online service providers, joining Arkansas, California, Florida, and Utah. Signed by Governor Greg Abbott on June 13, 2023, the Securing Children Online through Parental Empowerment (SCOPE) Act is scheduled to go into effect on September 1, 2024, and will require digital service providers to "register" the age of potential users at account creation and implement a series of privacy and safety controls for known minors.

Applicability

The SCOPE Act applies to digital service providers that enable users to "socially interact" with others on the service; create "public or semi-public profile[s]" on the service; and "create or post content that can be viewed by other users" and shared on message boards, chat rooms, a landing page, video channels, or a main feed. "Digital services," in turn, are defined as websites, applications, programs, or software that "collect[ ] or process[ ] personal identifying information" (i.e., information that is associated with or "reasonably linkable" to an individual) on the internet.

The Act does not apply to digital services that offer only email or direct messaging. A number of other services are also excluded from the scope, including those where the provider primarily creates or curates the news, sports, commerce, or other content (with chat and other interactive features ancillary to the service itself). The Act further does not apply to cloud service providers and search engines (unless they also create "harmful material," as defined in the Texas Penal Code).

Notable Requirements

Under the Act, providers that knowTexas residents under 18 (known minors) use their digital service will need to implement the following privacy and safety controls:

  • Verify a known minor's parent's or guardian's identity and relationship to the minor.
     
  • Develop parental tools to enable known minors' verified parents or guardians to control their minor's privacy and account settings, alter the provider's duties to their minor (e.g., on data collecting, targeted ads), and monitor and limit their minor's time spent on the service.
  • Enable verified parents or guardians to request access to and delete their known minor's personal identifying information.
  • Limit the collection, use, and transfer of known minors' personal identifying information.
  • Prevent known minors from engaging in financial transactions on the service.
  • Prohibit targeted advertising to known minors.
  • Implement a content moderation strategy to avoid known minors' exposure to harmful material and content that promotes or facilitates self-harm or eating disorders, substance abuse, stalking, bullying, or harassment, and child sexual exploitation or abuse, which includes the following:

    • Developing a comprehensive list of harmful material to prevent known minors from viewing it.
    • Describing to users the different categories of harmful material that will be blocked.
    • Employing hash-sharing technology to identify harmful material.
    • Using filtering technology to assist with blocking harmful material and conducting human reviews to test the technology.
    • Creating a database of keywords used to evade the filters (e.g., misspellings or hashtags).

     

    • Providing the source code for relevant algorithms to independent security researchers, except to the extent it is considered a trade secret.

     

  • If using algorithms to recommend content to known minors, ensure that they do not interfere with the content moderation strategy.
  • Clearly disclose in the terms of service, privacy policy, or a related document how algorithms are used to provide information, including how they promote, rank, or filter information and employ personal identifying information.
  • If the digital service is an adult platform, verify the age of users seeking to access content on the service (i.e., if the provider shares more than one-third of harmful material or obscene content, as defined by the Texas Penal Code, on the service).

Enforcement

The Texas attorney general's consumer protection division may bring an enforcement action for an injunction, civil penalties of up to $10,000 per violation, and actual damages for violation of the Act. A known minor's parent or guardian may also file a lawsuit against the provider to seek a declaratory judgment or an injunction.

***

While the Act will likely face pre-enforcement challenges, it reflects a broader regulatory trend related to kids' privacy and safety. Considering these issues early in the product development lifecycle should help reduce the risk associated with these types of laws.

Print and share

Explore more in

Blog series

Perkins on Privacy

Perkins on Privacy keeps you informed about the latest developments in privacy and data security law. Our insights are provided by Perkins Coie's Privacy & Security practice, recognized by Chambers as a leading firm in the field. 

View the blog
Home
Jump back to top