South Carolina Enacts Sweeping Age‑Appropriate Design and Online Safety Law—Effective Immediately
On February 5, 2026, South Carolina’s governor signed House Bill 3431, the Age-Appropriate Code Design, into law. The law took effect immediately, with no cure period, and imposes expansive new design, data, and governance obligations on online services reasonably likely to be accessed by minors.
South Carolina joins a growing group of states—including California, Maryland, Nebraska, and Vermont—that have enacted some form of Age‑Appropriate Design Code (AADC) legislation. Notably, on its face, the South Carolina version reaches much further than its analogues in other states.
Scope: Covered Online Services ‘Reasonably Likely To Be Accessed by Minors’
The Act applies to “covered online services” (e.g., websites or applications) that conduct business in South Carolina and are “reasonably likely to be accessed by minors,” and either have more than $25 million in annual revenue; or receive or share the personal data of at least 50,000 consumers, households, or devices; or derive at least 50% of its annual revenue from the sale or sharing of consumers’ personal data. “Reasonably likely to be accessed by a minor,” in turn, is defined as “it is reasonable to expect that the covered online service would be accessed by an individual minor or by minors based on the covered online service and meet either of the following criteria: (1) the individual is known to the covered online service to be a minor as defined in Section 39-80-10(7); or (2) the covered online service is directed to children as defined by the Children's Online Privacy Protection Act, 15 U.S.C. Sections 6501-6506 and the Federal Trade Commission rules implementing that act. When (1) is met, the covered online service must treat the particular individual as a minor. When (2) is met, the covered online service must treat all individuals using or visiting the covered online service as minors, except when the covered online service has actual knowledge that the individual is not a minor.
Notably, “known to be a minor” includes inferences related to the user’s age. While the intent of this language appears to be to require applying protections with respect to (1) known minors on general audience services and (2) all users on sites or services that met COPPA’s definition of “directed to children,” this language can also be read to suggest that an online service is in scope if it has knowledge of a single minor user, particularly because many of the substantive obligations attach to “users” and “visitors” rather than minors or their parents, a point highlighted in NetChoice’s challenge to the law.[1]
Core Obligations
HB 3431 requires covered online services to provide easily accessible tools that allow all “users” (in essence, South Carolina residents) and “visitors” (an undefined term)—not just minors or their parents—to do the following. For any user known to be a minor, such settings (among others) must be on by default:
- Disable certain design features, including but not limited to “covered design features” that encourage engagement (e.g., infinite scrolling, auto-play, third-party likes and views, appearance filters, rewards, notifications, and in-game purchases)
- Limit a user’s time and purchases on the service
- Block/disable contact (e.g., messages, requests, reactions, likes, comments) from account holders who are not among the minor’s existing connected accounts
- Block/disable quantification of engagement (e.g., a visible count of comments, likes, views, or reactions for any item generated by the user)
- Limit visibility of a user’s account, content, location, and connections
The law also requires providing all users the ability to opt out of “personalized recommendation systems” and establish such opt-out as the default for known minor users. “Personalized recommendation system” is defined as “a fully or partially automated system used to suggest, promote, or rank content, including other users, hashtags, or material from others based on the personal data of users.” Thus, the law appears to require that adults be able to opt out of any personalization, regardless of whether the personalization is addictive or otherwise results in harm.
Other provisions of the law are limited to minors, in line with other AADC laws. For example, the law requires exercising reasonable care in the use of minors’ personal data and design of the service to prevent harm to minors, including compulsive usage, psychological harm, identity theft, privacy intrusions, and discrimination. Such harms are familiar in AADC laws, though most others are framed as a duty to “mitigate” risk of harm rather than “prevent” such harms. The law also tethers strict data minimization requirements to minors’ data, prohibits “facilitating” targeted advertising to minors, allows collection of precise location data only when necessary to provide the service and with a visible indication to the minor that their location information is going to be collected or used, prohibits sending push notifications at certain times, and prohibits profiling except in very limited circumstances. The law also mandates the provision of certain parental tools, including managing the minor’s privacy settings, restricting purchases, and limiting time spent on the platform. Finally, covered online services must establish mechanisms for parents, minors, and schools to report harm to minors; are prohibited from showing ads for certain products and from using dark patterns; must provide detailed information about their “personalized recommendation systems” and how parents can control them; and must provide detailed information about their design safety, privacy protections, and parental tools for minors.
Audit Obligations
Unique among AADC laws adopted to date, the law requires covered online services to issue public reports issued by an independent third-party auditor that contains detailed information concerning their treatment of minors and design features. Such reports bear some similarity to privacy impact assessments mandated by other laws, and transparency or audit reports under online safety laws, but are required to be submitted to the South Carolina attorney general (AG), which is to publish such reports on its website, raising serious concerns regarding trade secrets and confidentiality. Such reports must be issued annually on or before July first, suggesting that the first such reports must be published less than 6 months after the law was signed.
Enforcement and Liability
The law is enforceable by the AG and provides that covered online services “shall be liable for treble the financial damages incurred as a result of a violation,” suggesting a need for the AG to establish damages for most violations. In addition, officers and employees may be held personally liable for willful or wanton violations. Further, requirements related to the use of dark patterns are enforceable under the South Carolina Unfair Trade Practices Act, which is backed civil penalties as well as a private right of action for individuals who suffer an “ascertainable loss of money or property, real or personal” due to unfair or deceptive methods or practices.
Key Takeaways for Companies
Given the ambiguity of the law and its ongoing legal challenge, it is difficult to identify immediate compliance priorities. Companies should monitor ongoing legal challenges and at minimum consider ensuring that any compliance measures they have adopted to protect kids and teens in other states are applied to South Carolina residents.
Endnote
[1] NetChoice filed a Complaint challenging the law on February 9, 2025.
Related insights
Perkins on Privacy
Perkins on Privacy keeps you informed about the latest developments in privacy and data security law. Our insights are provided by Perkins Coie's Privacy & Security practice, recognized by Chambers as a leading firm in the field.