Federal Legislation Concerning Minors and Online Services.
The federal Children's Online Privacy Protection Act of 1998 (COPPA) applies to the online collection of personal information of children under 13 years of age. It required the Federal Trade Commission (FTC) to issue and enforce regulations concerning children's online privacy. In 2000, the FTC adopted the Children's Online Privacy Protection Rule (Rule), which was subsequently revised in 2013. Under the Rule, websites and online services covered by COPPA must post privacy policies, provide parents with direct notice of their information practices, and get verifiable consent from a parent or guardian before collecting personal information from children.
The Rule applies to operators of commercial websites and online services directed to children under the age of 13 that collect personal information. It applies to operators of sites and online services geared toward general audiences when they have actual knowledge they are collecting information from children under age 13. Under the 2013 revisions, COPPA also applies to operators when they have actual knowledge they are collecting personal information from users of another site or online service directed to children under age 13. In certain circumstances, COPPA applies to advertising networks, plug-ins, and other third parties.
Relevant California Legislation.
California enacted Assembly Bill 2273, entitled The California Age-Appropriate Design Code Act (AB 2273), in 2022. AB 2273 required businesses providing online services, products, or features likely to be accessed by children to:
In September 2024, California enacted Senate Bill 976, entitled Protecting Our Kids from Social Media Addiction Act (SB 976). SB 976 requires:
Both AB 2273 and SB 976 have been challenged as violating the First and Fourteenth Amendments of the United States Constitution and are pending litigation. The Ninth Circuit Court of Appeals affirmed part of the injunction relating to a data protection impact assessment requirement in AB 2273, but vacated the remainder of the injunction. However, the Northern California District Court heard revised oral arguments on a second preliminary injunction motion for AB 2273. In regards to SB 976, the Ninth Circuit Court of Appeals granted a full injunction against the enforcement of the entirety of SB 976 while the appeal is pending.
Defined Terms.
The following terms are defined: "addictive feed," "addictive internet-based service or application," "business," "dark pattern," "likely to be accessed by minors," "media," "minor," "online service, product, or feature," "operator," "parent," "personal information," "precise location information," and "profiling."
Required Age Estimation for Businesses Providing an Online Service, Product, or Feature.
A business that provides an online service, product, or feature likely to be accessed by minors (business) must either: (1) estimate the age of minor users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business; or (2) apply the privacy and data protections afforded to minors to all consumers or users of the online service, product, or feature.
Restrictions on the Collection and Use of Personal Information of Minors.
A business may not use any personal information collected to estimate age or age range for any other purpose or retain that personal information longer than necessary to estimate age. Age assurance must be proportionate to the risks and data practice of an online service, product, or feature.
A business may not collect, sell, share, or retain personal information from minors under the age of 13, except to comply with the age estimation requirement.
A business may not:
Requirements for Businesses Providing Online Services, Products, or Features Likely Accessed by Minors.
A business must take the following actions:
Prohibition on Providing Addictive Feeds to Minors.
An operator of an addictive internet-based service or application is prohibited from providing an addictive feed to a user unless:
Time-Restricted Notifications to Minors.
Prior to January 1, 2026, a business is prohibited from sending notifications to a user that the business knows is a minor, unless the operator has obtained verifiable parental consent to send notifications, as follows:
Beginning January 1, 2026, a business is prohibited from sending notifications during these timeframes to a user whom the business has not reasonably determined is not a minor, unless the operator has obtained verifiable parental consent.
Required Options for Users of an Addictive Internet-Based Service or Application.
Operators of an addictive internet-based service or application must provide a mechanism through which any user may:
An operator of an addictive internet-based service or application is not required to give a parent any additional or special access to, or control over, the data or accounts of their minor child.
Compliance with the regulations established does not serve as a defense to any claim that a minor might have against the operator of an addictive internet-based service or application regarding any harm to the minor's mental health or well-being.
Enforcement Provided Under the Consumer Protection Act.
A violation of a regulation established is not reasonable in relation to the development and preservation of business and is an unfair or deceptive act in trade or commerce and an unfair method of competition for the purpose of applying the Consumer Protection Act.
The substitute bill adds a definition for "business" and excludes from the definition of "online service, product, or feature" an interactive gaming platform that complies with federal law and does not use dark patterns to lead or encourage minors to provide personal information.
(In support) There is more that can be done to protect children. The research is clear that addictive social media platforms are targeting children for profit. Addictive feeds compel youth to stay engaged in social media, often exposing them to hateful and dangerous content. Social media is a constant distraction, providing an endless stream of entertainment, unexpected likes, and dopamine hits. Children today are more online and more connected than ever. Children are having their sleep and learning time interrupted by notifications, so restrictions should be in place to prohibit notifications to minors when they are in school or at night. Social media is negatively affecting academic success, distorting self-perception, and challenging peer relationships. Children are not aware of the addictive response they might be having. There is an unprecedented youth mental health crisis from spending too much time online. Washington is forty-eighth in the country for mental health. There is no accountability for the social medial companies, and it is not about free speech. The state needs to help with turning off the hateful algorithms.
Technology companies are already using artificial intelligence algorithms to identify the ages of users, including children. Plenty of technology is profiling children. Social media companies are starting to implement more policies to protect children, but these are not industry standards. These protections should connect and tie in with a parental overlay. It is up to parents to drive guidelines, but there are factors at play that parents cannot control. Parents do everything they can to keep their kids safe, but this is not a fair fight. The government requires safe car seats in vehicles, monitors toys for safety issues, and gives movies ratings, but parents do not have these tools for social media. Parents feel lost as they do not know how to protect their kids online, especially when kids start breaking rules or start sneaking access to platforms they are not allowed on. More structured time off screens is needed, which is why funding for youth development programs is also crucial.
The concerns about First Amendment right violations are recognized, but this legislation was crafted based on those active lawsuits and court rulings. Age verification is proving your age without disclosing your identity. No one's data is at risk, because the data is never stored. If you do not store it, you cannot have it stolen. Europe has strict data protection laws, and companies are able to operate within those laws and provide age checks. Age verification was recently considered by the Supreme Court in relation to pornography, and the court is expected to move to allow age verification.
(Opposed) Technology companies are committed to providing an age appropriate and safe space. Ultimately, this legislation is unconstitutional for violating First Amendment rights and the Commerce Clause. There are current law suits challenging similar legislation in other states. A business that is likely to have minors on their site cannot retain the minor's data, yet this business has the responsibility to estimate the minor's age. Users will have to log in and provide private identifying information each time as the only way to verify age is with a government issued identification or with biometric identification. There is greater cybersecurity risk associated with age verification. Collecting and storing this kind of information makes these companies the target for theft. Many adult users would prefer not to share their identifying information with online services, creating an unpleasant dilemma.
(In support) Representative Lisa Callan, prime sponsor; Radu Smintina, School's Out Washington; Stephan Blanford, Children's Alliance; Adam Eitmann, Office of the Attorney General; Taj Jensen; Maggie Humphreys, MomsRising; Jai Jaisimha, Transparency Coalition.ai; Trevor Greene, Superintendent, Yakima School District; Laura Marquez-Garrett, Social Media Victims Law Center; and Iain Corby, The Age Verification Providers Association.
The second substitute bill:
(In support) The internet can be great, until it isn't. Many online applications and services, such as social media, can have negative impacts on children, including increasing anxiety, depression, suicidality, body image issues, and other areas of poor mental health. There is already a youth behavioral health crisis. Many states are pursuing legislation because of these negative impacts on kids. While social media sites are taking action by creating teen accounts and providing more parental control, that is being undermined by the use of algorithms that personalize experiences and drive the constant use of social media, including creating reward systems and providing constant notifications. The bill is about addressing the atmosphere experienced on social media by children. It sets common-sense limits on screen time, and will help protect children's personal data from being used for profit.
The Office of the Attorney General supports this legislation as a reasonable approach to protecting kids online. Concerns around the bill's constitutionality have been taken into consideration. The bill does not compel or limit speech. The bill represents a small investment to improve mental health outcomes and will be cost-effective compared to the cost of providing behavioral health and other services.
(Opposed) Other states have passed content moderation legislation. The Supreme Court ruled that those laws violate the First Amendment, and that a personal feed is protected speech. There have also been rulings against age verification laws as unconstitutional. Litigation is very expensive, and those costs are not contemplated in the fiscal note. Online spaces can be dangerous, but companies are taking action to address this. The bill's prohibitions on retaining and maintaining data will prevent companies from keeping data required to verify the age of juvenile users. This will result in companies being unable to use tools that are already in place.
(In support) Representative Lisa Callan, prime sponsor; Taku Mineshita, Office of Governor Bob Ferguson; Adam Eitmann, Washington State Office of the Attorney General; and Stephan Blanford, Children's Alliance.