SENATE BILL REPORT
SB 5708
As of February 10, 2025
Title: An act relating to protecting Washington children online.
Brief Description: Protecting Washington children online.
Sponsors: Senators Frame and Wagoner; by request of Attorney General.
Brief History:
Committee Activity: Business, Financial Services & Trade: 2/13/25.
Brief Summary of Bill
  • Specifies requirements for, and restrictions on, a business providing an online service, product, or feature likely to be accessed by minors, concerning age estimation, the collection and use of personal information and precise location information, profiling, the use of dark patterns, privacy, and notifications during specified timeframes.
  • Prohibits an operator of an addictive Internet-based service or application from providing addictive feeds to a user under certain circumstances.
  • Requires an operator of an addictive Internet-based service or application to provide certain options to all users.
  • Defines terms, provides applicability of the Consumer Protection Act, and provides construction and for severability.
SENATE COMMITTEE ON BUSINESS, FINANCIAL SERVICES & TRADE
Staff: John Kim (786-7453)
Background:

In recent years, at both the federal and state level, numerous pieces of legislation concerning social media and children have been enacted or proposed.  According to the National Conference of State Legislatures, in 2023, 35 states and Puerto Rico addressed legislation concerning social media and children and 12 states enacted bills or adopted resolutions.  In 2024, the United States Senate passed two pieces of legislation concerning online services and minors, but versions of those bills were not considered by the full House of Representatives during the 118th session of Congress.

 

This discussion addresses some federal legislation and legislation enacted in the state of California in 2022 and 2024, as the current bill contains provisions that are substantially similar or identical to the 2022 and 2024 California laws.  The California laws are currently under judicial review by a federal district court in California and the United States Court of Appeals for the Ninth Circuit (Ninth Circuit Court of Appeals).  The Ninth Circuit Court of Appeals has jurisdiction over federal judicial districts in nine states, including the state of Washington.

 

Federal Legislation Concerning Online Services and Minors. Children's Online Privacy Protection Act of 1998. The federal Children's Online Privacy Protection Act of 1998 (COPPA) applies to the online collection of personal information of children under 13 years of age. It required the Federal Trade Commission (FTC) to issue and enforce regulations concerning children's online privacy.  In 2000, the FTC adopted the Children's Online Privacy Protection Rule (Rule), which has been subsequently revised in 2013.

 

Under the Rule, websites and online services covered by COPPA must post privacy policies, provide parents with direct notice of their information practices, and get verifiable consent from a parent or guardian before collecting personal information from children.

 

The Rule applies to operators of commercial websites and online services directed to children under the age of 13 that collect personal information.   It applies to operators of sites and online services geared toward general audiences when they have actual knowledge they are collecting information from children under 13.  Under the 2013 revisions, COPPA also applies to operators when they have actual knowledge they are collecting personal information from users of another site or online service directed to kids under 13.  In certain circumstances, COPPA applies to advertising networks, plug-ins, and other third parties.

 

Recent Federal Legislation. In July 2024, the United States Senate passed The Children and Teens' Online Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA). COPPA 2.0 would expand the scope of entities covered under COPPA and includes requirements for verifiable consent by parents of children under 13 to:

  • collect data;
  • prohibit targeted marketing;
  • a right to content erasure;
  • information practices; and
  • a digital marketing bill of rights for minors.  

 

The United States Senate version of KOSA would establish a duty of care for online platforms to prevent and mitigate specified harms to children and requires platforms to provide options regarding information, disable or limit certain product features, and opt out of personalized algorithmic recommendations.

 

In September 2024, the United States House Committee on Energy and Commerce advanced its version of COPPA 2.0 and KOSA to the full House of Representatives for consideration, but the measures were not voted on by the full House of Representatives during the 118th session of Congress.

 

2022 California Legislation and Pending Constitutional Challenge. In September 2022, California enacted AB 2273, entitled The California Age-Appropriate Design Code Act (AB 2273).  Some provisions of AB 2273 that are substantially related to or identical to the current bill include a requirement for businesses providing online services, products, or features likely to be accessed by children to:

  • estimate the child's age with a reasonable level of certainty;
  • specify default privacy settings provided to children;
  • specify requirements and restrictions on the use of personal information;
  • restrict the use of dark patterns to lead or encourage children to provide certain personal information; and
  • authorize the state's attorney general to enforce the act.

 

In December 2022, NetChoice, LLC (NetChoice), a trade association of online businesses, filed a complaint for declaratory and injunctive relief concerning AB 2273 against the attorney general of California in the United States District Court for the Northern District of California (Northern California District Court).  Arguments in the complaint included, in part, that AB 2273 violated the First and Fourteenth Amendments of the United States Constitution as imposing viewpoint-, content-, and speaker-based restrictions on speech that did not meet various standards of scrutiny; that the bill used vague terms and provisions in violation of due process; that it violated the Commerce Clause as imposing undue burdens on interstate commerce; and that certain provisions were preempted by federal law.

 

In September 2023, the Northern California District Court granted NetChoice's motion for a preliminary injunction against California's enforcement of AB 2273. In 2024, the Ninth Circuit Court of Appeals affirmed the part of the injunction relating to a data protection impact assessment requirement but vacated the remainder of the injunction.  In January 2025, the Northern California District Court heard revised oral arguments on a second preliminary injunction motion by NetChoice.

 

2024 California Legislation and Stay Pending Appeal in the United States Court of Appeals for the Ninth Circuit.  In September 2024, California enacted SB 976, entitled Protecting Our Kids from Social Media Addiction Act (SB 976). Some provisions of SB 976 that are substantially related to or identical to the current bill include:

  • restrictions on a website's display of addictive feeds to minors;
  • sending notifications to known or potential minors during specified time periods without parental consent; and
  • general requirements covering addictive Internet-based services.

 

In November 2024, NetChoice filed a complaint for declaratory and injunctive relief concerning SB 976 against the attorney general of California in the Northern California District Court.  In its complaint, NetChoice stated that SB 976 regulates services offered by its members including Google, which owns and operates YouTube; Meta, which owns and operates Facebook and Instagram; Nextdoor; Pinterest; and X. Arguments in the complaint included, in part, that SB 976 violates the First Amendment as content-based and speaker-based regulations of speech that did not meet the standards of heightened or strict scrutiny.  The complaint also argued that SB 976's central coverage definition of addictive Internet-based services or applications was unconstitutionally vague and violated free speech under the First Amendment and due process under the Fourteenth Amendment.

 

In December 2024, the Northern California District Court granted a partial preliminary injunction, prohibiting California from enforcing provisions in SB 976 relating to restrictions on operators sending notifications to known or potential minors during specified time periods, and a requirement for operators to annually and publicly disclose certain information regarding minor users.  The court's injunction otherwise permitted California to enforce the remainder of the law.  In January 2025, NetChoice appealed the District Court's ruling to the Ninth Circuit Court of Appeals.  In January 2025, the Ninth Circuit Court of Appeals granted a full injunction against the enforcement of the entirely of SB 976 while the appeal is pending.  Oral arguments for the appellate case are scheduled for April 2025.

Summary of Bill:

Requirement for Certain Businesses to Estimate the Age of Minor Users or Apply Certain Protections to All Users. The bill requires a business that provides an online service, product, or feature likely to be accessed by minors to:

  • estimate the age of minor users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business; or
  • apply the privacy and data protections afforded to minors to all consumers or users of the online, service, product, or feature.

?

A minor is defined as an individual under 18 years of age who is located in Washington State.

?

The term likely to be accessed by minors is defined to mean it is reasonable to expect, based on the following indicators, that the online service, product, or feature would be accessed by minors:

  • the online service, product, or feature is directed to children as defined by COPPA.
  • the online service, product, or feature is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of minors.
  • an online service, product, or feature that displays, provides, contains, or sells advertisements marketed to minors.
  • an online service, product, or feature that has design elements that are known to be of interest to minors including, but not limited to, games, cartoons, music, and celebrities who appeal to minors.
  • a significant amount of the audience of the online service, product, or feature is determined, based on internal company research, to be minors.

?

The term online service, product, or feature excludes a broadband Internet access service or telecommunications service.

?

Restrictions on the Collection and Use of Personal Information of Minors. A business that provides an online service, product, or feature likely to be accessed by minors may not use any personal information collected to estimate age or age range for any other purpose or retain that personal information longer than necessary to estimate age. Age assurance must be proportionate to the risks and data practice of an online service, product, or feature. ?Such a business may not collect, sell, share, or retain personal information from minors under the age of 13.

?

Personal information is defined as information that identifies or is reasonably capable of being associated or linked, directly or indirectly, with a particular individual or individual's household. It includes, but is not limited to, data associated with a persistent unique identifier, such as a cookie ID, an IP address, a device identifier, or any other form of persistent unique identifier. It does not include publicly available information.

?

A business that provides an online service, product, or feature likely to be accessed by minors may not take any of the following actions:

  • use the personal information of any minor in a way that the business knows, or has reason to know, is materially detrimental to the physical health, mental health, or well-being of a minor;
  • collect, sell, share, or retain any personal information that is not necessary to provide an online service, product, or feature with which a minor is actively and knowingly engaged; or
  • if the end user is a minor, use personal information for any reason other than a reason for which that personal information was collected, unless the business can demonstrate a compelling reason that use of the personal information is in the best interests of minors.

?

Restriction on Profiling of Minors. Profiling is defined as any form of automated processing of personal information that uses personal information to evaluate certain aspects relating to an individual, including analyzing or predicting aspects concerning an individual's performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.

?

A business that provides an online service, product, or feature likely to be accessed by minors may not profile a minor by default unless the business can demonstrate it has appropriate safeguards in place to protect minors and one of the following is true:

  • profiling is necessary to provide the online service, product, or feature requested and only with respect to the aspects of the online service, product, or feature with which the minor is actively and knowingly engaged; or
  • the business can demonstrate a compelling reason that profiling is in the best interests of minors.

?

Restriction on the Collection and Use of Precise Location Information of Minors. Precise location information is defined as information derived from technology including, but not limited to, global positioning system level latitude and longitude coordinates or other mechanisms, that directly identifies the specific location of an individual with precision and accuracy within a radius of 1750 feet.

?

A business that provides an online service, product, or feature likely to be accessed by minors may not:

  • collect, sell, or share any precise location information of minors by default unless the collection of that precise location information is strictly necessary for the business to provide the service, product, or feature requested, and then only for the limited time that the collection of precise location information is necessary to provide the service, product, or feature; or
  • collect any precise location information of a minor without providing an obvious sign to the minor for the duration of that collection that precise location information is being collected.

?

Restriction on the Use of Dark Patterns on Minors. A dark pattern is defined as a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making, or choice.

?

A business that provides an online service, product, or feature likely to be accessed by minors may not use dark patterns to lead or encourage minors to provide personal information beyond what is reasonably expected to provide that online service, product, or feature to forego privacy protections, or to take any action that the business knows, or has reason to know, is materially detrimental to the minor's physical health, mental health, or well-being.

?

Requirements for Businesses Regarding Privacy of Minors. A business that provides an online service, product, or feature likely to be accessed by minors must take all of the following actions:

  • configure all default privacy settings provided to minors by the online service, product, or feature to settings that offer a high level of privacy unless the business can demonstrate a compelling reason that a different setting is in the best interests of minors;
  • provide any privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of minors likely to access that online service, product, or feature;
  • if the online service, product, or feature allows the minor's parent, guardian, or any other individual or entity to monitor the minor's online activity or track the minor's location, provide an obvious signal to the minor when the minor is being monitored or tracked;
  • enforce published terms, policies, and community standards established by the business including, but not limited to, privacy policies and those concerning minors; and
  • provide prominent, accessible, and responsive tools to help minors, or if applicable their parents or guardians, exercise their privacy rights and report concerns.

?

Prohibition on Providing Addictive Feeds to Minors. The bill provides it is unlawful for an operator of an addictive Internet-based service or application to provide an addictive feed to a user unless:

  • prior to January 1, 2026, the operator does not have actual knowledge that the user is a minor; or
  • commencing January 1, 2026, the operator has reasonably determined that the user is not a minor.

?

An operator is defined as a person who operates or provides an?Internet website, an online service, an online application, or a mobile application.

?

An addictive Internet-based service or application is defined as an website, online service, online application, or mobile application including, but not limited to, a social media platform, that offers users or provides users with an addictive feed as a significant part of the service provided by that website, online service, online application, or mobile application.

?

The bill excludes from the definition of an addictive Internet-based service or application:

  • an Internet website, online service, online application, or mobile application for which interactions between users are limited to commercial transactions or to consumer reviews of products, sellers, services, events, or places, or any combination thereof; or
  • an Internet website, online service, online application, or mobile application that operates a feed for the primary purpose of cloud storage.

?

An addictive feed is defined as an Internet website, online service, online application, or mobile application, or a portion thereof, in which multiple pieces of media generated or shared by users are, either concurrently or sequentially, recommended, selected, or prioritized for display to a user based, in whole or in part, on information provided by the user, or otherwise associated with the user or the user's device, unless any of the following conditions are met, alone or in combination with one another:

  • the information is not persistently associated with the user or user's device, and does not concern the user's previous interactions with media generated or shared by others;
  • the information consists of search terms that are not persistently associated with the user or user's device;
  • the information consists of user-selected privacy or accessibility settings, technical information concerning the user's device, or device communications or signals concerning whether the user is a minor;
  • the user expressly and unambiguously requested the specific media or media by the author, creator, or poster of the media, or the blocking, prioritization, or deprioritization of such media, provided that the media is not recommended, selected, or prioritized for display based, in whole or in part, on other information associated with the user or the user's device, except as otherwise permitted by this chapter and, in the case of audio or video content, is not automatically played;
  • the media consists of direct, private communications between users;
  • the media recommended, selected, or prioritized for display is exclusively the next media in a preexisting sequence from the same author, creator, poster, or source and, in the case of audio or video content, is not automatically played; or
  • the recommendation, selection, or prioritization of the media is necessary to comply with the other requirements in the bill.

?

Media is defined as text, audio, an image, or a video.

?

Restrictions on Notifications to Minors During Specified Timeframes. Prior to January 1, 2026, the bill provides it is unlawful for a business that provides an online service, product, or feature likely to be accessed by minors, between the hours of 12:00 a.m. and 6:00 a.m., in the user's local time zone, and between the hours of 8:00 a.m. and 3:00 p.m., from Monday through Friday from September through May in the user's local time zone, to send notifications to a user if the business has actual knowledge that the user is a minor unless the operator has obtained verifiable parental consent to send those notifications.

?

Commencing January 1, 2026, it is unlawful for a business that provides an online service, product, or feature likely to be accessed by minors, during the same timeframes, to send notifications to a user whom the business has not reasonably determined is not a minor unless the operator has obtained verifiable parental consent to send those notifications.

Required Options for Users of an Addictive Internet-Based Service or Application. The operator of an addictive Internet-based service or application must provide a mechanism through which any user, whether or not they are a minor, may do any of the following:

  • limit their access to any addictive feed from the addictive Internet-based service or application to a length of time per day specified by the user;
  • limit their ability to view the number of likes or other forms of feedback to pieces of media within an addictive feed;
  • require that the default feed provided to the user when entering the Internet-based service or application be one in which pieces of media are not recommended, selected, or prioritized for display based on information provided by the user, or otherwise associated with the user or the user's device, other than the user's age or status as a minor; or
  • set their account to private mode, in a manner in which only users to whom the user is connected on the addictive Internet-based service or application may view or respond to content posted by the user.

?

Applicability of the Consumer Protection Act. The bill provides that a violation of the chapter created by the bill is not reasonable in relation to the development and preservation of business and is an unfair or deceptive act in trade or commerce and an unfair method of competition for the purpose of applying the Consumer Protection Act.

?

Statutory Construction. The bill provides the following in construing its provisions:

  • the chapter created by the bill does not restrict the ability of a business that provides an online service, product, or feature to comply with Washington State or federal law; or comply with a subpoena, warrant, court order, or other civil or criminal legal process, unless such compliance is otherwise prohibited by Washington State or federal law; or
  • the chapter created by the bill may not be construed as requiring the operator of an addictive Internet-based service or application to give a parent any additional or special access to, or control over, the data or accounts of their minor child; and
  • compliance with the chapter created by the bill by the operator of an addictive Internet-based service or application does not serve as a defense to any claim that a minor, or an individual who was a minor at the time of using the Internet-based service or application, might have against the operator of an addictive Internet-based service or application regarding any harm to the mental health or well-being of the minor.

?

Severability. The bill provides that if any provision of this chapter or its application to any person or circumstance is held invalid, the remainder of the chapter or the application of the provision to other persons or circumstances is not affected.

Appropriation: None.
Fiscal Note: Requested on February 9, 2025.
Creates Committee/Commission/Task Force that includes Legislative members: No.
Effective Date: Ninety days after adjournment of session in which bill is passed.