From Casetext: Smarter Legal Research

NetChoice, LLC v. Yost

United States District Court, Southern District of Ohio
Feb 12, 2024
2:24-cv-00047 (S.D. Ohio Feb. 12, 2024)

Opinion

2:24-cv-00047

02-12-2024

NETCHOICE, LLC, Plaintiff, v. DAVE YOST, in his official capacity as Ohio Attorney General, Defendant.


Elizabeth Preston Deavers, Magistrate Judge.

OPINION AND ORDER

ALGENON L. MARBLEY, CHIEF UNITED STATES DISTRICT JUDGE.

This matter is before this Court on Plaintiff NetChoice, LLC's (“NetChoice”) Motion for Preliminary Injunction (“PI”) against Defendant Ohio Attorney General David Yost. (ECF No. 2). Having granted Plaintiff's request for a TRO on January 8, 2024, this Court held a Preliminary Injunction Hearing on Plaintiff's Motion on February 7, 2024. For the reasons set forth below, this Court GRANTS Plaintiff's Motion for a Preliminary Injunction.

I. BACKGROUND

A. Factual Background

This case is about whether an Ohio state law, the Parental Notification by Social Media Operators Act, Ohio Rev. Code § 1349.09(B)(1) (“Act”), which was originally set to take effect on January 15, 2024, violates the First and Fourteenth Amendment rights of popular websites including Facebook, X (formerly Twitter), and YouTube, in addition to violating the First Amendment rights of those websites' users. The websites' and users' interests are represented by Plaintiff NetChoice, an Internet trade association, which brought this suit against Ohio Attorney General David Yost, in his official capacity, seeking declaratory and injunctive relief to prevent Yost from enforcing the law against NetChoice's members.

1. NetChoice and the Internet Landscape

NetChoice is an Internet trade association that represents several popular websites and platforms including Google, Meta, X, Nextdoor, and Pinterest, each of which, NetChoice contends, publish, disseminate, create, or distribute speech protected by the First Amendment. (ECF Nos. 2 at 6-7; 2-1 ¶ 11). Adults and teens alike flock to NetChoice's member websites and generate billions of “posts” every day. (ECF Nos. 2 at 7; 2-1 ¶ 6). NetChoice also details the non-legislative tools that parents have at their disposal to oversee their children's use of the internet, including restrictions made available by devices, networks, software, and even by NetChoice's member organizations on their platforms. (ECF No. 2 at 7-8; 2-1 ¶ 8).

2. The Entities that the Act Seeks to Regulate

The Act, which resembles legislation enacted in other states, seeks to require certain website operators to obtain parental consent before allowing any unemancipated child under the age of sixteen to register or create an account on their platform. Specifically, the Act regulates “operator[s]” of “online web site[s], service[s], or product[s]” that (1) “target[] children,” or are “reasonably anticipated to be accessed by children”; (2) have users in the state of Ohio; and (3) allow users to do all of the following:

(a) Interact socially with other users within the confines of the online web site, service, or product;
(b) Construct a public or semipublic profile for the purpose of signing into and using the online web site, service, or product;
(c) Populate a list of other users with whom an individual shares or has the ability to share a social connection within the online web site, service, or product;
(d) Create or post content viewable by others, including on message boards, chat rooms, video channels, direct or private messages or chats, and a landing page or main feed that presents the user with content generated by other users.
§ 1349.09(A)(1); (B). The Act explains that in order to determine “whether an operator's online web site, service, or product targets children, or is reasonably anticipated to be accessed by children, the attorney general or a court may consider the following factors”:
(1) Subject matter;
(2) Language;
(3) Design elements;
(4) Visual content;
(5) Use of animated characters or child-oriented activities and incentives;
(6) Music or other audio content;
(7) Age of models;
(8) Presence of child celebrities or celebrities who appeal to children;
(9) Advertisements;
(10) Empirical evidence regarding audience composition; and
(11) Evidence regarding the intended audience.
§ 1349.09(C).

The Act contains several exceptions. Of relevance here, the Act does not apply to corners of the Internet where “interaction between users is limited to”: “(1) Reviewing products offered for sale by electronic commerce or commenting on reviews posted by other users; (2) Comments incidental to content posted by an established and widely recognized media outlet, the primary purpose of which is to report news and current events.” § 1349.09(O).

3. The Act's Requirements of Covered Operators

If an operator falls within the above-enumerated parameters, it is required to: (1) “[o]btain verifiable consent for any contract with a child, including terms of service, to register, sign up, or otherwise create a unique username to access or utilize the online web site, service, or product, from the child's parent or legal guardian” through a variety of acceptable methods; and (2) present to the parent or guardian a list of features related to content moderation and a link where they may review those features. See § 1349.09(B). In the absence of parental consent, children under the age of sixteen “shall” be denied access to the “use of the online web site, service, or product.” § 1349.09(E).

4. The Act's Enforcement Mechanism and Penalties for Non-Compliance

Should a covered operator be found to be in noncompliance with the Act, the Ohio Attorney General “shall investigate” the issue and may bring suit. § 1349.09(G); (H). A court that finds that an operator has violated the terms of the Act “shall impose a civil penalty” under the following scheme: (1) up to $1000 per day for the first 60 days of noncompliance; (2) up to an additional $5000 per day for days 61-90; and (3) up to an additional $10,000 per day for days 91 and beyond. See § 1349.09(I). “If an operator is in substantial compliance with this section,” however, the attorney general may not commence civil action until providing the operator with written notice of the suspected violations, and a 90-day opportunity to cure, in which the operator must provide the attorney general with “written documentation that the violation has been cured and that the operator has taken measures sufficient to prevent future violations.” § 1349.09(M).

B. Procedural Background

The Governor of Ohio signed the Act into law in July 2023, and it was set to take effect on January 15, 2024. Plaintiff NetChoice, however, filed this lawsuit and a Motion requesting both a TRO and PI on January 5, 2024. (ECF No. 2). This Court held a conference on Plaintiff's Motion for a TRO on January 8, 2024, and granted Plaintiff's request, thereby enjoining Attorney General Yost from enforcing the Act against NetChoice's member organizations. Attorney Yost has now responded to NetChoice's Motion for Preliminary Injunction (ECF No. 28), and NetChoice has replied (ECF No. 29).

II. STANDARD OF REVIEW

Rule 65 of the Federal Rules of Civil Procedure provides for preliminary injunctive relief when a party believes it will suffer immediate and irreparable injury, loss, or damage. See Fed.R.Civ.P. 65. A preliminary injunction is an “extraordinary remedy” intended to preserve the status quo until trial, Winter v. Nat. Res. Def. Council, Inc., 555 U.S. 7, 24 (2008), and should only be awarded upon a clear showing that the movant is entitled to such relief, Southern Glazer'sDisrib. Of Ohio, LLC v. Great Lakes Brewing Co., 860 F.3d 844, 849 (6th Cir. 2017).

In determining whether to issue a preliminary injunction, the court must consider the following four factors: “(1) whether the movant has a strong likelihood of success on the merits; (2) whether the movant would suffer irreparable injury without the injunction; (3) whether issuance of the injunction would cause substantial harm to others; and (4) whether the public interest would be served by the issuance of the injunction.” Certified Restoration Dry Cleaning Network, L.L.C. v. Tenke Corp., 511 F.3d 535, 542 (6th Cir. 2007) (citation omitted). All four factors must be balanced rather than treated as prerequisites. Id.

III. LAW & ANALYSIS

A. Standing

Before turning to the preliminary injunction considerations outlined above, this Court must first determine whether NetChoice has standing to bring these claims on behalf of: (1) its members; and (2) its members' users. A litigant has standing if it “is entitled to have the court decide the merits of the dispute or of particular issues.” Warth v. Seldin, 422 U.S. 490, 498 (1975). There are two constraints that govern a party's standing: “[c]onstitutional standing addresses who has the right to invoke the power of a court (e.g., by filing a lawsuit), while prudential standing addresses what arguments a party may raise as a claim or defense.” NetChoice, LLC v. Griffin, No. 5:23-CV-05105, 2023 WL 5660155, at *8 (W.D. Ark. Aug. 31, 2023) (emphasis in original) (citing Curtis A. Bradley, Ernest A. Young, Unpacking Third-Party Standing, 131 Yale L.J. 1, 26 (2021)).

Recall that NetChoice not only brings First Amendment and Fourteenth Amendment claims on behalf of its member organizations, but it also brings a First Amendment claim on behalf of the users of those member organizations' websites and platforms.

1. Constitutional Standing

First, this Court considers whether NetChoice is entitled to bring a suit challenging the Act at all. The Supreme Court has established an “irreducible constitutional minimum” of standing containing three elements: (1) an “injury in fact” that is concrete and particularized and actual and imminent; (2) “a causal connection between the injury and the conduct complained of”; and (3) a likelihood that the injury will be redressable by the court. Lujan v. Defenders of Wildlife, 504 U.S. 555, 560-61 (1992).

NetChoice seeks to proceed under a now well-worn theory of associational standing. NetChoice must show that: “(1) its members would have standing to sue in their own right; (2) the suit seeks to protect interests germane to the association's purpose; and (3) neither the claim asserted nor the relief requested requires the individual members of the association to participate in the lawsuit.” Griffin, 2023 WL 5660155, at *9 (citing Hunt v. Wash. State Apple Advert. Comm'n, 432 U.S. 333, 343 (1977)).

With respect to the first prong, Attorney General Yost argues that NetChoice fails to show that any of its members would suffer a First Amendment injury should the Act go into effect and that, therefore, NetChoice does not have constitutional standing to bring this lawsuit. In Attorney General Yost's view, the injury suffered must be a First Amendment injury because this lawsuit seeks to vindicate First Amendment rights. (ECF No. 28 at 17). NetChoice responds that its members will suffer economic harms and First Amendment injuries, both of which are independently sufficient to confer constitutional standing. (ECF No. 29 at 5-7).

Attorney General Yost is incorrect to suggest that NetChoice cannot predicate their constitutional standing on economic harm. Courts of Appeals that have considered whether compliance costs confer standing “have uniformly held that compliance costs associated with a regulatory regime satisfy the injury-in-fact requirement.” Tennessee v. United States Dep't of Agric., 665 F.Supp.3d 880, 898 (E.D. Tenn. 2023) (collecting cases). This is true even when the underlying claims are non-economic in nature: constitutional standing pertains to whether a litigant can bring a lawsuit at all, not which claims it can bring. See Griffin, 2023 WL 5660155, at *8 (citing Bradley, Young, Unpacking Third-Party Standing, 131 Yale L.J. at 26).

Virginia v. American Booksellers Ass'n, Inc., 484 U.S. 383, 392 (1988) is instructive. There, booksellers' organizations and bookstores challenged the constitutionality of a Virginia statute that made it a crime “to knowingly display for commercial purpose in a manner whereby juveniles may examine and peruse” certain adult materials. Id. at 387. The Supreme Court had no trouble concluding that the plaintiffs in Booksellers had shown an injury-in-fact because the law was “aimed directly at plaintiffs, who, if their interpretation of the statute [was] correct, [would] have to take significant and costly compliance measures or risk” enforcement. Id. at 392.

The same is true here. Many of NetChoice's member organizations would incur substantial compliance costs should the Act go into effect. (ECF Nos. 2-1 ¶ 14; 2-2 ¶ 13). Each member organization that believes it would be covered by the Act would need to develop a protocol for the processing of parental consent notifications in compliance with the Act. Otherwise, each faces the risk of civil liability to the tune of thousands of dollars a day for each unauthorized minor using its site. NetChoice argues that the economic risk is particularly acute because the Act is vague, therefore insufficiently apprising its members as to whether they must comply with it. As a result, some websites may needlessly accumulate expenditures to comply with the Act, even though the Attorney General has no intention of enforcing it against them. And for some of these platforms, the cost of complying with the Act would put them out of business. (ECF No. 2-2 ¶¶ 13-19)

A compliance cost injury alone is sufficient, but NetChoice also argues that its member organizations' First and Fourteenth Amendment rights will be violated by the Act. A plaintiff satisfies the injury-in-fact requirement when it alleges “an intention to engage in a course of conduct arguably affected with a constitutional interest, but proscribed by a statute, and there exists a credible threat of prosecution thereunder.” MedImmune, Inc. v. Genentech, Inc., 549 U.S. 118, 128-129 (2007). Specifically, NetChoice contends that its members have a well-established First Amendment right to “disseminate” protected speech by and to minors and adults alike, and a Fourteenth Amendment right to have laws be reasonably clear about the entities to which they apply. Attorney General Yost's arguments that NetChoice's members' First Amendment rights are not threatened are unavailing, for reasons described further below.

The second and third prongs of the associational standing test can be resolved simply. As NetChoice's Vice President and General Counsel explains, NetChoice's purpose is “to make the Internet safe for free enterprise and free expression.” (ECF No. 2-1 ¶ 3). As a result, this lawsuit centered on protecting these interests is germane to its purpose. Nor would this lawsuit require each member of the association to participate, as the nature of the suit is unlikely to require factintensive inquiry of each member. See Griffin, 2023 WL 566015 at *10 (reaching the same conclusion with respect to a similar lawsuit brought by NetChoice challenging a parental notification law in Arkansas).

Having concluded that NetChoice has associational standing to bring this lawsuit, this Court need not analyze the more fact-intensive question of whether NetChoice also has organizational standing because it has had to divert its own resources to address issues caused by the Act.

2. Prudential Standing

Having established that NetChoice has constitutional standing to bring a lawsuit challenging the Act, this Court considers whether NetChoice is entitled to bring its specific claims. In other words, whether it has prudential standing. The “prudential standing rule . . . normally bars litigants from asserting the rights or legal interests of others in order to obtain relief from injury to themselves.” Warth, 422 U.S. at 509. There are exceptions, however. It is uncontroverted that NetChoice, a member-based organization, has prudential standing to bring a claim on behalf of its members. Memphis A. Philip Randolph Inst. v. Hargett, 2 F.4th 548, 557 (6th Cir. 2021) (“There is no prudential standing bar when member-based organizations advocate for the rights of their members.”).

What requires closer scrutiny, however, is NetChoice's attempt to vindicate the First Amendment rights of minor Ohioans who may wish to access its member organizations' websites. In NetChoice v. Griffin, NetChoice brought a lawsuit challenging a similar Arkansas state law that imposes an age-verification requirement on many of NetChoice's member organizations. The district court in Griffin conducted a robust analysis of NetChoice's standing to advocate on behalf of minor Arkansans, which this Court finds persuasive. See 2023 WL 5660155, at *10-12. Specifically, the court analogized NetChoice's claim on behalf of minor website users to several Supreme Court cases in which vendors and vendors' associations challenged legislation that arguably infringed on the constitutional rights of their customers. See Craig v. Boren, 429 U.S. 190, 193-95 (1976) (concluding that a beer vendor had standing to challenge Oklahoma's genderbased liquor age restrictions); Carey v. Population Services Inter'l, 431 U.S. 678, 682-84 (1977) (concluding that a contraceptive vendor had standing to challenge a law that restricted the sale of contraceptives on behalf of its potential customers).

Most on point on this issue is, yet again, Virginia v. American Booksellers Assoc., Inc., 484 U.S. at 393. Like NetChoice, the plaintiffs argued that the law in question violated book buyers' First Amendment rights. Id. at 389. The Booksellers Court explained that “in the First Amendment context, ‘[l]itigants . . . are permitted to challenge a statute not because their own rights of free expression are violated, but because of a judicial prediction or assumption that the statute's very existence may cause others not before the court to refrain from constitutionally protected speech or expression.'” Id. at 393 (quoting Sec'y of State of Maryland v. J.H. Munson Co., 467 U.S. 947, 956-57 (1984)). This is because would-be speakers, like the Ohioan minors here, may choose to refrain from engaging in protected activity instead of running the risk of challenging the law. Munson, 467 U.S. at 956. In other words, individuals may succumb to a speech restriction's chilling effect. “Society as a whole then would be the loser.” Id.

This exception to the traditional prudential standing rule, sometimes known as the “overbreadth exception,” Prime Media, Inc. v. City of Brentwood, 485 F.3d 343, 349 (6th Cir. 2007), applies so long as the plaintiff has constitutional standing, specifically, an injury-in-fact. As established above, NetChoice has constitutional associational standing. In Booksellers, just as in the case at hand, the legislation was “aimed directly” at the member bookstores (here, the websites), “who, if their interpretation of the statute is correct, will have to take significant and costly compliance measures.” 484 U.S. at 393.

Nor is this Court is persuaded that any tension between the interests of minor uses and NetChoice's members overwhelms the shared interest that the two groups have in free expression. Attorney General Yost insists that the interests of members and users are fatally divergent because the members' “primary product is their users-including Ohio children-and userdata, not the content they host.” (ECF No. 28 at 11) (emphasis in original). But even assuming that this assertion is an accurate characterization of each of NetChoice's member organizations' business models, this Court fails to see how it changes the calculus. In sum, based on the record before this Court, NetChoice has standing to bring both its claims on behalf of its member organizations and Ohioan minors.

B. Likelihood of Success on the Merits

Having concluded that NetChoice has standing to bring its claims, this Court turns to the first-and critically important-preliminary injunction factor: likelihood of success on the merits. In its Motion, NetChoice mounts a facial challenge to the Act's constitutionality, or, in the alternative, an overbreadth challenge. Specifically, NetChoice makes three arguments: (1) that the Act violates NetChoice's member organizations First and Fourteenth Amendment rights because it is so vague that NetChoice's member organizations do not have fair notice as to whether they must comply with the Act's dictates, and if so, how; (2) that the Act imposes impermissible speaker- and content-based restrictions on First Amendment protected speech; and (3) that the Act imposes an impermissibly overinclusive and underinclusive ban on minors' access to First Amendment protected speech.

1. First Amendment: Restrictions on Protected Speech a. The Act Regulates Protected Speech

Fundamentally, “the First Amendment bars the government from dictating what we see or read or speak or hear,” Ashcroft v. Free Speech Coal., 535 U.S. 234, 245 (2002), and protects “the right to distribute, the right to receive, the right to read and freedom of thought,” Griswold v. Connecticut, 381 U.S. 479, 482 (1965). Here, NetChoice argues that the Act imposes content and speaker-based restrictions by discriminating between websites, and by preventing minors from accessing certain protected content.

Defendant, however, seeks to cast the Act-and this case-as not about the First Amendment, but about the right to contract. Attorney General Yost argues that the Act regulates commercial activity and does not regulate speech at all, such that it should only be subject to rational basis review, an easy hurdle to clear.

Despite the “challenges of applying the Constitution to ever-advancing technology,” Brown v. Ent. Merchants Ass'n, 564 U.S. 786, 790 (2011), the First Amendment implications of the Act come into focus when social media operators are thought of as publishers of opinion work-a newspaper limited to “Letters to the Editor,” or a publisher of a series of essays by different authors. The analogy is an imperfect one-social media operators are arguably less involved in the curation of their websites' content than these traditional examples. But the comparison helps clarify that the Act regulates speech in multiple ways: (1) it regulates operators' ability to publish and distribute speech to minors and speech by minors; and (2) it regulates minors' ability to both produce speech and receive speech. And as NetChoice points out, this Court is unaware of a “contract exception” to the First Amendment. Indeed, neither party references any such authority. Like many of NetChoice's member organizations, a publisher stands to profit from engagement with consumers. That an entity seeks financial benefit from its speech does not vitiate its First Amendment rights.

Nonetheless, Attorney General Yost insists that the Act does not regulate speech, simply the ability of minors to contract, which it argues the State has authority to regulate as commercial transactions. In support of this proposition, the State cites 44 Liquormart v. Rhode Island, 517 U.S. 484 (1996), a case about commercial speech regulation that provides little support. The case does not address regulation of commercial activity that does not impinge on speech at all, as Defendant argues is the case here, but explains that “the State retains less regulatory authority when its commercial speech restrictions strike at ‘the substance of the information communicated' rather than the ‘commercial aspect of [it].'” 517 U.S. at 499. Presumably, Defendant wants this Court to infer from this case that the State retains more regulatory authority here because, in the State's view, the Act is a regulation striking at the commercial aspect of the relationship between social media platforms and their users, not the speech aspect of the relationship. But this Court does not think that a law prohibiting minors from contracting to access to a plethora of protected speech can be reduced to a regulation of commercial conduct.

In sum, as NetChoice puts it, the Act “is an access law masquerading as a contract law.” (ECF No. 29 at 16). That is, the Act does implicate the First Amendment, at least to some degree, and therefore, is not subject to the deferential rational basis standard of review.

b. The Law is Content Based

Having concluded that the Act does indeed implicate the First Amendment, this Court now considers whether it should be subject to strict scrutiny or only intermediate scrutiny. Courts “apply the most exacting scrutiny to regulations that suppress, disadvantage, or impose differential burdens upon speech because of its content,” but only “an intermediate level of scrutiny” when “regulations are unrelated to the content of speech.” Turner Broad. Sys. v. FCC, 512 U.S. 622, 642 (1994).

“Strict scrutiny” requires the government to show that the law at issue is “narrowly tailored to serve compelling state interests.” KenAmerican Res., Inc. v. United States Sec'y of Lab., 33 F.4th 884, 893 (6th Cir. 2022) (quoting Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015)). On the other hand, “intermediate scrutiny” validates a law “under the First Amendment if it advances important governmental interests unrelated to the suppression of free speech and does not burden substantially more speech than necessary to further those interests.” Id. (quoting Holder v. Humanitarian L. Project, 561 U.S. 1, 26-27 (2010)).

NetChoice argues that several provisions of the law discriminate based on content, whereas Defendant argues that the Act is content-neutral and that any effect it has speech is incidental. When considering whether a regulation is content based, the principal inquiry is whether the government has regulated the speech because it agrees or disagrees with its communicative content. Id. at 643. When a law “applies to particular speech because of the topic discussed or the idea or message expressed” it is content-based on its face. City of Austin v. Reagan Nat'l Advert. of Austin, LLC, 596 U.S. 61, 69 (2022) (cleaned up). A facially content-based law cannot escape strict scrutiny, even if it has a “benign motive.” Reed v. Town of Gilbert, 576 U.S. 155, 165-66 (2015). And even if not facially content-based, a law can be content-based in its purpose or justification. Id. At its core, content-neutrality is about “whether the government has adopted a regulation of speech because of disagreement with the message it conveys.” Ward v. Rock Against Racism, 491 U.S. 781, 791 (1989). For example, a law that requires “political signs” to be smaller than “event signs” is facially content-based and subject to strict scrutiny. Reed, 576 U.S. at 16465. It is worth noting that a law that is viewpoint-based-perhaps singling out signs in favor of Republican candidates for office-is a highly disfavored form of content-based regulation. Id. at 168-69. But a law need not discriminate based on viewpoint to be content based. Id.

Other regulations are better described as speaker based. The Supreme Court is “deeply skeptical of laws that distinguish among different speakers, Nat'l Inst. Of Fam. & Life Advocs. v. Becerra, 138 S.Ct. 2361, 2378 (2018), but speaker-based restrictions “are not automatically content based or content neutral,” Schickel v. Dilger, 925 F.3d 858, 876 (6th Cir. 2019). Speaker based restrictions are suspect only because they are often a proxy or pretext for regulation of content. Id. But if they distinguish between speakers “based only on the manner in which speakers transmit their messages to viewers, and not upon the messages that they carry,” they are subject to only intermediate scrutiny. Turner, 512 U.S. at 645.

NetChoice argues that the Act is facially content based because it targets some websites while exempting others. Specifically, the Act only purports to govern websites that are targeted at children, or reasonably anticipated to be accessed by children. § 1349.09(B). The Act also excludes from coverage websites where interaction between users is “incidental to content posted by an established and widely recognized media outlet, the primary purpose of which is to report news and current events.” § 1349.09(O). Similar is an exemption for websites where interaction is limited to reviews for “products for sale,” but the Act does not exempt reviews of, for example, services or art. Id. All of these, NetChoice argues, are content-based restrictions. NetChoice contends that even if these are just speaker-based restrictions, they cannot be justified without reference to content, and are therefore, content-based distinctions subject to strict scrutiny. Schickel, 925 F.3d at 876 & n.2.

“(B) The operator of an online web site, service, or product that targets children, or is reasonably anticipated to be accessed by children, shall do all of the following:

(1) Obtain verifiable consent for any contract with a child, including terms of service, to register, sign up, or otherwise create a unique username to access or utilize the online web site, service, or product, from the child's parent or legal guardian using any of the following methods:
(a) Requiring a parent or legal guardian to sign and return to the operator a form consenting to the contract by postal mail, facsimile, or electronic mail;
(b) Requiring a parent or legal guardian, in connection with a monetary transaction, to use a credit card, debit card, or other online payment system that provides notification of each discrete transaction to the primary account holder;
(c) Requiring a parent or legal guardian to call a toll-free telephone number implemented by the operator and staffed by trained personnel;
(d) Requiring a parent or legal guardian to connect to trained personnel by videoconference;
(e) Verifying a parent's or legal guardian's identity by checking a form of government-issued identification against databases of such information, and promptly deleting the parent's or legal guardian's identification from the operator's records after such verification is complete.
(2) Present to the child's parent or legal guardian a list of the features offered by an operator's online web site, service, or product related to censoring or moderating content, including any features that can be disabled for a particular profile.
(2) Provide to the child's parent or guardian a web site link at which the parent or legal guardian may access and review the list of features described in division (B)(2) of this section at another time.”

“(O) This section does not apply to an online web site, service, or product respecting which interaction between users is limited to the following:

(1) Reviewing products offered for sale by electronic commerce or commenting on reviews posted by other users;
(2) Comments incidental to content posted by an established and widely recognized media outlet, the primary purpose of which is to report news and current events.”

Attorney General Yost concedes that the provisions above are speaker-based, but argues that the Act should, nonetheless, not be subject to strict scrutiny because its speaker-based distinctions do not disfavor particular communicative content. Instead, Yost argues, the complained of language simply tailors the Act's contract regulation mechanism to entities whose platforms and business practices pose a heightened risk to minors' privacy, health, and safety. (ECF No. 28 at 27-28). Specifically, Yost asserts that the legislation is concerned with operators' release of minors' personal information and data pursuant to exploitative terms of service, addictive social media features like “infinite scroll,” increased rates of mental illness in children, and a risk of exposure to sexual predation on websites that facilitate private messaging between users. (Id. at 22, 30). These “features and functions,” Defendant argues, are absent or less threatening on the exempted sites: product review sites and traditional media outlets, where users are, for example, not able to engage in private chats with other users. (Id. at 31).

Attorney General Yost's arguments are not wholly without merit. Because speaker-based distinctions are only subject to strict scrutiny if they are also facially content-based, or contentbased distinctions in disguise, this Court must consider whether the Act's speaker-based provisions disfavor the messages that the operators publish, or simply the “manner in which [they] transmit” those messages. Turner, 512 U.S. at 645, 658 (emphasis added). It is a close call.

Turning first to the language that defines the broad category of operators to which the Act applies-websites that “target[] children” or are “reasonably anticipated to be accessed by children”-NetChoice argues that these are transparently content-based restrictions because whether a website “targets children” is inextricably connected to its content. (ECF No. 29 at 12). The State argues, on the other hand, that this language is simply an example of tailoring, designed to prevent overbreadth by exempting websites that are unlikely to be accessed by children. (ECF No. 28 at 27-28).

A court in the Northern District of California recently considered a similar law that only applied to websites “likely to be accessed by children.” NetChoice, LLC v. Bonta, No. 22-CV-08861-BLF, 2023 WL 6135551, at *6 (N.D. Cal. Sept. 18, 2023). There, the court pointed out that “having to view content to determine whether the statute applies does not by itself mean that the statute regulates speech,” citing a Ninth Circuit cases where the court concluded a “law classifying workers as employees or independent contractors based on criteria including whether worker's output was ‘to be appreciated primarily or solely for its imaginative, aesthetic, or intellectual content' did not regulate speech.” Id. (citing Am. Soc'y of Journalists & Authors, Inc. v. Bonta, 15 F.4th 954, 960-61 (9th Cir. 2021)). But this Court has already concluded that the Act regulates speech in a way that a law seeking to distinguish between, for example, workers for tax purposes does not. The Act here distinguishes between speakers for purposes of establishing how minors may access the speech on those platforms.

Since this language identifies a certain topic, it is tempting to apply strict scrutiny reflexively, particularly given that facially content-based regulations cannot be rehabilitated by an apparent benign purpose. Reed, 576 U.S. at 165-66. In Reed, the Supreme Court invalidated a regulation that treated, for example, “Temporary Directional Signs” differently from “Ideological Signs.” Id. at 164. The Supreme Court reasoned that the sign code was facially content based because it discriminated based on communicative content and was, therefore, subject to strict scrutiny. Id. at 159, 164.

But relevant here is the Supreme Court's “rejection of the view that any examination of speech or expression inherently triggers” strict scrutiny. City of Austin, Texas v. Reagan National Advertising of Austin, LCC, 596 U.S. 61, 69 (2022). That is, a law may still be content neutral, even if it requires reading the speech at issue to determine if the speech or speaker is covered. Id. In City of Austin, a sign's content was only considered to determine whether it was placed on the premises relevant to its content or placed off those premises. Id. The sign's location relative to its content determined the regulation to which it was subject, but the topic was otherwise not considered. Id. The majority concluded that Austin's sign regulation was content-neutral because it did not “single out any topic or subject matter for differential treatment.” Id. at 71.

It is challenging to reconcile City of Austin with Reed. Indeed, Justice Thomas, who wrote Reed, dissented in City of Austin, saying the majority's attempt to distinguish Austin's sign code from the one in Reed was unworkable. Id. at 91. (Thomas, J., dissenting). In Justice Thomas's view, if the message matters at all when applying a regulation, the law is content based. Id. at 92.

The Act here certainly requires consideration of the content on an operator's platform to determine if it “targets children” or is “reasonably anticipated to be accessed by children.” The Act's eleven-factor list attempts to make clear that content is the essential consideration with respect to whether an operator is covered. § 1349.09(C)(1)-(11). But Justice Thomas's more rigid approach only garnered three votes in City of Austin. The majority's opinion instead requires this Court to inquire whether the Act “single[s] out any topic or subject matter for differential treatment.” City of Austin, 596 U.S. at 71. It does not.

“Attempts to” because the Act remains vague with respect to which operators it regulates, as discussed further below.

(C) In determining whether an operator's online web site, service, or product targets children, or is reasonably anticipated to be accessed by children, the attorney general or a court may consider the following factors:

(1) Subject matter;
(2) Language;
(3) Design elements;
(4) Visual content;
(5) Use of animated characters or child-oriented activities and incentives;
(6) Music or other audio content;
(7) Age of models;
(8) Presence of child celebrities or celebrities who appeal to children;
(9) Advertisements;
(10) Empirical evidence regarding audience composition; and
(11) Evidence regarding the intended audience.

There is no indication that the State disfavors the sort of content designed to appeal to children-cartoons and the like. “Websites that children might access” is not a topic or subject matter. Indeed, even though covered platforms contain some subject matter likely to appeal to children, most also contain subject matter “as diverse as human thought.” Packingham v. North Carolina, 582 U.S. 98, 105 (2017). The “targets children” or “reasonably anticipated to be accessed by children” language tailors the Act's applicability to only the platforms that have a chance of attracting the children the Act seeks to protect. Sites that are not reasonably likely to be accessed by children need not conform with the Act's dictates. But the Act regulates much more speech than just speech that targets children. NetChoice has not shown that this language- “targets children” and “reasonably anticipated to be accessed by children”-are examples of content-based regulation.

Even more challenging is a question that the parties do not engage with fully: whether the “features and functions” that characterize social media sites are themselves communicative content. The State assumes that they are not, repeatedly pressing the argument that the Act does not regulate content, but just websites that have “features and functions” that harm youth. The State seems to conceptualize “content” as user-generated content, and it relies on the fact that the Act does not discriminate between topics about which users post or read. In the State's view, the Act simply regulates the manner in which content is conveyed to users, not the message. See Turner 512 U.S. at 645, 658. NetChoice, on the other hand, suggests that justification based on “function” necessarily indicates that content-based regulation is afoot. But the fact that some classifications by “function or purpose” are a proxy for content-based regulation “does not mean that any classification that considered function or purpose is always content based.” City of Austin, 596 U.S. at 74. It is, unfortunately, not that simple.

At this early juncture, this Court shares the view of a district court in the Western District of Texas that considered a regulation of major social media websites' content-moderation practices. That court-albeit in a different context-reasoned that, “[u]nlike broadband providers and telephone companies, social media platforms ‘are not engaged in indiscriminate, neutral transmission of any and all users' speech.'” NetChoice, LLC v. Paxton, 573 F.Supp.3d 1092, 1107 (W.D. Tex. 2021) (Pitman, J.), vacated and remanded sub nom. NetChoice, L.L.C. v. Paxton, 49 F.4th 439 (5th Cir. 2022). Many of the operators covered by the Act, “curate both users and content to convey a message about the type of community the platform seeks to foster and, as such, exercise editorial discretion over their platform's content.” Id. at 1108. In other words, they are not “mere conduits.” Id. at 1107.

Features like “infinite scroll,” which very well may be addicting to minors and adults alike, are admittedly unlikely to convey messages. But those are not the features that the Act identifies as hallmarks of the websites it regulates. Instead, the Act covers websites that allow users to:

(a) Interact socially with other users within the confines of the online web site, service, or product;
(b) Construct a public or semipublic profile for the purpose of signing into and using the online web site, service, or product;
(c) Populate a list of other users with whom an individual shares or has the ability to share a social connection within the online web site, service, or product;
(d) Create or post content viewable by others, including on message boards, chat rooms, video channels, direct or private messages or chats, and a landing page or main feed that presents the user with content generated by other users.
§ 1349.09(A)(1)(a)-(d). The existence of functionalities allowing users to post, comment, and privately chat-in other words, to connect socially online-may very well be conveying a message about “the type of community the platform seeks to foster.” Paxton, 573 F.Supp.3d at 1108. The features that the Act singles out are inextricable from the content produced by those features. This Court therefore finds the Act's distinction on the basis of these functionalities to be content based.

The exceptions to the Act for product review websites and “widely recognized” media outlets, however, are easy to categorize as content based. It is noteworthy that the exceptions for media outlets and product review sites do, in part, define exempted speakers by the fact that “interaction between users is limited to” public comments. § 1349.09(O). Presumably, the public nature of comments-as opposed to private chats-reduces the predation risk to minors that Defendant argues covered operators pose. (See ECF No. 28-4 at 4). Even assuming, however, that requiring parental approval before a minor can engage in private user interaction is one of the Act's goals-and a constitutionally sound one-the exceptions as written still distinguish between the subset of websites without private chat features based on their content. For example, a product review website is excepted, but a book or film review website, is presumably not. (ECF No. 29 at 14). The State is therefore favoring engagement with certain topics, to the exclusion of others. That is plainly a content-based exception deserving of strict scrutiny.

Given that the parties have only mentioned severability in passing, without substantive argument, this Court declines to sever these troubling exceptions at this stage.

c. The Act Violates Ohioan Minors' Rights

NetChoice also argues that the Act merits strict scrutiny because it infringes on minors' rights to both access and produce First Amendment protected speech. Generally, First Amendment protections “are no less applicable when government seeks to control the flow of information to minors.” Erznoznik v. City of Jacksonville, 422 U.S. 205, 214 (1975). In other words, the State does not possess “a free-floating power to restrict the ideas to which children may be exposed.” Brown, 564 U.S. at 794.

This Court does not address NetChoice's argument that the Act may be atextually interpreted to require age verification procedures for all users, such that it would potentially also chill adult speech. (See ECF No. 2 at 14). As NetChoice acknowledges, such an interpretation would be atextual and Defendant's counsel assured the court at the Rule 65.1 conference that it did not intend to enforce an age verification requirement.

Particularly relevant here is the Supreme Court's analysis in Brown v. Ent. Merchs. Ass'n, which invalidated a California regulation prohibiting the sale of violent video games to minors. There, the Supreme Court reasoned that even if “the state has the power to enforce parental prohibitions”-for example, enforcing a parent's decision to forbid their child to attend an event- “it does not follow that the state has the power to prevent children from hearing or saying anything without their parents' prior consent.” Id. at 795 n.3. As the Court explained, “[s]uch laws do not enforce parental authority over children's speech and religion; they impose governmental authority, subject only to a parental veto.” Id. The Act appears to be exactly that sort of law. And like content-based regulations, laws that require parental consent for children to access constitutionally protected, non-obscene content, are subject to strict scrutiny.

d. Strict Scrutiny

Having concluded that NetChoice is likely to succeed on its argument that the Act is a content-based regulation, this Court considers whether the Act is likely to fail strict scrutiny. Strict scrutiny is “the most demanding test known to constitutional law,” City of Boerne v. Flores, 521 U.S. 507, 534 (1997), and requires the government to show that the law “furthers a compelling governmental interest and is narrowly tailored to that end,” Reed, 576 U.S. at 171. In other words, to survive strict scrutiny, the State must “specifically identify an actual problem in need of solving” and show that “the curtailment of free speech must be actually necessary to the solution.” Brown, 564 U.S. at 799.

As NetChoice correctly points out, Attorney General Yost toggles between several different interests in its opposition to NetChoice's Motion. Defendant argues both that it seeks to regulate the ability of operators to contract with minors, not to limit minors' access to expressive content, (ECF No. 28 at 22, 33), but in the same breath asserts that the State's compelling interest is in protecting minors from harms associated with covered operators' platforms, including mental health issues, data privacy issues, and sexual predation. (ECF No. 28 at 22, 31). Defendant also argues that the State has a compelling interest in protecting and advancing parents' ability to make decisions about their children's care and upbringing. (Id. at 22). This Court will address each of these purported interests in turn.

With respect to minors' ability to contract with operators, NetChoice asserts that Defendant “has not identified any harms flowing from contract terms.” (ECF No. 29 at 17). But this is an overstatement. On several occasions, Defendant cites the risk to minors of “involuntary releases of personally identifiable and other personal information and data,” and devotes two pages of its response to troubling terms of service used by Facebook and TikTok. For example, TikTok's terms of service give TikTok startling broad permission to use, modify, and reproduce its users' consent. (ECF No. 28 at 25). Attorney General Yost also points out that courts have enforced these extensive “click-wrap” terms against minors. (Id.) Whether the State has a “compelling” interest in protecting children against these harms is less clear. Conclusively, though, the Act is not narrowly tailored to protect minors against oppressive contracts. The Act regulates access to and dissemination of speech when it could instead seek to regulate the-arguably unconscionable-terms of service that these platforms require. The Act is also underinclusive with respect to this interest. For example, as NetChoice explains, a child can still agree to a contract with the New York Times without their parent's consent, but not with Facebook.

Next, Defendant argues that scientific research supports the notion that engagement with operators' platforms can have damaging mental health effects, and subject minors to sexual predation. (ECF No. 28 at 31). Attorney General Yost encloses a report entitled “Social Media and Youth Mental Health: The U.S. Surgeon General's Advisory.” (ECF No. 28-2). The report outlines the potential risk of harm to children and teens from both: (1) exposure to harmful content; and (2) excessive use perpetuated by the features discussed above like “infinite scrolling.” (Id. at 9-11). But even if protecting children against these harms is a compelling interest, which it very well may be, see Sable Communications of California, Inc. v. F.C.C., 429 U.S. 115, 126 (1989) (explaining that “there is a compelling interest in protecting the physical and psychological well- being of minors”), the Act is not narrowly tailored to those ends. Foreclosing minors under sixteen from accessing all content on websites that the Act purports to cover, absent affirmative parental consent, is a breathtakingly blunt instrument for reducing social media's harm to children. The approach is an untargeted one, as parents must only give one-time approval for the creation of an account, and parents and platforms are otherwise not required to protect against any of the specific dangers that social media might pose. See Brown, 564 U.S. at 802 (concluding that legislation preventing minors from buying violent video games was “seriously underinclusive” because the “Legislature is perfectly willing to leave this dangerous, mind-altering material in the hands of children so long as one parent . . . says it's OK....That is not how one addresses a serious social problem.”).

And finally, with respect to the rights of parents, Attorney General Yost fails to distinguish the State's purported interest from an analogous-and rejected-state interest in Brown. When the State of California tried a similar argument-that the legislation prohibiting minors from purchasing violent video games was “justified in aid of parental authority”-the Supreme Court noted that it doubted “punishing third parties for conveying protected speech to children just in case their parents disapprove of that speech is a proper governmental means of aiding parental authority.” Brown, 564 U.S. at 802. More conclusively, however, the Court detailed a series of preexisting protections to help parents-just as there are here-such that “filling the remaining modest gap in concerned parents' control can hardly be a compelling state interest.” Id. at 803. And the legislation was also overinclusive, in that it enforced a governmental speech restriction, subject to parental veto, as opposed to protecting only the interests of genuinely concerned parents. Id. at 804. That is, some parents simply may not care. Id. The same is true here.

In other words, the Act is either underinclusive or overinclusive, or both, for all the purported government interests at stake.

2. Due Process: Void for Vagueness

Laws run afoul of the Due Process Clause of the Fourteenth Amendment if they fail to “give fair notice of conduct that is forbidden or required.” FCC v. Fox Television Stations, Inc., 567 U.S. 239, 253 (2009). In addition to affording regulated parties notice, precision is also essential to ensure that laws cannot be enforced in an arbitrary or discriminatory way. Id. The need for clarity is particularly acute when laws restrict speech. See id. Having concluded above that the Act does in fact regulate speech, this Court rejects Attorney General Yost's invitation to apply a relaxed vagueness standard.

NetChoice identifies several aspects of the Act that this Court finds troublingly vague. Specifically, the Act purports to apply to operators that “target[] children” or are “reasonably anticipated to be accessed by children.” § 1349.09(B). On its face, this expansive language would leave many operators unsure as to whether it applies to their website. The legislature's apparent attempt at clarity is also unilluminating. The Act provides an eleven-factor list that the Attorney General or a court may use to determine if a website is indeed covered, which includes malleable and broad-ranging considerations like “[d]esign elements” and “[l]anguage.” § 1349.09(C). All the listed considerations are undefined.

The Act also contains an eyebrow-raising exception for “established” and “widely recognized” media outlets whose “primary purpose” is to “report news and current events,” the speaker- and content-based flavor of which are discussed further below. § 1349.09(O)(2). But the Act also provides no guardrails or signposts for determining which media outlets are “established” and “widely recognized.” Such capacious and subjective language practically invites arbitrary application of the law.

Attorney General Yost does not focus his argument on these examples, but instead highlights aspects of the Act that are more precisely defined. For example, he points to a few of the eleven factors that are less vague: “[e]mpirical evidence regarding audience composition” and “[p]resence of child celebrities or celebrities who appeals to children.” (ECF No. 28 at 40 (quoting § 1349.09(C)). Defendant Yost also points to the Children Online Privacy Protection Act of 1998 (“COPPA”), a federal regulation that uses some of the same factors to explain which websites or online services are “directed to children,” and therefore, covered by COPPA. (ECF No. 28 at 4041). But he points to no case where a court has concluded that COPPA's language is not vague, nor can this Court find one.

These more specific factors and the existence of the COPPA scheme do not cure vagueness in the eleven-factor list. But even if they did, they do not address the broad-ranging language in the exceptions. None of these phrases or the definitions in COPPA rehabilitates, for example, amorphous descriptors like “established” or “widely recognized.”

C. Irreparability of Harm

This Court next considers whether either NetChoice's members or minor Ohioans will suffer irreparable harm absent an injunction. See Fed.R.Civ.P. 65(b)(1)(A). Generally, “[a] plaintiff's harm from the denial of a preliminary injunction is irreparable if it is not fully compensable by monetary damages.” Overstreet v. Lexington-Fayette Urb. Cnty. Gov't, 305 F.3d 566, 578 (6th Cir. 2002).

NetChoice asserts that its members will be irreparably harmed through unrecoverable compliance costs and the risk of civil liability were the Attorney General to enforce the Act against them. In particular, NetChoice asserts that its members will need to spend money on engineering and compliance procedures, among other things, in order to comply with Ohio's law. For some of its members, these requirements are extremely burdensome. (See ECF No. 2-2). Although these are monetary harms, NetChoice persuasively argues that there is no cause of action through which they could seek to recover those compliance costs.

NetChoice also argues that the Act violates both NetChoice members' and Ohioan minors' constitutional rights, and that as a result, they will suffer irreparable harm absent a preliminary injunction. “When constitutional rights are threatened or impaired,” however, “irreparable injury is presumed.” Mich. State A. Philip Randolph Inst. v. Johnson, 833 F.3d 656, 669 (6th Cir. 2016) (cleaned up). In fact, “‘[t]he loss of First Amendment freedoms,” like the ones NetChoice asserts are violated here, “for even minimal periods of time, unquestionably constitutes irreparable injury.'” Roman Cath. Diocese of Brooklyn v. Cuomo, 141 S.Ct. 63, 67 (2020) (quoting Elrod v. Burns, 427 U.S. 347, 373 (1976) (plurality opinion)).

D. Balance of Equities and the Public Interest

The last two factors in the preliminary injunction balancing test merge when the government is a party. Nken v. Holder, 556 U.S. 418, 435 (2009). Attorney General Yost urges that the public interest will be served by allowing the law to go into effect and protect minors. (ECF No. 28 at 42). But as NetChoice correctly points out, “the State has no interest in enforcing laws that are unconstitutional.” EMW Women's Surgical Ctr., P.S.C. v. Friedlander, 591 F.Supp.3d 205, 215 (W.D. Ky. 2022) (cleaned up).

IV. CONCLUSION

For the reasons set forth above, this Court finds the Act unconstitutional and GRANTS Plaintiff's Motion for a Preliminary Injunction against Defendant, Attorney General Yost.

Specifically, Defendant remains ENJOINED from implementing and enforcing the Act against Plaintiff or its member organizations. The bond posted by NetChoice following entry of the Temporary Restraining Order remains in place.

IT IS SO ORDERED.


Summaries of

NetChoice, LLC v. Yost

United States District Court, Southern District of Ohio
Feb 12, 2024
2:24-cv-00047 (S.D. Ohio Feb. 12, 2024)
Case details for

NetChoice, LLC v. Yost

Case Details

Full title:NETCHOICE, LLC, Plaintiff, v. DAVE YOST, in his official capacity as Ohio…

Court:United States District Court, Southern District of Ohio

Date published: Feb 12, 2024

Citations

2:24-cv-00047 (S.D. Ohio Feb. 12, 2024)