Section 230 - Protection for private blocking and screening of offensive material

170 Analyses of this statute by attorneys

  1. The Test of Time: Section 230 of the Communications Decency Act Turns 20

    Davis Wright Tremaine LLPAugust 9, 2016

    The statute made it illegal to knowingly send or show minors obscene or indecent content online. Section 230 of the act, 47 U.S.C. § 230, prohibited treating online service providers as the publisher or speaker of content provided by others or holding providers liable for attempts to eliminate objectionable content. Just a year after the CDA’s enactment, the Supreme Court struck down the criminal provisions.

  2. What is Section 230?

    Buckingham, Doolittle & Burroughs, LLCAndrew StebbinsDecember 21, 2023

    The Internet makes it easier than ever to connect with people around the world, share ideas and information, and have their voices heard regardless of whether they are a single individual with limited resources or a massive corporation with money and influence. This connectivity is facilitated by innumerable platforms and websites that host user-generated content, such as social media platforms (like Facebook, TikTok, and Instagram), websites that allow individuals to create their own blogs (like WordPress, Weebly, or Medium), and other applications or websites that host user-generated content (like Google reviews, Wikipedia articles, or Change.org petitions).However, it’s widely believed that these websites and the degree of free speech that they provide would likely not be possible had Congress not enacted Section 230 (47 U.S.C. § 230). Indeed, the enactment of Section 230 in 1996 is largely credited with creating the Internet as we know it, through facilitating innovation and promoting the open exchange of ideas.You may wonder, how did a single federal law accomplish this?The answer is simple: by limiting tech companies’ legal liability. Specifically, under Section 230 owners of websites and platforms – as well as moderators of those platforms – are immune from lawsuits arising from third-party publications, as well as decisions by the websites or platforms to moderate content in a manner they see fit. For example, if someone is the victim of a false and defamatory statement published on Facebook, due to Section 230, the victim can only sue the Facebook user who posted the statement, not Facebook. Likewise, Facebook could not be sued because it decides to remove certain problematic posts under its terms of service, but fails to act to remove other problematic posts. By ensuring that tech companies would not be “pu

  3. Section 230 Immunity Protects Yelp from Injunction Order to Remove Defamatory Posts

    K&L Gates LLPElisa D'AmicoJuly 5, 2018

    Hassell v. Bird, 2018 WL 3213933 (Cal. Sup. Ct. July 2, 2018). [1] Both the superior court and Court of Appeal had ordered Yelp — a nonparty — to remove reviews that were determined by the court to be defamatory. The plurality decision, penned by Chief Justice Cantil-Sakauye, reversed, holding that Yelp was protected against this sort of “removal order” by the Communications Decency Act of 1996 (47 U.S.C. § 230 (“Section 230”)). This decision is critical to all websites — media sites, review sites — and all online platforms, like social media platforms.

  4. Executive Order Directed to Section 230 to Increase Regulatory Scrutiny of Online Services

    Wilson Sonsini Goodrich & RosatiLauren Gallo WhiteJune 4, 2020

    On May 28, 2020, President Trump signed an "Executive Order on Preventing Online Censorship" directed to Section 230 of the Communications Decency Act (47 U.S.C. §230(c)). Section 230 has long afforded protections to interactive computer services against litigation over their hosting and moderation of online content.

  5. A New Filter For Section 230: Snapchat Court Joins Lawmakers In Chipping Away At Social Media Immunity

    Vinson & Elkins LLPJessica HeimMay 20, 2021

    Section 230(c)(1) of the Communications Decency Act (codified at 47 U.S.C. § 230 (“Section 230”)) has long been credited for the boom of user generated content on the internet — the crux of social media that has driven the online environment for decades. Section 230 grants immunity to companies that provide user content platforms, essentially stating that the companies cannot be held liable for the content their users publish.

  6. A New Filter For Section 230: Snapchat Court Joins Lawmakers In Chipping Away At Social Media Immunity

    Vinson & Elkins LLPJessica HeimMay 25, 2021

    Section 230(c)(1) of the Communications Decency Act (codified at 47 U.S.C. § 230 (“Section 230”)) has long been credited for the boom of user generated content on the internet — the crux of social media that has driven the online environment for decades. Section 230 grants immunity to companies that provide user content platforms, essentially stating that the companies cannot be held liable for the content their users publish.

  7. Second Circuit Affirms Broad Immunity for Online Providers to Remove Third-Party Content from Their Websites

    WilmerHaleAri HoltzblattMarch 25, 2021

    Earlier this month, in Domen v. Vimeo, Inc.,1 a panel of the U.S. Court of Appeals for the Second Circuit held that a relatively unused subpart of Section 230 of the Communications Decency Act (CDA)—namely, 47 U.S.C. § 230(c)(2)(A)—immunized an online platform (Vimeo) from a lawsuit brought by users who complained that the platform had wrongfully deleted their content and banned them from using the platform. The result of this ruling—termination at the motion-to-dismiss stage of a lawsuit against an online platform based on its decisions, actions or inactions relating to the moderation of third-party content—is fully in line with rulings from legions of courts across the country that have applied Section 230 as a source of very broad immunities for online platforms.2 But the ruling is groundbreaking in one respect: It is the first reported case in nearly 20 years in which an appellate court has held that § 230(c)(2)(A)—which prohibits holding online platforms liable for “any action voluntarily taken in good faith” to block or remove material that the platform “considers to be . . . objectionable”—can and should operate to bar such claims at the threshold pleading stage.

  8. Client Alert: A No-Decision Decision: The Supreme Court Dodges Section 230

    Jenner & BlockJune 7, 2023

    On May 18, the Supreme Court issued a much-anticipated decision in Gonzalez v. Google LLC,[1] the first case in which the Supreme Court has considered the contours of Section 230 of the Communications Decency Act, 47 U.S.C. § 230, known as the “twenty-six words that created the internet.”[2] The Court declined to address Section 230’s applicability to YouTube’s recommendation algorithm, leaving the current Section 230 protections of algorithmically generated recommendations—as decided by lower courts—in place.Section 230 provides internet platforms immunity from claims that “treat [them] as the publisher or speaker of” third-party content.[3] This broad protection allows social media and other platforms to function and make content moderation-related decisions without the threat of liability relating to the user-generated content. However, critics have alleged that it removes accountability from platforms, and lawsuits such as Gonzalez have arisen to attempt to narrow the law’s scope.Gonzalez, along with its companion case Twitter Inc. v. Taamneh,[4] arose out of a series of terrorist attacks by ISIS, resulting in injuries and deaths. The plaintiffs—relatives of the victims—sued Google and Twitter under the An

  9. Online Communications and Content: How Section 230 Reform Has Catapulted into Relevancy

    Faegre Drinker Biddle & Reath LLPMatthew RubinOctober 27, 2020

    Throughout much of 2020, Members of Congress, the Trump Administration, and even Associate Justice Clarence Thomas have highlighted the reasonableness in reevaluating Section 230 of the Communications Decency Act (47 USC §230) and reining in the liability protections afforded to “interactive computer services” (including e.g. search engines and social media companies). In recent weeks, particularly throughout the 2020 presidential election cycle and leading into the November polls, perceived threats to political free speech online have further inflamed the dialogue around these “Section 230 protections” in the nation’s capital and beyond.

  10. Grindr and Armslist Cases Reaffirm Core Protections for User-Generated Content

    Davis Wright Tremaine LLPJAMES ROSENFELDSeptember 27, 2019

    In Daniel v. Armslist, LLC, the Wisconsin Supreme Court reversed a decision finding that a website permitting gun advertisements could be responsible for death and injuries caused by someone who obtained a gun from someone posting such an ad. 926 N.W.2d 710 (Wis. Sup. Ct., Apr. 30, 2019).Both decisions interpret Section 230, which provides, in part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). Courts have broadly interpreted the immunity to provide protection for websites from any state-law claims based on content provided by third parties. But the immunity has been embattled lately, with online sex trafficking, election-rigging, “fake news,” and hate speech making headlines, and courts and Congress acting to circumscribe the immunity. See, e.g., Pub. L. No. 115-164, 132 Stat. 1253 (2018) (Allow States and Victims to Fight Online Sex Trafficking Act of 2017) (adopting statute that would permit liability for ads posted to websites that result in sex trafficking, reacting to decisions finding the website Backpage.com immune under Section 230); HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676 (9th Cir. 2019) (Section 230 does not preempt ordinance prohibiting short-term rental websites from permitting bookings of allegedly unregistered properties); Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir. 2016) (website responsible for failure to warn user of third parties