Words May Have Consequences, Even If Actions Don't
Sometimes, the things you say can get you in more trouble than the things you do. Estate of Bride v. YOLO Technologies, Inc. is a good example of this. In Bride, the Ninth Circuit denied immunity to the defendant for representations about content moderation on its Snapchat extension, but extended immunity for claims based on the nature of the Snapchat extension.
The upshot is this: companies need to be more careful about the representations they make about their content moderation, but are more safe with regard to the actual apps they create (including apps that hinge on anonymity).
Legal Background
When the internet was still a novelty, Section 230 of the Communications Decency Act was Congress’s attempt to steer this new technology between a rock and a hard place. Traditionally, publishers were liable for a piece of writing if the publisher edited or moderated the content. So, when internet forums and message boards entered the scene, the owners were stuck with a dilemma. If they attempted to moderate content, they would potentially face significant liability for getting involved. But if they did nothing, their websites would be flooded by trolls and bad faith actors.
Section 230 created a middle ground. It allowed website owners to “to perform some editing on user-generated content without thereby becoming liable for all defamatory or otherwise unlawful messages they didn’t edit or delete.” Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157, 1163 (9th Cir. 2008) (en banc). Put another way, moderation became possible without automatically opening the doors to liability.
Under Section 230, there are three requirements for broad immunity: (1) the defendant must be a provider or user of an interactive computer service, (2) the plaintiff must seek to treat, under a state law cause of action, the defendant as a publisher or speaker, (3) of information provided by another information content provider.
Only the second prong is at issue in this appeal. To determine whether the plaintiff seeks to treat the defendant as a publisher or speaker under this prong, the Ninth Circuit applies a two-part test concerning the platform’s duty. First, the court asks whether the duty at issue stem from the platform’s status as a publisher (in which case it is barred by Section 230), or from some other obligation (like a contract). Second, the court asks what this duty requires. If it requires content moderation, then Section 230 bars the claim. Put simply, this analysis is meant to avoid extending Section 230 immunity too far.
Factual Background
Plaintiffs are three children and the estate of a fourth child who were users of Snapchat, a social media app popular with teenagers. Each child downloaded app created by the defendant (called the YOLO extension). This app worked within Snapchat and allowed users to post questions or polls, to which others could anonymously respond. As an apparent safeguard against anonymity, YOLO represented that it would ban users for inappropriate usage of the app, and would unmask the identity of users who sent harassing messages.
The four plaintiffs were relentlessly bullied on Snapchat as a result of YOLO. One of the children attempted to unmask the bullies, but after failed attempts to do so, he killed himself. Another child’s parent reached out to YOLO in various ways in an attempt to unmask the bullies, but her attempts were also unsuccessful. Ultimately, the plaintiffs filed suit for products liability, misrepresentation, unjust enrichment, and other state tort law based on misrepresentation.
The Decision
The panel determined that YOLO was not immune from the misrepresentation claims, but immune from the products liability claims.
For the misrepresentation claims, the panel concluded that plaintiffs seek to hold YOLO liable for its promises/representations that it would unmask and ban specific users, not for its failure to take specific moderation actions. The court recognized that YOLO may have specific defenses related to the enforcement of those promises, but declined to rule that Section 230 categorically prohibited plaintiffs’ claims. According to the panel, YOLO’s representations to users about unmasking or banning users created a distinct legal duty.
In contrast, the court concluded that YOLO was immune from the products liability claims about the dangerousness of YOLO’s app. At base, these claims attempted to hold YOLO liable for users’ speech or YOLO’s decision to publish that speech. The panel distinguished Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021), where the Ninth Circuit denied immunity when Snapchat created a filter that showed a user’s current speed, because the panel stated that this incentivized dangerous behavior outside the app. But here, YOLO only encouraged the sharing of messages between users on the app. The panel also refused to hold that the feature of anonymity made the app inherently dangerous, as anonymity is a cornerstone of much internet speech. Like in the prior case of Dyroff v. Ultimate Software Group, 934 F.4d 1093 (9th Cir. 2019), the defendant here merely created a “blank text box,” which users then filled in. Section 230 applies to such situations.
Takeaway
There is an inherent tension in the court’s denial of Section 230 immunity. The court noted that platforms are immune from claims that hinge on moderation decisions, but the plaintiffs’ claims here do hinge in some part on YOLO’s failure to moderate users. To be sure, the panel recognized that the claims stem from duty created by YOLO’s representations about what it would do to moderate, not the moderation itself, which is a distinct duty. But this creates a blurry line between moderation itself and statements about the platform.
For example, if a platform advertises itself as “safe” or “respectful,” is that a representation that creates a duty to moderate to ensure safety or respectfulness? Or does it need to be more specific, like YOLO’s promise to unmask certain users? This is a question that will need to be fleshed out over time, but I’m confident that the court does not want to create a situation where platforms are liable for general claims about their platform. We’ll just have to wait to see how those lines will be drawn.