Death by Filter: Examining Snapchat’s Liability

Posted on May 31, 2021

Authored by Ritika Acharya*

Exclusively Launched for iPhone, Snapchat for Android Now has More Users
Image Source: Wccftech

Introduction

In a ruling dated 4th May, 2021, the U.S. 9th Circuit Court of Appeals (“the Panel”) set in motion the idea that social media platforms can be held liable for building or enabling features that are so inherently dangerous to its users that the product is essentially defective. The ruling came after a lawsuit filed against Snap Inc. (the owner of the smartphone social media application, Snapchat) in February 2021 by the parents (“Parents”) of two boys who had died in a high-speed car accident. Snapchat allows its users to take photos or videos (colloquially known as “snaps”) and share them with other Snapchat users. This article seeks to analyze this ruling and its impact on the safe harbor exemptions against intermediary liability.

Background

It may be rather absurd to imagine that a ‘filter’ on social media could be fatal. However, in this case, the Parents submitted before the Panel that Snapchat encouraged their sons to drive at dangerous speeds and caused the boys’ deaths through its negligent design of its smartphone speed filter (“Speed Filter”), which let users record their real-life speed and rewarded them with “trophies, streaks, and social recognitions” for crossing the 100 miles per hour mark on the speedometer. The filter has since been taken down by Snap Inc. 

The Parents alleged that Landen Brown, one of the boys, had taken out his phone to document how fast they were going on the Speed Filter, which showed that the car was being driven at 123 miles per hour. Moments later, the car ran off the road and crashed into a tree, killing both boys instantly. The Parents claimed that Snap Inc. breached its duty to exercise due care in supplying products that do not present an unreasonable risk of injury or harm to the public.

Snap Inc. sought to claim immunity under section 230(c)(1) of the Communication Decency Act (“CDA”), which states, “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The safe harbor exemption under this provision aims to provide immunity to platforms such as social media companies from liability for third-party content. In other words, it is a protection against prosecution for something someone else has posted on the platform. Citing this provision, Snap Inc. argued that Landen’s snap was third-party content for which it was not liable under Section 230(c)(1).

The United States District Court, where the suit was initially filed by the Parents, dismissed the lawsuit in light of the safe harbor exemptions. However, subsequently, an appeal was filed against the dismissal which resulted in the Panel reversing the lower court’s decision to dismiss the case and remanding the matter for further proceedings.

Ruling of the Panel

To determine whether Section 230(c)(1) applied to immunize Snap Inc. from the plaintiffs’ claims, the Panel applied the three-prong test set forth in Barnes v. Yahoo!, Inc. as per which Section 230(c)(1) only protects from liability – (1) a provider or user of an interactive computer service; (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker; (3) of information provided by another information content provider.

With respect to the first prong the Panel held that the parties did not dispute that Snap Inc. was a provider of an “interactive computer service”. Section 230(f)(2) of the CDA states, “interactive computer service means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.”

In line with this definition, the Panel observed that the Snapchat application permits its users to share photos and videos through Snapchat’s servers and the internet. Snapchat thus necessarily enables computer access by multiple users to a computer server. Snap Inc. as the creator, owner, and operator of Snapchat, is therefore a “provider” of an interactive computer service.

With respect to the second prong the Panel held that the Parents’ claim did not treat Snap Inc. as a “publisher or speaker” because the claim was founded on Snap Inc.’s design of Snapchat. The Parents’ lawsuit treated Snap Inc. as a products manufacturer rather than a “publisher or speaker of third-party information”, by accusing Snap Inc. of negligently designing their product, i.e. Snapchat, with a defect, i.e. the interplay between Snapchat’s reward system and its Speed Filter. Furthermore, the incentive system within Snapchat that encouraged its users to pursue certain unknown achievements and rewards, worked in tandem with the Speed Filter to entice young Snapchat users to drive at speeds exceeding 100 mph, making such a design unreasonable and negligent. The duty to design a reasonably safe product was fully independent of Snap, Inc.’s role in publishing third-party information.

With respect to the third prong the Panel held that the Parents had not relied on “information provided by another information content provider.” The Panel noted that Snap Inc. was being sued for the predictable consequences of designing Snapchat in such a way that it allegedly encourages dangerous behaviour. Snap Inc. knew or should have known that, many of its users were drivers of, or passengers in, cars driven at speeds of 100 m.p.h. or more because they wanted to use Snapchat to capture a mobile photo or video showing them hitting 100 m.p.h. and then share the snap with their friends. 

Accordingly, the Panel concluded that Snap Inc. did not enjoy immunity from the suit under Section 230(c)(1) and upheld the Parents’ claim. In subsequent proceedings, the Panel will go into the merits of the case to determine whether there was a causal link between Snapchat’s Speed Filter and the car accident that resulted in the boys’ deaths.

Analysis and Implications

Traditionally, social media has managed to exempt itself from liability through legal safe harbour provisions. In the same vein as Section 230(c)(1) of the CDA, section 79 of the Information Technology Act, 2000 (“IT Act”) provides exemption against liability of intermediaries against any third-party information hosted by them if the intermediary does not initiate the transmission, select the receiver of the transmission and modify the information contained in the transmission. Social media entities often rely on this section to escape liability for third-party content in India.

The present ruling of the Panel may change the course of how lawsuits against social media companies are adjudicated in India. It may open the door to holding social media accountable for being a product manufacturer, rather than an internet intermediary. If Snapchat is held liable for the negligent design of its Speed Filter, it may set a notable precedent of making social media liable for the features it enables or endorses on its platform. A common products liability tort could become the loophole in the safe harbor provisions that exposes social media to civil liability.

To the detriment of Snap Inc., a Snapchat defeat would turn the tide of the growing number of cases against it. In the second week of May, another lawsuit was filed against Snapchat seeking to hold the company responsible for a teenager’s suicide, triggered by the bullying he had been subjected to on Yolo and LMK, two anonymous messaging integrations from Snapchat. Carson Bride, the 16-year-old had been receiving anonymous messages for months, which included sexual comments and taunts over specific incidents. He figured that the messages had to be from people he was familiar with, but the anonymity of the apps made it impossible for him to know the identity of the person(s) behind them. He was unable to reply to the taunts because the apps were designed in such a way that replies made the original message public. This made Carson refrain from replying to the messages, not wanting to risk revealing his humiliation to the world. The suit, filed by Carson’s mother, alleges that both messaging apps let users send messages anonymously, thereby facilitating cyberbullying to such a degree that the apps should be considered dangerous. Yolo has been taken off the App Store and Google Stores but LMK is still available for download on both. The Bride family has sought damages on behalf of all 92 million Snapchat users, and for the two apps to be banned from the market until they can prove they have effective safeguards in place.

The Panel in the Snapchat Speed Filter case has opened up the possibility of social media being held liable if a specific integration proves dangerous, by asserting that the duty of care requires social media companies to foresee all reasonable uses and misuses of their products. Therefore, the final judgement in this case is likely to influence the outcome of the teen cyberbullying case. The judgement may also lead other social media companies like Facebook, overhauling some of its existing features like Instagram’s anonymous question sticker that is very similar to Yolo and LMK.


*Ritika Acharya is a Researcher at IntellecTech Law who takes a keen interest in technology law. She is also a law student at Maharashtra National Law University (MNLU) Mumbai, with a passion for reading and writing.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s