Courts Consider Limitations of Online Platforms’ Immunity for Third-Party Content

A recent appellate decision may signal a narrowing of the scope of the federal law that online platforms have invoked for over 20 years to avoid liability for content third parties create. The case is noteworthy not only for digital media companies but also for any business that hosts user-generated content online.

Section 230 of the Communications Decency Act protects owners of websites and apps from liability for content generated by others (with some exceptions). Section 230 provides “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Congress enacted the statute in the Internet’s early days to encourage the growth of the Internet without fear of liability for failing to police users’ postings.

In v. City of Santa Monica, Airbnb and HomeAway challenged an ordinance prohibiting them from completing bookings for properties not registered with the City of Santa Monica. The court held that the ordinance complied with Section 230 because it did not focus on what users posted, but only on financial transactions.  In other words, the platforms still could host users’ listings of unregistered properties as long as they did not allow bookings of those properties. Thus, the court found the ordinance only limited internal bookings, and not the public-facing content with which Section 230 is concerned.

The court also rejected the platforms’ argument that the ordinance effectively compelled them to remove unregistered listings since the alternative would be listing properties that users could not book (which would surely frustrate users and render the platforms’ services inefficient, if not useless). The court made a technical distinction that the ordinance did not actually require the platforms to remove unregistered listings, regardless of whether removal might be the most practical means of compliance. The court declined to consider whether alternatives to removal were viable or made business sense.

The court’s narrow construction of Section 230 immunity departs from past expansive interpretations that took into account the broader policies behind the Section 230 – i.e., ensuring that online intermediaries are not liable for content that users can generate in volumes and at speeds that make it impossible (or at least prohibitively expensive) to keep up. Regardless of technicalities, the ordinance shifts the burden of compliance with City registration requirements for rental properties from the property owners themselves to platforms like HomeAway and Airbnb.

Further, contrary to Section 230’s goal of incentivizing innovation, the ruling appears to disfavor services offering “one-stop shops” for browsing listings and booking rentals in favor of sites with reduced functionality (e.g. Craigslist) where users would need to transact directly with owners of listed properties. One can imagine the potential consequences if other courts follow suit.  To provide one example, could a service allowing users to book restaurant reservations online be required to make sure the restaurant’s liquor license is in order before each booking?

Other appellate courts, however, continue to interpret Section 230 broadly. In Herrick v. Grindr LLC, for example, the court dismissed claims against the owner of a dating app for failing to remove fake profiles posted by the plaintiff’s ex-boyfriend impersonating the plaintiff. Unlike, the court did not concern itself with whether, despite not being required to monitor for false profiles at the posting stage, Grindr could be required to act at some later time – i.e., when someone uses the app’s internal features to express interest in the imposter. The court also rejected misrepresentation claims against Grindr for allegedly implying it would remove illicit posts. Although a site might be held liable notwithstanding Section 230 if it represents to users that it will monitor or remove third party content, the court noted that the app’s terms of service disclaimed any obligation to remove content. Commentators have speculated that courts or Congress may roll back Section 230’s protections now that some online platforms arguably have the resources and technology to screen user-generated content. Headlines regarding, for example, cyber-bullying and “fake news” have led some to opine that it is time to stop “coddling” online platforms as compared to their non-digital counterparts. Time will tell whether foreshadows such a trend.In any event, businesses would be wise to consult with counsel regarding how to structure their platforms, and draft terms of service, to retain Section 230’s protection in the face of increased scrutiny.

FVLD publishes updates on legal issues and summaries of legal topics for its clients and friends. They are merely information and do not constitute legal advice. We welcome comments or questions.
© 2023 Funkhouser Vegosen Liebman & Dunn Ltd. All rights reserved. Terms of Use. Privacy Policy.
Designed by Digital Strategy Firm, Usman Group.