In recent years, there has been a flood of criticism directed at websites that publish objectionable content from third parties. Examples include social media sites that allow “fake news” and false political ads and websites that host proponents of mass shootings and white supremacy. But there are everyday examples of the damage done by people posting defamatory content. While such publishers generally have been shielded from liability for nearly 25 years, there is a push to change those rules.
Section 230 of the Communications Decency Act (“CDA”) was passed in 1996 to address the liability of ISPs and websites that hosted third-party content. Prior to the CDA, they could be held liable for publishing third-party defamatory material under the theory that a publisher has knowledge, opportunity, and ability to exercise editorial control over the defamatory statement. However, ISPs and websites argued that they lacked editorial control over posts and could therefore not be held liable for the content of those posts. The result was that those sites that did try to screen their material for defamation could be found liable more frequently than websites that did not vet content at all.
The CDA addressed this issue in part to encourage online publishers to moderate content. The law states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” “Interactive computer service” refers to any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server. Essentially, this has been interpreted to apply to blogs, listservs, forums, social media, apps and even sites that allow user reviews like Amazon. As a result of the CDA, such parties can avoid liability for what users post even if they are moderating posts according to their own terms of service. Thus, Facebook can remove posts and ban users who violate their terms of service, but they cannot be sued for defamation by parties who were harmed by posts that were not removed.
There are exceptions under the CDA. Sites can still be liable for enforcing criminal law, intellectual property law (in case of copyright or trademark infringement), state laws, and sex trafficking laws.
Both liberals and conservatives are seeking changes in the CDA for different reasons. Some groups feel websites must do a better job of moderating to eliminate objectionable content, while others feel that moderating interferes with free speech and maybe biased against certain groups. While various proposals are debated in Congress, the Department of Justice (DOJ) held a public workshop in Washington, D.C. on Feb. 19, 2020, to discuss the CDA and its impact on the American people and business community, and whether improvements to the law should be made. The Justice Department emphasized that the meeting was not focused on actively generating policy, but rather on aligning public concerns with the goals of private companies, likely intending to help eventual Congressional action move smoothly. The DOJ will continue holding private sessions and roundtables with stakeholders to further discuss the issues and continue to build towards a consensus.
If you provide an internet service that allows users to post or upload their own content, you must monitor legislative developments closely. Speak to experienced counsel to advise you regarding possible changes you may need to make in the event the CDA is amended to avoid liability.