You have /5 articles left.
Sign up for a free account or log in.

With the tragic events in Texas and New York, we are once again thrown into a national conversation about how to address domestic terrorism. At least five principal issues come up in the wake of these events: public health concerns reflected in firearm laws; the effective applicability of “red flag” laws; mental illness; the loneliness and feelings of dispossession on the part of young white men who predominate in the demographics of shooters; ideologies and theories based on white supremacy; the safety of Black, brown and Indigenous communities, as well as our children; and, not least, the role that the internet and social media play in this complicated mix. For the sake of brevity, let us focus on the last point.

Content-moderation policies and practices have risen to the fore of debate in the United States. Suffice it to say that neither Democrats nor Republicans have hit the right mark, as evidenced by these discussions veering into extremes and becoming paralyzed as matters of new law in Congress. But for the bipartisan divide that afflicts the U.S., it does not have to be that difficult. A simple and clear law should be available for discussion among citizens and ready acceptance on the part of social media companies, notwithstanding their knee-jerk allergy to any form of government regulation. How about this draft?

  • All platforms must comply with existing First Amendment law, including the reporting of posts that pose a “clear and present danger” to the health and safety of the individual or community, including persons as well as physical property.
  • All platforms must have a clearly identified and functional link on the homepage of their business to which users can report illegal activity.
  • All provisions of this law are in keeping with section 230 of the Communications Decency Act of 1996.

Three main points illuminate these provisions. The first is the clear and present danger exception to First Amendment law. That exception is settled law, along with federal and state prohibitions against child pornography and obscene materials. Private companies apart from this proposed law are not bound by the First Amendment, but neither are they exempt from existing law. This provision makes that point crystal clear.

In keeping with that point, creating an obligation whereby users can report such activity offers the community an opportunity to participate in communication with social media companies. The helpless mess that many experience with security lapses and privacy violations can begin to be addressed with this basic and essential ability to streamline a complaint.

Moreover, this method has antecedents. The Digital Millennium Copyright Act of 1998 established a similar path for content owners to notify internet service providers of potential infringements. It would be quite simple for lawmakers to take a page out of the DMCA book. Nothing in these provisions obviates section 230 of the CDA. In fact, one could make the argument that because these rules are merely about the compliance of social media companies with existing law, this proposed law is extraneous. Indeed, the same could be said about the Stop Enabling Sex Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act. Congress passed those laws in 2018 nonetheless, indicating that in some cases it is important to enshrine in law that which should be obvious but is not, or that which, like content moderation, is so caught up in culture war discourse a clear statement seems to belie lawmakers as well as law enforcement.

By the same token, nothing in this proposed law inhibits a platform from establishing their own policy for content management. The law is the floor of expectations that our government sets below which no company is entitled to go without redress. Policy, stemming from the ancient Greek term for citizens, establishes a higher level of expectations for users of the community that the platform sets. Once one hits the Accept button on terms and conditions to a site, the user has effectively become a member of that community.

Much remains to be done in the United States to address domestic terrorism. Let’s begin with the low-hanging fruit: common sense law on content moderation according to the laws we already have in place.

Next Story

Written By

More from Law, Policy—and IT?