The Supreme Court is currently reviewing the cases of Gonzalez vs. Google and Twitter vs. Taamneh to determine if YouTube and Twitter are liable for terrorism-related content hosted on their platforms.
Of course it’s abhorrent that terrorists use YouTube and Twitter to recruit and plan their activities. But those sites are used by millions (and in YouTube’s case, billions) of people, and host billions of pieces of content, most of which are not related to terrorism. And because of that, the law says YouTube and Twitter are not responsible for bad actors on their platform. Here’s how the Gonzalez vs. Google and Twitter vs. Taamneh are attempting to change the Supreme Court’s mind.
What is Section 230?
Section 230 preserves a free and open internet. In 1996, just as the then-new internet was gaining widespread acceptance, Congress committed to supporting that development in Section 230 of the Communications Decency Act of 1996.
Snowflake helped Tor users thwart Russian censorship. Now the VPN is branching out as Snowstorm.
In less than 800 words, Section 230 recognizes that the internet and the services on it give Americans access to a “diversity of political discourse…cultural development [and] intellectual activity.” It states that the internet should remain free from government regulation so that it, and free speech, can flourish. Services like YouTube and Twitter are free to moderate user content and speech according to their own guidelines.
Why are YouTube and Twitter in hot water?
Supreme Court cases Gonzalez vs. Google and Twitter vs. Taamneh allege that YouTube and Twitter should be liable for aiding and abetting terrorism because they recommended terrorism-related content (in the case of Gonzalez vs. Google) and hosted terrorism-related content( in the case of Twitter vs. Taamneh).
As of now, YouTube and Twitter are protected from that liability by a Section 230 that states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Basically, you are responsible for what you do online. Services like YouTube and Twitter cannot be held responsible for the content posted to their platform, and neither can fellow users of the platform. In simpler terms: when someone posts something hateful online, the speaker is responsible, not the service that hosts the post.
Gonzalez vs. Google and Twitter vs. Taamneh allege that YouTube and Twitter should not be protected under Section 230 and are liable for promoting terrorism-related content, not just hosting it.
What does the Supreme Court have to decide?
The Supreme Court must break Section 230 into teeny, tiny pieces, down to the word, to determine if it will protect YouTube and Twitter in these cases.
Justices have quibbled over the definition of “aiding and abetting” and whether either platform could be considered as having aided and abetted terrorist organizations. They also discussed whether or not YouTube’s recommendation algorithm and the platform’s suggestions for what to “watch next” could be considered an endorsement of a piece of content or just a “neutral” tool for cataloguing YouTube’s massive library.
The Supreme Court is also considering the implications of their decision in the long term. Should it find YouTube and Twitter liable, and therefore move to regulate parts of big tech that have previously been left untouched? Or would that open all internet services to liability and undoubtedly overwhelm the court systems with thousands, if not millions, of new lawsuits?
And what about free speech? Would finding YouTube and Twitter liable stifle a free and open internet and put individuals at risk for legal action every time they share a video or post in an online forum? Or would it be better to hold YouTube, Twitter, and other open platforms responsible for any terrorism-related activity on their sites?
What would the internet look like if Twitter and YouTube became responsible for the content on their sites?
The shape if the internet as we know it was made in the image of free speech. To make platforms responsible for what is said or hosted on their sites means that those platforms be open to a countless lawsuits. It would also mean that you as a user would be liable for anything you say on those platforms that upsets somebody enough to pursue legal action under the amended Section 230.
To avoid being buried in legal fees, platforms would resort to significant, if not complete, censorship to restrict how individuals interact online. That could hinder innovation, communication, and generally make the world a much smaller place.