President Trump and his enablers in government and right wing media will shoulder the blame for Wednesday’s insurrection at the US Capitol, but internet platforms–Facebook, Instagram, YouTube, and Twitter, in particular–have played a fomenting and facilitating role that no one should overlook.

In their relentless pursuit of engagement and profits, these platforms created algorithms that amplify hate speech, disinformation, and conspiracy theories. This harmful content is particularly engaging and serves as the lubricant for businesses as profitable as they are influential. These platforms also enforce their terms of service in ways that favor extreme speech and behavior, predominantly right-wing extremism.

Since 2015, when Trump announced his presidential campaign, the relationship between internet platforms and the political right has been increasingly symbiotic. The business choices of internet platforms have enabled an explosion not only of white supremacy, but also of Covid denial and antivax extremism, which have variously undermined the nation’s pandemic response, nearly sabotaged the presidential election, and played a foundational role in the violence at the Capitol. A huge industry has evolved on the platform giants to raise money from and sell products to people in the thrall of extreme ideas.

The platforms hide behind the First Amendment to justify their policies, claiming that they do not want to be arbiters of truth. There are two flaws in this argument. First, no thoughtful critic wants any platform to act as a censor. Second, the algorithmic amplification of extreme content is a business choice made in pursuit of profit; eliminating it would reduce the harm from hate speech, disinformation, and conspiracy theories without any limitation on free speech. Renee DiResta of the Stanford Internet Observatory made this point in a WIRED essay when she noted “Free Speech Is Not the Same As Free Reach.”

Until this insurrection, many policymakers and pundits have dismissed the rising tide of online extremism, believing it to be safely contained, and therefore harmless. Their lack of concern allowed extremism’s audience and intensity to multiply.

Because internet platforms play a dominant role in our national conversation, extremism cultivated online seeped into the real world. We saw evidence earlier this year when white supremacists occupied the Michigan state capitol and then rioted in Minneapolis, Louisville, Portland, and Kenosha after the murder of George Floyd. Internet platforms, Facebook in particular, were central to organizing these violent acts, as well as in Washington, DC yesterday. Journalists have uncovered police members in Facebook Groups devoted to a variety of right-wing extremist ideas, which may explain why police departments in some cities have not taken the threat of right-wing extremism seriously. Press and online videos have depicted police officers standing by as insurrectionists broke the law, or even taking selfies with them.

The violence on January 6 followed a rally where the president incited the crowd to march to Capitol Hill and “show strength.” The rally was organized and livestreamed on every major internet platform, which also amplified photos and videos posted during the day. Twitter and Facebook both allowed Trump to post an inflammatory video about the mob violence and only took it down after a tsunami of negative feedback. Twitter suspended Trump’s account for 12 hours and Facebook indefinitely–likely due to pressure from employees and policy makers–but irreversible damage had been done.

The scale of internet platforms is such that their mistakes can undermine democracy, public health, and public safety even in countries as large as the United States. Facebook’s own research revealed that 64 percent of the time a person joins an extremist Facebook Group, they do so because the platform recommended it. Facebook has also acknowledged that pages and Groups associated with QAnon extremism had at least three million members, meaning Facebook helped radicalize two million people. Over the past six months, QAnon subsumed MAGA and the antivax movement, with a major assist from the platforms and policies of Facebook, YouTube, Instagram, and Twitter. The recording of his recent conversation with Georgia Secretary of State Brad Raffensperger confirmed that President Trump has joined his followers in embracing QAnon and its conspiracy theories.

Congress and law enforcement must decide what to do about the unprecedented insurrection in Washington. President Trump and elements of right wing media must pay. So, too, must internet platforms. They have prioritized their own profits and prerogatives over democracy and the public health and safety of the people who use their products. It is no exaggeration to say that internet platforms, as well as new technologies like artificial intelligence and smart devices, are unsafe. They are very often created by people who have no incentive to anticipate, much less prevent, harms. As things stand, the incentives have encouraged the development of a predatory ecosystem, with platforms, users, and politicians alike in on the grift.

You May Also Like

Supermassive black hole captured in stunning new image surrounded by group of ‘Spanish Dancer’ stars

A SUPERMASSIVE black hole has been captured in a stunning new image…

PS5 stock coming ‘soon’ to UK retailer Gamebyte in huge boost for console-hunters

UK gamers still on the hunt for a PlayStation 5 will get…

Instagram makes major change to your profile page – making it more like Twitter

INSTAGRAM’s latest update has enabled users to pin their best posts to…

NASA tells China to be ‘open and transparent’ after OVERLAP with US lunar landing sites revealed

NASA has called on China to be ‘open and transparent’ with its…