If there is a singular moment that defines YouTube’s intentional opacity and the lack of accountability this facilitates, perhaps it was in 2018, when Google (and therefore YouTube) provided the most limited data set of the three companies to the independent researchers tasked by the Senate Select Committee on Intelligence with preparing reports analyzing the nature and extent of Russian interference in the 2016 US election. Our collective lack of insight into what is happening on the platform in the four year since has been an ongoing echo of that moment.

And yet, by and large, YouTube’s game plan of giving less to scrutinize has worked. Why?

In part, the problem is practical and technical. It is much harder—and more time consuming—to search and analyze audio and video content than it is text. In part, it is an audience problem: The people that write and research platforms tend to live on Twitter (and, to a lesser extent, Facebook). Perhaps the problem is also a product of unconscious bias, with academics and journalists over-indexing on the importance of the written word. It’s certainly a generational problem: Users of YouTube, and other platforms that focus on video content like TikTok or Twitch, also tend to be younger. Fundamentally, it’s also a storytelling problem: It’s simply harder to write a captivating story about a platform’s failure to take action or release a policy than it is to write about a platform that releases one. That is, until the results of failing to have a policy become all too clear, as they have for YouTube since Election Day. I am guilty of all these biases, and they are evident in my work too. But to solve the challenges posed by content moderation and its governance, the focus must extend beyond the problems that are easier to write about. Opacity should not be so rewarded.

The YouTube problem is not just a problem with YouTube. It’s also indicative of a broader truth: In general, researchers, lawmakers, and journalists focus on the problems that are most visible and tractable, even if they are not necessarily the only important ones. As more content moves from the biggest “mainstream” platforms to smaller ones—perhaps precisely because they have more lax content moderation standards—this will be an increasingly common challenge. Likewise, as platforms and users create more “private” or “disappearing” content, it will be harder to track. This does not mean social media will not still have all the usual problems—hate speech, disinformation, misinformation, incitement to violence—that always exist where people create content online.

This is not a call for a swath of new policies banning any and all false political content (whatever that would mean). In general, I favor intermediate measures like aggressive labelling, de-amplification, and increased friction for users sharing it further. But most of all, I favor platforms taking responsibility for the role they play in our information ecosystem, thinking ahead, being transparent, explaining their content moderation choices, and showing how they have been enforced. Clear policies, announced in advance, are an important part of platform governance: Content moderation must not only be done, but it must be seen to be legitimate and understood.

YouTube ultimately did append a small label to videos about election results stating that “The AP has called the Presidential race for Joe Biden.” Whether or not this is adequate, YouTube’s failure to announce in advance that it planned to do so (as other platforms did) is inexplicable. This ad hoc approach creates the opening for speculation that its actions are influenced by political outcomes, rather than objective criteria it laid out beforehand. YouTube’s role in modern public discourse is important enough that it needs to do better than complacent reassurances that “our systems are generally working as intended.”

You May Also Like

The Apple-Google Contact Tracing Plan Won’t Stop Covid Alone

Apple and Google are offering a bold plan using signals from more…

SpaceX opens pre-orders for Starlink satellite internet allowing the public to place a $99 deposit

SpaceX is accepting pre-orders from the public for its Starlink satellite internet. The…

How to get DOZENS of free TV channels on your iPhone

DID you know that you can watch TV for free on your…

Apple ID Sharing Can Make a Digital Mess. Here’s How to Clean It Up

I got my first Apple device when I was 12, an iPod…