He was in the midst of those ruminations in January 2018 when Sheryl Sandberg forwarded an email from one of her college friends to him. Noah Feldman was a Harvard law professor who had been thinking of Facebook’s problems through the lens of early constitutional law. He had just finished reading a book on James Madison when, on a visit to California, he took a bike ride around the Stanford campus. The idea came to him that Facebook’s toughest calls might best be handled by a quasi-judicial unit with autonomous powers. “This is a strange thing,” admits Feldman of the Oversight Board. “I mean, Facebook is not a country, and this body looks sort of like the court.”

He sent a brief description to Sandberg, who urged him to write up a proposal. When Zuckerberg saw it, he summoned Feldman to a meeting. “Unbeknownst to me, he had been thinking for a long time about devolution of power away from Facebook,” Feldman says. Zuckerberg ultimately hired Feldman as a consultant, and the project was put in motion.

“Mark had been seeking input from a lot of different places,” says Andy O’Connell, a director on Facebook’s Global Public Policy team. “Noah’s idea was actually implementable, and other ideas were not. And it was the most detailed proposal.” (Still, many in and out of Facebook claim to have thought of it. ”I can’t tell you how many people have said, ‘Glad you’re running with my idea,” says Zoe Darmé, a manager on the project.)

By the spring of 2018, Zuckerberg was sharing his excitement about the idea with people. In an April interview that year, he told me about brainstorming a Supreme Court–like entity, whose members don’t work for Facebook but would have binding authority. “I think [it] would help make people feel like the process was more impartial on judging what content should be on the service and what’s not,” he told me.

Leading the project were two relative newcomers to Facebook, Brent Harris and Heather Moore. Facebook had hired Harris, an expert in international regulation, to become its director of governance and global affairs late in 2017. Since he had worked on adjudicating the BP oil spill in the Gulf of Mexico, he was well-placed to deal with the gushers of offensive content on Facebook’s platform. Soon after the March 2018 Cambridge Analytica scandal broke, he began focusing on the board, joined by the newly hired Moore, an attorney with an extensive background in procedural justice. She headed the effort to write the board’s charter and bylaws. “[The concept] was to bring together a group of people outside of these walls with the expertise, knowledge, and purpose to really make consequential decisions in a way that was more democratic than what was currently happening inside the company,” she says.

In keeping with the theme of independence, the project leaders created a process by which they sought guidance from experts in a dense series of meetings, workshops, and conferences. It ran simulations of board considerations. All told, Facebook consulted with more than 2,200 people from 88 countries.

Last year Facebook ran a series of 20 workshops, in places like Singapore, Menlo Park, Brazil, Berlin, Nairobi, and New York City, to take feedback from activists, politicians, nonprofit groups, and even a few journalists. By the time of the New York workshop I attended, Facebook had tentatively drafted a charter, and had suggestions on the bylaws that would dictate the group’s operations. But in our two-day discussion, everything was up for grabs.

One of the longest discussions involved precedent. Facebook handles millions of appeals every year on its content decisions. The board will handle an infinitesimal slice of those, maybe 25 or 30 in its first year—and Facebook is obliged to respect its decisions only in those individual cases. For instance, in our workshop we simulated a board discussion about a Facebook decision to take down a post where a female comedian claimed that “all men are scum.” Facebook considered it hate speech and took it down, and a public controversy ensued. If a board overruled Facebook, the post would be restored. But removing a single post doesn’t tackle the underlying problem that Facebook’s Community Standards were too inflexible by handling hate speech the same, whether it was directed jokingly at a powerful group or employed harshly toward a vulnerable minority.

You May Also Like

Call of Duty: Warzone 2.0 pre-loads and launch times to play it at release

CALL of Duty: Modern Warfare 2 is already out, and soon the…

Sony Xperia 5 V Review: Small and Mighty but Too Pricey

The chief problem I had with Sony’s flagship smartphone of 2023, the…

Selling of Mobile Phone Data Presents Security Risk for U.S. Armed Forces

WASHINGTON—In 2016, a U.S. defense contractor named PlanetRisk Inc. was working on…

The Best Chef’s Knives to Sharpen Your Home Cooking Skills

A great knife is the cornerstone on which a great meal is…