The moment I first realized that everything had changed for Facebook was right after the 2016 US presidential election with one of the first of many Zuckerbergian mea culpas. Not that first post-election post, his horribly disingenuous dodge that improbably asserted that Facebook could not have influenced the election. This, despite a Facebook political advertising sales force, now numbering probably in the hundreds, that had spent the past year claiming the contrary to every candidate with a marketing budget. No, it was Zuck’s second post, more circumspect and clearly more scripted, that described a concrete series of steps to counteract the influence he’d previously declared nonexistent. There, buried in the reassuring lingo of corporate comms-speak (“easy reporting”, “disrupting fake news economics”), lay some hidden bombs, or perhaps for the company, land mines. Not only would Facebook deign to rely on outside third-party sources, a sort of Snopes.com-ification of Facebook. It would consult with newspapers (!) on how to fact-check content itself.
To anyone (like this former Facebook employee) steeped in the company’s usual MO, this was astonishing. For the past two decades of consumer internet life, the great media intermediators had hidden behind what I’ll call the Algorithmic Pass. This was the not-altogether-wrong assertion that their companies merely optimized around user demand—providing the needy user whatever they wanted, by whatever metric—and were completely agnostic to truth, aesthetics, or political virtue. To every public clamor or brouhaha (and there were many), the answer was always, “It’s just math,” and they’d point at the roomful of geeks, replete with Nerf guns and beanbag chairs, as proof.
Antonio García Martínez (@antoniogm) was the first ads targeting product manager on the Facebook Ads team, and author of the memoir Chaos Monkeys: Obscene Fortune and Random Failure in Silicon Valley. He wrote about the internet in Cuba in WIRED’s July issue.
More than a mere corporate cover-your-ass maneuver, the Algorithmic Pass heralded a monumental shift in how modern, media-saturated humans learned about the world. No longer would handpicked mandarins at recognized media establishments—the editors and curators of our literary and political world—anoint one or another piece of content with the always malleable imprimatur of “true” or even “good.” No. Whatever piece of content, however brilliant or vile, that received an escalating chain reaction of user engagement would receive instantaneous, worldwide distribution. Having “gone viral” became a greater trophy than appearing “above the fold” (now a ludicrous concept). Vox populi, vox culturae.
And then the 2016 election happened.
Suddenly we’re all rescinding Facebook’s Algorithmic Pass, hounding the uncharacteristically beleaguered company to take some responsibility for what appears on its blue-framed pages. What’s most ironic about the hubbub is this: people fear Facebook’s power, so they ask Facebook to take on even more power by taking a very direct hand in what appears there, rather than a very second-order mathematical one. As Facebook’s power grows and our trust erodes, we somehow overcompensate by rushing to entrust them with even more.
Contemplate this unsettling vision: Mark Zuckerberg, or more likely one of his deputies, sitting in the equivalent of the afternoon editorial meeting at The New York Times, where the day’s news—which stories will appear, and which won’t—are decided: this news source discarded as fake or spammy, this one included and effectively boosted in the newsfeed. As much as I grew to admire some of the company’s culture as an employee, I realize as much as anyone how they can (and do) descend into groupthink and biases of various flavors. Do we really want Zuck as global news editor versus a disinterested algorithm that merely optimizes toward some objective and picks the day’s news winners and losers? The editor is dead; long live the editor, only now with editor-in-chief Zuckerberg.
Oddly enough, it’s a job he and the company don’t want. “We’re a technology company, not a media company,” has been the constant refrain, along with invocations of the Algorithmic Pass, for engineering-centric companies like Facebook. MOVE FAST AND BREAK THINGS and DONE IS BETTER THAN PERFECT were the Facebook mantras (as immortalized on their many in-office posters), not ALL THE NEWS THAT’S FIT TO PRINT and DEMOCRACY DIES IN DARKNESS.
And it shows.
Around 2015, as Facebook’s Trending Topics product dithered in embarrassing irrelevance (a shameless rip of Twitter’s Trending feature, it appears on the right-hand side during most Facebook sessions), the company stooped to hiring humans—HUMANS!—to fix its deficient software product. Within 18 months or so, all had been fired and the human effort shuttered, but not before, in an absolute and unusual violation of Facebook’s typically ironclad OPSEC, some of them spilled the beans about how horrible working at Facebook had been, with some even suggesting they’d been pushed to bias the news. A half-trillion-dollar company armed with some of the best technical minds in the world couldn’t manage a dozen or so wet-behind-the-ears journalism grads, something the Sacramento Bee manages annually without much ado. That’s how good Facebook is at being a media company.
But if there’s anything I grew to respect while working at Facebook, it was the company’s unnatural ability to pivot in a completely new direction and iterate rapidly toward excellence there, no matter how originally foreign the territory. With the feds breathing down their neck (Facebook is testifying before Congress this week) and Zuckerberg issuing public apologies during the Jewish Day of Atonement, the company has been shaken like nothing I’ve ever seen as employee or outside observer. If the world wants Facebook as editor, they’ll sure get it, for better or worse.
What’s that mean in practice? From the company’s hints, it will involve the aforementioned third-party fact-checking services, a sort of Snopes-ification of the Facebook experience. Based on both that and user input, content will first be conspicuously flagged as false and then effectively disappeared from newsfeed distribution, as porn or other terms-of-service-violating content is now. In addition, based on its short-lived experiments in human editing around Trending Topics, Facebook will almost certainly draw up a list of acceptable news outlets of passable truthiness, boosting their distribution at the expense of second-tier (or no-tier) content producers.
There’ll be some clear downsides though.
The death-by-algorithm of the media gatekeepers meant that many new voices rose to the fore that would never have jumped through the arbitrary hoops of conventional publication. XKCD, The Oatmeal, Stratechery, Slate Star Codex, Ribbonfarm, Wait But Why—all those weird but clever bloggers or cartoonists who joked, scribbled, or illustrated their way to online fame, viral post after viral post—the new crop of those will find it very hard to hustle themselves an audience. The lone, nonconforming online genius may just be muted along with that Russian political ad farm. Your byline isn’t on Slate or The Washington Post? Too bad, lone content creator.
Which brings us to the other ironic thing about all of this: In order to preserve our political democracy, which elevates the most popular among us (though perhaps not the finest) to power, we’ll seemingly abandon a total democracy of thought, which does the same for ideas. You can judge a people by how much freedom they can tolerate without destroying themselves. It seems the power for anyone to go viral and attain a global audience, through articulate reasoning or just clickbait-y libel, was a just bit too much freedom for us to bear.