When WIRED introduced Facebook to its online readers in 2004, four months after Mark Zuckerberg launched the site with a few friends out of his Harvard dorm room, the first order of business was explaining the poke. “On Thefacebook, poking is a way of saying ‘hi’ to would-be contacts, a method to strike up a conversation without adding the person as a friend,” went the post. “And there’s quite a bit of poking going on.” From there, the story went on to describe the latest social network sweeping college campuses. All 34 of them.
If one phrase is going to be repeated ad nauseum around the 15th anniversary of Facebook’s creation, it’s that a lot has changed. The company has expanded from an exclusive platform for American college students to one of the biggest, most powerful communication and advertising companies in the world—a one-stop shop for sharing photos, consuming news, messaging friends, buying and selling goods–and, in some countries, essentially the internet itself. It employs tens of thousands of people, has more than 2 billion users, and makes even more billions of dollars.
Looking back at 15 years of WIRED’s Facebook coverage may strike some as a myopic or self-serving exercise (Can I interest you in these stories from our archive?!), but it carries some useful reminders. Events of the past few years have led to calls for more ethical tech—for engineers and designers to more fully anticipate the range of impacts their products could have on society, intended or not, and think about how their tools might be used for harm as well as for good. (The tech press and users would do well to consider those things, too.)
It certainly wasn’t clear from the outset that Facebook would become the force it is today—even if Zuckerberg did end weekly meetings chanting “Domination.” Facebook was just one dainty wildflower in a vast garden of social networks, and every day it seemed like a new one popped up. Tribes, Flickr, Orkut, Bebo. None of them were making money. They didn’t seem to have that much staying power, either. SixDegrees.com had come and gone; Friendster was already giving way to MySpace.
In fact, it was only after News Corp acquired MySpace in 2006 that Facebook had its first mention in the IRL printed pages of WIRED magazine. In contrast to cool teen hangout MySpace—which News Corp hoped to mine for insights into social media virality—Facebook, WIRED wrote, “avoids out-of-control content like an STD.” So, yes, things have changed!
As Facebook’s user base ticked up by the thousands and then millions, the questions WIRED asked about the company changed. “What is this thing?” morphed pretty quickly into “But will it make money and (by implication) survive?” The answer to that one turned out to be yes.
Mark Zuckerberg was always fairly upfront about his desire to get people to share their personal information, lots of it, on the platform he controlled. “Facebook has always emphasized two qualities that tend to be undervalued online: authenticity and identity,” contributing writer Fred Vogelstein wrote in an October 2007 profile. “Users are encouraged to post personal information—colleges attended, workplaces, email addresses. Facebook also emphasizes honesty: Because users typically can view profiles only of people they’re linked to, and they can’t link to them unless both partners confirm the relationship, there’s little point in creating a fake identity.” (That, of course, would not always be the case.)
Early on in the pages of WIRED, a portrait emerged of a young CEO determined to radically reshape the concept of privacy in the digital age, no matter how much pushback he got from the public. Take, for example, Facebook’s rollout of the News Feed in 2006. Users hated it, protesting en masse and threatening boycotts. “The easiest thing for Zuckerberg to do was simply dismantle News Feed,” Vogelstein recounted. “But he refused. News Feed was not just any feature. It was the infrastructure to undergird the social graph. So, three days after the feature launched, he posted a 485-word open letter to his users, apologizing for the surprise and explaining how they could opt out of News Feed if they wished. The tactic worked; the controversy ended as quickly as it began, with no real impact on user growth.”
It was no great leap to tie the company’s focus on sharing to its business goals. Two years later, Vogelstein described Facebook’s plans to sell targeted advertising across the web, just like its rival Google. “But unlike with AdSense,” he wrote, “Facebook’s ads could be exquisitely tailored to their targets. ‘No one out there has the data that we have,’ says COO [Sheryl] Sandberg.” As long as Facebook was free to use—and its leaders promised it always would be—it would sink or swim on advertising. WIRED’s Jargon Watch column would eventually coin the phrase “privacy zuckering”: “v. Creating intentionally confusing privacy policies—à la Mark Zuckerberg—to sucker users of social networking sites like Facebook into exposing valuable personal information.”
As Facebook kept tweaking privacy settings and profile features to encourage—or just unilaterally make—more information public, a debate over its tactics played out online, including on WIRED. On May 7, 2010, Ryan Singel wrote, “Facebook’s Gone Rogue; It’s Time for an Open Alternative,” while later that month Fred Vogelstein asked, “What if the Facebook (Un)Privacy Revolution Is a Good Thing?” At the time the risks were largely framed as ones to Facebook’s growth: Make your ads too creepily relevant to users, and they might freak out and walk away. Overstep enough, the theory went, and the free market would work its magic.
In a WIRED cover story that same month, Steven Levy put Zuckerberg at the vanguard of a new generation of hackers, heir apparent to the likes of Bill Gates. “Like Gates,” Levy wrote, “Zuckerberg is often accused of turning his back on hacker ideals, because he refuses to allow other sites to access the information that Facebook users contribute. But Zuckerberg says that the truth is just the opposite; his company piggybacks—and builds—on the free flow of information. ‘I never wanted to have information that other people didn’t have,’ he says. ‘I just thought it should all be more available. From everything I read, that’s a very core part of hacker culture. Like “information wants to be free” and all that.’”
Indeed, some of the earliest concerns about Facebook expressed in the pages of WIRED were about what the social network was doing to the web, rather than to the world. That fall, the magazine declared the open web to be dead, thanks in part to closed platforms like Facebook. This wasn’t the first time that WIRED said RIP to browsing as we knew it—we also struck the knell in 1997—but PointCast didn’t quite have the same takeoff velocity as Zuckerberg’s rocket.
“Facebook became a parallel world to the Web, an experience that was vastly different and arguably more fulfilling and compelling and that consumed the time previously spent idly drifting from site to site,” wrote Michael Wolff, in a piece placing blame for the demise “on them.” (Chris Anderson, then editor-in-chief of WIRED, had a companion piece in the same issue arguing that the blame fell “on us.”) “Even more to the point, Facebook founder Mark Zuckerberg possessed a clear vision of empire: one in which the developers who built applications on top of the platform that his company owned and controlled would always be subservient to the platform itself. It was, all of a sudden, not just a radical displacement but also an extraordinary concentration of power.”
By 2012, Facebook had become so entrenched in our lives that it seemed inescapable. Zuckerberg was already talking about the platform as infrastructure, and that analogy took hold. Discussing successful campaigns against some tech companies’ increasingly intrusive terms of service, columnist Anil Dash invoked the specter of 20th century utility regulation. “It won’t be long before some eager lawmaker sees political value in writing laws to rein these companies in,” he wrote. “It’s up to us—the users and the press—to save them from that.”
Everyone was talking about the power of network effects, and the comparisons between Zuckerberg and Gates took on another dimension. “Troublesome corporate behavior is easier to swallow when there are other choices out there, when you have the option to take your business to another store down the street,” wrote Steven Johnson in a June 2012 story about the “Facebook Juggernaut.” “But when one company owns the whole street, each little transgression is amplified.”
And yet, the inherent goodness of Facebook’s stated mission—”to make the world more open and connected,” as Zuckerberg wrote before the company went public that year—still went largely unquestioned. “A more open and connected world? You’d have to be some kind of cynic or misanthrope to object to such a laudable goal,” Johnson wrote.
The power of social media connections was not much in doubt by this point, not after they helped usher along the Arab Spring and Occupy movements, not to mention smaller waves of “self-organized, hyper-networked revolts.” Their initial success helped support the insistence of companies like Facebook and Twitter that their products were forces of progress. Many observers (although certainly not all) would be slower to realize how authoritarian governments, terrorist groups, and other bad actors could make use of the same tools.
And so Facebook kept connecting people. It launched Internet.org to bring more people from developing countries online—and, added bonus, onto Facebook. But Facebook also wanted to connect you no matter where you were—using other apps, on your mobile phone, in your chat apps, at the gym. In Rooms, at one point. News Feed, which people hated so much at its launch, became a place where users spent more time than ever.
As Facebook became the de facto information portal for millions, then billions of people, how it shaped those connections carried more and more weight. The company was constantly adjusting its News Feed algorithm in search of a better user experience (and more user engagement), to better surface the posts it thought you wanted to see, encouraging you to keep sharing and scrolling and commenting and liking.
“Every tweak to the technology that powers the News Feed has consequences for the people and businesses that attempt to harness it to win people’s attention,” Jessi Hempel wrote in 2016, on the News Feed’s 10th anniversary. “Along with this power comes a growing tension over how decisions get made about what information belongs in that feed.”
There were early signs at just how distortive this sort of system could be when taken to its extreme. Back in 2014, Mat Honan liked everything. Literally. As an experiment, he decided to like every single thing that came across his Facebook feed, no matter what he actually felt about it. The transformation was swift. “As day one rolled into day two, I began to dread dropping in on Facebook,” Honan wrote. “It had become a temple of provocation. My News Feed had not only drifted further and further right, it had oddly also drifted further and further left—a digest of bipartisan extremism.”
Facebook was famously laissez faire about the content on its platform. “As Facebook’s engineers and managers constantly explain, the company is nonjudgmental about what’s in anyone’s News Feed — as long as it makes the user happy,” Steven Levy wrote in 2015.
By the end of 2015, 63% of Americans were getting their news from Facebook. And then people started running for president. WIRED covered the 2016 election cycle more closely than ever before, because technology was a bigger part of the story than ever. “People have often asked me why a tech publication is writing about politics,” Issie Lapowsky, our senior writer covering national affairs, wrote the day before the vote. “It’s a fair question. But considering that email servers, Russian hackers, Twitter trolls, and WikiLeaks now have a prominent role in our electoral system, the more pertinent question seems to me: How could we not?”
Less than 48 hours later, as Trump celebrated his electoral-college victory, people wondered with growing alarm just how much the internet played a role. When it came to Facebook, unintended consequences like echo chambers and fake news were popular topics of discussion in the days following the election, eventually to be joined by nefarious Russian trolls. But WIRED was also clear that part of Facebook’s power in the election was how the platform worked exactly as planned—ads were purchased, by Trump’s campaign and by his supporters, and those ads were exquisitely tailored to their targets.
Welcome to the “Is Facebook destroying democracy?” portion of this particular timeline. People were pissed, and nothing the company or its CEO said or did seemed to help much.
“Over the past two and a half years, Facebook’s integrity as a place that ‘helps you connect and share with the people in your life’ has been all but laid to waste—as it has served as a clearing-house for propaganda, disinformation, fake news, and fraud accounts,” Ideas columnist Virginia Heffernan wrote in the November 2017 issue. “More serious still: Facebook may not just have been vulnerable to information warfare; it may have been complicit.”
Facebook’s social mission to connect the world was no longer a defense against any of the company’s oversteps. All of a sudden, it might have been the problem.
“This idea that more speech—more participation, more connection—constitutes the highest, most unalloyed good is a common refrain in the tech industry. But a historian would recognize this belief as a fallacy on its face,” wrote Zeynep Tufekci in the February 2018 issue, which was dedicated to free speech. “Facebook doesn’t just connect democracy-loving Egyptian dissidents and fans of the videogame Civilization; it brings together white supremacists, who can now assemble far more effectively. It helps connect the efforts of radical Buddhist monks in Myanmar, who now have much more potent tools for spreading incitement to ethnic cleansing—fueling the fastest-growing refugee crisis in the world.”
WIRED’s March 2018 cover showed a photo illustration of Zuckerberg looking bruised and battered, meant to convey the damage to the CEO’s reputation after a two-year period we hyperbolically (or not) described as “hell.” As Editor-in-chief Nicholas Thompson and Fred Vogelstein wrote, the story of Facebook now was “of a company, and a CEO, whose techno-optimism has been crushed as they’ve learned the myriad ways their platform can be used for ill. Of an election that shocked Facebook, even as its fallout put the company under siege. Of a series of external threats, defensive internal calculations, and false starts that delayed Facebook’s reckoning with its impact on global affairs and its users’ minds. And—in the tale’s final chapters—of the company’s earnest attempt to redeem itself.”
That tale is nowhere near finished. Since the 2016 election, Facebook has come out with a slew of proposed solutions to its various problems: war rooms to protect elections, artificial intelligence to take down rule-breaking posts, fact checking partners to tamp down the spread of fake news, partnering with researchers and law enforcement to spot foreign manipulation on its platform. Last May, Zuckerberg told Steven Levy it would take “three years” to fix Facebook, although what a “fixed Facebook” looks like and how the world is supposed to measure that is still unclear.
At the same time, it seemed like a new scandal was exploding in Facebook’s face nearly every week in 2018—a pattern that has continued into the new year. “Facebook has certainly changed, but it’s hardly fixed,” wrote Lapowsky, looking back at the company’s turbulent year. After what felt like the thousandth scandalous revelation—and subsequent apology and pledge to do better—Vogelstein asked, “Why Should Anyone Believe Facebook Anymore?” Mark Zuckerberg’s social network is still bigger and more powerful than ever. But, to put a spin on an old trope, with great power comes great accountability.
WIRED has covered a lot of new ground about Facebook over the past 15 years. But some lessons of the Facebook era have been out there all along. A year before WIRED ever mentioned Facebook, and months before Zuckerberg flipped the switch on his site in Cambridge, the magazine published a special June 2003 issue, guest edited by Rem Koolhaas, as “a catalog of emerging spaces, the seeds of the coming culture.”
“Whether we’re considering contagious diseases, cultural fads, or trends in the stock market, we need to start thinking in terms of networks,” one entry read. “Sometimes they help us, and sometimes they hurt us – being connected can be good or bad. But either way, networks are always there. And when not just you but anyone can be connected to anyone else on earth in just six steps, what goes around comes around – faster than you think.”