If an app on Facebook behaved the way Facebook has been behaving, Facebook would probably have shut it down by now.
Tuesday’s scathing TechCrunch investigation all but guarantees it. The report found that Facebook has been paying people as young as 13 years old to download an app that grants Facebook access to users’ entire phone and web history, including encrypted activity and private messages and emails. The app, called Research, allows Facebook to see how people’s friends, who have not consented to having their data collected, interact with those users, too.
Facebook says the app was purely for market research. Explained another way: The app allowed Facebook to spot competitive threats on the horizon to help it retain its unprecedented power. Facebook has used another app, called Onavo, to collect similar information; for example, data from Onavo alerted Facebook to the growing popularity of the messaging app WhatsApp before the company acquired it in 2014.
“I think it speaks to the growth-at-any-cost mentality of the company,” says Ashkan Soltani, who served as chief technologist to the Federal Trade Commission during its 2011 investigation of Facebook.
Facebook didn’t respond to WIRED’s request for comment.
At a time when Facebook is under the microscope for violating its users’ privacy, such techniques are bold enough. But what makes the operation even more brazen is that Facebook continued running the program, which launched in 2016 and was sometimes called Atlas, even after Apple banned Onavo from the App store less than six months ago. Apple said it would no longer allow developers to collect information from other third-party apps.
Apparently undeterred, Facebook created a workaround for the Research app. It circumvented Apple’s vetting process using a technical loophole that is only intended for apps Facebook distributes to its own employees. That allowed Facebook to ingest everything a user did on their phones, including teens and minors. While kids under the age of 17 had to receive parental consent to participate, the disclosure form analyzed by TechCrunch minimized the extent of what could be done with all that data. “There are no known risks associated with the project,” it read. Facebook told TechCrunch only 5 percent of the app’s users were teens.
Still, even the solicitations adults would have received about the app weren’t entirely forthcoming. When users referred their friends to the app, for which they could also get paid, the email they received encouraged them to “Install it and forget it,” making the act of giving away unlimited access to their private communications sound as harmless as setting up a Ronco Rotisserie.
“I think it speaks to the growth-at-any-cost mentality of the company.”
The Research app is just the latest example of Facebook’s doublespeak. In public and even under oath, executives like Mark Zuckerberg and Sheryl Sandberg have spent at least a year—if not their entire careers—promising to do better by their users. But in private, evidence abounds that the company continues to flout every rule and attempt at oversight placed before it. They’ve promised to protect user privacy by cutting off developer access to data while continuing to give it away to corporate giants and major advertisers. They’ve vowed to investigate foreign interference in elections, all while withholding information about the extent of that interference on Facebook. They’ve launched efforts to make their ads more transparent, while crippling external efforts by organizations like ProPublica to pull back the curtain even further.
Even as privacy hounds and antitrust watchdogs at the FTC and on Capitol Hill sniff and scratch at Facebook’s door, the social media giant, apparently high on hubris, just keeps tossing them red meat. If Facebook has learned anything from the last two years of public and regulatory scrutiny, it has a funny way of showing it.
In a tweet, security researcher Will Strafatch, who helped TechCrunch with its story, said Facebook’s actions were “the most defiant behavior I have EVER seen by an App Store developer.”
After TechCrunch’s story published, Facebook shut down the iOS version of the app, but kept its Android counterpart running. In a statement to TechCrunch, a Facebook spokesperson said the story ignored “key facts.” “Despite early reports, there was nothing ‘secret’ about this; it was literally called the Facebook Research App,” the spokesperson said. “It wasn’t ‘spying’ as all of the people who signed up to participate went through a clear on-boarding process asking for their permission and were paid to participate.”
Apple has revoked Facebook’s access to its so-called Enterprise Developer Program, which allows Facebook to disseminate apps to test internally with its employees. “Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple,” an Apple spokesperson said in a statement. “Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.” Apple didn’t respond to questions about whether Facebook’s broader family of apps, including Instagram and WhatsApp, would be affected.
It’s hard to imagine Facebook wouldn’t have done the same, if not more, to another developer that so clearly crossed the line. After the political consulting firm Cambridge Analytica was accused of violating Facebook’s rules by harvesting and retaining data on tens of millions of users without their knowledge, Facebook banned nearly every app the company had ever touched, including some unaffiliated research apps that were associated with the University of Cambridge.
What Facebook has done with the Research app isn’t so different, says Soltani. Like Cambridge Analytica, which hired an academic named Aleksandr Kogan to do the work through his company, Facebook used third-party companies to recruit people to the app. And, like the Cambridge Analytica app, which collected data on users’ friends, the Research app had access to its users’ communications with other people who had no idea the app was snooping on them.
“Even if users’ consent was sufficient, everyone they communicated with did not consent,” says Soltani.
By Facebook’s standards, Apple is letting Facebook off easy.
That’s just what Facebook seems to be counting on. For all of the talk about cracking down on the company these last few years, Facebook has faced few penalties that aren’t purely reputational. Later today, the company is expected to report record profits. It’s little wonder executives there would believe they could get away with unscrupulous behavior.
“Facebook continues to demonstrate its eagerness to look over everyone’s shoulder and watch everything they do in order to make money.”
Senator Richard Blumenthal
The TechCrunch story is not the first scandal for Facebook within the last week. It’s not even the first scandal involving minors. A recent investigation by Reveal showed how Facebook knowingly duped kids into racking up monster fees on their parents’ credit cards while playing Facebook games. Staffers toyed with whether they should stop it, by requiring kids to reenter credit card numbers before they could spend the money, but opted against it because it would threaten Facebook’s revenue. They fought efforts by parents seeking a refund, and even had a name for this gambit: “friendly fraud.” Facebook has since said that it updated its policies in 2016 to “provide dedicated resources for refund requests related to purchased made by minors on Facebook.” But the fact that Facebook willingly engaged in activity it defined itself as fraud suggests just how far the company will go to grow.
On Tuesday, Democratic senators Richard Blumenthal of Connecticut and Ed Markey of Massachusetts wrote to Facebook, demanding answers about the gaming issue. Now, both senators are voicing similar concerns about the Research app, particularly its targeting of children as young as 13.
“It is inherently manipulative to offer teens money in exchange for their personal information when younger users don’t have a clear understanding how much data they’re handing over and how sensitive it is,” Markey said in a statement, emphasizing his intention to reintroduce a bill called the Do Not Track Kids Act. “Congress also needs to pass legislation that updates children’s online privacy rules for the 21st century.”
Blumenthal, meanwhile, called the story an “astonishing example of Facebook’s complete disregard for data privacy and eagerness to engage in anti-competitive behavior.” Indeed, the scandal could amplify calls to break Facebook up, giving critics a shining example of how the company uses its market dominance to figure out which competitors to copy or crush.
“Facebook continues to demonstrate its eagerness to look over everyone’s shoulder and watch everything they do in order to make money,” Blumenthal wrote. “Mark Zuckerberg’s empty promises are not enough. The FTC needs to step up to the plate, and the Onavo app should be part of its investigation.”
In 2011, Facebook signed a consent decree with the FTC, barring the company from making “deceptive privacy claims.” The FTC declined to comment on whether these new revelations about the Research app will have any bearing on its ongoing investigation into whether Facebook violated the decree. Making that determination, Soltani says, will take a careful reading of the app’s disclosures. But if the Washington Post’s reporting turns out to be true, Facebook may soon face a “record-setting” fine from the Commission. Of course, as Soltani points out, even a fine of $100 million would constitute “a couple minutes of operating time” for Facebook.
Still, it would be the first real test of whether money’s the thing that could teach Facebook the lesson it has failed to learn on its own.