Facebook is in the eye of the storm.
Politicians from both sides of the aisle are livid after a trove of leaked internal documents revealed that Facebook has long known, from its own research, the severe harm its apps can cause.
Consumers are mad from being knocked off the company’s services for six hours on Monday, the longest outage in 13 years.
And investors, who are typically the last to jump ship, are speaking with their wallets, pushing the stock price down 12% in the past three weeks, while the Nasdaq has fallen just 4.5% over that stretch.
It’s a flurry of outrage similar to what hit Facebook in March 2018, when reports surfaced that Cambridge Analytica improperly accessed the data of 87 million Facebook members and used it to target ads for Donald Trump in the 2016 presidential election.
That scandal marked a huge black eye for Facebook and sparked scrutiny over its lobbying efforts, calls for a breakup of the company, multiple antitrust probes and ultimately a record $5 billion fine from the Federal Trade Commission.
But Facebook’s business kept humming along, and the site didn’t change much. Ahead of the 2020 presidential election, misinformation still flourished. And over the course of the Covid-19 pandemic, anti-vaxxers and anti-maskers have run wild, with Facebook’s algorithms often helping to spread the most outlandish conspiracy theories.
The latest crisis stems from reporting by The Wall Street Journal, showing that Facebook clearly understands the addictive nature of its products, and uses that knowledge to make even more money off its users. In particular, Facebook knows its Instagram service can be detrimental to the health of teenagers.
“Facebook is just like Big Tobacco, pushing a product that they know is harmful to the health of young people, pushing it to them early, all so Facebook can make money,” Sen. Ed Markey, D-Mass., said in a hearing last week held by the Senate Commerce subcommittee on consumer protection.
Facebook’s witness at the hearing was Antigone Davis, the company’s global head of safety. She was called on to address a series of stories from the Journal titled “The Facebook Files,” which was based on internal documents provided by a whistleblower.
On Sunday, the whistleblower revealed herself ahead of an interview on the CBS program “60 Minutes” as Frances Haugen, a former product manager at the company. Before leaving Facebook in May, Haugen made copies of at least 209 slides of internal company research.
The public outcry sparked by the Journal’s series ultimately led Facebook to pause its plans to develop Instagram Kids, a version of the app aimed at children 12 and younger.
However, Facebook hasn’t committed to permanently ending it Instagram Kids effort. In a recent chat with his Instagram followers, Facebook’s Instagram chief, Adam Mosseri, said he would allow his young children to use Instagram if there was a customized version of the product for that age group.
A different tone on the Hill
While Facebook hearings have become almost routine on Capitol Hill, the events of this week were very different than in the past.
On Tuesday, after her appearance on “60 Minutes,” Haugen testified before the same subcommittee that hosted Davis. Haugen was scathing in her criticism of her former employer, telling lawmakers that the company consistently prioritizes profits over user health and safety and that it steers users toward high-engagement posts that are often known to be harmful.
Senators called out Facebook CEO Mark Zuckerberg for failing to answer their questions and doing nothing to address the public since the start of the Journal’s series of reports. Following the hearing, subcommittee Chair Sen. Richard Blumenthal, D-Conn, said it was premature to consider subpoenaing Zuckerberg, adding that he should appear before Congress voluntarily.
“He has a public responsibility to answer these questions,” Blumenthal said.
Zuckerberg finally addressed the issue Tuesday evening in a Facebook post, rejecting claims made by Haugen and the Journal.
“At the heart of these accusations is this idea that we prioritize profit over safety and well-being,” Zuckerberg wrote. “That’s just not true.”
He added it’s illogical to make the case that Facebook intentionally pushes users to content that makes them angry.
“We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content,” he wrote.
As if Facebook wasn’t experiencing enough stress, on Monday the company suffered its worst service outage since 2008.
The outage was caused by “configuration changes on the backbone routers” and knocked out the company’s numerous services, including Facebook, Instagram and WhatsApp, for six hours.
Facebook’s own work tools were also knocked offline, and company employees and contractors were unable to get on the system. One employee told CNBC that some workers convened on an impromptu Discord server in order to communicate with one another because Facebook’s internal communication tools were offline. An Instagram employee told CNBC that some employees said the outage was karma for the recent whistleblower ordeal.
With advertisers unable to reach consumers for the bulk of the work day, Facebook may have lost between $110 million and $120 million in ad revenue, according to an estimate provided by Morningstar. That would represent just more than 0.4% of the revenue Facebook generated in the fourth quarter a year ago.
Investors pushed the stock down almost 5% on Monday, adding to a recent slump. Even after bouncing back 2% on Tuesday, the stock is 12% below where it was trading on Sept. 13, just before the Journal started publishing its series.
Facebook could very well rebound as it has in the past and continue its upward ascent. But each time there’s a crisis, investors have a little more reason to question the sustainability of the business model.
“We have battled negative headline after negative headline before,” CNBC’s Jim Cramer wrote Monday afternoon to members of his investing club. “However, this latest story strikes us as different. The culture needs to change at Facebook, and if they cannot fix themselves then we would expect to see even more calls, louder calls, for increased regulation on the platform, and regulation is never good for business.”
A Facebook spokesperson didn’t respond to a request for comment.