At this point, it isn’t exactly surprising that social media platforms like Facebook can have negative effects on society. For years, journalists, politicians, social scientists — and even biologists and ecologists — have been raising concerns about the influence Facebook has on our collective well-being. And Facebook has always defended itself by insisting that it is a net good to society because of how it brings people together.
But a new series of reports from the Wall Street Journal, “The Facebook files,” provides damning evidence that Facebook has studied and long known that its products cause measurable, real-world harm — including on teenagers’ mental health — and then stifled that research while denying and downplaying that harm to the public. The revelations, which only strengthen the case that a growing chorus of lawmakers and regulators have been making for breaking up Facebook or otherwise severely limiting its power as a social media giant, could represent a turning point for the company.
Already, the Journal’s reporting has prompted consequences for Facebook: A bipartisan Senate committee is investigating Instagram’s impact on teenagers, and a group of legislators led by Sen. Ed Markey (D-MA) is calling for Facebook to halt all development of its Instagram for Kids product for children under 13, which BuzzFeed News first revealed the company was developing in March.
“We are in touch with a Facebook whistleblower and will use every resource at our disposal to investigate what Facebook knew and when they knew it — including seeking further documents and pursuing witness testimony,” read a joint statement from Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) on Tuesday. “The Wall Street Journal’s blockbuster reporting may only be the tip of the iceberg.”
It’s unclear how much these efforts will impact Facebook’s policy decisions and bottom line. The investigations are in their early stages, and it’s too soon to say if it will directly lead to any new laws or other regulation.
Instagram’s head of public policy wrote in a company blog post on Tuesday that the Journal’s reporting “focuses on a limited set of findings and casts them in a negative light,” and that the fact that Instagram did internal research on the matter demonstrates its “commitment to understanding complex and difficult issues young people may struggle with.”
Rep. Rodgers: Do you agree too much time in front of screens, passively consuming content, is harmful to children’s mental health?
Mark Zuckerberg: Congresswoman, the research that I have seen on this suggests that if people are using computers and social —
Rep. Rodgers: Could you answer yes or no? I am sorry. Could you use yes or no?
Mark Zuckerberg. I don’t think that the research is conclusive on that. But I can summarize what I have learned, if that is helpful.
Zuckerberg went on to say, “overall, the research that we have seen is that using social apps to connect with other people can have positive mental health benefits and well-being benefits by helping people feel more connected and less lonely.”
He did not mention any of the negative effects his own team had found about Instagram over the past three years, including that in its own study of teenage users, 32 percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.
When Rep. Rodgers and other Republicans followed up with Facebook and asked about the company’s internal research on the effects of its products on mental health, the company did not share the Instagram research results, according to Bloomberg, nor did it share them with Sen. Ed Markey when his office also asked Facebook to provide any internal research on the matter in April, according to letters provided by Markey’s office to Recode.
“This is such a profound issue for kids and teens,” said Jim Steyer, CEO and founder of the nonprofit organization Common Sense Media, which promotes safe technology and media for children and families. “The fact that Facebook has known the research, done the research, and then hid it … it’s quite mind-boggling,” he told Recode.
Other damning findings from the Journal’s reporting include a discovery that the company has a VIP program that allows celebrities and politicians to break its rules, and that in 2018, Facebook tweaked its algorithm in a way that encouraged people to share angrier content. In each case, Facebook’s own employees found systematic proof of serious issues, but when they warned executives — including Mark Zuckerberg — about it, they were largely ignored.
For years, Facebook’s main line of defense to criticism about any negative impacts its products might cause is that social media, like other technological innovations, can cause some harm — but that the good outweighs the bad.
In a recent interview with my colleague Peter Kafka on the Recode Media podcast, Instagram head Adam Mosseri pointed to the way that social media has helped social justice movements like Black Lives Matter and Me Too. And he compared Facebook to the invention of the automobile.
“Cars have positive or negative outcomes. We understand that. We know that more people die than would otherwise because of car accidents,” said Mosseri. “But by and large, cars create way more value in the world than they destroyed. And I think social media is similar.”
It’s undeniable that social media can facilitate social change. It can also be a useful way for people to keep in touch with their friends and family — and indeed, as Zuckerberg told Congress, it can help people feel less lonely.
But, at some point, the question is whether the public will accept that rationale as an excuse for the company to have free rein to experiment on our collective well-being, measure that harm, and keep the public in the dark about what they learn as they continue to rake in record profits of nearly $30 billion a quarter.