This article was originally published at Observer.com, where Marco is editor-in-chief.

 

n 1995, 15 years before founders Kevin Systrom and Mike Krieger would obtain seed funding for the app that would become Instagram, a meta study of the mortality of anorexia nervosa was published in the Journal of American Psychiatry The findings were alarming. The study showed that the mortality rate associated with anorexia nervosa was more than 12 times higher than the annual death rate for females 15-24 years old in the general population, and the risk of suicide more than 200 times higher. In the decades that followed, more research was conducted. The conclusions were similar. Eating disorders including anorexia, bulimia, and EDNOS (eating disorder not otherwise specified) were not only deadly, but had a range of mortality rates that, at the high end, were comparable to the abuse of cocaine. A meta study of all-cause mortality of mental disorders conducted in 2014 found that anorexia nervosa specifically was associated with a higher mortality rate than alcohol use disorder. Only opioid use was significantly more deadly.

It’s important to keep this context in mind when reading the internal Facebook research documents published by the Wall Street Journal on September 29.  In a presentation titled, “Teen Girls Body Image and Social Comparison on Instagram — An Exploratory Study in the US,” researchers at Facebook mapped out — in colorful diagrams and branded charts — the “downward spiral” that is both triggered and “exacerbated” by use of the Instagram platform. “Once in a spiral,” the document reads, “teens work through a series of emotions that in many ways mimic stages of grief.”

Ad

The stages of grief are presented as an ouroboros of brightly hued arrows pulled from the Instagram brand color palette. “Bargaining” is a deep royal purple. “Insecurity” is a lovely cornflower blue followed immediately by the bright kelly green of “Dysmorphia.” The result, according to Facebook, is that “aspects of Instagram exacerbate each other to create a perfect storm.”

What’s inside the storm? Facebook researchers concluded that “mental health outcomes related to this can be severe.” Below this headline, in bright red text, is a list of the outcomes. The first is “eating disorders.”

To say that explicitly connecting the use of a product to a category of disorders that have similar mortality rates to cocaine abuse is alarming would be an understatement. But eating disorders are primarily associated with — and disproportionately affect— women and girls. They are not treated with the same seriousness as substance use disorders. It is difficult to imagine a researcher at a tech company presenting a rainbow colored “downward spiral” that ends with amphetamine abuse before moving on to design recommendations that include implementing more “fun” photo filters and experimenting with “mindfulness breaks.”

Renee Engeln is a professor at Northwestern, where she runs the university’s Body and Media Lab. Engeln studies the same relationships between social media, mental health and body image that Facebook is addressing in the leaked report. I sent her the report and called shortly after.

Ad

“We’ve known all this forever,” she said immediately. “They’ve known this forever, too.”

She told me, based on viewing the report, that Facebook is underestimating the severity of eating disorders as well as how widespread eating disordered behavior is. At the same time, Engeln said Facebook was also missing the larger point — the effect that Instagram has on its users. “You don’t have to have an eating disorder for it to matter,” Engeln said. “When a whole generation of girls spends a significant amount of time hating what they see in a mirror, that’s a mental health issue, even if they don’t meet the diagnostic criteria for a mental disorder.”

Facebook claims that the leaked research has been mischaracterized by the Journal, and has responded by publishing and annotating two internal presentations about the toxicity of Instagram. “This type of research is designed to inform internal conversations and the documents were created for and used by people who understood the limitations of the research,” reads the update to a statement attributed to Pratiti Raychoudhury, Vice President, Head of Research for Instagram.

In a Senate hearing on September 30, Sen. Richard Blumenthal of Connecticut read from documents provided to his office by a whistleblower that contradicted Facebook’s soft-pedaling of its own internal research. “Substantial evidence suggests that experiences on Instagram or Facebook make body dissatisfaction worse, particularly viewing attractive images of others, viewing filtered images, posting selfies and viewing content with certain hashtags,” Blumenthal quoted.

Testifying at the same hearing, Facebook’s global head of safety Antigone Davis said that the company believes that Instagram helps more teens than it harms, but added that the research led to “numerous” changes that include “a dedicated reporting flow for eating disorder content.”

Engeln rejected Facebook’s argument that Instagram was sometimes a positive experience for young people. “The fact that the platform can provide positive and negative experiences isn’t interesting. That’s just typical. When I see people downplay a report like this, I want to know how many people produced the report. How many people were in the meeting when it was presented? I want you to add up all those hours, and how much those people are paid, and then tell me you didn’t think it was a big deal.”

Facebook has, in recent days, paused their initiative to develop “Instagram Kids,” a version of the app for users under 13. “This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today,” wrote Adam Mosseri, Head of Instagram. 

Mental health experts were not called out specifically in Mosseri’s list. Whether or not Facebook’s internal research is accurate, the manner in which the conclusions were presented demonstrates an attitude that seems out of step with the seriousness of their findings. And there isn’t a lot of evidence to suggest that their internal researchers are either wrong or unqualified to study the problem.

“I know there are scientists working at Facebook and Instagram,” said Engeln, “We have people who’ve gotten PhDs from our department who work there. I know they have good scientists, so I know that they knew this stuff already.”

A study conducted by Engeln’s lab along with researchers from UCLA and the University of Oxford , showed that Instagram is potentially more uniquely harmful to self-image than Facebook’s other products.
Engeln and her team found that when study subjects used Instagram (but not Facebook) it led to a significant decrease in body satisfaction — in only seven minutes of use.

“They just messed around on their own Instagram account for seven minutes,” Engeln said. “And that was enough.”

I asked Engeln what social media companies could do to improve mental health outcomes for their users. She was not optimistic that Facebook would implement changes without intervention.

“I don’t trust social media companies to do anything to minimize the harm to young people. I think it is not in their best interest to do so. I think what they’re most interested in doing is minimizing harm to their reputation so that they can continue to make lots of money and garner lots of social influence and power. And I’m sorry if that’s obnoxious, but you can quote me on that.”

Meg Marco (@meghann on Twitter) is editor in chief at The Observer, published by New York-based Observer Media. This piece was originally published there.