Leaked Documents Show Facebook Put Profit Before Public Good

Nov. 8, 2021 — A leaked trove of papers from inside Facebook reveals that the social media large’s inner analysis uncovered a bunch of issues on the platform associated to public well being and different points, however did just about nothing about it.

The information had been leaked by a whistleblower, former Facebook worker Frances Haugen, who shared tens of hundreds of paperwork with the Securities and Exchange Commission, Congress, and a consortium of stories organizations. She has since testified earlier than the Senate Commerce Subcommittee on Consumer Protection and European lawmakers.

Amplifying ‘Anti-Vaxxers’ and Other Misinformation

President Joe Biden brought on a stir in July when he mentioned that due to rampant misinformation in regards to the COVID-19 vaccine, social media platforms like Facebook are “killing people — I mean they’re really, look, the only pandemic we have is among the unvaccinated,” he said. “And they’re killing people.”

While he was compelled to stroll again the assertion, the leaked papers recommend he wasn’t essentially fallacious.

According to the papers, in March — a time when the White House was getting ready a $1.5 billion marketing campaign towards vaccine misinformation — some Facebook staff thought they’d discovered a method to counter these lies on the platform, and on the similar time prioritize professional sources just like the World Health Organization.

“Given these results, I’m assuming we’re hoping to launch ASAP,” an worker wrote.

But Facebook ignored among the options and executives dragged their heels implementing others. Another proposal, geared toward curbing anti-vaccine feedback, was additionally ignored.

“Why would you not remove comments? Because engagement is the only thing that matters,” Imran Ahmed, CEO of the Center for Countering Digital Hate, an web watchdog group, informed The Associated Press. “It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”

Facebook’s algorithms — which decide the content material you see in your feed — additionally assist to unfold misinformation.

“It’s not like the anti-vax contingent was created by Facebook,” says Dean Schillinger, MD, director of the Health Communications Research Program on the University of California-San Francisco. “The algorithm said, ‘OK, let’s find certain people with certain political beliefs and let’s link them to anti-vaxxers,’” amplifying the misinformation. “That is certainly something that’s novel.”

If that weren’t sufficient, it seems Facebook might have misled Congress in regards to the firm’s understanding of how COVID misinformation unfold on the platform. In July, two prime House Democrats wrote to Facebook CEO Mark Zuckerberg requesting particulars about what number of customers had seen COVID misinformation and the way a lot cash the corporate produced from these posts.

“At this time, we have nothing to share in response to the questions you have raised, outside of what Mark has said publicly,” the corporate mentioned in response.

But the leaked papers present that by that time, Facebook’s researchers had run a number of research on COVID misinformation and produced massive inner stories. Employees had been in a position to calculate the variety of views garnered by a extensively shared piece of misinformation. But the corporate didn’t acknowledge that to Congress.

Keeping this data secret was an enormous missed alternative to make sure science-backed data reached most of the people, says Sherry Pagoto, PhD, director of the UConn Center for mHealth and Social Media.

“We know how misinformation spreads, so how can we think more about disseminating good information?” she says. “They have all kinds of data on the characteristics of messages that go far. How can we use what they know in the field of health communication to come up with a plan?”

In an emailed assertion, a spokesperson for Meta (within the midst of the uproar, Facebook introduced a brand new company title) mentioned, “There’s no silver bullet to fighting misinformation, which is why we take a comprehensive approach, which includes removing more than 20 million pieces of content that break our COVID misinformation policies, permanently banning thousands of repeat offenders from our services, connecting more than 2 billion people to reliable information about COVID-19 and vaccines, and partnering with independent fact-checkers.”

Ignoring Instagram’s Effect on Vulnerable Teens’ Mental Health

Combating misinformation isn’t the one method Facebook and its subsidiaries may have acted to guard public well being. The firm was additionally conscious of its adverse affect on younger individuals’s psychological well being, however publicly denied it.

Instagram, which is owned by Facebook, is extraordinarily widespread amongst teenage women. But the photo-sharing app exposes them repeatedly to pictures of idealized our bodies and faces, which might result in adverse self-comparisons and stress to look good.

Pro-eating dysfunction content material can also be extensively obtainable on the platform. For years, social science and psychological well being researchers have been social media’s impact on psychological well being, notably for adolescents. Studies have discovered hyperlinks between Instagram use and despair, nervousness, low vanity, and consuming issues.

The Facebook papers revealed what Instagram researchers referred to as a “teen mental health deep dive.” And there have been severe issues: The inner analysis confirmed that the platform made physique picture points worse for 1 in 3 teenage women, and 14% of teenage boys mentioned Instagram made them really feel worse about themselves. The information linked use of the app with nervousness and despair. And amongst teenagers who reported ideas of suicide, 6% of American customers and 13% of British ones tied that impulse on to Instagram.

Jean Twenge, PhD, writer of iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy–and Completely Unprepared for Adulthood, has been learning social media’s results on younger individuals for nearly a decade.

“I was not surprised that Facebook was finding social media could have significant links to depression and self-harm. The academic research has been showing that for years,” she says. “I was surprised how in-depth their research was into exactly the mindset of teen girls using Instagram. Their research really built on what we already knew.”

As with Facebook’s findings on misinformation, the corporate publicly downplayed Instagram’s adverse results — together with in feedback to Congress — and did little to regulate teen customers’ expertise on the app.

“I think that given what they knew about Instagram and mental health, it certainly would’ve been the right thing to do to make changes to the platform,” Twenge says.

In their e-mail, the Meta spokesperson mentioned, “Our research doesn’t conclude that Instagram is inherently bad for teens. While some teens told us Instagram made them feel worse when they were struggling with issues like loneliness, anxiety, and sadness, more teens told us that Instagram made them feel better when experiencing these same issues.”

A Responsibility to the Public Good?

While Facebook customers could also be shocked to find out how the corporate usually put income forward of its prospects’ well being, those that research public well being are something however.

“This is not a problem in any way unique to social media platforms,” Schillinger says.
“Corporate entities frequently pursue policies that engage the public to participate in activities, to purchase or consume products, to implement behaviors that are unhealthy to themselves or others or the planet. … Do you think Facebook is acting differently than any other company in that space?”

Which is the place the potential for regulation is available in, Haugen, the whistleblower, mentioned. She has referred to as for it, as have many lawmakers within the wake of her revelations.

“Large organizations that have influence and access to lots of people need to be accountable to the well-being of that population, just as a matter of principle,” says sociologist Damon Centola, PhD, writer of Change: How to Make Big Things Happen.

He likens the explosion of social media to the historical past of tv, which has been regulated in quite a few methods for many years.

“I think that provides us with a parallel of social media and the capacity of the medium to influence the population,” he says. “It seems to me that organizations can’t get away with saying they won’t take public welfare into account.”

The so-called Facebook Papers are most damning, some consultants say, as a result of the corporate’s protection claims their analysis was solely meant for product improvement, so it doesn’t show something.

This disregards all of the peer-reviewed papers, printed in revered journals, that reinforce the findings of their inner analysis. Taken collectively, the 2 sorts of analysis depart little room for doubt, and little doubt that one thing wants to vary.

“Think of it like environmental polluting,” Centola says. “Companies can know they’re polluting, but they can also say it didn’t actually matter, it didn’t cause any harm. But then you get the documentation saying no, that has huge effects. That’s when it really does matter.”

Social Media as a Force for Good

But there may be one potential upside of the Facebook papers, in keeping with the consultants: It’s clear that the corporate is aware of rather a lot about how you can unfold messages successfully. With sufficient stress, Facebook and different social media platforms might now start to make use of these insights in a constructive course.

“Facebook should be developing a strong collaboration with trustworthy entities to develop content that’s both true and promotes public health, while also engaging and algorithmically driven,” Schillinger says. “If we can use the platform and the reach and the [artificial intelligence] Facebook has for health-promoting content, the sky’s the limit.”

And efforts like that could be on the horizon.

“We’re focused on building new features to help people struggling with negative social comparison or negative body image,” the Meta spokesperson wrote within the e-mail. “We’re also continuing to look for opportunities to work with more partners to publish independent studies in this area, and we’re working through how we can allow external researchers more access to our data in a way that respects people’s privacy.”

Which is to not say that Facebook will voluntarily put public well being earlier than the corporate’s must generate income, with out laws forcing them to take action.

“I do think Facebook is interested in making their platform better for users. But their first interest is always going to be having as many users as possible spending as much time as possible on the platform,” Twenge says. “Those two desires are often at cross-purposes.”

Leave a Reply