“Facebook is the new cigarettes”

Richard A Meyer
5 min readSep 18, 2021

This week, The Wall Street Journal published a bombshell investigation about how Facebook responds to the flaws in its platform. The four-part report, which is largely based on internal documents, suggests that the company often plays down what it knows about these problems. According to The Journal, at least some of the documents have been turned over to the S.E.C. and Congress by a whistle-blower. Here is a summary of articles from the Times and the WSJ. They paint a very disturbing picture.

Researchers inside Instagram, which is owned by Facebook, have been studying for years how its photo-sharing app affects millions of young users. In public, Facebook has consistently played down the app’s negative effects, including in comments to Congress, and hasn’t made its research public or available to academics or lawmakers who have asked for it. In response, Facebook says the negative effects aren’t widespread, that the mental-health research is valuable and that some of the harmful aspects aren’t easy to address.

Mr. Zuckerberg resisted some fixes proposed by his team, the documents show, because he worried they would lead people to interact with Facebook less. Facebook, in response, says any algorithm can promote objectionable or harmful content and that the company is doing its best to mitigate the problem.(

Scores of Facebook documents reviewed by The Wall Street Journal show employees raising alarms about how its platforms are used in developing countries, where its user base is huge and expanding.

For more than a month, Facebook researchers warned that comments on vaccine-related posts-often factual posts of the sort Facebook sought to promote-were filled with antivaccine rhetoric aimed at undermining their message, internal documents reviewed by The Wall Street Journal show.

Some Facebook officials have become concerned that Mr. Zuckerberg or Chief Operating Officer Sheryl Sandberg may face questions from lawmakers about how their past public statements on these issues square with the company’s internal assessments, according to people familiar with the matter.

The Covid-19 mess in particular strikes at the heart of Facebook’s problem: its users create the content, but their comments, posts, and videos are hard to control, given how Facebook built and runs its platform, in ways that are fundamentally different from a company shaping its product or a publisher curating stories.

“We’re focused on outcomes, and the data shows that for people in the U.S. on Facebook, vaccine hesitancy has declined by about 50% since January, and acceptance is high,” Facebook spokesman Aaron Simpson said in a statement.

“The internal narrative is that the platform is by and large good,” said Brian Boland, a former Facebook vice president who managed business relationships and left late last year in part because he said the company wasn’t forthcoming enough about its problems.

“Through this crisis, one of my top priorities is making sure that you see accurate and authoritative information across all of our apps,” Mr. Zuckerberg wrote in an accompanying post that said Facebook had removed hundreds of thousands of false claims around Covid.

Facebook went further, and discussed what more it could do to tamp down borderline posts that came just short of violating its rules, many of which it labeled as false but didn’t remove, the documents show.

In August 2020, a report by the advocacy group Avaaz concluded that the top 10 producers of what the group called “health misinformation” were garnering almost four times as many estimated views on Facebook as the top 10 sources of authoritative information.

“I think that if someone is pointing out a case where a vaccine caused harm, or that they’re worried about it, that’s a difficult thing to say, from my perspective, that you shouldn’t be allowed to express at all,” Mr. Zuckerberg said in a September interview with Axios on HBO.

“I randomly sampled all English-language comments from the past two weeks containing Covid-19-related and vaccine-related phrases,” the researcher wrote early this year, adding that based on his assessment of 110 comments, about two-thirds “were anti-vax.”

Wall Street Journal

The company said it would now remove a much longer list of false vaccine claims than before-including that vaccines aren’t effective, or that it is safer to get the disease than to be vaccinated-rather than simply labeling them as false.

Given the research showing a small number of posters and commenters were responsible for a large amount of antivaccine content, Facebook slashed the number of comments a person could make on posts from authoritative health sources to 13 per hour from 300, according to an April 2 internal memo.

Facebook’s Mr. Rosen said in a public post in July that the company wasn’t responsible for vaccine hesitancy in the U.S., and that it was helping promote vaccines. He cited a survey that showed vaccine acceptance by Facebook users in the U.S. had risen 10 to 15 percentage points since January, and said it had removed or reduced the visibility of more than 185 million pieces of debunked or false Covid content.

At a gathering of Facebook’s leadership in and around Menlo Park early this month, some officials discussed whether Facebook has gotten too big, with too much data flowing to manage all of its content, said people familiar with the gathering.

Among other findings:

Instagram’s own research shows risks to teenagers’ mental health.The service, which is owned by Facebook, has been studying its effect on young users for three years. “We make body image issues worse for one in three teen girls,” read one slide in an internal presentation, according to The Journal. Senators Richard Blumenthal and Marsha Blackburn said that they would launch an inquiry into the research, which Instagram defended in a blog post.

Facebook knows its algorithm rewards outrage. In 2018, the company made changes to its algorithm that it said would encourage interactions among families and friends. But internal research found that publishers and political parties responded by creating content that produced a lot of discussion — often because it was sensational and divisive. “Misinformation, toxicity, and violent content are inordinately prevalent among reshares,”

Facebook has been slow to stop drug cartels and human traffickers from using its platform. Internal documents reviewed by The Journal revealed that Facebook employees had flagged criminal use of the platform in some countries but received a weak response from the company.

Even when Zuckerberg tries to do the right thing, and loudly, The Journal’s reporting shows how the platform he built is used to undermine his efforts, as we’ve seen with anti-vaccination misinformation.

What’s most revealing is the persistence of the tired old, so-so-sorry, we’ll-do-better excuses that its executives trot out when the company is called out for its destructive products.

At this point, it’s probably best for Facebook executives to say nothing, since every time they do they trip all over themselves in their weird analogies — which are often centered on the idea that humanity sucked before Facebook.

Originally published at https://www.newmediaandmarketing.com on September 18, 2021.

--

--

Richard A Meyer

Marketing and Political thought leader — Writer- Audiophile