Be careful who you trust, gentle internet user.
If Facebook's Wednesday report on "Widely Viewed Content" is to be believed, the most-shared links appearing in people's News Feeds during the second quarter of 2021 covered topics like football, hemp, charitable donations, and recipes. You may want to hold off on placing too much trust in Facebook, though.
The social media company reportedly buried a similar report that it prepared for 2021's first three months, according to a Friday story from the New York Times. You can probably guess at the reason: Facebook wasn't thrilled with the results!
The earlier report, which the Times reviewed in full alongside internal executive emails that were shared with the paper, found that the most-viewed link of Q1 was an article whose headline suggested the death of a "healthy" Florida doctor could have been caused by a COVID-19 vaccine. (All of the COVID vaccines currently approved for emergency use in the U.S. are "safe and effective," the Centers for Disease Control has stated.)
The article in question was accurately reported, noting only that it was possiblethe vaccine was a contributing factor in the doctor's death. A subsequent update to the story noted that the medical examiner's report found no conclusive evidence one way or another. While the article itself doesn't cross a line into misinformation, the loose connection it draws between the vaccine and a death is the sort of thing that anti-vax believers have weaponized on social media platforms.
The same buried report also revealed that the Facebook page for a far-right-aligned website that has peddled in conspiracy theories and misinformation was the 19th most popular page on the platform during the opening months of 2021. It's enough to make a reasonable person think that maybe, just maybe, President Joe Biden had a point back in July.
The report was headed toward public release until executives stepped in with worries over the possibility of it creating a public relations nightmare. That group included Facebook chief marketing officer Alex Schultz; Facebook told the paper Schultz initially supported the report's release but he eventually turned around and agreed with holding it back. The report was subsequently not released.
Facebook spokesperson Andy Stone had a whole entire quiet part to say out loud to the Timesin response: "We considered making the report public earlier, but since we knew the attention it would garner, exactly as we saw this week, there were fixes to the system we wanted to make."
There's no further explanation in the story for Facebook's thinking here. Mashable reached out with some follow-up questions, seeking clarification on exactly what "fixes" were needed, and what the company's leaders think the decision to hold back an apparently unfavorable report says about Facebook's stated efforts to be more transparent. The company hasn't yet replied.
The Q2 report, meanwhile, has been widely panned for presenting what critics consider to be an inaccurate portrayal of what people are seeing on the platform. A report from The Washington Postcharacterized the Q2 report as being "part of a broader push by Facebook to block or discredit independent research about harmful content on its platform, offering its own carefully selected data and statistics instead."
We saw one such situation unfold recently when Facebook moved to pull the plug on NYU's Ad Observatory project, an effort that was conceived to monitor and analyze how politicians spend ad money on the platform. The NYU project employed a browser extension that the program's participants, all of whom had to voluntarily opt in, could install to help gather data on ad placement.
Facebook threatened to shut the project down in the weeks ahead of the 2020 election in the U.S., and it eventually acted on those threats roughly halfway into 2021. The problem, as many critics pointed out, is Facebook leaned on flawed and easily undermined arguments to justify its decision.
"It's like ExxonMobil releasing their own study on climate change," a former Facebook employee told the Postin its story about the Q2 report. "It's something to counter the independent research and media coverage that tells a different story."
SEE ALSO:Facebook report claims decline in hate speech. Experts want more info.Facebook does indeed think very highly of this report, and how it reflects the company's efforts to provide the world with an honest picture of what people are seeing most on the platform. As Guy Rosen, vice president of integrity, said in a statement provided for that Poststory: "This is another step on a long journey we've undertaken to be, by far, the most transparent platform on the internet."
That is a staggering sentence to read in the context of this new Timesreport, which makes it clear that Facebook moved to hide an earlier, more damning report from public view and then fix its processes, in ways that aren't currently clear, to ensure that future reports didn't end up in the same spot.
Stone took to Twitter on Saturday to address the buried report and the flak Facebook is taking for how the whole situation played out, pointing first to the article about the doctor as an example of how hard it can be to define misinformation. That's a discussion worth having, but it's a bit of a sidestep away from the real issue: Facebook buried an unfavorable report.
Stone doesn't get to that until seven tweets into the thread, noting "we ended up holding it because there were key fixes to the system we wanted to make," echoing what he earlier told the Times. He doesn't take it any further than that, though. And that's a problem. What was wrong with the Q1 report that necessitated a fix, beyond it containing info that Facebook didn't like? What exactly did those fixes consist of? Facebook hasn't answered those questions.
Following Stone's thread, Facebook ended up releasing the Q1 report in PDF form on Sunday, as The Verge reported.
In one sense, it's hardly shocking or unexpected for a company, public or private, to place its own interests ahead of anyone else's. But at the same time, it's difficult to swallow Facebook's flowery statements about transparency and serving the public good when, as we've now learned, it's very obviously doing exactly the opposite.
UPDATE: Aug. 22, 2021, 10:21 a.m. EDT Added details about Andy Stone's thread and additional context relating to the buried report.
UPDATE: Aug. 22, 2021, 12:19 p.m. EDT Added a link to the Q1 report PDF, which Facebook released on Aug. 22.
TopicsFacebookSocial Media
(责任编辑:知識)
We asked linguists if Donald Trump speaks like that on purpose
Tyler, the Creator helped Frank Ocean celebrate 'Blonde' release in a delicious way
Mom discovers security cameras hacked, kids' bedroom livestreamed