Facebook is under fire for allegedly censoring conservative viewpoints through its internal chat boards, leading to the company’s employees taking sides on how their platform should be run. The issue raises questions surrounding privacy and transparency that will likely reverberate in Congress as lawmakers seek to regulate social media giants like Facebook.
A Facebook FB -5.05 percent employee said on the company’s racial-justice conversation board in June 2020, when America was shaken by demonstrations after the killing of George Floyd at the hands of a Minneapolis police officer: “Get Breitbart off of News Tab.”
Facebook’s News Tab is a tool that collects and promotes stories from a variety of sources. “Minneapolis Mayhem: Riots in Masks,” “Massive Looting, Buildings in Flames, Bonfires!” and “BLM Protesters Pummel Police Cars on 101!” were among the headlines included in the employee’s reply.
According to written conversations on Facebook’s office communication system reviewed by The Wall Street Journal, the employee said they were “emblematic of a concerted effort at Breitbart and similarly hyperpartisan sources (none of which belong in News Tab) to paint Black Americans and Black-led movements in a very negative light.” Many other workers echoed this sentiment.
A business researcher noted in the same conversation that any actions intended at eliminating Breitbart—a right-wing publication popular among fans of former President Donald Trump—could meet internal difficulties due to the possible political fallout. “At best,” the researcher stated, “it would be a very tough policy conversation.”
Breitbart News was kept on Facebook’s News Tab. According to a spokeswoman for the tech giant, the company bases its decision on the specific content published on Facebook, not the entire Breitbart site, and that the Facebook material met the company’s requirements, including the need to follow its anti-misinformation and anti-hate speech policies.
Many Republicans, including President Donald Trump, claim that Facebook discriminates against conservatives. The records examined by the Journal did not reach a conclusion on whether prejudice plays a role in its overall choices. They do demonstrate that staff and their supervisors argued whether and how to constrain right-wing publications, with more senior employees often acting as a check on agitation from the ranks. The Journal’s records, which do not include all of the staff message, did not reference similar discussions about left-wing media.
Other records show that Facebook’s management team has been so concerned on avoiding accusations of prejudice that political concerns are often at the forefront of its decision-making.
Employees at Facebook have constantly campaigned for the firm to take action against far-right websites, as shown by a huge number of internal message boards. They’ve structured their arguments around Facebook’s enforcement of its own rules in many instances, claiming that Facebook is giving right-wing publications a pass to avoid public backlash. “We’re afraid of political blowback if we follow our standards without exceptions,” one employee said in an internal message.
In June 2020, there will be a rally in Minneapolis against police brutality and racism.
KEREM YUCEL/Getty Images/Agence France-Presse
According to the papers, Facebook workers paid specific attention to Breitbart, slamming the company for promoting the site’s content on the News Tab and assisting it in ad sales. They also claimed that Facebook provided Breitbart and other conservative publications preferential treatment, allowing them to avoid sanctions for spreading false material or hate speech.
According to statistics from research company NewsWhip, right-wing sites are routinely among the best-performing publishers on the network in terms of interaction. That is one of the reasons why some on the left criticize Facebook, claiming that its algorithms favor far-right material.
Facebook claims that it applies its policies equally and does not consider politics when making decisions.
Facebook spokesperson Andy Stone stated, “We make adjustments to minimize problematic or low-quality material to enhance people’s experiences on the network, not because of a page’s political point of view.” “When it comes to modifications that would have an influence on public pages, such as publishers, we always examine the effects of the proposed change before implementing it.”
The Facebook Files is a Wall Street Journal effort based on research from earlier this year that contains a cache of papers and data. Internal conversations provide a rare glimpse into Facebook’s difficulties to manage the products and systems that are at the core of its financial success.
According to Pew Research Center, Facebook is one of the most essential channels for publishers, with more than a third of Americans stating they obtain their news from the site on a daily basis.
Conservative content was consistently blocked by Facebook’s “Trending Topics” list, according to the tech site Gizmodo in May 2016. The charges were dismissed by Facebook, but the resulting debate encouraged Republicans to accuse the company of prejudice, which they haven’t stopped making.
Employees’ hostility for conservative media is evident in certain internal papers. In 2018, a Facebook developer quit the firm after claiming on a message board that the corporation was intolerant of conservatives. Some Facebook workers chastised him for appearing on Tucker Carlson’s Fox News broadcast, saying the network was “so notorious and biased it can’t even call itself a news station,” according to message board archives. Several staffers referred to Mr. Carlson as a “white nationalist” and “political hack” who “looks like a Golden Retriever who has been tricked out of a stockpile of goodies on a regular basis.”
In an interview, Mr. Carlson stated, “Any dog analogy is a praise as far as I’m concerned.”
Fox News did not respond to a request for comment. News Corp., which owns Fox, and News Corp., which owns the Wall Street Journal, are both owned by the same company.
Employees addressed whether Facebook was enforcing its policies equally across the political spectrum in several of the papers seen by the Journal. According to documents from internal Facebook discussion boards, they claimed the firm was enabling conservative sites to dodge the company’s fact-checking policies, post untrustworthy and inflammatory information, and jeopardize the internet giant’s relationship with advertisers.
‘Extraordinary circumstances’
A member of Facebook’s integrity team, which aims to reduce bad conduct on the site, sent a goodbye email to coworkers in late 2020, claiming that Breitbart was harming the company’s attempts to combat hate speech.
“We create specific exceptions to our written policy for them,” the worker said, “and we even officially encourage them by identifying them as trusted partners in our main products.”
Trust is ranked.
Breitbart was the least trusted news site, as well as the lowest quality, according to a Facebook survey of several dozen in the United States and the United Kingdom.
User evaluations of trustworthiness
Breitbart News is rated number one.
the least dependable
each and every publisher
INTERNAL TRUST RATING ON FACEBOOK
Y-Axis: User survey trust ratings
X-Axis: Internal Facebook trust rating
Breitbart is the publication with the lowest level of trust among each and every publisher.
Y-Axis: User survey trust ratings
X-Axis: Internal Facebook trust rating
Breitbart is the least trustworthy of the news organizations.
all publishers
News Tab, which was established in 2019, includes Breitbart. A major layer of the offering features selected news from newspapers such as The Wall Street Journal, New York Times, and Washington Post, who get compensated for their material. Breitbart is part of a second layer of content that isn’t funded and is aimed to give news according to a user’s interests.
Facebook said that sites featured in the News Tab must concentrate on quality news reporting and that those that disseminate what it considers to be disinformation or violate its public list of community standards would be removed.
When asked about Breitbart’s inclusion, Facebook CEO Mark Zuckerberg noted in an interview at the time of the debut that the goal was for News Tab to contain a diverse range of viewpoints.
As Mr. Floyd’s death on May 25 heightened political tensions across the country in 2020, one staffer wrote in the racial-justice chat that he understood that “factual progressive and conservative leaning news organizations” both needed to be represented, but that this could be accomplished without including Breitbart.
Because “news framing is not a norm by which we approach journalistic integrity,” a senior researcher noted in the conversation that removing Breitbart from News Tab for the way it presented news events, such as the demonstrations following Mr. Floyd’s death, would be an issue for Facebook.
Breitbart might get caught in the net if the firm removes sites whose trust and quality ratings are declining, he warned. He doubted, though, that the corporation would do so for all publishers whose ratings had dropped. “I can also tell you that two years ago, we observed a reduction in confidence in CNN: would we adopt the same strategy for them as well?” he wrote.
He said that Breitbart was harmed by algorithm adjustments that rewarded all material deemed trustworthy, which he claimed were justified inside Facebook since they applied to all publications and could be linked to a stated purpose of enhancing user experience.
According to a graphic from the survey provided by the Journal, Breitbart was the least trusted news site of the several dozen it looked at throughout the United States and Great Britain in August 2019. Breitbart was likewise classed as “low-quality” in the research, which also assessed news sources based on their quality.
Breitbart’s material, according to a spokesperson, is significantly more accurate and popular among Facebook’s own users than the mainstream news media rivals for which Facebook pays for content.
Demonstrators at Huntington Beach, California, in March 2017 in support of Donald Trump and Breitbart.
EUGENE GARCIA/epa/shutterstock photo
Advertisers and staff who work in ad sales have also criticized Facebook’s ties with Breitbart. In 2018, a Facebook Audience Network staffer claimed that Breitbart should be removed from the network, which is a collection of third-party publications to which Facebook sells advertising.
In an internal document, the individual stated, “My point is that enabling Breitbart to monetize via us is, in fact, a political statement.” “It’s a willingness to embrace radical, ugly, and often fake news that is used to spread fear, racism, and bigotry.”
Following the 2016 election, advertisers began to shun Breitbart, which took pleasure in upsetting the left with anti-PC language and nationalism that detractors labeled racist. Even if an advertiser didn’t actively intend to advertise on Breitbart, its advertising may show there thanks to the automatic ad system.
Many advertisers tried to avoid appearing on Breitbart by using a Facebook Audience Network function that enabled them to restrict certain websites, but the strategy wasn’t working, according to the employee.
“Breitbart attempts to get around every restriction we put in place, so we have to block at the platform level,” one anonymous advertiser was quoted as saying by the employee, expressing the client’s discontent.
A product management director answered. The filmmaker said, “On a personal level, what you say resonates with me.” “That being said, and most importantly, while making choices, we must depend on our beliefs and policies.”
It was incorrect, according to a Breitbart spokesperson, that it operated around restrictions.
Breitbart stayed in the Facebook Audience Network until the spring of 2020, when it was withdrawn along with all other mobile web publishers.
‘Hyperposters’ are being targeted.
Following the 2016 election, Facebook made efforts to reduce the spread of disinformation in users’ feeds. A program dubbed “Sparing Sharing” was used to target “hyperposters,” or users who post often. It lowered the number of people who saw their postings because analytics showed that these individuals published a disproportionate amount of incorrect and inflammatory content.
Despite Joel Kaplan, Facebook’s global head of public policy and a former deputy chief of staff to former President George W. Bush, cautioning against rushing the effort, Facebook had adopted it. Mr. Zuckerberg agreed to the move, but requested that its impact be reduced.
Another technology, known as “Informed Engagement,” limited the reach of messages that individuals were more inclined to share if they hadn’t read them first.
The two changes effectively changed the news articles that consumers would view toward a more mainstream, less volatile mix.
According to the papers examined by the Journal, Facebook data scientists analyzed the effect of the two technologies on dozens of publications depending on their ideology in 2019.
According to the papers, the research, labeled a “political ideology analysis,” revealed that the firm had been reducing the traffic of big far-right sites, even though that was not its objective. According to the papers, “very conservative” sites would profit the most if the tools were deleted, with Breitbart’s traffic rising by 20%, the Washington Times’ by 18%, the Western Journal’s by 16%, and the Epoch Times’ by 11%.
Political Assessment
When Facebook looked at its ‘Sparing Sharing’ and ‘Informed Engagement’ features, which were intended to combat disinformation, it discovered that the two had a stronger effect on far-right publications.
Ideology in politics (as classified by Facebook)
The size of the circles is determined by the size of the publisher’s readership.
Without the tools in place, the audience is likely to grow.
5% of 0 percent 10% of the total 15% 20% 25% 30% 35% 35% 35% 35% 35% 35% 35% 35% 35% 35% 35% 3
Note: The size of the audience is calculated using the Facebook internal statistic VPV, or Viewport Views, which tracks how many times people view content. Source: ‘Sparing Sharing + Informed Engagement Removal Political Ideology Analysis,’ an internal paper.
According to the papers and others familiar with the subject, the research was aimed to prepare Facebook for any backlash that may result from ending the two efforts, including charges of prejudice. In an internal document, one of the researchers stated, “We might face severe blowback for having ‘experimented’ with distribution at the cost of conservative publishers.”
The Informed Engagement initiative was discontinued, although Sparing Sharing was continued.
As Facebook was increasingly accused of censoring conservative viewpoints, Mr. Kaplan ordered further research assessing how enforcement measures were executed for various ideologies, according to one of the individuals familiar with the situation.
“The fact that Facebook has effectively censored our material is not news to us,” stated a Breitbart spokesman. “However, we’ve been hammering our establishment news rivals in terms of interaction for years, so picture what would happen if Facebook treated Breitbart in the same way that other major news publishers are handled.”
Examples are shown below.
According to the papers, in 2020, a Facebook engineer compiled a list of cases he said demonstrated that Facebook consistently refuses to apply its own content moderation standards for major far-right outlets such as Breitbart, Charlie Kirk, PragerU, and Diamond and Silk.
In July 2020, Mr. Trump tweeted a Breitbart video saying that Covid-19 did not need the use of a mask and that there was a treatment for it that contained the antimalarial medicine hydroxychloroquine. Before Breitbart and social media companies like Facebook pulled down the video, which included a live news conference, it had been seen millions of times.
According to Facebook’s fact-checking policies, pages may be penalized if they get too many “strikes,” which means they published information that third-party fact-checkers found to be inaccurate. To be considered a “repeat offender,” a user must get two strikes within 90 days, which might result in their ability to publish material being suspended. More strikes might result in a drop in distribution and ad income.
According to internal communications documenting the discussion, Mr. Zuckerberg said Breitbart wasn’t penalized for the video since it was the company’s lone transgression in a 90-day period.
The content creators were “managed partners,” according to the engineer’s list of examples, and were part of a scheme in which Facebook assigns corporate handlers to popular users. According to the records, a secondary advantage for these users was that their Facebook connection enabled them escape penalties for fact-checking strikes.
Senior Facebook officials, including the policy and public-relations teams, would evaluate a strike and decide whether to reverse the penalty.
According to a Facebook spokeswoman, such enquiries come from both political parties as well as major news organizations.
The engineer added in an internal note that his evaluation was based in part on a queue of three dozen escalations that he had come across, the great majority of which were on behalf of conservative content producers. An internal message board was updated with a summary of the engineer’s findings.
At a rally in March 2020, social media influencers Lynnette ‘Diamond’ Hardaway, center, and Rochelle ‘Silk’ Richardson, right, talked with Mr. Trump.
Al Drago/Bloomberg News photo
Third-party fact-checkers rated a post on pro-Trump influencers Diamond and Silk’s Facebook page that said, “How the hell is allocating 25 million dollars in order to give a raised [sic] to house members who don’t give a damn about Americans going to help stimulate America’s economy going to help stimulate America’s economy?” as “false” in one case he cited. When fact-checkers labeled the post “false,” a Facebook employee working on the partner program stated that the publication “has not hesitated to go public about their concerns about purported [anti-]conservative prejudice on Facebook.”
According to the uploaded summary and escalation papers, Diamond and Silk were able to persuade the third-party fact checker to reduce the rating to “Partly False” and, with the support of the managed partner escalation procedure, all of its strikes were withdrawn.
The memo, which summarized the engineer’s examples, was originally published by BuzzFeed.
According to a Facebook representative, the employee who noted Diamond and Silk’s readiness to complain about prejudice was just collecting information to send along to decision makers, not debating how to handle the situation.
Employees at Facebook asked that higher-ups clarify the charges, according to the chat discussions obtained by the Journal.
One said, “We seem to be giving hate-speech policy advice and penalty reduction services to certain partners.” Another said, “Leadership is afraid of being accused of prejudice.”
According to the chat logs, Facebook administrators dropped into the discussion to clarify fact-checking procedures and how the managed partner program works, but they didn’t answer the issues regarding bias.
—Andrew Levinson created the design. Photos have been filtered using a color filter.
Keach Hagey and Jeff Horwitz may be reached at [email protected] and [email protected], respectively.
Dow Jones & Company, Inc. All Rights Reserved. Copyright 2021 Dow Jones & Company, Inc. 87990cbe856818d5eddac44c7b1cdeb8