• Politics

Joe Biden’s Fight With Facebook Is Just Beginning

10 minute read

One post links to a video of a purported doctor alleging 62% of his patients have developed microscopic blood clots after being vaccinated. Another cites an alleged physician claiming humans already have built-in immunity to COVID-19, and therefore don’t need to get vaccinated. A third, shared hundreds of thousands of times, claims that vaccinations are “magnetizing” people because they contain microchips that are used to track them.

False content like this, circulating relentlessly through public Facebook posts, private groups, and Facebook-owned Instagram, caused President Joe Biden to angrily accuse the social media giant on July 16 of “killing people” by not confronting online COVID-19 misinformation. Facebook quickly fired back, accusing the White House of using the company as a scapegoat for its own shortcomings in responding to the pandemic, and highlighting what they say are the company’s unprecedented efforts to provide people with accurate information. Biden walked back his comment on Monday, clarifying that he believes the people spreading such misinformation on Facebook are ultimately responsible for the nation’s rising infections and deaths among the unvaccinated.

While the fight over Facebook’s responsibility to curb dangerous misinformation and conspiracy theories on its platform isn’t new, the latest back-and-forth marked a striking escalation in tensions between Biden’s Administration and the social media company, signaling the desire to rein Facebook in may become a key policy priority in Washington. But the finger-pointing over who is to blame for rampant vaccine misinformation also reveals the difficulty of actually preventing these potentially deadly falsehoods from spreading over social media amid cries of censorship, legislative challenges and judicial roadblocks.

“I think it’s probably both a harbinger of more fights to come and also a byproduct of the way frustrations have grown over the last three years,” Jesse Lehrich, the co-founder of Accountable Tech, an advocacy group that has long criticized Facebook’s moderation failures, says of the recent escalation. “On the one hand, you want to work with the platforms to the extent you can to get them to be as helpful as possible. But on the other hand, if nothing is changing, it makes sense to take [them] head-on.”

As concerns within the White House have risen, senior administration officials have regularly been in contact with Facebook executives about its efforts to stem COVID-19 vaccine misinformation, according to White House Press Secretary Jen Psaki. The White House has repeatedly flagged “problematic posts” that contain disinformation and pushed Facebook for more transparency in its data on who this COVID-19 disinformation is reaching, Psaki said on July 15.

But so far, the White House has indicated this outreach to Facebook hasn’t led to significant results. “We know they have taken some steps to address misinformation, but much, much more has to be done,” said Psaki. “And we can’t wait longer for them to take aggressive action because it’s costing people their lives.”

Officials are frustrated in particular with the company’s reluctance to take aggressive action against the so-called “Disinformation Dozen,” a group of 12 accounts is responsible for more than 65% of all anti-vaccine content on Facebook-owned platforms, according to an analysis by the Center for Countering Digital Hate released in March. While 35 accounts tied to the “Disinformation Dozen” have been shut down, at least 62 others linked to them, with a total of 8.4 million followers, are still active, the Center for Countering Digital Hate said on Friday.

And while Facebook has repeatedly insisted the data on their pandemic measures is available to the public, they have been less transparent about the spread of misinformation on their platform, including how many users have seen false claims about COVID-19 and vaccinations. Analysts and misinformation researchers have been pressing the company to release such metrics for years, arguing that Facebook’s refusal to do so is hindering possible solutions. “I guess I’m left with a simple question: How many people have seen COVID vaccine misinformation on Facebook?” Rob Flaherty, the White House’s director of digital strategy, asked in a tweet on July 16.

While only Facebook knows the full scale of the problem, independent researchers’ findings have been striking. The viral claim pushed by anti-vaccine activists earlier this year—that vaccines will “magnetize” those who take them—ran rampant on Facebook and Instagram. Of a sampling of 77 high-performing posts that garnered more than 632,000 views in total, 71% did not receive a fact-checking label from the social media company, according to data shared with TIME by Avaaz, a nonprofit that tracks online disinformation. The group’s analysis also highlighted a major disparity between the treatment of English and Spanish-language misinformation: 97% of the Spanish posts spreading the false claims did not receive a fact-checking label, compared to 55% in English.

Health experts’ dire warnings about rising COVID-19 cases and widespread vaccine hesitancy are “clearly not enough for Facebook and Instagram to crack down at-scale on the infodemic that is fueling distrust and fear of vaccines in the U.S.,” says Rebecca Lenn, a senior advisor at Avaaz. “It should be well known by now that we can no longer count on social media platforms to regulate themselves and protect users against harmful anti-vax lies.”

Facebook has defended its efforts. “At a time when COVID-19 cases are rising in America, the Biden Administration has chosen to blame a handful of American social media companies,” Guy Rosen, the company’s VP of Integrity, wrote in an online post on Saturday. The company has been touting figures that they say show that vaccine acceptance among Facebook users in the U.S. has increased between 10 and 15 points since January, citing a survey conducted through Carnegie Mellon University and University of Maryland.

“Facebook is not the reason this goal was missed,” Rosen said, referring to Biden’s vow to have 70% of the country vaccinated by July 4. “As a company, we have devoted unprecedented resources to the fight against the pandemic, pointing people to reliable information and helping them find and schedule vaccinations.”

 

Facebook’s wide reach has long been ideal for anyone willing to exploit the algorithm to promote controversial or conspiracy views. Facebook CEO Mark Zuckerberg found that out himself in 2016, when a photo of his infant daughter at the doctor was flooded with anti-vaccination comments. Three years later, as lawmakers grew increasingly alarmed at declining vaccination rates and surges in measles cases, they demanded that Facebook take action to stop the promotion of health conspiracies. Facing political pressure, in 2019 Facebook announced it would take action on ads that promoted vaccine misinformation, and remove options that allowed the targeting of users with “vaccine controversies.”

That didn’t stop health misinformation from exploding during the pandemic. One analysis by Avaaz found that content from 10 “superspreader” sites that shared false and misleading claims about the virus had almost four times as many views on Facebook as content from top health institutions, such as the World Health Organization and the Centers for Disease Control and Prevention.

The Biden Administration is aware of how entrenched the problem is. In a July 15 report,Surgeon General Vivek Murthy lays out lists of recommendations urging medical professionals, teachers, journalists and everyday Americans to take action to combat health misinformation in their communities. It also asks technology platforms—which it doesn’t name—to “assess the benefits and harms of products and platforms and take responsibility for addressing the harms,” evaluate the effectiveness of its internal policies, and give researchers access to data that might help stem viral falsehoods.

Even though some of Biden’s top advisers, like his Coronavirus Response Coordinator Jeff Zients, have previous professional ties to Facebook, the President has said he would to crack down on the company. In April 2019, he told the Associated Press he was willing to break up the social media platform, calling the idea “something we should take a really hard look at.” During his interview with the New York Times editorial board in January 2020, he called for the revocation of Section 230 of the Communications Decency Act, which shields social media companies from being held liable for harmful content. Once elected, he nominated Lina Khan, a vocal critic of Big Tech, to lead the Federal Trade Commission, and installed Tim Wu, another critic of the technology industry, on the National Economic Council. On July 9, Biden signed an executive order directing federal agencies to take action against consolidation in a variety of industries, including technology.

Still, the government has struggled to convince federal courts that behemoths like Facebook are monopolies that require the requisite oversight. The FTC had filed a lawsuit in December under the Trump Administration seeking to break up Facebook, deeming it a monopoly in the area of Personal Social Networking services that violates antitrust laws. The lawsuit was dismissed by a federal judge last month, who claimed that categorization was vague, and that FTC had not provided sufficient evidence showing Facebook is a monopoly. The judge did offer the FTC a 30-day window to refile. The FTC did not respond to a request for comment about whether it would take that option.

Passing new antitrust legislation is also an uphill battle, even though it is an issue that commands bipartisan support among lawmakers. Less than a week before the FTC case was dismissed, the House Judiciary Committee passed a package of six antitrust bills, including one that would empower federal regulators to break up Facebook, Amazon, Google and Apple. Though some lawmakers are using recent events to make a renewed push for its passage, House Majority Leader Steny Hoyer, who controls the House floor schedule, has said the bills are not yet ready for a vote, which indicates there may not be enough votes for the package to pass the lower chamber. And even if the package manages to pass the House, it undoubtedly will face challenges in the evenly divided Senate.

Even Facebook’s critics acknowledge that while the company is an easy target for bipartisan ire, ultimately the problem requires a “whole-of-society effort,” as Murthy wrote in his report. What is perhaps more difficult—and more crucial— is countering the main amplifiers of that misinformation, which is increasingly coming from conservative media personalities and even some lawmakers, says Renee DiResta, a technical research manager at Stanford Internet Observatory who has studied vaccine misinformation on social media. Georgia Republican Rep. Marjorie Taylor Greene, for example, has encouraged people to “just say no” to vaccination, and compared vaccination encouragement to Jews being forced to wear a yellow Star of David during the Holocaust. Fox News’ Tucker Carlson has analogized the concept of a vaccine passport to the segregation laws of the Jim Crow era.

Social media “is just one channel where people get information,” DiResta says. “Influential media personalities and even some politicians with very large platforms have tried to politicize and undermine confidence in the vaccines to their large audiences.”

Ultimately, the stand-off between the President and the world’s largest social media company that made global headlines for 72 hours over the weekend seemed to end with no new commitments from either the company or the Administration to take concrete steps to curb the dangerous impact of vaccination lies. But Lehrich says that, in order for the Biden Administration to succeed, other branches of government will need to act as well. “There is only so much that can be done from the bully pulpIt as far as taking on these companies,” he says. “I don’t think Biden and Biden alone will or should be the ones that fixes these broader problems.”

More Must-Reads From TIME

Write to Alana Abramson at Alana.Abramson@time.com and Vera Bergengruen at vera.bergengruen@time.com