A House hearing on Wednesday, which included new testimony from Facebook whistleblower Frances Haugen, laid bare the deep partisan divisions that continue to hamper any legislative or regulatory reforms to hold tech platforms accountable for how they amplify dangerous content.
“Facebook wants you to get caught up in a long, drawn out debate over the minutiae of different legislative approaches,” Haugen, who worked on the company’s civic integrity team and leaked tens of thousands of internal company documents, told a House Energy and Commerce subcommittee. “Please don’t fall into that trap. Time is of the essence… You have a once-in-a-generation opportunity to create new rules for our online world.”
But Wednesday’s hearing—at which lawmakers waved copies of George Orwell’s 1984, heatedly disagreed on the very definition of misinformation, and discussed their own “victimization” at the hands of Big Tech censorship—was a reminder of why a slew of recent legislative proposals to rein in social media companies are unlikely to go anywhere. While lawmakers from both parties support tougher internet regulations, and seem to agree that tech companies should be held responsible for business decisions that impact how online content is amplified, their underlying reasons diverge sharply.
The recent debate has focused on targeted reforms to Section 230 of the 1996 Communications Decency Act, which gives online platforms immunity from liability for content posted by third parties. While it has come under fire from both sides of the aisle, Democrats want to amend Section 230 to hold social media platforms accountable for the lack of moderation of hate speech and disinformation, while Republicans have alleged that it allows for censorship of conservative views.
“The Internet has grown substantially since 1996, and it is clear Big Tech has abused this power granted to them by Congress,” Rep. Bob Latta, an Ohio Republican and a ranking member on the subcommittee, said on Wednesday. “They censor conservative voices and use algorithms to suppress content that does not fit their narrative.”
In Wednesday’s hearing, Democrats cited four tech reform proposals. The Justice Against Malicious Algorithms Act, which was introduced by senior House Democrats after Haugen first testified on Capitol Hill last month, would amend Section 230 by making social media platforms liable when they “knowingly or recklessly” use algorithms to recommend content that leads to physical or “severe emotional” harm. The legislation would only apply to platforms with more than 5 million unique monthly visitors, and exclude web hosting services. They also laid out the SAFE TECH Act, which would remove Section 230 protections for ads and other paid content, the Civil Rights Modernization Act, which focuses on targeted ads that violate civil rights laws, and the Protecting Americans from Dangerous Algorithms Act, which would strip Section 230 protections from platforms “if their algorithms amplify misinformation that leads to offline violence” like civil rights abuses and international terrorism.
“Social media platforms like Facebook continue to actively amplify content that endangers our families, promotes conspiracy theories, and incites extremism to generate more clicks and ad dollars,” said Rep. Frank Pallone, the New Jersey Democrat who serves as the committee’s chair. This echoed Haugen’s testimony in October, in which she cited internal research that Facebook acts against just 3 to 5% of hate speech and 0.6% of violence incitement.
None of these bills have support from Republican lawmakers, although some have proposed their own legislation to hold platforms liable. Republican proposals have focused on preserving constitutionally protected speech with exceptions. These include a “Bad Samaritan” carve out removing liability protections from companies that “knowingly promote, solicit, or facilitate illegal activity,” a carve out for companies with direct or indirect connections to the Chinese Communist Party, and carve outs for cyberbullying, terrorism and doxxing. “Rather than censor and silence speech, the answer should be more speech,” said Rep. Cathy McMorris Rodgers, a Republican from Washington. “That’s the American way. Big tech should not be the arbiters of truth.”
While there has been broader bipartisan support for legislation narrowly targeting child exploitation, cyberbullying and terrorist groups, Wednesday’s hearing shows how much the politically sensitive debate over misinformation, free speech and bias is likely to derail more comprehensive efforts.
Haugen cautioned lawmakers to be aware of unintended impacts from legislative reforms, referring to a 2018 law that carved out an exemption from Section 230 for sex trafficking and prostitution that advocates say ended up harming sex workers. “I encourage you to move forward with your eyes open to the consequences of reform,” Haugen said. “I encourage you to talk to human rights advocates who can help provide context on how the last reform of 230 had dramatic impacts on the safety of some of the most vulnerable people in our society, but has been rarely used for its original purpose.”
In his own testimony in front of Congress in March, Facebook CEO Mark Zuckerberg claimed the company’s hate speech removal policy “has long been the broadest and most aggressive in the industry.” While Zuckerberg said he supports some reforms to Section 230, he proposed that Congress should require platforms to have “adequate systems to address unlawful content” in place in order to keep legal protections. Critics have said this approach would favor large companies like Facebook over smaller start-ups and companies who might find it harder to meet the requirement.
The company, which recently renamed itself Meta, has declined to comment on specific legislative proposals.
“Facebook wants you to have analysis paralysis, to get stuck on false choices and not act here,” Haugen told lawmakers on Wednesday. Despite the renewed bipartisan push for action in the aftermath of her revelations, for now it seems Congress may still be stuck.
- Meet TIME’s Newest Class of Next Generation Leaders
- After Visiting Both Ends of the Earth, I Realized How Much Trouble We’re In
- Google Is Making It Easier to Remove Personal Info From Search
- Oil Companies Posted Huge Profits. Here’s Where The Cash Will Go (Hint: Not Climate)
- Column: We Asked Hundreds of Americans About Abortion. Their Feelings Were Complicated
- A Short History of the Rise, Fall, and Rise Again of the Marcos Family
- Long-Lasting Birth Control Is Already Hard to Get. Advocates Worry It May Only Get Worse
- Who Should Be on the 2022 TIME100? Vote Now