Facebook Welcomes Regulations, Specifically Those That Hurt Its Competition
October 13, 2021 |
Nick Clegg, Facebook's head of global affairs and communications, appeared on CNN's State of the Union Sunday after a harrowing week for the company. Last week a whistleblower, Frances Haugen, testified before the Senate on a number of topics relating to Facebook's lack of transparency and the potentially deleterious effects on its users. However, Clegg's answer to a question about Section 230, the clause within the Communications Decency Act which generally shields platforms from liability for user-generated content posted to their sites, was perplexing.
Facebook's Nick Clegg says Section 230 of the Communications Decency Act should be changed: "My suggestion would be to make that protection, which is afforded to online companies like Facebook, contingent on them applying… their policies as they're supposed to." #CNNSOTU pic.twitter.com/CiJ2gB2UAn
— State of the Union (@CNNSotu) October 10, 2021
When asked by host Dana Bash if he supported "amending Section 230" in order to "hold companies like [Facebook] liable" for certain posts made on their sites, Clegg responded that he did, and recommended "mak[ing] that protection…contingent on them applying…their policies as they're supposed to, and if they fail to do that, they would then have that liability protection removed."
What, exactly, does that mean in practice? "You tell me, because it makes no sense to me," says Jeff Kosseff, a cybersecurity law professor at the U.S. Naval Academy and author of The Twenty-Six Words That Created the Internet, a book about the history and application of Section 230.
This was not Facebook's first foray into offering ideas about how the government ought to regulate it. For months, Facebook has blanketed the airwaves with ads bemoaning that "there hasn't been a major update to Internet regulations in 25 years." On a dedicated webpage, they list specifics: new standards of transparency, privacy, and data portability, as well as "thoughtful updates" to Section 230, "to make content moderation systems more transparent."
While this sounds magnanimous—a social media juggernaut currently in the hot seat, offering ideas on how best it can be tamed—don't believe the hype.
Lately, Section 230 has been in the crosshairs of both political parties, though for different reasons: Republicans feel that the social media giants censor too much content, while Democrats feel that they do not censor enough. Any fine-tuning of the law would almost certainly never pass such an evenly divided Congress. Besides, the last time Section 230 was amended was with the passage of 2018's Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), which has ultimately done more harm than good.
Any potential repeal or revision, however, would likely only serve to insulate Facebook as the world's most popular social media site, as well as to discourage competitors.
"Section 230 was vital to Facebook's creation, and its growth," Kosseff explains, "but now that it's a trillion-dollar company, Section 230 is perhaps a bit less important to Facebook, but it is far more important to smaller sites. Facebook can handle defending a bunch of defamation cases on the merits much more than a site like Yelp or Glassdoor."
Yelp, for example, states in its content moderation section that it forbids "hate speech, bigotry, racism, or similarly harmful language;" however, they also "don't typically take sides in factual disputes." Under a robust Section 230, Yelp is able to maintain that stance without having to worry about being sued over content that users post on the site. Without it, they would risk defamation suits or takedown requests by businesses who get negative reviews; since even the most frivolous lawsuits would require time and money to fight, Yelp may become completely unreliable if businesses are able to pick and choose their own reviews.
Kosseff is not sold on Facebook's regulatory push, saying that "for Facebook to suddenly be the spokesperson for what the standard should be for Section 230 protections is kind of laughable." And indeed, in a situation where Congress decides to either repeal Section 230, or to establish criteria that a site must meet to qualify for its protections, it is worth considering that Facebook would exert an outsized influence in drafting them. Such is the nature of regulatory capture, in which regulatory agencies end up serving the interests of the firms they are supposed to be overseeing.