Meta started removing COVID-19 misinformation early into the pandemic, but it's now wondering if it should take a gentler approach. The Facebook owner has asked the Oversight Board for advice on whether or not it should continue its existing coronavirus policies now that the pandemic has "evolved." The company provided multiple options for the Board's consideration, ranging from the status quo through to significantly softer approaches.
The social media giant suggested that it might temporarily stop the immediate removal of false COVID-19 claims and either limit its distribution, submit it to independent fact-checkers or apply labels steering users toward accurate information. Meta was also willing to continue removing at least some misinformation, but said it would stop pulling content when it no longer represents an "imminent risk of harm." The Board would provide guidance on how Meta would make that decision.
Global Affairs President Nick Clegg characterized the advice request as an attempt to strike a balance between "free expression" and safety. The Board's decision would not only help shape that balance, but would aid Meta in responding to future health crises. Clegg noted that Meta had removed 25 million instances of bogus COVID-19 content since the pandemic began, and that it now had resources including its own virus information center as well as guidance from public health authorities.
The Board is also tackling multiple potentially important cases in other areas. A transgender non-binary couple is appealing Instagram's decision to remove two images of (covered-up) nudity despite some moderators determining that the convent didn't violate the site's pornography policies. Meta stood behind its decisions to remove the posts, but the couple said the company didn't provide an adequate answer and shouldn't censor transgender bodies at a time when trans rights and healthcare are under threat.
Another dispute challenges Instagram's decision to remove a video playing a snippet of Chinx (OS)' drill music tune "Secrets Not Safe" after UK law enforcement claimed the rap song's lyrics (referencing a past shooting) could promote real-world harm. A fourth case, meanwhile, concerns an appeal from a Latvian user who allegedly promoted violence with a post accusing Russia of fascism and referencing a poem that called on people to kill fascists.
While all of the cases could have a significant effect on Meta's policies, the possible changes to the firm's COVID-19 misinformation response may draw the most attention. Critics have repeatedly argued that Meta wasn't doing enough to fight misinformation, pointing to evidence that people who lean heavily on Facebook for news are more likely to believe false claims about vaccines and the coronavirus. Meta's request for advice runs counter to that criticism, and could raise fears that misinformation will spread rapidly.