A group of Democratic senators led by Sen. Amy Klobuchar, D-Minn., wrote to the CEOs of Facebook, Twitter, Google and Google’s subsidiary YouTube on Thursday asking the companies to crack down on vaccine misinformation and make their efforts more transparent.
As the pandemic rages on, Klobuchar, joined by Sens. Tammy Baldwin, D-Wisc., and Gary Peters, D-Mich., told the CEOs that it is “vital” for Americans to get accurate information about the coronavirus vaccines.
“While we understand that your companies have implemented policies regarding the removal of vaccine-related misinformation and dedicated resources to stop the spread of misinformation, we believe more must be done,” the senators wrote. “It is imperative that you be transparent about the amount of harmful misinformation that appears on your platforms and the effectiveness of your efforts to remove this content, so that public health organizations and experts can respond appropriately.”
The senators added that platforms must enforce their policies to limit exposure to misinformation and should actively promote reliable information to users.
The companies already have policies in place to remove misinformation and elevate reliable sources, but reporting throughout the pandemic has revealed that measures to crack down on dangerous and inaccurate messages often come after many users have already seen them.
Platforms have also faced conflicting pressure from Democrats and Republicans in Congress on how they should approach content moderation overall. Democrats tend to push for the companies to take more drastic action to eliminate misinformation and hate speech alike from their services, while some Republicans worry that such efforts would disproportionately target conservative speech due to alleged bias of moderators and algorithms.
The senators asked the companies to respond to a series of questions touching on both transparency and enforcement around misinformation on their platforms.
On transparency, the senators asked how much coronavirus and vaccine-related information is reported and removed each day on average since the beginning of the pandemic. They also want to know how long it takes the platforms to remove messages that are marked false and if they take action on accounts responsible for high levels of misinformation.
The senators asked companies that have exceptions to their policies for politicians — such as Twitter, which exempts world leaders from some of its harshest penalties but still reduces distribution for messages that violate policies — whether vaccine-related misinformation is also exempt from moderation policies when posted by such users.
Finally, they asked if the platforms will work with public health groups to promote vaccination and how they will make sure communities disproportionately impacted by the pandemic receive accurate information.
This is not the first time lawmakers have called on the companies to crack down on misinformation. Klobuchar, for example, previously urged the platforms to take a strong stance on misinformation around voting in the lead-up to the 2020 election.
A Facebook spokesperson pointed to the company’s December policy update that said ads promoting the vaccine and how to safely access it would be allowed, while content seeking to exploit the pandemic for commercial gain would continue to be prohibited. Ads or organic posts claiming to sell the vaccine would be rejected, Facebook added, as would false claims about the vaccines that have been debunked by health experts on the platform. The company said it would take some time for its systems and teams to become trained on the new policies.
A Google spokesperson pointed to a December blog post that detailed how it would surface accurate vaccine information for users. Google said when users search for Covid-19 vaccines, it would provide a list of authorized vaccines available in a users’ area with information on each. Google has also given $250 million in ad grants to governments so they can run public service announcements related to the virus. It’s also invested several million dollars in fact-checking initiatives related to virus information, Google said.
YouTube also has policies against Covid-19 anti-vaccine content and said it will continuously update them as needed. It has also taken steps to make reliable sources appear high in search results and limit recommendations of so-called borderline content that approaches prohibited behavior.
Twitter did not immediately respond to a request for comment.
WATCH: Why Covid-19 misinformation is everywhere and how companies are trying to stop it