At least 1 million videos have been removed for dangerous COVID-19 misinformation since February 2020, according to YouTube’s Chief Product Officer Neal Mahon
The statistic was shared by Mahon in a blog post outlining how the company handles misinformation on its platform.
“Misinformation has moved from the periphery to the center,” he wrote. “It is no longer confined to the closed-off worlds of Holocaust deniers or 9-11 truthers; it now pervades every aspect of society, sometimes tearing through communities at breakneck speed.”
At the same time, the YouTube executive claimed that “bad content” makes up a small percentage of all YouTube content.
“Bad content represents only a tiny percentage of YouTube’s billions of videos (about.16-.18 percent of total views turn out to be content that violates our policies,” Mahon wrote.
He went on to say that YouTube removes nearly 10 million videos every quarter, “the majority of which don’t even reach 10 views.”
Facebook recently made a similar argument about its platform’s content. Last week, the social network released a report claiming that memes and other non-political content are the most popular posts. In response to criticism regarding its handling of COVID-19 and vaccine misinformation, the company has argued that vaccine misinformation is not representative of the type of content most users see.
During the pandemic, both Facebook and YouTube have come under fire for their policies regarding health misinformation. Both platforms have well over a billion users, implying that even a small amount of content can have a significant impact.
And, so far, neither platform has disclosed any information about how vaccine and health misinformation spreads or how many users are exposed to it.
Mahon also stated that clearing up misinformation is only one aspect of the company’s strategy.
YouTube is also working on “increasing the availability of information from reliable sources and reducing the spread of videos containing harmful misinformation.”