YouTube said on Wednesday it had removed more than a million videos containing “dangerous disinformation about the coronavirus” since the start of the pandemic, when social networks are accused of contributing to the spread of misleading ideas about the Covid -19 and vaccines.
Google’s video platform has defended its content moderation techniques, emphasizing the priority given to legitimate and reliable sources, such as the World Health Organization (WHO).
“We remove nearly 10 million videos per quarter, the majority of which do not even reach ten views,” said Neal Mohan, product director of the site, in a statement released Wednesday.
But “if we only look at what we remove, we miss the mountains of content that people really see”, he insists, noting that “in all, between 0.16% and 0.18% of views concern content that break our rules. “
“For Covid-19, we are basing ourselves on the consensus of experts from health organizations (…). But in most other cases, misinformation is more difficult to assess. “
The issue of disinformation around Covid-19 and vaccines has grown to such proportions that in July, US President Joe Biden even estimated that Facebook and other platforms were “killing” people by letting false information circulate about vaccination against Covid-19.
The bosses of Facebook, Twitter and Google have been summoned several times to answer questions from American parliamentarians, in particular on the subject of moderation of content.
SL (with MAP)