This story is part of War in Ukraine, CNET’s coverage of events there and of the wider effects on the world.
YouTube apparently has removed dozens of videos from Russian Media Monitor, a channel run by Daily Beast columnist Julia Davis that spotlights and translates Russian TV perspectives on the country’s invasion of Ukraine.
Davis complained on Twitter on Monday that YouTube had removed 60 videos and shared screenshots of three YouTube notices explaining to Davis that the videos violated the site’s community guidelines. CNET verified those three are no longer available on YouTube.
The YouTube channel has 12,000 subscribers, but Davis also has 390,000 followers on Twitter, where her bio says, “I watch Russian state TV, so you don’t have to.” Her videos typically show figures on Russian TV speaking on the war in Ukraine.
One video posted to Twitter but apparently removed from YouTube features Vladimir Solovyov, who the US State Department says “may be the most energetic Kremlin propagandist around today.” In the video, he complains of missing stockpiles of Russian military equipment and says people were excited by recent attacks on civilian infrastructure across Ukraine because it showed Russians still have some supplies.
The situation shows the difficulties of content moderation in a world where social media can be used both to shed light on difficult subjects and to spread disinformation. With 2 billion monthly users and tight ties to the world’s most powerful internet search engine, YouTube is one of the most impactful sources of online information on Earth.
YouTube can offer outsiders insight into Russian thinking that can inform assessments of the country’s attitudes about the war the same way journalism can. But video clips of Russian propagandists also could run afoul of YouTube’s effort to “prohibit content denying, minimizing or trivializing well-documented violent events.”
YouTube and Davis didn’t immediately respond to requests for comment.
Some fans of Davis’ work sought to reverse the YouTube move. “What part of ‘community guidelines’ could @JuliaDavisNews possibly be violating by bringing transparency to the statements by Russian propagandists? @YouTube should restore her videos,” tweeted Hans Kristensen, director of the Nuclear Information Project at the Federation of American Scientists.
Since Russia’s February invasion of Ukraine, YouTube has not only intensified its policies related to misinformation about the war, but it has also ratcheted up enforcement of them. Misinformation also is a problem on TikTok, the new darling of social media, and has been a major issue at Meta’s Facebook and Instagram.
Because YouTube’s policy enforcement takes place on such a massive scale, it must rely on machine learning to sniff out much of the bad stuff to pull down — and sometimes good content gets improperly removed along with it. The nuance is particularly tricky in cases involving multiple languages and videos that debunk controversial topics.
YouTube — like Facebook, Twitter, Reddit and many other internet companies that give users a platform to post their own content — has grappled with how to balance freedom of expression with effective policing of the worst material posted there. Over the years, YouTube has faced difficulty with different kinds of misinformation; conspiracy theories; discrimination; hate and harassment; child abuse and exploitation; and videos of mass murder, all at an unprecedented global scale. Critics of YouTube argue that the company’s content moderation and safety efforts still fall short too often.
CNET’s Imad Khan contributed to this report.