Tougher standards: We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.
But it's not just content creators that will be impacted as anyone who merely searches for keywords that YouTube deems 'questionable', for whatever reason, will be promptly redirected to propaganda videos intended to "directly confront and debunk" whatever 'questionable' content that user was looking for.
Early intervention and expanding counter-extremism work: We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube,they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the program’s reach to 20,000 more teens across the U.K.
So who will be responsible for choosing which content qualifies as "controversial" and/or "questionable?" Well, as the Daily Caller points out, that responsibility will fall upon 'impartial' groups like the Anti-Defamation League that recently published a list of "alt-right" and "alt-lite" YouTubers yet failed to highlight extreme leftist organizations like Antifa...must have just been an oversight.
According to YouTube, the system, while largely automated, will mix in human reviews in the form of its already established “Trusted Flagger” volunteer program that works with over 15 institutions to deal with extremist content, including the Anti-Defamation League.
The ADL recently released a list naming members of the “alt-right” and the “alt-lite,” the latter of which included controversial YouTube personalities like Gavin McInnes, Mike Cernovich, and Brittany Pettibone. Curiously, the ADL is selective in what it chooses to label as “extremism.” It does not have violent far-left ideologies like Antifa and militant leftist organizations like Redneck Revolt on its radar.
It’s worth noting that the “Trusted Flagger” system was later transformed into the much maligned “YouTube Heroes” program, which invited the public to help moderate content. It was heavily criticized for giving social justice activists the power to manipulate the platform.
Despite the apparent focus on targeting extremism, YouTube’s announcement includes the company’s efforts to artificially promote videos through its “Creators for Change” program, which in YouTube’s own words pushes creators who are “using their voices to speak out against hate speech, xenophobia, and extremism.”
Not surprisingly, these moves to censor content creators while shoving propaganda videos down the throats of users, has been blasted by conservatives online who feel like they've been targeted.
“If a video doesn’t break YouTube’s terms of services then they absolutely SHOULD NOT be attempting to dampen the reach of the video any further,” said YouTuber Annand “Bunty King” Virk, who raised his concerns with The Daily Caller. “Who determines what’s passable and what isn’t? At what point do we finally realize that saying the right thing isn’t always about saying what people want to hear?”
“By these standards, if YouTube existed previous to the Emancipation Act, they’d be censoring videos criticizing slave owners, since being anti-slavery wasn’t popular… at all,” he added. “The popular opinion isn’t always the right opinion.”
“No one can really say who’s going to be impacted by this new road map, and that’s the point isn’t it? If their policies and terms of service aren’t there to help guide creators anymore, then why even have them? So really, anyone could be at risk without even knowing it,” he said.
“I have no problem with YouTube cracking down on terrorist recruitment videos and the likes,” clarified Undoomed. “What I don’t understand is how such videos could’ve possibly been considered acceptable under the extant TOS and policies.”
“I think there is a high probably for collateral damage with this new attitude,” he said. “Some people could conceivably consider skeptics and anti-SJWs ‘extremists,’ while all we are doing is arguing for a little common sense, and of course for freedom of speech as demanded by the Constitution.”
“My suspicion is that ‘trusted flaggers’ is just a code word for the ‘usual suspects’. i.e. the same type of radical left-wing reactionaries that have reshaped Twitter into an Orwellian nightmare,” he concluded.