YouTube is at it again.  This time they’re increasing the requirements for channels on their platform that will make them eligible to earn money from ads that run before and during their videos.  In April 2017, they began requiring channels to have a minimum of 10,000-lifetime views in order to qualify for their new monetization program.  But it’s now upped that to a threshold of 4,000 hours of watch time within the past 12 months. 4,000 hours is a lot.  In addition, they are also requiring YouTubers to have at least 1,000 subscribers as well.  YouTube explained the change to the criteria in a blog post:

They will allow us to significantly improve our ability to identify creators who contribute positively to the community and help drive more ad revenue to them (and away from bad actors). These higher standards will also help us prevent potentially inappropriate videos from monetizing which can hurt revenue for everyone.

this is bad news for smaller channels that might not have this kind of audience, but they still play by YouTube’s rules and earn through their videos.  A Mumbai-based composer and music producer who runs a handful of YouTube channels, has explained why this is posing to be a difficult challenge for “new” content creators:

Previously, it was possible to earn at least enough to cover the cost of your own DIY video projects over time. The gap between YouTube’s earlier requirements and the new ones is massive. Garnering 4,000 hours of watch time is a whole different ball game than trying to build an audience organically without specializing in video production and publishing. For myself and my colleagues, that means shelving some upcoming projects, because we’ll now need to find other ways to fund them.


But what is this about, really? YouTube is looking out for their own interests.  The platform hosted (unfortunately) a ton of disturbing content last years.  Some of which included videos depicting violent imagery featuring children’s cartoon characters.  It also lost millions of dollars in revenue as numerous major brands boycotted YouTube for running their ads alongside racist and homophobic content.

That said, is this the best way to fix what’s ultimately broken with YouTube?  Like I’ve said with Facebook, this is a massive platform so how can they police every piece of content that goes up?  What this does, though, is it punishes the smaller creators.  This is completely unfair.  I know, I know.  You’re probably thinking life is unfair.  But when it comes to social media, you want to punish people who are offending the system.  Not everyone who wants to use it, in general.


Let’s put this into perspective for a minute.  Last year, PewDiePie published a video that showed two shirtless men laughing as they held up a banner that read “Death to All Jews”.  This year, Logan Paul posted a clip of a dead body hanging from a tree in Japan. But in both of these instances, it was the creators themselves that took down the content after backlash.  Not YouTube.  Which makes you wonder if these new rules are going to help in the first place?

Again, what this will do is help YouTube from a public relations nightmare when the likes of Logan Paul post these kinds of things.  But they’re hurting the community and it actually doesn’t address the real problem surrounding this type of platform.  Isn’t this something that AI could help with?  Maybe not fix, but certainly, help alleviate some of the issues.

%d bloggers like this: