Facebook had problems with their algorithm when it came to people posting inappropriate content.  Now YouTube is in the same boat.  Interestingly enough, both are coming up with the same solution.  Earlier in the week, YouTube announced that it will now manually review uploads from accounts that part of their Google Prefered ad tier, which lets brands publish advertisements in videos from the top five percent of YouTube creators. Until now, Google’s (ergo YouTube’s) solution had been to take down offensive channels and tweak their advertiser-friendly guidelines in order to give brands more control over where their ads show up.  This new manual review aims to take that one step further.

The reason for this?  Many of their top stars – like PewDiePie and Logan Paul – have been making the news lately.  PewDiePie for his racist marks, and Logan Paul for his disgusting video of a corpse hanging in the Suicide Forest.  But there have been other issues too.  The family channel with abusive kid pranks.  Which is why YouTube is under fire a bit, and I agree that they should be.  They aren’t keeping a watchful eye on what type of content that is being put on the site.  This is the same thing that we’ve seen with Facebook, so this manual review kind of makes sense.


What does this mean?  Well, to start YouTube will be relying less on their algorithms in order to catch these content creators.  Facebook and Twitter have both decided that they need to hire more people as they try to crack down on bots and troll accounts that plague the sites.  The internet is full of trolls.  And while an algorithm is intelligent at catching some things, there is no way that it can keep up with the volume of all the things that can go wrong.

So why does Logan Paul’s Vlog site still live on the platform?  Well, I don’t think YouTube can just arbitrarily take down all of his videos, on account of one or two being offensive.  But they stopped the original projects that he was working on for YouTube Red, which is their paid ad-free streaming service.  They also terminated his Google Preferred ad deal.  This means he won’t be able to earn as much money, although he’s still able to monetize some of his content.  To put this into perspective, he was making about $12.5 million thanks to the Preferred ad deal, his Maverick apparel line and sponsored posts on social media.

logan paul

The decision to do this was likely a tough one for YouTube.  Think about how many millions of people were watching his channel.  But let’s think about that from an influence perspective for a moment.  And people should be scared by the amount of influence that some of these YouTube stars have over their impressionable teenagers.  But that’s not all, YouTube is also implementing stricter requirements for their Partner Program.  Which will allow smaller channels to earn money by placing ads in their videos to help filter out offensive content.

Is this enough?  The short answer is – we don’t know.  At some point, we will certainly find out.  These kinds of things don’t go unnoticed, that’s for sure.  I do applaud YouTube (and Google) for putting some stricter guidelines in place in order to address the issue.  I’m not convinced that it will work, but I will give them the benefit of the doubt to see if and how this might work.

One thought on “Will Stricter Guidelines Prevent Inappropriate Content From Being Posted on YouTube?”

Comments are closed.