youtube

youtube

YouTube is running into some issues with advertisers pulling out.  Why exactly?  HP and Mars (to name two) are pulling their ads from YouTube after reports revealed that their ads were running alongside videos where children were being exploited (whether done so clearly or innocently).  In addition, these videos were loaded with pedophilic comments. In many cases, the companies have vowed not to return to advertising with YouTube until there are appropriate safeguards in place.  And you can’t blame them really.  This, unfortunately, is a growing trend with social media and networking platforms.  We saw something similar with Facebook Live.  People were able to live stream some pretty horrific content before Facebook realized that there was an issue with the service.

YouTube has already been taking down these videos and disabling ads for other clips.  They are working urgently to fix this situation.  YouTube has stressed that it was clamping down on videos that might give “cause for concern” even if the content was illegal. If the content is illegal, then it shouldn’t be allowed to be shown on YouTube in the first place.  If the content is legal but shows “cause for concern”, then sure, I could see YouTube having to draw a line.  But if it’s illegal, that should be pretty cut and dry, no?

Unfortunately, this move is coming too late for many advertisers.  They don’t their ads next to videos or content that is horrible, and you can’t blame them. There were also previous issues with videos promoting hate speech and extremism.  Some companies have started taking action because YouTube took too long to respond.  Like with all the other issues we’ve been seeing online, YouTube depends primarily on algorithmic filtering.  To YouTube’s credit, they also use trusted viewers and reports from authorities.  But clearly, that isn’t enough to prevent questionable or illegal videos from getting uploaded.

Youtube

In addition, there has been a growing trend of content on YouTube that claims to be “kid friendly”, but is far from it.  There have been reports of content creators with incredibly questionable content that some feel, borders on child abuse.  I had also heard about videos that would have a popular children’s character who would end up getting killed or have something violent happen to them.  But parents see the character and assume that the content is rated for their child, only to find out afterward that it wasn’t appropriate at all.

How is YouTube going to combat this?  Well, in reference to the latter issue, Google intends to make some changes to enforcement that “will take shape over the weeks and months ahead as we work to tackle this evolving challenge”.  Google terminated more than 50 channels and removed thousands of videos over the past couple of weeks using the new guidelines.  But will that be enough?  Unfortunately, there is a problem right now with anyone having the ability to upload content when and however they want.  There aren’t enough safeguards in place to stop some of this from happening.  These tech giants need to come up with a better way to monitor these kinds of things or they are going to start losing business and even users.