facebook

fake news

Facebook is really trying to put fake news into its rear-view mirror.  This makes sense to me.  This isn’t a good news story, because it’s caused so much damage over the last few years.  That’s why I applaud Facebook (although it doesn’t always seem that way) for trying to rid itself of the scourge that is fake news.  That said, though, they still have something to worry about within this realm.  More specifically – Infowars.  If you’re not familiar with the name, it’s a far-right website that promotes conspiracy theories.  For example, the conspiracy theory that the Sandy Hook school shooting was actually staged, came from Infowars.

Which is why, it was really weird (and confusing) when Facebook said that they weren’t going to ban a site like Infowars, even though it’s acknowledged that it’s a service that shares fake news and conspiracy theories.  While I normally want to condemn Facebook for this kind of agreement, I don’t think I can.  Why?  Because Mark Zuckerberg has gone on the record and given his reason why he’s allowing this.  And, honestly, it aligns with how I feel about this particular topic.

fake_news

I’ve written posts about hate speech and typically, my stance is that while it’s egregious, it can’t be censored, because everyone has a right to their own thoughts and opinions.  This is similar to what Zuckerberg is saying.  He uses the fact that he’s Jewish and paints the picture that there are people who don’t believe the Holocaust actually happened.  But he doesn’t think that his platform should limit someone’s right to have that opinion.  And he’s right.  The platform itself should be careful not to allow these thoughts and opinions to get to the extreme.

Where I don’t agree is with how Facebook wants to deal with the situation.  An offensive or deliberately inaccurate post can stay up, but Facebook may downgrade the post so that its algorithms show it to fewer people.  Further, he believes that Facebook doesn’t have a responsibility to take it down.  This is a hard one because, on one hand, I do believe they have a responsibility in this arena, but on the other hand – what if I think it’s offensive, but someone else doesn’t?  Who is making that judgment call?  And that’s kind of what Zuckerberg thinks as well.  He doesn’t want to be the one to say what can or can’t be put online.  Which, might get in the way of his quest to curb fake news.

hate speech

That said, Facebook’s policy is still evolving.  I think there needs to be some kind of limitation so that bad things aren’t happening as a direct result of this misinformation.  That’s not to say that all misinformation is good, but if it leads to harm or causes some damage, it is problematic.  Zuckerberg feels that there is a deep sense of responsibility for Facebook to try and fix the problem, and while I do agree, I just wonder if anyone can believe him at this time.  Is it too little, too late?