One of Facebook’s selling features to advertisers has been that you can use the company’s vast amounts of data to target users. Further to that, you can target users based on almost any personal characteristic. So, for example, if you join a Facebook Group for the baseball team that you are on, in theory, this information is provided to advertisers, and you start to see ads for baseball gloves. What’s interesting about this, though, is that you can also target people for all the wrong reasons. On Thursday, it was revealed that you could target people using anti-Semitic phrases like “Jew hater” and “how to burn Jews”. It was also revealed that Facebook enabled targeting for other hateful groups, like the “Ku-Klux-Klan”.
I honestly feel like all I’ve been doing lately is writing about hate. Whether it’s direct hate, or discrimination or in an indirect way like this. How can this happen, exactly? Facebook advertisers use automated ad buying software that can target users based on specific information that they’ve added to their profile. Users can enter whatever they want on their profile categories like field of study, school, job title or company. Facebook’s algorithm then surfaces these labels when ad buyers (or journalists) go looking for them.
In this particular case, users were entering things like “Jew hater” under field of study. which means, it shows up in the targeting search results and was an actual option for ad buyers. Did your jaw drop just there? Facebook issued a statement saying that it would remove the inappropriate categories, adding that the company “has more work to do” when it comes to preventing this kind of targeting. I do want to call Facebook out on this particular issue. Not because I think they were doing this intentionally.
I certainly don’t think that’s the issue here. But what is a concern is that this isn’t the first time that Facebook hasn’t done enough to prevent something. Or to stop something. Or even to manage something in a way that doesn’t expose people to hate, or even crimes. Maybe the company has gotten so big, and so fast, that they don’t have the ability to keep up with these things? That’s kind of a lame excuse. They can keep up enough to roll out updates, and improve all kinds of features. But not this one? It’s been a bad year for Facebook algorithms. They came to the realization this spring that their News Feed algorithm was actually used to help spread misinformation during last year’s U.S. Presidential election. Need I say more?
In addition to that, Facebook has admitted that “inauthentic accounts” from Russia bought $100,000 worth of political advertising during that same election. The accounts were able to make the purchases because of these algorithms, and not humans were approving and facilitating the transactions. And that’s only of the as that we know about. There could be more. And when it comes to this President, I believe we haven’t seen the whole story. There is something that he’s done that is so horrendous, it will immediately get him impeached. Maybe this is a dream or even a fantasy, but there is definitely more to this story.
Again, I don’t necessarily blame Facebook for this. I do blame them for using technology over humans time and time again. I am a supporter of technology doing things instead of humans, but when it continually yields the same results, you have to wonder. And, in this case, it’s a pretty hateful thing that we’re seeing. I will leave you with a statement made by Rob Leathern, Product Management Director for Facebook:
“We don’t allow hate speech on Facebook. Our community standards strictly prohibit attacking people based on their protected characteristics, including religion, and we prohibit advertisers from discriminating against people based on religion and other attributes. However, there are times where content is surfaced on our platform that violates our standards. In this case, we’ve removed the associated targeting fields in question. We know we have more work to do, so we’re also building new guardrails in our product and review processes to prevent other issues like this from happening in the future.”