Facebook is reminding users to check their privacy settings in order to determine where your data is going. While this might seem like goodwill on Facebook’s part, this isn’t enough, in my opinion. That said Facebook posted the following:
“Last week showed how much more work we need to do to enforce our policies and help people understand how Facebook works and the choices they have over the data. We’ve heard loud and clear that privacy settings and other important tools are too hard to find and that we must do more to keep people informed.”
Cambridge Analytica isn’t mentioned by name in the post, that’s what this is all about. What’s interesting is how they’re spinning this one. As someone who has seen (and done) her fair share of “spinning” communication to make it sound better than it actually is, I can read between the lines. I think most people can as it’s kind of blatant. They’re hearing that privacy settings are too hard to figure out? Seriously? Are you just hearing this now? This didn’t come up during any of your design/development stages? I mean, it’s not that big of a leap to think that people would want to know how to find this information. Especially since Facebook is the only ones who really knew what information these apps were collecting.
So where can you find this? On desktop, the privacy shortcuts selection from the dropdown takes you to a privacy checkup, which acts more like an FAQ than an actual menu to change any settings. In the mobile app, though, you’ll have quick access to making your account more secure (adding two-factor authentication and controlling who sees your posts, among other tools). The big addition to the privacy center though is being able to delete anything you’ve shared or reacted to, and your search history. Facebook also made it easier to inspect how you’re advertised to. There are options for changing how you’re tracked and advertised to on the greater web.
As you are probably aware, I am not a big fan of Facebook. I haven’t been for a while, and they’re giving me reasons not to be. That said, I wonder if I’m being too hard on them? Is this kind of update “good enough” in terms of what users want or even need? Often, I try to play devil’s advocate because I think I should be fair and not biased in my assessment. But I don’t’ think I can be when it comes to Facebook. I don’t know that I’m being biased because they’re making the case themselves. This whole Cambridge Analytica data breach shows that they don’t understand their own privacy rules.
It also shows that they don’t understand their users. Or, worse, they know what they want, but they don’t care. How can you operate a service for billions of people worldwide, and have huge gaps in your own policies? The argument could be made that they don’t care. The rules are vague for their advantage. I work on a lot of policy in my job. There are times when policy is written in a way so that it can be left up to interpretation. But there are times when we can’t do that – and we have to be more specific or bad things happen.
Again, I feel like I’m being too hard on Facebook all the time, but like I said – they’re doing it to themselves. They’re putting themselves in this situation and I think these gestures on their part are kind of bullsh*t. I think I’ve used that word twice in posts over the last few days, and I think both times it was about Facebook. I don’t know how else to describe this whole thing, but I think that summarizes it nicely.