Senators Mark Warner (D-Virginia) and Deb Fischer (R-Nebraska) have introduced legislation to ban “dark pattern” tactics, which are designed to trick users into handing over access to their data. Before I get into what that means, I’d like to point out that this bill is being proposed by a Democrat and a Republican. Why is this important? It simply means that politicians can cross the aisle and work with people who aren’t in their party, in an attempt to protect citizens.
What are dark patterns exactly? It was a term that was first popularized by the website darkpatterns.org, and they describe everything from UI elements to technical tricks designed to lure users into taking actions they might not otherwise agree to. For example, a website might present them with in-app purchase buttons or data-sharing agreements that are designed to appear like different and more mundane functions.
The bill, known as the Deceptive Experiences to Online Users Reduction (DETOUR) Act, doesn’t distinguish between mobile and apps and a desktop browsing experience. But the legislation does make it illegal for large, public online service with more than 100 million monthly active users to “design, modify, or manipulate a user interface with the purpose or substantial effect or obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.” In addition, the bill requires that these services cannot “subdivide or segment” users into groups for “the purposes of behavioral or psychological experiments” without first getting their informed consent. The bill also states that these companies cannot target anyone under the age of 13 “with the purpose or substantial effect of cultivating compulsive usage, including video auto-play functions initiated without the consent of a user”.
Let me first start by saying this bill is oddly specific. While I understand that companies are engaged in these dark patterns, it makes me wonder if there have been specific incidents that we haven’t heard about. Sure, we’ve all heard about what Facebook has done over the last couple of years, but there’s something about this that sounds different.
If this were to become law, these companies would have to create their own, independent review boards for the approval of behavioral or psychological experiments, as well as create professional standards bodies that would work in coordination with the Federal Trade Commission. This doesn’t sound like a big deal at first, but then think about how many companies would have to have these independent review boards? This is in addition to the growing concern that certain tech giants should have more regulation.
I guess my question is where does it stop? There is nothing wrong with certain regulations, and maybe this one is completely necessary. But at what point is it too much? Every single social networking company will have to hire their own independent review board, which will certainly create jobs, but to what end? I think there’s a better way of doing this because what are the chances that the review board is going to say no, if you work for them? Of course they’re going to suggest that it’s ok to run an experiment on people, especially if they have some kind of stake in the company. All that said, the bill is moving forward, but whether or not it will become law is up in the air.