Britain Parliament

Britain MP’s Are Proposing Regulations for Tech Giants

facebook

If you look at what happened with Facebook and Cambridge Analtyica, then yes, the British parliament is correct when they say that tech giants can’t be trusted to do the right thing, and therefore need to be regulated. Now, this is something that I’ve been suggesting for quite some time now. But is that really the only way? Are there any tech giants who seem to be able to “do the right thing”? And what is the right thing? Based on the above example, the right thing is simply not sharing data that they shouldn’t be. Sure, Facebook would say that it’s not that cut and dry, and maybe it isn’t. But given what happened with Facebook, can we trust tech giants with our data?

The Digital, Culture, Media and Sport Committee in the U.K. said that there are three areas in which legislation is needed. The first is preventing the spread of fake news, which this committee deems a “huge threat to democracy”. The second is to protect user data from misuse. And lastly, ensuring that large tech companies don’t use their stature to take advantage of smaller businesses, who are effectively dependent on technology companies, in order to reach their customers.

Are these an accurate assessment of what is really going on? Completely. In fact, the Canadian government recently put a team in place to help during the 2019 Federal Election in order to ensure that mis-information isn’t spread, especially if a social networking platform like Facebook or Twitter gets hacked – again. So yes, this is definitely a real thing. Perhaps Members of Parliament in Britain and Canada don’t want to see what happened during the 2016 Presidential Election to happen in their countries?

While these rules would be applicable to all tech giants, according to Reuters, Facebook is the one who is coming under fire the most:

https://www.youtube.com/watch?v=ggE2iaRb1CY

In a damning report that singled out Facebook chief executive Mark Zuckerberg for what it said was a failure of leadership and personal responsibility, the UK parliament’s Digital, Culture, Media and Sport Committee said the companies had proved ineffective in stopping harmful content and disinformation on their platforms.

“The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologize than ask permission,” committee chairman Damian Collins said.

“We need a radical shift in the balance of power between the platforms and the people.”

In addition, CEO, Mark Zuckerberg was called out by name by committee chairman Damian Collins:

He refused to appear three times before British lawmakers, a stance that showed “contempt” toward parliament and the members of nine legislatures from around the world, the committee said […]

“Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies.”

And he’s not wrong. But should all social networking (or tech) companies suffer because of Facebook’s mistakes? While I don’t necessarily agree with punishing all for the mistakes of a few, but, in this case, that’s the only way to stop the few (or the one) from perpetuating bad business practices. What will be interesting is whether or not these kinds of regulation actually make it to the United States any time soon.