The FBI is still concerned about phone encryption. And maybe not in the way that you would think. The FBI wants less encryption on your phone, where you, as a consumer, should want more (or at least status quo). During a cybersecurity conference in New York, the current chief – Christopher Wray – indicated that the agency failed to access the content inside 7,775 devices. Even though they had proper warrants for these devices. This is more than half of the phones that they tried to access within the 2017 fiscal year. What does Wray say about this? Well, according to him, encryption is a “major public safety issue”. Seriously? This is his idea of a “major” public safety issue? Don’t even get me started on this one.
Wray is now urging the private sector to work with the government in order to find a way to move forward with less encryption. Don’t get me wrong, I am not in favor of crime, but what about the innocent people out there? Wray indicates that the FBI isn’t interested in looking at everyone’s devices. Just the ones owned by suspects. In my opinion, though, this is a slippery slope. And this is pretty much the same position that Comey took during his time. You may recall, the FBI asked tech giants to create a “back door” into their software and phones in order to allow the FBI access.
Tim Cook, said that the request was “chilling” and had “dangerous” implications. His warning was that companies wouldn’t be able to control how that back door was actually used. And he’s not wrong, but this supplements the idea that this is a slippery slope.
The big challenge, according to the FBI is that the phone companies can’t unlock the phones because they don’t have the encryption key. The Justice Department went to court in 2016 to force Apple to devise a way to help it gain access to a dead attacker’s iPhone after a mass shooting in San Bernardino, California. That battle ended when the FBI paid a third party to hack the phone. You read that correctly. They paid a company to hack the phone. So why isn’t this always an option? I’m not promoting hacking either, but if that’s your workaround for one particular case, then why not employ this method permanently?
The Trump administration has hinted that they might need to take more aggressive steps if the tech companies can’t come up with responsible encryption. This would give law enforcement agencies access after a warrant is obtained. As an example of a possible compromise, Wray cited a case from New York several years ago. Four major banks, he said, were using a chat messaging platform called Symphony, which was marketed as offering “guaranteed data deletion.” State financial regulators became concerned that the chat platform would hamper investigations of Wall Street.
The response? “The four banks reached an agreement with the regulators to ensure responsible use of Symphony. They agreed to keep a copy of their communications sent through the app for seven years and to store duplicate copies of their encryption keys with independent custodians not controlled by the banks”, Wray said. “So in the end, the data was secure — still encrypted, but also accessible to regulators.”
Isn’t there liability concerns with Apple, for example, having this information stored somewhere for 7 years? Or whatever length of time is required? This has the potential to create a ton of data breaches, that could be extremely catastrophic. Again, I’m not siding with crime or terrorism, but I think that we need to understand what less encryption means, or could look like. How many of those 7,775 phones were owned by actual terrorists or even criminals for that matter? Is the technology the problem? Or is the problem with a singular view of how to execute a law? I’m not making any generalizations here, but I don’t think this is as cut and dry as Wray is making it out to be.