Around the time of the Snowden revelations, I remember sitting in a board meeting at some startup I won’t name, trying to convince (prominent VC and C-level) folks that the right thing for customers was for the company to encrypt their private data, give the customers the keys, and retain no ability to decrypt without permission. We could do this technically and still pull off the company mission.
Fast forward a few years and Apple and others are doing exactly what I proposed and on a much larger scale. Was I prescient? Not much. It was about understanding what customers want. Companies will inevitably need to build what customers want, or lose out to someone who will. And anyone really listening would have come to the same conclusion.
I remember the board members at the time looking at me like I was nuts. One called me a dumb hippy. Another smart and well-respected fellow said there was no way the government would let that happen, i.e., if they wanted a back door, they’d get it.
Coming back to present, the government now has fewer back doors then they did back then, in part thanks to Snowden and more so due to an active pro-security community routing these things out. Many security flaws (intentional or otherwise) have been exposed, serving to protect private information and private citizens.
But given an attack like the unconscionable events in Paris, including a purportedly encrypted cell phone, some people think that will shift us all back to the 9/11 mindset. The opportunists, at least, hope this will be their chance to throw in some more PATRIOT act like nonsense.
I don’t think it will. There is zero public evidence of any terrorist plot ever being foiled by decrypting communications. And more importantly, there is no customer value to be gained from doing so.
On the first point, we can look back to WW-II, where Turing and company broke Enigma. That’s a great example of government cracking encryption for social good, right?
Had the Germans known the British and US forces had a backdoor in their favorite encrypted chat app (of the day), they would not have relied on said app, or would have added more & different encryption on top. Bletchley Park had to painstakingly avoid passing critical intel to the Allies, costing many lives in the process, just to ensure the Germans had no proof their communications were vulnerable.
So, by extension, if Western governments succeed in getting back doors put back into popular chat apps, then it can do little good in solving or preventing terrorism. The terrorists will just do something else, including something low-tech.
As with DRM, the only people harmed by inane government policies on encryption, guns, or whatnot are ordinary law abiding folks.
[by the way, compare and contrast this excellent linked article today with the crappy one on TerraServer I blogged about yesterday to see the vast range in journalistic quality we have on line]