What Does the Secure Data Encryption on Your Mobile Devices Really Mean for Security?

If Apple and Google design their mobile devices so that the government is stalled by their secure data encryption, then the bad guys win. At least that seems to be what a host of current and former law enforcement, anti-terrorism and intelligence officials currently believe.

With the release of iOS 8 and the newest range of smartphones and tablets, Apple has implemented their device-based encryption so that there is no way they could comply with a government order to gain access to the content on a mobile device, and this is now the default setting on iPhones.

Google has also moved to do replicate this on their Android devices. Predictably, this has led to outrage from the law enforcement community, based on the assumption that it will make it possible for criminals, drug dealers and terrorists to hide their actions and avoid convictions.

I won't go into the whole debate on the needs for law enforcement and reduced privacy versus freedom rights in order to combat the evils of today. But there are some key issues that stand out within this debate.

First, I'm not clear on how the moves by Apple and Google change things much. For years now it has been very easy to enable secure data encryption on these and other devices. Any criminal who knew how to install an app on their device could have put these protections in place already.

Really, the main change is that these encryptions are now default. So, while experienced bad guys have had this level of data protection for a while, now everyone will automatically have their data protected if they lose their phone. Which makes me wonder, is the real objection for these law enforcement officials that they now won't be able to easily access the data of regular people? Or perhaps it just makes it harder to catch the really dumb criminals.

The other problem with this debate is one that is of special concern to businesses and IT professionals. Namely, these so called security and encryption products that people have been using for years, which had backdoors for government, could be taken up on false advertising charges. There is no way that a system can have backdoors and be consider secure in any way.

Imagine you wanted to protect your most valuable possessions, so you decide to purchase the most secure safe that money can buy. The safe manufacturer talks you through all of the impressive features that make it impossible for even the most skilled thieves to get through. And then you notice a little USB port on the side of the safe. You ask, "What's that for", and the safe manufacturer reluctantly replies, "That makes it possible for law enforcement to access the safe. You know, just in case criminals were using the safe."

Would you purchase a safe that not only made it easy for government officials to gain access, but also any other intruder with the know-how to easily open it as well?

That's what these backdoor options are - holes in your security. There have already been clear examples of criminal hackers and malicious foreign powers using backdoors to hack into business systems and steal data and information from enterprises. Companies like Apple and Google have realised that they can't sell secure products to businesses in the US, and especially abroad, that have built-in holes in their security.

Essentially, it comes down to just one choice: Device makers can either follow Apple's model and say, "Look, this is a secure system and we can't sell a secure system with a backdoor in it", or they may have to give up on offering any security whatsoever.

This article was written by Jim Rapoza, Senior Research Analyst and Editorial Director, Aberdeen Group.