26 February 2016

Why is Apple confusing the difference between hacking and encryption?

By

You might have been following the story of the iPhone that belonged to Syed Rizwan Farook, the gunman in the San Bernardino shootings. The phone is refusing to give up its secrets to the FBI who, obviously, have 14 good reasons to be interested in its contents. Only, they don’t have the four digit pin number to access the phone and, rather inconveniently, the phone might be set to delete its data if the pin number if entered incorrectly ten times. Currently, they only have a 1 in 1000 chance of getting it right, meaning no reason to start from 0000 and work their way up to 9999; the so-called ‘brute force’ method.

That’s why the FBI have turned to Apple who they’ve asked to produce a new ‘hacked’ version of the phone’s operating system (called ‘iOS’) with the delete function disabled. Apple are refusing to comply and have turned the debate into a matter of civil liberties. That, in turn, is producing quite a bit of media around the story, with some quite intelligent people saying some rather odd things.

The main oddity is that political commentators keep confusing their technologies, assuming that ‘hacking’ is the same as ‘encryption’. They’re not. ‘Hacking’, in the current context, means ‘bypassing the security’ of a device. Sometimes that method of bypassing the security is through what’s called a ‘backdoor’. The most famous (albeit simplified) backdoor was in the film War Games, where the computer programmer coded the name of his dead son into the system so anybody typing ‘Joshua’ could gain access.

Encryption, on the other hand, is something quite different and has nothing to do with the FBI’s request. It’s just convenient to think it has because American authorities have been long waiting for an opportunity to raise the problem of strong cryptography. Governments around the world want tech companies to provide easy methods of breaking encrypted communications and that method would be through what is called a ‘trapdoor’.

So there are ‘backdoors’ and ‘trapdoors’. Confused? Don’t be. Let’s explain them a little further using an analogy.

You have a house with a front door and a back door. The front door is very secure. However the backdoor just needs a kick in the right place. If you know where to kick it, you can ‘exploit’ that backdoor and get into the house. ‘Backdoor’ means an easy way to get inside the system. What you find inside that system is a very different matter…

Now, imagine that inside that house, you have a diary. In order to stop people reading the diary, you write each day’s entry in a different language. You are fluent in thousands of languages so that on no two days do you use the same language. Even if authorities could recognise one day’s language, they would not know the next. Let us imagine, however, that the government owns a form of Rosetta Stone, on which the secrets of all languages are written. With that stone, every diary entry could be unlocked. That Rosetta Stone would be the ‘trapdoor’ to your encryption.

This is to explain both in the simplest terms but the point should be obvious that we’re talking about two different areas of technology. To ‘hack’ the San Bernardino phone involves the relatively simple task of removing the instructions that tells the phone to delete the data when 10 failed pins have been entered. To return to our analogy: imagine the lock to your front door is wired to your burglar alarm. Anybody ‘picking’ the lock would trigger the alarm. Now let’s say that the FBI was to get in without triggering the alarm. They would obviously ask the electrician who installed the alarm to cut the cable running from the lock to the alarm or cut the power to the alarm. That, in effect, is a ‘hack’. It is also what the FBI are asking Apple to do with the San Bernardino phone. They haven’t even asked for the phone to be unlocked, merely a way of stop it deleting its data whilst the FBI break the pin number using a brute force ‘attack’.

‘Backdoors’ and ‘trapdoors’ sound the same but I hope you have a better understand of ‘backdoors’. ‘Trapdoors’, on the other hand, are more involved because they involve cryptography and, in order to understand cryptography, we need to go a little beyond the diary analogy we used earlier. We need to acknowledge that computer cryptography usually involves maths.

Now, before your eyes glaze over, this isn’t going to be about the complicated equations behind real cryptography. I want to explain it in terms that even I can understand. Encryption is a series of mathematical functions that computers find easy to calculate one way but almost impossible to do in reverse. What do I mean by that? Well, 10 multiplied by 10 is obviously 100. But how would you work backwards and tell me which two numbers (called factors) I used to reach 100? Was is 2×50, 4×25, 10×10, 0.5×200, or 0.05×2000? There are an infinite number of possible factors for even the simplest numbers and if those two factors are hidden deep in the infinity of possibilities then even the most powerful computers could take decades (or forever) to find the answer.

Governments, naturally, don’t like this truth about numbers. It’s why they want the industry to provide ways of figuring out these ‘key’ numbers without going to the trouble of checking every combination. They want, in other words, their own ‘trapdoor’ inside the maths. Yet herein lies the reality: mathematics doesn’t play politics. The only way to work out the square root of 7921  is to work out the square root of 7921 or to know that the answer is 89. There is no short cut yet governments are essentially demanding one.

This is the deeper argument going on between governments and tech companies. Governments want us all to use ‘weak’ forms of encryption (let’s say, only using small numbers that government computers can crack quite quickly) to protect our data but, sadly for them, they cannot simply ‘outlaw’ big numbers. The equations are already out there. Strong encryption algorithms can be written by any competent programmer and third-party software will always be available to encrypt data in a way that lies beyond the power of any computer to decrypt. If people wish their secrets to remain locked forever, there is little governments can do but criminalize the ownership or sale of strong encryption. The worst case scenario is also the reality: terrorists could quite easily write their own algorithms to encode their messages behind strong cryptography.

Governments might well be fighting an unwinnable battle over cryptography. The FBI, on the other hand, have a stronger case when it comes to the San Bernardino phone. To disable the delete function is the sort of thing Apple’s engineers do every time they launch a new version of their operating system, with new features added or unpopular features taken away. In this case, they would be simply taking away a function that is unpopular with one user: the FBI. Indeed, I’d be extremely surprised if such a reprogramming task couldn’t be done (if it’s not already being done, as an academic challenge) by hackers across the globe. Hacking communities live for such bragging rights and it would not be the first time that the Apple operating system has been hacked, as proved by anybody who uses a jailbroken Apple product which no longer has the quite severe restrictions that Apple places on what can or cannot run on their hardware. What such as ‘hack’ is not, however, is an attack on Apple’s cryptography (the way they encode data). It’s worth repeating: none of this involves providing the mathematical (and mythical) ‘trapdoor’ to allow data to be decrypted instantly.

As to Apple: only a cynic would argue that their current standoff with the FBI is excellent marketing that panders to public perceptions of a ‘friendly’ tech giant. I am, unfortunately, such a cynic. Much of Apple’s argument makes little technical sense. Perhaps they wish to be seen as the customer’s champion, affirming their liberal credentials as well as their status as manufacturer of premium (though highly overpriced) products. It adds to the mythology of Apple but their arguments from a technical point of view sound fairly weak and weaker too regarding the wider issue of privacy. Allowing the FBI to unlock this one phone will not allow the FBI to unlock every phone (unless the hacked operating system somehow left Apple headquarters) and, even if it did, it’s not where the battle of secrecy will really be fought. Strong encryption already lies in the hands of users and those that wish to protect their data will do so, irrespective of what Apple or governments think, say, or do.

David Waywell writes and cartoons at The Spine.