5 July 2015

CapX Reviews: The Spook In The Machine


Keeping secrets, sending secrets, stealing secrets: it’s a very ancient trade. The business of intercepting and deciphering communications has been going on for as long as people have had brains enough to profit from knowing more than their enemies. And today, as this bleakly entertaining new book from Gordon Corera reminds us, the branch of intelligence known as signals intelligence is now conducted on an industrial scale. The dream of the East German Stasi – that everyone should be spied on, all of the time – is close to becoming a nightmarish and universal reality.

Spies are odd. But they are different sorts of odd. The secret agent kind of spy is one variety, but it’s the other kind of spook that Gordon Corera’s Intercept introduces us to: the eccentric, intellectual, socially-challenged code-cracking kind of spy, the geek branch of the service. They have been part of the state’s surveillance effort for countless years, and an organised branch for a century or more. It was 101 years ago when the First World War ‘decipherers’ of German cables first gathered in Room 40 of the Admiralty Building, and as one observer said “a rummier set of fellows I never came across in all my born days.”

There are many ways of using data gathered from enemies – enemies real or imagined – but they can be reduced to two main streams. One is cryptography, the deciphering of coded communications. Code books – the physical keys that allow messages to be encoded and decoded – have been around for centuries, and the work of the cryptographer was once the job of working out the contents of the absent code book (while the job of the secret agent was to get hold of the code book).

Even before the First World War ended cryptography took a big leap forward, when a German engineer invented a machine that he called Enigma. The German Army was slow on the uptake, but businesses did see a use for a coding machine that did not rely on code books, and that made encrypted data astonishingly difficult to crack (as Enigma evolved, its potential developed from millions, to billions, to trillions of possible combinations). The machine age of coding had arrived, and it would require a matching machine intelligence to break those codes.

The story of the World War Two intelligence community at Bletchley Park has been well documented, although some of the central players are still little known. Tommy Flowers, for example, a partly self-taught Cockney engineer who built Colossus, the first electronic computer capable of switching fast enough to cycle through the immense calculations needed to unravel the Enigma codes. Or Bill Tutte, a chemist who ‘used to stare at the wall for months on end’, and who by sheer power of imagination described the device the Germans called ‘The Secret Writer’, the first digital coding machine. Or John Tiltman, an intuitive Bletchley codebreaker who worked best standing upright at his desk, apparently comatose (‘his best thinking took place just below full consciousness’). It was Tiltman who formulated one of the great insights of practical cryptography, which was the more complex a system was, the more likely the users were to make a fatal mistake when using it.

That leads on to the great paradox of cryptography. The more power one side in the struggle acquires, the more power that puts in the hands of the other side. This is a rule that shapes the entire world of signals intelligence, or SigInt. The Enigma machine, for example, was more sophisticated by an order of magnitude than anything that had come before it. But sophisticated meant complex, and in a complex system the operator will eventually make an error that opens a door for the code-breaker. The rule goes further: the Enigma machine was so complex that the German code-setters believed that it was uncrackable. One of the challenges of SigInt is not letting the enemy know you have broken the code, something that is very difficult to do if you want to act on the intelligence acquired. Yet even in the face of compelling evidence that Enigma had been compromised, the Nazis refused to believe that their unique machine could be decoded. Signals intelligence is a world where confidence breeds failure.

The Second World War marked the high point of Western cryptography. During the Cold War the successes of Bletchley were not repeated. The Russians turned out to be much better coders than the Germans, and intelligence work turned to more traditional methods, such as physical surveillance (first through high-level air reconnaissance, and then satellites), and relationship intelligence, the world of agents and double-agents. But the nature of data and communications was changing, and that brought to the fore the second stream of signals intelligence work.

To interpret the meaning of a message it is not always necessary to decode it. It may not even be necessary to look at the content at all. It may be more valuable to look at the frequency or the source of the communications, the statistical shape of the data. This is called traffic analysis, and it goes right back to the early days of modern signals intelligence. In the First World War for example the imminence and location of German air attacks was tracked simply by monitoring Zeppelin call signs broadcast to their own anti-aircraft guns. The communications were in unbroken code, but by tracking the frequency and recipients of the calls the raids could be predicted in advance. Fast forward to today, and traffic analysis has become infinitely more significant.

The volume of communications data is now mind-boggling. In the 1990s alone, says Corera, the number of internet users grew from four million to over 360 million. Last year the number was three billion. The day is near when most humans will be online. Even while these communications can be and are interrogated by machine for key written words, or images, or spoken words (automated voice recognition has now been around for more than half a century), it is already impossible to evaluate fully the content of all that the world says online. Instead, the spooks look for patterns, clusters, chains, for shapes in the white noise. In modern data intelligence volume is value. Traffic analysis is the surveillance response to total connectedness.

And here again that rule of power balance applies. One of the forces that created the internet was the desire to create a ‘distributed’ communications system – a system that had no centre, and which was therefore resilient. The internet is intrinsically resistant to attack, and for that reason more secure, but because a distributed system has an unlimited number of entry points it is also intrinsically insecure. The rule seems to be that every security enhancement creates a corresponding new insecurity.

The lesson of Intercept is that secret information is power, and that there is no end to the struggle to capture it and control it. The other side of that contest is the shaping of information and the exercise of power through disinformation, which is today evident in much of the world. But that is a different story.

Intercept: The Secret History of Computers and Spies. Gordon Corera, Weidenfeld & Nicholson, RRP £20

Richard Walker is a journalist and communications advisor to financial companies.