Apple Steps Back Their Security
The fallback for law enforcement agencies has always been the place where files are stored, and all the best encryption within end-to-end communications will not stop unencrypted files at rest from being examined. But when the user encrypts data into the Cloud and where they hold their own keys, that’s when the nightmare begins for them.
The rise of cybersecurity on the Internet
Let’s pinpoint the start of cybersecurity on the Internet to the 1970s. This saw the rise of the Lucifer cipher and saw banks properly protect their communications. This led to the 56-bit DES encryption method, and which led many to suspect that the size of the key had been crippled due to the demands of law enforcement agencies. But, there was an even greater threat to these agencies evolving: public key encryption.
The rise of public key encryption started in the mid-1970s when Whitfield Diffie and Marty Hellman first defined a method that allowed us to secure our communications using a key exchange method — the Diffie-Hellman key exchange method. And then, almost a year later, Rivest, Shamir and Adleman presented a way to digitally sign a hash of data with the RSA signature method, and where a server could sign a hash of data with its private key and for this to be verified with an associated public key. For almost the first time, we could digitally verify that we were connecting to a valid system. But, the RSA method could not only sign data, it could also encrypt things with a public key, and where the private key could now be used to decrypt the data. It was a nightmare come true for law enforcement agencies.
What was magical about these methods was that you could encrypt data with keys that could be created for every single session — and generated and stored on user devices. User devices could even pick the keys that they wanted and their sizes and security levels. The days of security being crippled were fading fast. While the first versions of SSL were crippled by the demands for limits on this security, eventually, SSL evolved into something that could not be controlled. But, still files could still be viewed on user devices, so it was not a major problem for investigators.
Then, in 2001, the AES method was standardized by NIST, along with the newly defined SHA-256 hashing method, and we basically had all the security methods in place. But all of this did not please law enforcement agencies. For them, the rise of cryptography removed the opportunities that they had had in the past and where they could mass harvest information from phone calls or from the postal service. For the first time in history, citizens were free from spying from both those who protect nations and those who attack citizens. The Wild West years of the early Internet — and where little could be trusted — have subsided, and now we have systems which take encryption from one service on a device to another service on another device — end-to-end encryption.
End-to-end encryption
For some, end-to-end encryption was the final nail in the coffin for those who wish to monitor the tracks of citizens. This is data in motion, and where law enforcement agencies could still peak at data at rest and where the data is actually stored. Once data in motion and data at rest were encrypted, the door was effectively closed for peaking at data.
And, so, companies such as Apple advanced new methods which protected data at rest, and where all of a citizen's data could be encrypted onto the Cloud without Apple having the encryption key to view any part of it. For this, they created the Advanced Data Protection service:
This service protects things like citizens' photos, iCloud Drive, and wallet passes. For almost the first time, we had almost perfect security — and where five decades of advancement were finally coming together. We now have end-to-end encryption in apps such as What’s App and Signal, and Apple provides secure data storage.
But, some governments around the world saw the rise of privacy as a threat to their security agencies, and where the usage of encryption with file storage and over-the-air would mean that they could not monitor their citizens for threats against society. It is — and always will be — a lose-lose store on both sides. And, so, many governments have been calling for a back door in cryptography so that a “good guy” could get access to the citizen data and communication, but not a “bad guy”. Unfortunately, that’s not the way that encryption works, and where backdoors are a bad thing and difficult to hide.
So, the UK government has put pressure on Apple to provide them with a backdoor into their secure systems. For this, Apple would have to either provide them with a magic key to open up encrypted communications and file store, or dump their Advanced Data Protection system, and leave files unencrypted for investigation.
Apple stepping back
It would have been a difficult choice for Apple, but they have decided to drop their Advanced Data Protection system for UK users, and not go with the nightmare of a backdoor in their systems. Imagine if a terrorist had stored their files in iCloud, and law enforcement agencies had requested these files. Well, Apple would have to hold their hands up and say that they didn’t have the encryption files to access them, as the encryption keys were held by the user.
I trust Apple and believe they have some of the best security around. When was the last time you heard of someone getting some malware on an Apple system? They support a proper secure enclave and are advancing a privacy-aware cloud infrastructure for machine learning. They have also brought forward homomorphic encryption applications. Of all the big tech companies, Apple leads the way in terms of supporting the privacy and the security of users.
Conclusions
I feel sorry for Apple, as they have been painted into a corner. From a cybersecurity point-of-view, it is disappointing that Apple has been forced to step back on the Advanced Data Protection tool, as it was a great advancement in overcoming large-scale data breaches. And, like it or not, there is no magic wand that stops a bad actor from using something that a good actor has access to. Basically, if you leave your front door key under the mat, you have no guarantee that someone else will find the key and use it.
We have advanced cybersecurity for the past few decades and now use end-to-end encryption in a way we should have done from the start of the Internet. Of course, there are no winners in this, and society must find ways to protect itself from bad people, but opening up the whole of iCloud seems like a disaster waiting to happen.
The door is open for other more agile companies to support enhanced security and privacy, as the large tech companies seem to be applying the brake on some of their security advancements.