When there’s a company providing online storage secure against all but the clients, it causes law enforcement agencies (LEAs) some headaches, and I sympathise with those when they deal with the serious crimes like child exploitation, identity theft and other evils that screw with the lives of ordinary people. But I also know that a determined LEA will get someone’s data one way or another, and there’s already some decent research into technical methods of getting forensic evidence from online storage.
I strongly believe it’s actually in all our interests to ensure invasions of privacy never become routine, and such an undertaking should be difficult enough to force LEAs and governments to focus on the most serious criminals instead of the more trivial stuff like copyright infringement.
The fact goverments want easy access to peoples’ online storage, and the recent revisions to the ‘Patriot’ Act, present us with some problems:
1. It’s basically unauthorised access without the users’ knowledge, which defeats any security a service might have had.
2. As with encryption, a backdoor in a supposedly secure service is very likely to be exploited by others. I’ll probably cover this in more depth later, and it ties in with the first paragraph of this post.
3. Sometimes police get bribed, as we found out during the News International scandal, so there’s no guarantee of confidentiality.
4. It could be counter-productive, as more serious criminals would use strong encryption before sharing anything sensitive online, thereby making evidence much harder to recover.
5. It wouldn’t do the cloud computing market much good, as companies need the trust of potential customers/clients. A lot of people refuse to store anything online precisely because of privacy concerns.
Ethics, Trust and the Law
The one big factor here is trust. Since most of us aren’t running our own networks or writing software, we trust numerous parties to do it for us – those storing our data, the authorities that issue server certificates, the network administrators in our workplaces, and sometimes the agencies that gather information for whatever reason. Basically anyone responsible for the technicalities of getting information from A to B and storing it. This means that IT professionals are in a position of trust, to a much greater extent than politicians who get away with just about anything. It’s this asymmetry, and some lobbying by big businesses, that causes the divide between legislation and our ethics to become larger by the month. One of the most direct examples I always refer to is the Digital Economy Act, and thanks to the cheap but undervalued publications like Micro Mart that covered the affair better than more professional journals, some of us were very clued up on the dirty politics leading up to it. A group of us sat in the pub vowing to protect our present and future clients against a law we had reason to believe was passed through bribery. Some of the larger ISPs, including Talk Talk, took a similar but less militant position.
I believe everyone has a right to privacy and security even more so today. The phrase ‘nothing to hide, nothing to fear’ is so often repeated even though it’s a myth because everyone does indeed have something to hide, even if it’s just their bank balances or medical records. A mature, civilised and progressive society would also accept, as a fact of life, the majority of ordinary people also have embarassing little secrets and flaws in their character. This is important, since most personal computers, even smartphones, on closer investigation will reveal practically everything about the owner and his/her personality, and the ability to gather this kind of information is too dangerous in the wrong hands simply because of the massive potential for misuse, not to mention the incompetence that leads to data being leaked. The degree of control it would give politicans is also incompatible with a healthy democracy. I think I’ve already covered this in my rantings against the National Identity Register.
Protecting peoples right to privacy and security, in the context of this post, depend on the three ‘pillars’ of information security – Confidentiality, Integrity, and Availability (CIA). If any of those are compromised, by governments or criminal hackers, a system cannot be secure, no matter what claims are made to the contrary. It sounds really obvious, but it shows how the CIA model of security is literally true in real life, and how much society relies on it to function.
A Zero Knowledge Backup System
Now onto a solution to our concerns about the security of cloud computing: SpiderOak is an online storage service, and like any US-based company of its type, has to comply with the revised ‘Patriot’ Act, which means it could be ordered to hand over client data. The fundamental difference is SpiderOak is quite incapable of compromising its users’ privacy, if its executives are to be believed. They operate something known as a zero knowledge backup system, and have no way of accessing the content, determining which files belong to a specifc client, or of recovering encryption keys to accounts. The best SpiderOak Inc. could do, faced with an order, and assuming they can find the account in question through the server logs and ISPs, is hand over heavily encrypted blocks. I suspect technology capable of bruteforcing this level of encryption already exists, but it’s currently in very few hands and used in very exceptional circumstances.
Providing the user has taken a few client-side precautions and the information provided by SpiderOak is correct, this in theory makes SpiderOak more secure, and certainly much safer, than most storage providers.
Compare this with the DropBox service, where anything uploaded to it now may as well be public. The same could be said of a number of other services. Formerly, DropBox claimed that not even its employees were able to access its clients’ data. There were apparently no backdoors that allowed this, or so we were lead to believe.
But when the ‘Patriot’ Act and DropBox privacy statement were revised, it turned out such a method of accessing the data did indeed exist and the encryption to whatever account the government wanted to access could be removed, which makes the encryption massively flawed, if not useless. On the positive side, it serves as a warning that flawed and badly implemented encryption does exist, and as a lesson to look under the surface of marketing hype. I had read the policy and noticed the change at the time, but since I didn’t have the original I couldn’t post anything, but Erik Sherman at BNET had covered this in greater depth (At DropBox, Even We Can’t See Your Data).