Former MSc Student
Thesis: A Theoretical Model of Externalities in Anonymity Decisions Supervisor: Dr Ian Brown Completed: 2012
Confession time: Both my grandmother and I routinely fail to keep our home computer and web-browser security up to date.
Are we bad ‘netizens’ if we leave the keys in the ignition of our unlocked computer? Is it reasonable to complain about Anonymous without ensuring that my laziness (her inability) isn’t giving them easy ammunition for their cyber-dissent? Is our activity criminally negligent, and if so why can’t we be punished for it?
Last year, authorities in Spain arrested three suspects for allegedly downloading and booting up a program (called a Low Orbit Ion Cannon or LOIC) that lets the suspects’ computers purposely participate in an online distributed denial of service (DDoS) attack. In plain English: they were arrested for downloaded a software program that let their computers help take a website offline.
My computer, my grandma’s computer, your computer, or all of our computers may also have been participating in the same attack–even though we did not take deliberate actions to participate–simply because of our normal, oblivious, online behavior.
We still walk free, and this seems fair. But is there a point where our negligence becomes too egregious not to punish? How clear is the legal line between these suspects’ actions and our own?
I decide how to use my computer (when to update the security, which websites to visit, which programs to download) based on a simple risk-benefit calculation. I balance the expected benefits (laziness) with the sensitivity of my personal information at risk, and how much I value keeping my physical machines in working order. Yet in my internal calculation I never really consider the negative impacts of my actions on you—for instance opening up my computer to a program that turns it into one of many ‘bots’ on a (potentially criminally involved) ‘botnet’.
This situation is a classic instance of what Economics termed an externality. The collection of private actors can cause serious inefficiencies by not considering the external effects of their actions on others. One traditional economic prescription for this inefficiency is property rights (at least when negotiation costs are small, for the interested, another is a Pigouvian Tax): creating a “right” for me to be lazy, or a “right” for the victim to demand reparations when my laziness harms them.
Unfortunately, the world—and the digital world especially—is rarely so simple. Negotiating with the entire world is not cheap (at least not with current technology), and rights are hard to enforce evenly in such a complex, changing environment.
We should have a right to not be attacked by those who deliberately use their computers as tools in cyber-crimes. We should have the right to be protected from liability when our computers are hijacked through vulnerabilities even the best cyber security experts of the day do not know about (e.g. a Zero-Day attack). But should we have the right to leave our potentially dangerous computing tools lying around for anyone to pick up?
If I locked up my gun in my closet’s safe with a gun-lock, but it was stolen and used in a crime, my liability would be low. I followed the culturally accepted best practices for gun-ownership safety. But what if I left it out on the counter but locked the front door? If I left it in the driveway? If I put up a flier advertising an unlocked, unprotected gun lying in my driveway just waiting to be incorporated into your crime?
Gun rights have a much richer history in my country (the USA) than digital rights, so many of these questions have been discussed publicly regarding tort law and negligent firearm storage. The fact that we do not know our rights when you replace ‘gun’ with ‘computer’ is a problem.
The suspects in Spain were arrested for downloading and operating a computer program. Would they have been arrested if they introduced vulnerabilities into their computer, advertised their computer’s weaknesses in a chatroom, and asked someone to take over a portion of their computer from them?
If the rule of law is to be upheld, there should be a point up to which I am allowed to be negligent, but after which you are allowed to demand my vigilance. Currently some activities are either allowed or illegal depending on whom you ask. We should compromise on a reasonable balance of laziness and vigilance, but we should no longer abide ambiguity in this area.
It is possible that the law could attempt a fix by relying on the ‘intent’ of the actor: my grandmother (for example) did not mean to take down the site, cyber-criminals did. The problem here is that plausible-deniability always lies just a link away for the computer savvy (“How could I have known that the link would exploit a vulnerability in my browser, officer?”). My grandma relies on deniability in her defense, why shouldn’t they have access to the same defense?
Currently we have to trust that law enforcement officials can tell the difference between my grandmother and cyber-criminals. But if this culture of negligence continues to be ignored, legal and illegal activities will eventually devolve into indiscernible shades-of-grey.