“There’s no fresh start in today’s world. Any twelve-year-old with a cell phone could find out what you did. Everything we do is collated and quantified. Everything sticks.”
– Selina Kyle “The Catwoman” from Dark Knight Rises
As I noted in my previous post, Anne Hathaway’s character in the Dark Knight Rises shows us the value of providing individuals with the ability to rehabilitate their online reputations, and learn from their mistakes. Intel is honored to promote Matt Ivester’s book LOL…OMG (click here to download a free copy from January 25th – January 29th) to put practical tools in the hands of students so they can still exercise the freedom to explore, take risks and innovate.
Making mistakes is a critical component of innovation. John Stuart Mill wrote convincingly about the hazards of a culture in which liberty is curtailed to a point where individuals slide into self protective conformity out of fear of the tyranny of the majority. Jeremy Benthem discussed the coercive ability of the Panopticon (the building plan of an institution that allows a guard to watch all of the inmates without them being aware they are being watched) as “a new mode of obtaining power of mind over mind, in a quantity hitherto without example.” Michel Foucault took the Panopticon one step further in Discipline and Punish, when he extended the concept to all social spheres. Foucault raised important questions of what it means to punish individuals for their bad acts, and the role of public humiliation to chill future bad behavior. We know we can chill bad behavior by observing individuals, but how much positive behavior will it also chill?
I have written in this blog of the need to separate the Right to Privacy from the Right to Steal or the Right to Hack. We need to allow individuals a sphere of privacy in which to try new ideas, concepts and business ventures without an undue fear of the consequences of a mistake. We need what I call the “Right to Fail”. This Right to Fail should protect individuals so they can challenge themselves and attempt what others say is impossible. This concept or protecting failure has been at the heart of Silicon Valley for decades, and is the foundation of our modern innovation economy. Actually, preserving individuals’ ability to take risks, while protecting them from undue consequences has existed for millennia. For example, there are concepts of debt forgiveness in the Old Testament. Debt forgiveness has involved both punishment (debtors prisons in Europe) and rehabilitation (allowing an individual to build back a good credit rating over time).
Modern bankruptcy laws have increased their focus on allowing individuals to make limited mistakes and then rehabilitate themselves. These laws do not offer a “Clean Slate” program to wipe away all record of failure, but they do mitigate the harm to the individual or company from having taken a risk. However, these laws have also been modified over time so they do not allow individuals to use them as cover for bad acts (e.g. The Bankruptcy Abuse Prevention and Consumer Protection Act (BAPCA) of 2005). Similarly, we need a system in privacy law that will allow the Catwoman a reasonable chance to start a new life, without shielding her from reasonably suffering the consequences of her bad decisions. Creating a Right to Fail that allows for rehabilitation, but still adequately encourages personal responsibility and good judgment is at the root of the discussion around the EU Right to be Forgotten proposal. It appears more discussion is warranted, as optimizing for the Right to Fail, while not creating a Right to Steal, is an enterprise requiring great nuance, precision and adjustment over time.
The Panopticon and the history of bankruptcy laws provide useful lenses through which to analyze the European Commission proposal of a Right to be Forgotten. The task in drafting a Right to be Forgotten should have as its goal to allow individuals to escape from information about them that should never have been made public (the embarrassing photo (the LOL…OMG problem)) or that is no longer relevant (the position taken on a university term paper decades ago), while still allowing individuals to know important information about people with whom they engage (credit card fraud databases). This exercise may in the end be more about obscurity than forgetting, as Woodrow Hartzog and Evan Selinger point out in their excellent article in the Atlantic.
Would Anne Hathaway’s Catwoman really need a “Clean Slate” if there was a mechanism to ensure major search engines would not display the evidence of her misdeeds? If so, who should be trusted to make decisions about what level of obscurity is healthy for society? Who should review each request?
The proposed General Data Protection Regulation (the Regulation) attempts to break new ground on this issue. Section 3 of the Regulation covers Rectification and Erasure. Article 16 covers the rectification of inaccurate information, while Article 17 proposes provisions on the “Right to be forgotten and to erasure.” The current EU Data Protection Directive (Directive), which the Regulation would replace, also attempted to address this issue of what is an appropriate amount of forgetting. In the Directive these issues were handled under Section V “The Data Subjects Right of Access to Data”, which includes Article 12 (b) which requires “as appropriate the rectification, erasure or blocking of data the processing of which does not comply with the provisions of this Directive, in particular because of the incomplete or inaccurate nature of the data”.
The Right to be Forgotten in the Regulation is a significant expansion on the access and deletion language in the current Directive, and is potentially closer to Batman’s Clean Slate program. Under the Regulation’s language individuals would not only have the ability to withdraw consent for information they had previously provided (it is unclear how this would work in many instances where a service provider has agreed to provide a service based on the consent to provide the information), but also to demand deletion of information that relates to them, but was provided by a third party. This demand for deletion can be made for a number of reasons, but arguably the most important is Article 17 (1)(a) that “the data are no longer necessary in relation to the purposes for which they were collected or otherwise processed.”
The Dark Knight Rises helps us see why this construction of the Right to be Forgotten will be difficult to implement. If the Catwoman objects to internet blog postings about her prior criminal convictions, one of the arguments she could make is that such stories are no longer necessary, and she should be able to have them “forgotten”. She would argue that the stories are old, and the information is no longer necessary or relevant to inform the public. The Regulation would put the onus on the Data Controller to determine whether this information is still necessary. The Controller would have an exception under Article 17 (3)(a) to exercise the right of free expression, but this would only apply to data processed “solely for journalistic purposes or the purpose of artistic or literary expression in order to reconcile the right to the protection of the personal data with the rules governing freedom of expression.” The Regulation allows for the European Commission to create additional rules, but this does not seem to be an area where detailed implementing regulations and/or delegated acts will provide predictability or clarity.
Let’s say Selina Kyle objects to the accessibility of news stories about crimes she committed while she was under the age of 18. She complains to both the newspaper websites, and to the search engines, saying she wants the information “forgotten” (deleted from the websites) or at least “obscured” (not to show up as results in web searches). Many countries have juvenile justice systems which seal records of crimes committed by children. It seems to follow that some system of online reputation rehabilitation is consistent with the same values that are behind the juvenile justice procedures. However, society needs to remember some misdeeds to make certain similar events do not reoccur. The digital memory becomes our collective conscience. Additionally, tt is difficult to understand how completely forgetting would even be possible. In their paper “The Right to be Forgotten – between expectations and practice”, the European Network and Information Security Agency (ENISA) has expressed concerns about whether the Right to be Forgotten proposal can be technically implemented.
When do we have a Duty to Remember which outweighs the Right to be Forgotten? Who should make the decision? Should all the records be deleted or obscured? Is deleting all records even possible? What are the free speech implications?
Some argue the language in the existing Directive is better. As noted above, the Directive’s language allows requests for deletion for data or processing which “which does not comply with the provisions of the Directive”. Article 6 (1)(c) of the Directive provides personal data must be “adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed.” Because of this (and other language in the Directive), many have argued for some time that the existing Directive requires a “reasonable” level of deletion for information which would have a “disproportionate” impact on the individual to which the personal data relates.
As we need to optimize for the Right to Fail and Freedom of Expression at the same time, this type of flexible reasonableness standard may be the best legislation can offer. This standard would allow the Catwoman to make her case that her bad acts are behind her, and databases which have profiled her as a felon should now be modified. Such a flexible high level access and deletion obligation would still be difficult for companies to interpret and implement. Interpreting the standards will be equally challenging for courts and regulators. Also, it will not solve issues with how individuals can request deletion from other websites and organizations which have subsequently gotten access to the data (e.g., information aggregators). However, the Directive’s access and deletion requirements at least would provide a flexible principle based method to optimize for both rehabilitation and punishment.
Regardless of the legal obligations, there are practical steps individuals can take to obscure information on the internet. Matt Ivester describes many of them in LOL…OMG, thereby helping teenagers understand the risks to their online reputation, and how they can protect themselves. Commercial services like Reputation.com also provide opportunities to obscure information on the internet. These types of educational efforts and services provide individuals with opportunities to protect their Right to Fail. Still, these systems of self help are imperfect, and depend in large part on individuals having the knowledge and resources to use them. The current debate around the Right to Be Forgotten asks important questions regarding how to provide all individuals with more control over their online reputation. The Right to be Forgotten seems at times a wonderful aspiration, but a troublesome obligation.
Sadly, in the real world it does not appear Selina Kyle and Bruce Wayne will live happily ever after, as the length of her illustrious career as the Catwoman will convincingly argue against a “clean slate.” However, the practical steps described above may be enough to empower many individuals to get enough obscurity to remedy the LOL…OMG problem, and also preserve their Right to Fail.