Clouded data: Privacy and the promise of encryption (original) (raw)

We Need to Think Data Protection Beyond Privacy

2020

Turbo-Digitization after Covid-19 will advance algorithmic social selection and the biopolitical shift of digital capitalism. In order to mitigate these risks, we must address the social implications of anonymous mass data.

"I am Spartacus" – Privacy Enhancing technologies and Privacy as a public good

The paper introduces an approach to privacy enhancing technologies that sees privacy not merely as an individual right, but as a public good. This understanding of privacy has recently gained ground in the debate on appropriate legal protection for privacy in an online environment. The jurisprudential idea that privacy is a public good and prerequisite for a functioning democracy also entails that its protection should not be left exclusively to the individual whose privacy is infringed. This idea finds its correspondence in our approach to privacy protection through obfuscation, where everybody in a group takes a small privacy risk to protect the anonymity of fellow group members. We show how these ideas can be computationally realised in an Investigative Data Acquisition Platform (IDAP). IDAP is an efficient symmetric Private Information Retrieval (PIR) protocol optimised for the specific purpose of facilitating public authorities' enquiries for evidence.

Blockchain, GDPR, and fantasies of data sovereignty

Law, Innovation and Technology, 2020

Like the European Union's General Data Protection Regulation (GDPR), the broader, mainstream emergence of blockchain technology in the present moment of, what I call, data dysphoria is no accident. It is in part reaction to data dysphoria, and in part exploitation of it, a duality underpinned by the tantalizing promise of the prosumer 'taking control' of their data and establishing sovereignty over it. Blockchain and GDPR alike aim to resolve 'problem'/'solution' matrices with deep roots in a wide variety of global economic, political, social, legal and cultural contexts. This article explores the problem of achieving resolution based on innovation and technology by offering an account of the rise of blockchain and implementation of GDPR within a psycho-political framework, one in which fantasies of taking control are predominant yet highly contestable actualities in the lives of technology users.

Personal data are political. A feminist view on privacy and big data.

Recerca. Revista de pensament i anàlisi, 2019

The second-wave feminist critique of privacy defies the liberal opposition between the public-political and the private-personal. Feminist thinkers such as Hanisch, Young or Fraser note that, according to this liberal conception, public institutions often keep asymmetric power relations between private agents away from political discussion and action. The resulting subordination of some agents to others tends, therefore, to be naturalised and redefined as a «personal problem». Drawing on these contributions, this article reviews the social and political implications of big data exploitation and questions whether personal data protection must remain a matter of «privacy self-management». It aims to show that feminist political theory can decidedly help to identify and tackle the root causes of what I call «data domination».

The social, cultural, epistemological and technical basis of the concept of 'private' data

2012

In July 2008, the UK Information Commissioner launched a review of EU Directive 95/46/EC on the basis that: "European data protection law is increasingly seen as out of date, bureaucratic and excessively prescriptive. It is showing its age and is failing to meet new challenges to privacy, such as the transfer of personal details across international borders and the huge growth in personal information online. It is high time the law is reviewed and updated for the modern world." 1 Legal practitioners such as Bergkamp have expressed a similar sense of dissatisfaction with the current legislative approach: "Data Protection as currently conceived by the EU is a fallacy. It is a shotgun remedy against an incompletely conceptualised problem. It is an emotional, rather than rational reaction to feelings of discomfort with expanding data flows. The EU regime is not supported by any empirical data on privacy risks and demand…A future EU privacy program should focus on actual harms and apply targeted remedies." 2 Accordingly, this thesis critiques key concepts of existing data protection legislation, namely 'personal' and 'sensitive' data, in order to explore whether current data protection laws can simply be amended and supplemented to manage privacy in the information society. The findings from empirical research will demonstrate that a more radical change in EU law and policy is required to effectively address privacy in the digital economy. To this end, proposed definitions of data privacy and private data was developed and tested through semi-structured interviews with privacy and data protection experts. The expert responses indicate that Bergkamp et al 3 have indeed identified a potential future direction for privacy and data protection, but that further research is required in order to develop a coherent definition of privacy protection based on managing risks to personal data, and harm from misuse of such information.

Balancing Privacy with Legitimate Surveillance and Lawful Data Access

IEEE Cloud Computing, 2015

Legislators and policymakers must be mindful of the difficulties challenging any society that embraces new technological capacity without putting in place appropriate regulatory mechanisms and legal regimes. The overview provided here reviews these themes in the context of cloud technology.

Redesigning or Redefining Privacy

Snowden's revelations of 2013 have shifted attention to societal implications of surveillance practices and in particular privacy. This editorial reflects on key concepts and research questions raised in the issue. How can privacy be defined? Can it be designed? Considering such developments, this editorial asks if the public's attitudes to the sharing of data have moved towards, 'nothing to hide, nothing to fear' arguments and if greater awareness and corporate transparency are possible. Even if corporate surveillance does not operate through overt coercion, it is argued that it yet results in self-regulation and subjugation to neoliberal rationality. Since telecoms and social media companies generally work hand in hand with the state and legal and practical standpoints boundaries overlap on a great scale, how can privacy be safeguarded for citizens? And where 'accountability' of data holders, as interviewee Mark Andrejevic suggests, is a growing imperative. Contributions to this issue suggest detailed attention to legal frameworks, encryption practices, definitions of the surveilled subject and the history of such scrutiny may hold some of the answers.