Big Data and Governance (original) (raw)

Big Data Governance Needs More Collective Responsibility: The Role of Harm Mitigation in the Governance of Data Use in Medicine and Beyond

Medical Law Review, 2019

Harms arising from digital data use in the big data context are often systemic and cannot always be captured by linear cause and effect. Individual data subjects and third parties can bear the main downstream costs arising from increasingly complex forms of data uses-without being able to trace the exact data flows. Because current regulatory frameworks do not adequately address this situation, we propose a move towards harm mitigation tools to complement existing legal remedies. In this article, we make a normative and practical case for why individuals should be offered support in such contexts and how harm mitigation tools can achieve this. We put forward the idea of 'Harm Mitigation Bodies' (HMBs), which people could turn to when they feel they were harmed by data use but do not qualify for legal remedies, or where existing legal remedies do not address their specific circumstances. HMBs would help to obtain a better understanding of the nature, severity, and frequency of harms occurring from both lawful and unlawful data use, and they could also provide financial support in some cases. We set out the role and form of these HMBs for the first time in this article. Mustafa's case Mustafa loves good coffee. In his free time, he often browses high-end coffee machines that he cannot currently afford but is saving for. One day, travelling to a friend's wedding abroad, he gets to sit next to another friend on the plane. When Mustafa complains about how much he paid for his ticket it turns out that his friend paid less than half of what he paid. Mustafa googles possible reasons for this and concludes that it must be related to his browsing expensive coffee machines and equipment. He is very angry about this and complains to the airline, who send him a lukewarm apology that refers to dynamic and personalised pricing models. Mustafa feels that this is unfair but does not challenge it because pursuing this would make him lose time and money. Paula's case 1 After years of trying to conceive, Paula is pregnant. Five months into the pregnancy she suffers a miscarriage of pregnancy. Paula and her partner are heartbroken. For months after the end of her pregnancy Paula keeps receiving advertisements from shops specialised in maternity and infant products and services congratulating her on the 'milestones' of her supposed baby. This is an immensely aggravating and distressing experience for Paula and her partner. Paula's partner calls up the companies that send these advertisements, demanding them to erase their names from their database. He also demands to hear where they got Paula's contact details in the first place, but he does not receive any answers. Paula suspects that one of her doctors passed on her details to retailers, but she cannot prove it.

Regulation of Big Data: Perspectives on strategy, policy, law and privacy

Health and Technology, 2017

This article encapsulates selected themes from the Australian Data to Decisions Cooperative Research Centre's Law and Policy program. It is the result of a discussion on the regulation of Big Data, especially focusing on privacy and data protection strategies. It presents four complementary perspectives stemming from governance, law, ethics, and computer science. Big, Linked, and Open Data constitute complex phenomena whose economic and political dimensions require a plurality of instruments to enhance and protect citizens' rights. Some conclusions are offered in the end to foster a more general discussion.

Data protection in a big data society. Ideas for a future regulation

Digital Investigation, 2015

Big data society has changed the traditional forms of data analysis and created a new predictive approach to knowledge and investigation. In this light, it is necessary to consider the impact of this new paradigm on the traditional notion of data protection and its regulation. Focussing on the individual and communal dimension of data use, encompassing digital investigations, the authors outline the challenges that big data poses for individual information self-determination, reasonable suspicion and collective interests. Therefore, the article suggests some innovative proposals that may update the existing data protection legal framework and contribute to make it respondent to the present algorithmic society.

2018 - Tracing Big Data Imaginaries through Public Policy: The Case of the European Commission

Across the globe, the notion of Big Data has received much attention, not only in technology and business circles but also among political authorities. Public officials in Europe, the U.S., and beyond have formulated Big Data strategies that will steer I(C)T development towards certain goals and aspirations. Drawing on official European Commission documents and using the notion of sociotechnical imaginaries as a sensitising concept, this chapter investigates the values, beliefs, and interests that guide European policymakers' Big Data rhetoric, making the argument that while the Commission's embrace of a strong free-market position can be partly explained in terms of vexing economic, institutional, and epistemic challenges, its push for Big Data solutions threatens to undermine democratic rights and principles as well as efforts towards responsible research and innovation. The article concludes with recommendations for further research, emphasising the need for cross-disciplinary dialogue and scholarship.

Risks and Legal Protections in the World of Big-Data

Journal of health law, 2018

The development of large electronic data sets, whether from electronic health records, health registries, or large-scale gene-environment interaction studies, offer unparalleled, innovated opportunities to learn more about human health and disease. However, because these data may be used in unexpected ways without the knowledge or consent of individuals whose data are being used, they also raise critical concerns about protections of individuals against risks. Traditional approaches to protecting research participants and patients may not address new or heightened risks in the "bid data" area. We conducted legal research to elucidate the web of legal protections afforded research participants in genomic research, including laws governing human subjects research, privacy, consent, discrimination, and use of research participants' genetic information, with broader implications to data use in other research or in health care settings. It has revealed substantial regulatory activity and variation across the 50 states that may fill known gaps in federal protections. For example, some states go further than the Genetic Nondiscrimination Act (GINA) by extending genetic antidiscrimination statutes to life and disability insurers or to employers with less than 15 employees. In addition, states often explicitly provide remedies, such as statutory damages, attorneys' fees, and costs that can facilitate enforcement of legal rights not afforded in federal laws. This analysis can inform approaches to data privacy law in the United States and beyond to provide appropriate protections as health systems and scientists seek to harness the promise of big data.

The UK Reform of Data Protection: Impact on Data Subjects, Harm Prevention, and Regulatory Probity

2022

In September 2021, the UK government released a set of proposed reforms to its data protection regime for public consultation. The reforms are part of a broader national strategy, which aims to incentivise data-driven innovation and make the UK an international “data hub”. In this article, we argue that taken together, the proposed reforms risk (1) undermining the data subjects’ rights that were ensured with the adoption of the EU GDPR into UK law; (2) introducing an accountability framework that is inadequate to address harm prevention; and (3) eroding the regulatory probity of the Information Commissioner’s Office (ICO). We also comment on the analysis of the expected impact of the reform, discussing the negative impact for both public and private stakeholders, especially in light of the “Brussels effect” and growing international compliance with the EU GDPR.

The ethics of big data as a public good: which public? Whose good?

International development and humanitarian organizations are increasingly calling for digital data to be treated as a public good because of its value in supplementing scarce national statistics and informing interventions, including in emergencies. In response to this claim, a ‘responsible data’ movement has evolved to discuss guidelines and frameworks that will establish ethical principles for data sharing. However, this movement is not gaining traction with those who hold the highest-value data, particularly mobile network operators who are proving reluctant to make data collected in low- and middle-income countries accessible through intermediaries. This paper evaluates how the argument for ‘data as a public good’ fits with the corporate reality of big data, exploring existing models for data sharing. I draw on the idea of corporate data as an ecosystem involving often conflicting rights, duties and claims, in comparison to the utilitarian claim that data’s humanitarian value makes it imperative to share them. I assess the power dynamics implied by the idea of data as a public good, and how differing incentives lead actors to adopt particular ethical positions with regard to the use of data.

Impossible, unknowable, accountable: Dramas and dilemmas of data law

Social Studies of Science, 2019

On May 25, 2018, the European Union’s General Data Protection Regulation (GDPR) came into force. EU citizens are granted more control over personal data while companies and organizations are charged with increased responsibility enshrined in broad principles like transparency and accountability. Given the scope of the regulation, which aims to harmonize data practices across 28 member states with different concerns about data collection, the GDPR has significant consequences for individuals in the EU and globally. While the GDPR is primarily intended to regulate tech companies, it also has important implications for data use in scientific research. Drawing on ethnographic fieldwork with researchers, lawyers and legal scholars in Sweden, I argue that the GDPR’s flexible accountability principle effectively encourages researchers to reflect on their ethical responsibility but can also become a source of anxiety and produce unexpected results. Many researchers I spoke with expressed profound uncertainty about ‘impossible’ legal requirements for research data use. Despite the availability of legal texts and interpretations, I suggest we should take researchers’ concerns about ‘unknowable’ data law seriously. Many researchers’ sense of legal ambiguity led them to rethink their data practices and themselves as ethical subjects through an orientation to what they imagined as the ‘real people behind the data’, variously formulated as a Swedish population desiring data use for social benefit or a transnational public eager for research results. The intentions attributed to people, populations and publics – whom researchers only encountered in the abstract form of data – lent ethical weight to various and sometimes conflicting decisions about data security and sharing. Ultimately, researchers’ anxieties about their inability to discern the desires of the ‘real people’ lent new appeal to solutions, however flawed, that promised to alleviate the ethical burden of personal data.