Safeguarding student privacy in an age of analytics (original) (raw)

The privacy paradox and its implications for learning analytics

Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, 2020

Learning analytics promises to support adaptive learning in higher education. However, the associated issues around privacy protection, especially their implications for students as data subjects, has been a hurdle to wide-scale adoption. In light of this, we set out to understand student expectations of privacy issues related to learning analytics and to identify gaps between what students desire and what they expect to happen or choose to do in reality when it comes to privacy protection. To this end, an investigation was carried out in a UK higher education institution using a survey (N=674) and six focus groups (26 students). The study highlight a number of key implications for learning analytics research and practice: (1) purpose, access, and anonymity are key benchmarks of ethics and privacy integrity; (2) transparency and communication are key levers for learning analytics adoption; and (3) information asymmetry can impede active participation of students in learning analytics. CCS CONCEPTS • Applied computing → Computer-assisted instruction; • Humancentered computing → Empirical studies in HCI .

“We're being tracked at all times”: Student perspectives of their privacy in relation to learning analytics in higher education

Journal of the Association for Information Science and Technology

Higher education institutions are continuing to develop their capacity for learning analytics (LA), which is a sociotechnical data mining and analytic practice. Institutions rarely inform their students about LA practices and there exist significant privacy concerns. Without a clear student voice in the design of LA, institutions put themselves in an ethical grey area. To help fill this gap in practice and add to the growing literature on students' privacy perspectives, this study reports findings from over 100 interviews with undergraduate students at eight United States highereducation institutions. Findings demonstrate that students lacked awareness of educational data mining and analytic practices, as well as the data on which they rely. Students see potential in LA, but they presented nuanced arguments about when and with whom data should be shared; they also expressed why informed consent was valuable and necessary. The study uncovered perspectives on institutional trust that were heretofore unknown, as well as what actions might violate that trust. Institutions must balance their desire to implement LA with their obligation to educate students about their analytic practices and treat them as partners in the design of analytic strategies reliant on student data in order to protect their intellectual privacy.

Learning Analytics in action: ethics and privacy issues in the classroom

—The field of learning analytics (LA) is working on the definition of frameworks that structure the legal and ethical issues that stakeholders have to take into account regarding LA solutions. While current efforts in this direction focus on institutional and development aspects, this paper reflects on small-scale classroom oriented approaches that aim at supporting teachers in their practice. This reflection is based on three studies where we applied our teacher-led learning analytics approach in higher education and primary school contexts. We describe the ethical issues that emerged in these learning scenarios, and discuss them according to three dimensions: the overall learning analytics approach, the particular solution to learning analytics adopted, and the educational contexts where the analytics are applied.

Learning analytics at the intersections of student trust, disclosure and benefit

2019

Evidence suggests that individuals are often willing to exchange personal data for (real or perceived) benefits. Such an exchange may be impacted by their trust in a particular context and their (real or perceived) control over their data. Students remain concerned about the scope and detail of surveillance of their learning behavior, their privacy, their control over what data are collected, the purpose of the collection, and the implications of any analysis. Questions arise as to the extent to which students are aware of the benefits and risks inherent in the exchange of their data, and whether they are willing to exchange personal data for more effective and supported learning experiences. This study reports on the views of entry level students at the Open University (OU) in 2018. The primary aim is to explore differences between stated attitudes to privacy and their online behaviors, and whether these same attitudes extend to their university's uses of their (personal) data. The analysis indicates, inter alia, that there is no obvious relationship between how often students are online or their awareness of/concerns about privacy issues in online contexts and what they actually do to protect themselves. Significantly though, the findings indicate that students overwhelmingly have an inherent trust in their university to use their data appropriately and ethically. Based on the findings, we outline a number of issues for consideration by higher education institutions, such as the need for transparency (of purpose and scope), the provision of some element of student control, and an acknowledgment of the exchange value of information in the nexus of the privacy calculus. CCS CONCEPTS • Social and professional topics~Privacy policies • Security and privacy~Privacy protections

Privacy and data protection in learning analytics should be motivated by an educational maxim—towards a proposal

Research and Practice in Technology Enhanced Learning

Privacy and data protection are a major stumbling blocks for a data-driven educational future. Privacy policies are based on legal regulations, which in turn get their justification from political, cultural, economical and other kinds of discourses. Applied to learning analytics, do these policies also need a pedagogical grounding? This paper is based on an actual conundrum in developing a technical specification on privacy and data protection for learning analytics for an international standardisation organisation. Legal arguments vary a lot around the world, and seeking ontological arguments for privacy does not necessarily lead to a universal acclaim of safeguarding the learner meeting the new data-driven practices in education. Maybe it would be easier to build consensus around educational values, but is it possible to do so? This paper explores the legal and cultural contexts that make it a challenge to define universal principles for privacy and data protection. If not universal principles, consent could be the point of departure for assuring privacy? In education, this is not necessarily the case as consent will be balanced by organisations' legitimate interests and contract. The different justifications for privacy, the legal obligation to separate analysis from intervention, and the way learning and teaching works makes it necessary to argue data privacy from a pedagogical perspective. The paper concludes with three principles that are proposed to inform an educational maxim for privacy and data protection in learning analytics.

Ethical and privacy principles for learning analytics

He is working on research projects exploring how technology can be used to understand and influence human behavior. He has experience in the use of digital devices in areas such as behavioral analytics, social networks, computer-supported collaboration, personalization and technology-enhanced learning. George Siemens is the Executive Abstract The massive adoption of technology in learning processes comes with an equally large capacity to track learners. Learning analytics aims at using the collected information to understand and improve the quality of a learning experience. The privacy and ethical issues that emerge in this context are tightly interconnected with other aspects such as trust, accountability and transparency. In this paper, a set of principles is identified to narrow the scope of the discussion and point to pragmatic approaches to help design and research learning experiences where important ethical and privacy issues are considered. Introduction: privacy in learning environments The use of information and communication technology has significantly changed how learning experiences are conceived and deployed. The widespread use of various digital devices together with cloud computing allows for learning scenarios not previously considered. Students are now able to access a myriad of learning resources, interact with applications focusing on a specific topic, enhance their experience in virtual environments, augment reality and connect with others through social networks. The progress of technology evolves together with the capacity to record the events occurring in a learning environment. Every interaction and resource accessed can be captured and stored. As a consequence, learning scenarios can now be analyzed using big-data analytics techniques. Although the use of new technology is shaping the way we learn, a more significant change may derive from the use of big-data analytics (Siemens & Long, 2011).

Privacy Requirements for Learning Analytics – from Policies to Technical Solutions

2016

This paper is a first exploration of how privacy requirements could influence the design of each of the processes in a learning analytics framework model under development. The international organization for standardization, ISO/IEC JTC 1/SC36, is working on a reference model specifying a learning analytics process loop of six processes: Learning Activity, Data Collection, Data Storing and Processing, Analyzing, Visualization, and Feedback Actions. All these processes are, according to the framework, influenced by privacy policies.

Guest editorial: Ethics and privacy in learning analytics

Journal of Learning Analytics, 2016

The European Learning Analytics Community Exchange (LACE) project is responsible for an ongoing series of workshops on ethics and privacy in learning analytics (EP4LA), which have been responsible for driving and transforming activity in these areas. Some of this activity has been brought together with other work in the papers that make up this special issue. These papers cover the creation and development of ethical frameworks, as well as tools and approaches that can be used to address issues of ethics and privacy. This editorial suggests that it is worth taking time to consider the often intertangled issues of ethics, data protection and privacy separately. The challenges mentioned within the special issue are summarised in a table of 22 challenges that are used to identify the values that underpin work in this area. Nine ethical goals are suggested as the editors’ interpretation of the unstated values that lie behind the challenges raised in this paper.

Ethical and privacy issues in the application of learning analytics

Proceedings of the Fifth International Conference on Learning Analytics And Knowledge - LAK '15, 2015

The large-scale production, collection, aggregation, and processing of information from various learning platforms and online environments have led to ethical and privacy concerns regarding potential harm to individuals and society. In the past, these types of concern have impacted on areas as diverse as computer science, legal studies and surveillance studies. Within a European consortium that brings together the EU project LACE, the SURF SIG Learning Analytics, the Apereo Foundation and the EATEL SIG dataTEL, we aim to understand the issues with greater clarity, and to find ways of overcoming the issues and research challenges related to ethical and privacy aspects of learning analytics practice. This interactive workshop aims to raise awareness of major ethics and privacy issues. It will also be used to develop practical solutions to advance the application of learning analytics technologies.

Privacy in Learning Analytics – Implications for System Architecture

2015

This paper explores the field of ICT standardisation related to learning analytics, a new class of technologies being introduced to schools, universities and further education as a consequence of increased access to data from learning activities. Learning analytics has implication for how the individual manages data and knowledge about herself and her learning, highlighting issues of privacy, ownership of data, and consent to share and use data, – issues that are not yet been fully discussed in the field of learning technology development in general, and standardisation of learning technologies in particular. What do these issues mean for standardisation and design of LA architectures? Based on requirements of open architecture, transparency and trust, and ownership and consent this paper proposes a search architecture for learning analytics based on open and linked data. The proposed middle layer highlights dynamic usage agreements and student agency and represents an alternative a...