Operationalising transparency as an integral value of Learning Analytics systems -From ethical and data protection to technical design requirements (original) (raw)
Related papers
Ethical and privacy issues in the design of learning analytics applications
Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK '16, 2016
Issues related to Ethics and Privacy have become a major stumbling block in application of Learning Analytics technologies on a large scale. Recently, the learning analytics community at large has more actively addressed the EP4LA issues, and we are now starting to see learning analytics solutions that are designed not only as an afterthought, but also with these issues in mind. The 2 nd EP4LA@LAK16 workshop will bring the discussion on ethics and privacy for learning analytics to a the next level, helping to build an agenda for organizational and technical design of LA solutions, addressing the different processes of a learning analytics workflow.
HCI International 2022 Posters, 2022
Learning analytics (LA) is an emerging field of science due to its great potential to better understand, support and improve the learning and teaching process. Many higher education institutions (HEIs) have already included LA in their digitalization strategies. This process has been additionally accelerated during the COVID-19 pandemic when HEIs transitioned from face-2-face learning environments to hybrid and e-learning environments and entirely relied on technology to continue operating. Undoubtedly, there was never a time when so much student data was collected, analysed, and reported, which brings numerous ethical and data protection concerns to the forefront. For example, a critical issue when implementing LA is to determine which data should be processed to fulfil pedagogical purposes while making sure that LA is in line with ethical principles and data protection law such as the European General Data Protection Regulation (GDPR). This article contributes to the discussion on how to design LA applications that are not only useful and innovative but also trustworthy and enable higher education learners to make data-informed decisions about their learning process. For that purpose, we first present the idea and methodology behind the development of our interdisciplinary Criteria Catalogue for trustworthy LA applications intended for students. The Criteria Catalogue is a new normative framework that supports students to assess the trustworthiness of LA applications. It consists of seven defined Core Areas (i.e. autonomy, protection, respect, non-discrimination, responsibility and accountability, transparency, and privacy and good data governance), and corresponding criteria and indicators. Next, we apply this normative framework to learning diaries as a specific LA application. Our goal is to demonstrate how ethical and legal aspects could be translated into specific recommendations and design implications that should accompany the whole lifecycle of LA applications.
The influence of data protection and privacy frameworks on the design of learning analytics systems
Proceedings of the Seventh International Learning Analytics & Knowledge Conference, 2017
Learning analytics open up a complex landscape of privacy and policy issues, which will influence how learning analytics systems and practices are designed. Research and development is governed by regulations for data storage and management, and by research ethics. Consequently, when moving solutions out the research labs implementers meet constraints defined in national laws and justified in privacy frameworks. This paper explores how the OECD, APEC and EU privacy frameworks seek to regulate data privacy, with significant implications for the discourse of learning, and ultimately, an impact on the design of tools, architectures and practices that now are on the drawing board. A detailed list of requirements for learning analytics systems is developed, based on the new legal requirements defined in the European General Data Protection Regulation, which from 2018 will be enforced as European law. The paper also gives an initial account of how the privacy discourse in Europe, Japan, South-Korea and China is developing and reflects upon the possible impact of the different privacy frameworks on the design of LA privacy solutions in these countries. This research contributes to knowledge of how concerns about privacy and data protection related to educational data can drive a discourse on new approaches to privacy engineering based on the principles of Privacy by Design. For the LAK community, this study represents the first attempt to conceptualise the issues of privacy and learning analytics in a cross-cultural context. The paper concludes with a plan to follow up this research on privacy policies and learning analytics systems development with a new international study. CCS Concepts Security and privacy → Privacy protections • General and reference~Design • Security and privacy~Social aspects of security and privacy • Security and privacy~Privacy protections • Applied computing~E-learning
Learning analytics at the intersections of student trust, disclosure and benefit
2019
Evidence suggests that individuals are often willing to exchange personal data for (real or perceived) benefits. Such an exchange may be impacted by their trust in a particular context and their (real or perceived) control over their data. Students remain concerned about the scope and detail of surveillance of their learning behavior, their privacy, their control over what data are collected, the purpose of the collection, and the implications of any analysis. Questions arise as to the extent to which students are aware of the benefits and risks inherent in the exchange of their data, and whether they are willing to exchange personal data for more effective and supported learning experiences. This study reports on the views of entry level students at the Open University (OU) in 2018. The primary aim is to explore differences between stated attitudes to privacy and their online behaviors, and whether these same attitudes extend to their university's uses of their (personal) data. The analysis indicates, inter alia, that there is no obvious relationship between how often students are online or their awareness of/concerns about privacy issues in online contexts and what they actually do to protect themselves. Significantly though, the findings indicate that students overwhelmingly have an inherent trust in their university to use their data appropriately and ethically. Based on the findings, we outline a number of issues for consideration by higher education institutions, such as the need for transparency (of purpose and scope), the provision of some element of student control, and an acknowledgment of the exchange value of information in the nexus of the privacy calculus. CCS CONCEPTS • Social and professional topics~Privacy policies • Security and privacy~Privacy protections
Research and Practice in Technology Enhanced Learning
Privacy and data protection are a major stumbling blocks for a data-driven educational future. Privacy policies are based on legal regulations, which in turn get their justification from political, cultural, economical and other kinds of discourses. Applied to learning analytics, do these policies also need a pedagogical grounding? This paper is based on an actual conundrum in developing a technical specification on privacy and data protection for learning analytics for an international standardisation organisation. Legal arguments vary a lot around the world, and seeking ontological arguments for privacy does not necessarily lead to a universal acclaim of safeguarding the learner meeting the new data-driven practices in education. Maybe it would be easier to build consensus around educational values, but is it possible to do so? This paper explores the legal and cultural contexts that make it a challenge to define universal principles for privacy and data protection. If not universal principles, consent could be the point of departure for assuring privacy? In education, this is not necessarily the case as consent will be balanced by organisations' legitimate interests and contract. The different justifications for privacy, the legal obligation to separate analysis from intervention, and the way learning and teaching works makes it necessary to argue data privacy from a pedagogical perspective. The paper concludes with three principles that are proposed to inform an educational maxim for privacy and data protection in learning analytics.
Ethical and privacy principles for learning analytics
He is working on research projects exploring how technology can be used to understand and influence human behavior. He has experience in the use of digital devices in areas such as behavioral analytics, social networks, computer-supported collaboration, personalization and technology-enhanced learning. George Siemens is the Executive Abstract The massive adoption of technology in learning processes comes with an equally large capacity to track learners. Learning analytics aims at using the collected information to understand and improve the quality of a learning experience. The privacy and ethical issues that emerge in this context are tightly interconnected with other aspects such as trust, accountability and transparency. In this paper, a set of principles is identified to narrow the scope of the discussion and point to pragmatic approaches to help design and research learning experiences where important ethical and privacy issues are considered. Introduction: privacy in learning environments The use of information and communication technology has significantly changed how learning experiences are conceived and deployed. The widespread use of various digital devices together with cloud computing allows for learning scenarios not previously considered. Students are now able to access a myriad of learning resources, interact with applications focusing on a specific topic, enhance their experience in virtual environments, augment reality and connect with others through social networks. The progress of technology evolves together with the capacity to record the events occurring in a learning environment. Every interaction and resource accessed can be captured and stored. As a consequence, learning scenarios can now be analyzed using big-data analytics techniques. Although the use of new technology is shaping the way we learn, a more significant change may derive from the use of big-data analytics (Siemens & Long, 2011).
Journal of Learning Analytics, 2016
Studies have shown that issues of privacy, control of data, and trust are essential to implementation of learning analytics systems. If these issues are not addressed appropriately systems will tend to collapse due to legitimacy crisis, or they will not be implemented in the first place due to resistance from learners, their parents, or their teachers. This paper asks what it means to give priority to privacy in terms of data exchange and application design and offers a conceptual tool, a Learning Analytics Design Space model, to ease the requirement solicitation and design for new learning analytics solutions. The paper argues the case for privacy-driven design as an essential part of learning analytics systems development. A simple model defining a solution as the intersection of an approach, a barrier, and a concern is extended with a process focussing on design justifications to allow for an incremental development of solutions. This research is exploratory of nature, and furthe...
Privacy in Learning Analytics – Implications for System Architecture
2015
This paper explores the field of ICT standardisation related to learning analytics, a new class of technologies being introduced to schools, universities and further education as a consequence of increased access to data from learning activities. Learning analytics has implication for how the individual manages data and knowledge about herself and her learning, highlighting issues of privacy, ownership of data, and consent to share and use data, – issues that are not yet been fully discussed in the field of learning technology development in general, and standardisation of learning technologies in particular. What do these issues mean for standardisation and design of LA architectures? Based on requirements of open architecture, transparency and trust, and ownership and consent this paper proposes a search architecture for learning analytics based on open and linked data. The proposed middle layer highlights dynamic usage agreements and student agency and represents an alternative a...
More Than Figures on Your Laptop: (Dis)trustful Implementation of Learning Analytics
Journal of Learning Analytics, 2021
The adoption of learning analytics (LA) in complex educational systems is woven into sociocultural and technical challenges that have induced distrust in data and difficulties in scaling LA. This paper presents a study that investigated areas of distrust and threats to trustworthy LA through a series of consultations with teaching staff and students at a large UK university. Surveys and focus groups were conducted to explore participant expectations of LA. The observed distrust is broadly attributed to three areas: the subjective nature of numbers, the fear of power diminution, and approaches to design and implementation of LA. The paper highlights areas to maintain existing trust with policy procedures and areas to cultivate trust by engaging with tensions arising from the social process of LA.
Students' Information Privacy Concerns in Learning Analytics: Towards a Model Development
ArXiv, 2021
The widespread interest in learning analytics (LA) is associated with increased availability of and access to student data where students’ actions are monitored, recorded, stored and analyzed. The availability and analysis of such data is argued to be crucial for improved learning and teaching. Yet, these data can be exposed to misuse, for example, to be used for commercial purposes, consequently, resulting in information privacy concerns (IPC) of students who are the key stakeholders and data subjects in the LA context. The main objective of this study is to propose a theoretical model to understand the IPC of students in relation to LA. We explore the concept of IPC as a central construct between its two antecedents: perceived privacy vulnerability and perceived privacy control, and its consequences, trusting beliefs and self-disclosure behavior. Although these relationships have been investigated in other contexts, this study aims to offer mainly theoretical insights on how these...