Privacy in Learning Analytics – Implications for System Architecture (original) (raw)
Related papers
An exploration of standardisation options for the new field of learning analytics
2015
This paper explores the field of ICT standardisation related to learning analytics, a new class of technologies being introduced to schools, universities and further education as a consequence of increased access to data from learning activities. Issues related to requirements for data sharing are registered; and the challenges of privacy, ownership of data, consent are identified as major obstacles for successful implementation of learning analytics. What does this mean for standardisation and design of LA architectures? Based on requirements of open architecture, transparency and trust, and ownership and consent this paper proposes a search architecture for learning analytics based on open and linked data. The proposed middle layer highlights dynamic usage agreements and end user agency and represents an alternative approach to the LA architectures now being developed in international standardisation fora. Keywords—learning analytis; data sharing; interoperability; privacy; data o...
The influence of data protection and privacy frameworks on the design of learning analytics systems
Proceedings of the Seventh International Learning Analytics & Knowledge Conference, 2017
Learning analytics open up a complex landscape of privacy and policy issues, which will influence how learning analytics systems and practices are designed. Research and development is governed by regulations for data storage and management, and by research ethics. Consequently, when moving solutions out the research labs implementers meet constraints defined in national laws and justified in privacy frameworks. This paper explores how the OECD, APEC and EU privacy frameworks seek to regulate data privacy, with significant implications for the discourse of learning, and ultimately, an impact on the design of tools, architectures and practices that now are on the drawing board. A detailed list of requirements for learning analytics systems is developed, based on the new legal requirements defined in the European General Data Protection Regulation, which from 2018 will be enforced as European law. The paper also gives an initial account of how the privacy discourse in Europe, Japan, South-Korea and China is developing and reflects upon the possible impact of the different privacy frameworks on the design of LA privacy solutions in these countries. This research contributes to knowledge of how concerns about privacy and data protection related to educational data can drive a discourse on new approaches to privacy engineering based on the principles of Privacy by Design. For the LAK community, this study represents the first attempt to conceptualise the issues of privacy and learning analytics in a cross-cultural context. The paper concludes with a plan to follow up this research on privacy policies and learning analytics systems development with a new international study. CCS Concepts Security and privacy → Privacy protections • General and reference~Design • Security and privacy~Social aspects of security and privacy • Security and privacy~Privacy protections • Applied computing~E-learning
Journal of Learning Analytics, 2016
Studies have shown that issues of privacy, control of data, and trust are essential to implementation of learning analytics systems. If these issues are not addressed appropriately systems will tend to collapse due to legitimacy crisis, or they will not be implemented in the first place due to resistance from learners, their parents, or their teachers. This paper asks what it means to give priority to privacy in terms of data exchange and application design and offers a conceptual tool, a Learning Analytics Design Space model, to ease the requirement solicitation and design for new learning analytics solutions. The paper argues the case for privacy-driven design as an essential part of learning analytics systems development. A simple model defining a solution as the intersection of an approach, a barrier, and a concern is extended with a process focussing on design justifications to allow for an incremental development of solutions. This research is exploratory of nature, and furthe...
Research and Practice in Technology Enhanced Learning
Privacy and data protection are a major stumbling blocks for a data-driven educational future. Privacy policies are based on legal regulations, which in turn get their justification from political, cultural, economical and other kinds of discourses. Applied to learning analytics, do these policies also need a pedagogical grounding? This paper is based on an actual conundrum in developing a technical specification on privacy and data protection for learning analytics for an international standardisation organisation. Legal arguments vary a lot around the world, and seeking ontological arguments for privacy does not necessarily lead to a universal acclaim of safeguarding the learner meeting the new data-driven practices in education. Maybe it would be easier to build consensus around educational values, but is it possible to do so? This paper explores the legal and cultural contexts that make it a challenge to define universal principles for privacy and data protection. If not universal principles, consent could be the point of departure for assuring privacy? In education, this is not necessarily the case as consent will be balanced by organisations' legitimate interests and contract. The different justifications for privacy, the legal obligation to separate analysis from intervention, and the way learning and teaching works makes it necessary to argue data privacy from a pedagogical perspective. The paper concludes with three principles that are proposed to inform an educational maxim for privacy and data protection in learning analytics.
Privacy Requirements for Learning Analytics – from Policies to Technical Solutions
2016
This paper is a first exploration of how privacy requirements could influence the design of each of the processes in a learning analytics framework model under development. The international organization for standardization, ISO/IEC JTC 1/SC36, is working on a reference model specifying a learning analytics process loop of six processes: Learning Activity, Data Collection, Data Storing and Processing, Analyzing, Visualization, and Feedback Actions. All these processes are, according to the framework, influenced by privacy policies.
Ethical and privacy issues in the design of learning analytics applications
Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK '16, 2016
Issues related to Ethics and Privacy have become a major stumbling block in application of Learning Analytics technologies on a large scale. Recently, the learning analytics community at large has more actively addressed the EP4LA issues, and we are now starting to see learning analytics solutions that are designed not only as an afterthought, but also with these issues in mind. The 2 nd EP4LA@LAK16 workshop will bring the discussion on ethics and privacy for learning analytics to a the next level, helping to build an agenda for organizational and technical design of LA solutions, addressing the different processes of a learning analytics workflow.
LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox
Journal of Learning Analytics, 2016
To find a balance between learning analytics research and individual privacy learning analytics initiatives need to appropriately address ethical, privacy and data protection issues and comply with relevant legal regulations. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection exist, which may serve the consideration of these topics in a learning analytics context. The importance and significance of data security and protection are also reflected in national and international laws and directives, where data protection is usually considered as a fundamental right. Existing guidelines, approaches and relevant regulations served as a basis for elaborating a comprehensive privacy and data protection framework for the LEA’s BOX project. It comprises a set of eight principles to derive implications for ensuring an ethical treatment of personal data in a learning analytics platform and its services. The priv...
The privacy paradox and its implications for learning analytics
Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, 2020
Learning analytics promises to support adaptive learning in higher education. However, the associated issues around privacy protection, especially their implications for students as data subjects, has been a hurdle to wide-scale adoption. In light of this, we set out to understand student expectations of privacy issues related to learning analytics and to identify gaps between what students desire and what they expect to happen or choose to do in reality when it comes to privacy protection. To this end, an investigation was carried out in a UK higher education institution using a survey (N=674) and six focus groups (26 students). The study highlight a number of key implications for learning analytics research and practice: (1) purpose, access, and anonymity are key benchmarks of ethics and privacy integrity; (2) transparency and communication are key levers for learning analytics adoption; and (3) information asymmetry can impede active participation of students in learning analytics. CCS CONCEPTS • Applied computing → Computer-assisted instruction; • Humancentered computing → Empirical studies in HCI .
Learning and Collaboration Technologies, 2023
With the rising complexity of technology and its introduction into educational settings, the question of trusting and designing trustworthy learning analytics (LA) systems has gained importance. Transparency is one of the values that can contribute to enhancing an LA system’s trustworthiness. It has been included and discussed as a separate core value or principle in many ethical frameworks for LA. Even though these frameworks provide valuable contributions, they are mostly limited to the conceptual level. Defining what transparency entails in the context of LA is an important aspect, nevertheless, the translation and operationalisation of such abstract concepts into technology should be equally considered.In this paper, we focus on the question of how transparency can be translated into concrete design requirements in order to enhance the trustworthiness of LA systems. We present a normative framework in the form of an interdisciplinary Criteria Catalogue for trustworthy LA, which consists of seven core areas, including transparency. Second, we demonstrate how transparency can be translated and operationalised into more specific and low-level elements by using an example of the Learners’ Corner LA dashboard developed within the project “Learning Analytics – Students in Focus”. Third, we share the results of a study conducted to better understand students’ information needs in relation to LA tools and evaluate our design choices for the introduction of three quick information buttons within the Learners’ Corner.
Ethical and privacy principles for learning analytics
He is working on research projects exploring how technology can be used to understand and influence human behavior. He has experience in the use of digital devices in areas such as behavioral analytics, social networks, computer-supported collaboration, personalization and technology-enhanced learning. George Siemens is the Executive Abstract The massive adoption of technology in learning processes comes with an equally large capacity to track learners. Learning analytics aims at using the collected information to understand and improve the quality of a learning experience. The privacy and ethical issues that emerge in this context are tightly interconnected with other aspects such as trust, accountability and transparency. In this paper, a set of principles is identified to narrow the scope of the discussion and point to pragmatic approaches to help design and research learning experiences where important ethical and privacy issues are considered. Introduction: privacy in learning environments The use of information and communication technology has significantly changed how learning experiences are conceived and deployed. The widespread use of various digital devices together with cloud computing allows for learning scenarios not previously considered. Students are now able to access a myriad of learning resources, interact with applications focusing on a specific topic, enhance their experience in virtual environments, augment reality and connect with others through social networks. The progress of technology evolves together with the capacity to record the events occurring in a learning environment. Every interaction and resource accessed can be captured and stored. As a consequence, learning scenarios can now be analyzed using big-data analytics techniques. Although the use of new technology is shaping the way we learn, a more significant change may derive from the use of big-data analytics (Siemens & Long, 2011).