Student privacy self-management (original) (raw)
Related papers
Journal of the Association for Information Science and Technology
Higher education institutions are continuing to develop their capacity for learning analytics (LA), which is a sociotechnical data mining and analytic practice. Institutions rarely inform their students about LA practices and there exist significant privacy concerns. Without a clear student voice in the design of LA, institutions put themselves in an ethical grey area. To help fill this gap in practice and add to the growing literature on students' privacy perspectives, this study reports findings from over 100 interviews with undergraduate students at eight United States highereducation institutions. Findings demonstrate that students lacked awareness of educational data mining and analytic practices, as well as the data on which they rely. Students see potential in LA, but they presented nuanced arguments about when and with whom data should be shared; they also expressed why informed consent was valuable and necessary. The study uncovered perspectives on institutional trust that were heretofore unknown, as well as what actions might violate that trust. Institutions must balance their desire to implement LA with their obligation to educate students about their analytic practices and treat them as partners in the design of analytic strategies reliant on student data in order to protect their intellectual privacy.
Safeguarding student privacy in an age of analytics
Educational Technology Research and Development
In this special issue, we have collected three responses from scholars who are currently doing work in learning analytics to react to, better explain, and expound upon the Ifenthaler and Schumacher's findings in their own settings as researchers and practitioners.
The privacy paradox and its implications for learning analytics
Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, 2020
Learning analytics promises to support adaptive learning in higher education. However, the associated issues around privacy protection, especially their implications for students as data subjects, has been a hurdle to wide-scale adoption. In light of this, we set out to understand student expectations of privacy issues related to learning analytics and to identify gaps between what students desire and what they expect to happen or choose to do in reality when it comes to privacy protection. To this end, an investigation was carried out in a UK higher education institution using a survey (N=674) and six focus groups (26 students). The study highlight a number of key implications for learning analytics research and practice: (1) purpose, access, and anonymity are key benchmarks of ethics and privacy integrity; (2) transparency and communication are key levers for learning analytics adoption; and (3) information asymmetry can impede active participation of students in learning analytics. CCS CONCEPTS • Applied computing → Computer-assisted instruction; • Humancentered computing → Empirical studies in HCI .
Learning analytics at the intersections of student trust, disclosure and benefit
2019
Evidence suggests that individuals are often willing to exchange personal data for (real or perceived) benefits. Such an exchange may be impacted by their trust in a particular context and their (real or perceived) control over their data. Students remain concerned about the scope and detail of surveillance of their learning behavior, their privacy, their control over what data are collected, the purpose of the collection, and the implications of any analysis. Questions arise as to the extent to which students are aware of the benefits and risks inherent in the exchange of their data, and whether they are willing to exchange personal data for more effective and supported learning experiences. This study reports on the views of entry level students at the Open University (OU) in 2018. The primary aim is to explore differences between stated attitudes to privacy and their online behaviors, and whether these same attitudes extend to their university's uses of their (personal) data. The analysis indicates, inter alia, that there is no obvious relationship between how often students are online or their awareness of/concerns about privacy issues in online contexts and what they actually do to protect themselves. Significantly though, the findings indicate that students overwhelmingly have an inherent trust in their university to use their data appropriately and ethically. Based on the findings, we outline a number of issues for consideration by higher education institutions, such as the need for transparency (of purpose and scope), the provision of some element of student control, and an acknowledgment of the exchange value of information in the nexus of the privacy calculus. CCS CONCEPTS • Social and professional topics~Privacy policies • Security and privacy~Privacy protections
Changing Policies Concerning Student Privacy and Ethics in Online Education
International Journal of Information and Education Technology, 2016
Whereas the growth in global distance and online education has blossomed, especially with the arrival of massive open online courses (MOOCs), the same technological infrastructure permits unprecedented access to knowledge about students and their behaviors. This knowledge extends far beyond scores on tests to include the measurement of noncognitive factors such as persistence, and intrusive metadata such as geolocation information. Moreover, the growth in the internet of things (e.g., via smart phones and RFID chips) is rapidly complexifying the problem of intrusive data collection. In this paper, we review some of the policy challenges facing student privacy in online learning.
Who Is Reading Whom Now: Privacy in Education from Books to MOOCs
The arrival of new technologies in schools and classrooms around the nation has been met with a mixture of enthusiasm and anxiety. Education technologies ("ed tech") present tremendous opportunities: they allow schools to tailor programs to individual students; make education more collaborative and engaging through social media, gamification, and interactive content; and facilitate access to education for anyone with an Internet connection in remote parts of the world. At the same time, the combination of enhanced data collection with highly sensitive information about children and teens presents grave privacy risks. Indeed, in a recent report, the White House identified privacy in education as a flashpoint for big data policy concerns. This Article is the most comprehensive study to date of the policy issues and privacy concerns arising from the surge of ed tech innovation. It surveys the burgeoning market of ed tech solutions, which range from free Android and iPhone apps to comprehensive learning management systems and digitized curricula delivered via the Internet. It discusses the deployment of big data analytics by education institutions to enhance student performance, evaluate teachers, improve education techniques, customize programs, and better leverage scarce resources to optimize education results. This Article seeks to untangle ed tech privacy concerns from the broader policy debates surrounding standardization, the Common Core, longitudinal data systems, and the role of business in education. It unpacks the meaning of commercial data uses in schools, distinguishing between behavioral advertising to children and providing comprehensive, optimized education solutions to students, * Jules Polonetsky is Co-chair and Executive Director of the Future of Privacy Forum.
The unbearable lightness of consent: Mapping MOOC providers' response to consent
While many strategies for protecting personal privacy have relied on regulatory frameworks, consent and anonymizing data, such approaches are not always effective. Frameworks and Terms and Conditions often lag user behaviour and advances in technology and software; consent can be provisional and fragile; and the anonymization of data may impede personalized learning. This paper reports on a dialogical multi-case study methodology of four Massive Open Online Course (MOOC) providers from different geopolitical and regulatory contexts. It explores how the providers (1) define 'personal data' and whether they acknowledge a category of 'special' or 'sensitive' data; (2) address the issue and scope of student consent (and define that scope); and (3) use student data in order to inform pedagogy and/or adapt the learning experience to personalise the context or to increase student retention and success rates. This study found that large amounts of personal data continue to be collected for purposes seemingly unrelated to the delivery and support of courses. The capacity for users to withdraw or withhold consent for the collection of certain categories of data such as sensitive personal data remains severely constrained. This paper proposes that user consent at the time of registration should be reconsidered, and that there is a particular need for consent when sensitive personal data are used to personalize learning, or for purposes outside the original intention of obtaining consent.
User Consent in MOOCs – Micro, Meso, and Macro Perspectives
2018
While many strategies for protecting personal privacy rely on regulatory frameworks, consent, and anonymizing data, they are not always effective. Terms and Conditions often lag behind advances in technology, software, and user behaviours, and consent to use data for a range of unclear purposes may be provided unwittingly. As the commercial market for (student) data expands, so does the number of brokers who move, share and sell data across continents and legislative environments. This paper reviews four Massive Open Online Course (MOOC) providers from different geopolitical and regulatory contexts. It explores how consent to collect and use data is described to potential users, and how that consent applies at micro, meso, and macro levels. This paper proposes a need for greater transparency around the implications of users granting consent at the point of registration. Further, it highlights that though MOOC providers have a responsibility to make clear the potential uses and sharing of user data, users themselves should also be more aware and consider how meaningful student agency can be achieved.
Don't Forget the Fine Print: MOOCs and Student Privacy
New Directions for Institutional Research, 2016
This chapter presents issues related to the legal privacy rights of MOOC enrollees and considers the role of institutional values and missions; it concludes with recommendations for enhancing student privacy.