A Taxonomy for Writing Analytics (original) (raw)

Writing Analytics: Methodological and Conceptual Developments

The Journal of Writing Analytics, 2018

Welcome to Volume 2 of The Journal of Writing Analytics. As the scholars in this issue demonstrate, writing analytics is emerging as a vibrant field of study. As editors, we are encouraged to see that important methodological and conceptual developments are becoming apparent that serve to deepen and strengthen the field in significant ways. This issue contains seven research articles, two research notes, and a special section featuring research presented at a US educational measurement conference. As was the case with Volume 1 in 2017, our 2018 authors advance a remarkable range of research. And, as was the case last year, this year’s authors continue to come from diverse fields advancing focused interest. We begin by introducing the research of our colleagues and then turn to a reflection on the developments we see in their work. For Volume 2, see https://journals.colostate.edu/analytics/issue/view/13

Writing Analytics: Broadening the Community

The Journal of Writing Analytics, 2019

Welcome to Volume 3 of The Journal of Writing Analytics. This issue contains two invited articles, four research articles, six research notes, and a special section featuring research presented at a U.S. humanities conference. As was the case with Volume 2 in 2018, our 2019 authors continue to advance a remarkable research range. And, as now seems to be the norm for our journal, the authors continue to come from diverse fields. We begin by introducing the research of our colleagues and then turn to a reflection on the developments we see in their work.

Reflecting on Reflective Writing Analytics: Assessment Challenges and Iterative Evaluation of a Prototype Tool

When used effectively, reflective writing tasks can deepen learners' understanding of key concepts, help them critically appraise their developing professional identity, and build qualities for lifelong learning. As such, reflecting writing is attracting substantial interest from universities concerned with experiential learning, reflective practice, and developing a holistic conception of the learner. However, reflective writing is for many students a novel genre to compose in, and tutors may be inexperienced in its assessment. While these conditions set a challenging context for automated solutions, natural language processing may also help address the challenge of providing real time, formative feedback on draft writing. This paper reports progress in designing a writing analytics application, detailing the methodology by which informally expressed rubrics are modelled as formal rhetorical patterns, a capability delivered by a novel web application. This has been through iterative evaluation on an independently human-annotated corpus, showing improvements from the first to second version. We conclude by discussing the reasons why classifying reflective writing has proven complex, and reflect on the design processes enabling work across disciplinary boundaries to develop the prototype to its current state.

Considering Consequences in Writing Analytics: Humanistic Inquiry and Empirical Research in The Journal of Writing Assessment

The Journal of Writing Analytics, 2019

Consideration of the intersections of humanistic and empirical traditions of research are important, especially now with recent emphasis on fairness and consequences of score use. Humanistic research traditions can enhance research perspectives within the emerging field of writing analytics. Using two case studies from The Journal of Writing Assessment (JWA), this article explores the ways humanistic traditions facilitate a framework developed by JWA of localism. Such a perspective provides a bridge between the technically focused concerns for validity and reliability with the complex social contexts and diverse backgrounds and lived experiences of students and faculty who occupy these educational settings. For writing analytics to live up to its potential, the practices and scholarship need to meet high technical standards as well as attend to diverse and socially situated assessment concerns.

AcaWriter: A Learning Analytics Tool for Formative Feedback on Academic Writing

Journal of Writing Research

Written communication is an important skill across academia, the workplace, and civic participation. Effective writing incorporates instantiations of particular text structuresrhetorical moves-that communicate intent to the reader. These rhetorical moves are important across a range of academic styles of writing, including essays and research abstracts, as well as in forms of writing in which one reflects on learning gained through experience. However, learning how to effectively instantiate and use these rhetorical moves is a challenge. Moreover, educators often struggle to provide feedback supporting this learning, particularly at scale. Where effective support is provided, the techniques can be hard to share beyond single implementation sites. We address these challenges through the open-source AcaWriter tool, which provides feedback on rhetorical moves, with a design that allows feedback customization for specific contexts. We introduce three example implementations in which we have customized the tool and evaluated it with regard to user perceptions, and its impact on student writing. We discuss the tool's general theoretical background and provide a detailed technical account. We conclude with four recommendations that emphasize the potential of collaborative approaches in building, sharing and evaluating writing tools in research and practice.

Reflective writing analytics for actionable feedback

Proceedings of the Seventh International Learning Analytics & Knowledge Conference on - LAK '17, 2017

Reflective writing can provide a powerful way for students to integrate professional experience and academic learning. However, writing reflectively requires high quality actionable feedback, which is time-consuming to provide at scale. This paper reports progress on the design, implementation, and validation of a Reflective Writing Analytics platform to provide actionable feedback within a tertiary authentic assessment context. The contributions are: (1) a new conceptual framework for reflective writing; (2) a computational approach to modelling reflective writing, deriving analytics, and providing feedback; (3) the pedagogical and user experience rationale for platform design decisions; and (4) a pilot in a student learning context, with preliminary data on educator and student acceptance, and the extent to which we can evidence that the software provided actionable feedback for reflective writing.

Exploring Writing Analytics and Postsecondary Success Indicators

Grantee Submission, 2019

Writing is a challenge and a potential obstacle for students in U.S. 4-year postsecondary institutions lacking prerequisite writing skills. Building on Anonymous, we collected authentic coursework writing from students enrolled at six 4-year colleges, extracted natural language processing (NLP) writing features (analytics), and examined relationships between analytics and college grade point average (GPA). Consistent with Anonymous, findings suggest that NLP writing analytics may contribute to college GPA prediction. Implications are that real-�me NLP wri�ng analy�cs from authen�c coursework wri�ng from students could be leveraged to efficiently track success and flag poten�al obstacles during students' college careers.

The Open University ’ s repository of research publications and other research outputs Writing Analytics for Epistemic Features of Student Writing Conference Item

2016

Literacy, encompassing the ability to produce written outputs from the reading of multiple sources, is a key learning goal. Selecting information, and evaluating and integrating claims from potentially competing documents is a complex literacy task. Prior research exploring differing behaviours and their association to constructs such as epistemic cognition has used ‘multiple document processing’ (MDP) tasks. Using this model, 270 paired participants, wrote a review of a document. Reports were assessed using a rubric associated with features of complex literacy behaviours. This paper focuses on the conceptual and empirical associations between those rubric-marks and textual features of the reports on a set of natural language processing (NLP) indicators. Findings indicate the potential of NLP indicators for providing feedback regarding the writing of such outputs, demonstrating clear relationships both across rubric facets and between rubric facets and specific NLP indicators.

Journal of Research in Innovative Teaching & Learning Learning analytics to improve writing skills for young children – an holistic approach Article information

Purpose – Due to the important role of orthography in society, the project called IDeRBlog presented in this paper created a web-based tool to motivate pupils to write text as well as to read and to comment on texts written by fellow students. In addition, IDeRBlog aims to improve student’s German orthography skills and supports teachers and parents with training materials for their students. The paper aims to discuss these issues. Design/methodology/approach – With the aid of learning analytics, the submitted text is analyzed and special feedback is given to the students so that they can try to correct the misspelled words themselves. The teachers as well as the parents are benefiting from the analysis and exercises suggested by the system. Findings – A recent study showed the efficiency of the system in form of an improvement of the students’ orthographic skills. Over a period of four months 70 percent of the students achieved a significant reduction of their spelling mistakes. Originality/value – IDeRBlog is an innovative approach to improving orthography skills combining blogging and new media with writing and practice.