Evolution of Instructor Response? Analysis of Five Years of Feedback to Students (original) (raw)

Assessing peer and instructor response to writing: A corpus analysis from an expert survey

Assessing Writing, 2017

Over the past 30 years, considerable scholarship has critically examined the nature of instructor response on written assignments in the context of higher education (see Straub, 2006). However, as Haswell (2005) has noted, less is currently known about the nature of peer response, especially as it compares with instructor response. In this study, we critically examine some of the properties of instructor and peer response to student writing. Using the results of an expert survey that provided a lexically-based index of high-quality response, we evaluate a corpus of nearly 50,000 peer responses produced at a four-year public university. Combined with the results of this survey, a large-scale automated content analysis shows first that instructors have adopted some of the field's lexical estimation of high-quality response, and second that student peer response reflects the early acquisition of this lexical estimation, although at further remove from their instructors. The results suggest promising directions for the parallel improvement of both instructor and peer response.

Writing Analytics: Methodological and Conceptual Developments

The Journal of Writing Analytics

Welcome to Volume 2 of The Journal of Writing Analytics. As the scholars in this issue demonstrate, writing analytics is emerging as a vibrant field of study. As editors, we are encouraged to see that important methodological and conceptual developments are becoming apparent that serve to deepen and strengthen the field in significant ways. This issue contains seven research articles, two research notes, and a special section featuring research presented at a US educational measurement conference. As was the case with Volume 1 in 2017, our 2018 authors advance a remarkable range of research. And, as was the case last year, this year's authors continue to come from diverse fields advancing focused interest. We begin by introducing the research of our colleagues and then turn to a reflection on the developments we see in their work. 1.0 Research Articles Of the seven research articles in Volume 2, the first five are grouped according to studies of undergraduate student writing in the US. The two that conclude this section are devoted to the use of technology to assure fairness in the ways we gather and interpret information. In "Evolution of Instructor Response? Analysis of Five Years of Feedback to Students," Susan Lang contributes to the new field of analytics by focusing on the use of text mining techniques to support the work of writing program administrators. Undertaken over five years (10 semesters) from August 2012 through May 2017, her study presents findings from 17,534 samples of undergraduate student writing and 141,659 discrete comments by instructors on that writing in order to address a fundamental pedagogical issue: how best to support graduate teaching assistants in their ongoing professional development. Using ProSuite, an integrated collection of text analytics tools, Lang found that instructors over time incorporated a principled, consistent response vocabulary to student writing. This study is especially significant in

AcaWriter: A Learning Analytics Tool for Formative Feedback on Academic Writing

Journal of Writing Research

Written communication is an important skill across academia, the workplace, and civic participation. Effective writing incorporates instantiations of particular text structuresrhetorical moves-that communicate intent to the reader. These rhetorical moves are important across a range of academic styles of writing, including essays and research abstracts, as well as in forms of writing in which one reflects on learning gained through experience. However, learning how to effectively instantiate and use these rhetorical moves is a challenge. Moreover, educators often struggle to provide feedback supporting this learning, particularly at scale. Where effective support is provided, the techniques can be hard to share beyond single implementation sites. We address these challenges through the open-source AcaWriter tool, which provides feedback on rhetorical moves, with a design that allows feedback customization for specific contexts. We introduce three example implementations in which we have customized the tool and evaluated it with regard to user perceptions, and its impact on student writing. We discuss the tool's general theoretical background and provide a detailed technical account. We conclude with four recommendations that emphasize the potential of collaborative approaches in building, sharing and evaluating writing tools in research and practice.

A Comparison Analysis of Five Instructors’ Commenting Patterns of Audio and Written Feedback on Students’ Writing Assignments

2021

Instructors often use text-based methods when giving feedback to students on their papers. With the development of audio recording technologies, audio feedback has become an increasingly popular alternative to written feedback. This study analyzed five instructors' commenting patterns of both written and audio feedback. The five instructors, who taught sections of the same undergraduate composition class, provided written feedback to students on one writing assignment and audio feedback on another writing assignment. A mixed-methods research methodology was employed for the study. Data were collected through surveys, students' writing assignments, digital audio files (for audio feedback), and interviews. The findings indicated that the word count and the number of items commented on differed between audio and written commentary. In addition, there was a teacher effect and an interaction effect for both word count and number of items in the instructor feedback. The interview data offered explanations for why the teacher effect and the interaction effect might have occurred. The findings show that an individual teacher's commenting styles and strategies, as well as the medium used in commenting, have a strong influence on the nature and length of the commentary. Implications for future research and practices were discussed at the end of the paper.

College Writing Instructors Using Rubrics to Drive Instruction

Academia Letters, 2021

Based on data from the top 50 US News & World Report's ranked national universities and top 10 liberal arts colleges, on average college classes contain anywhere from 18 to 117 students (Giddon). Most basic college composition courses require students to write thousands of words to get credit for the course, and class sizes are generally not small, which suggests what a monumental task grading student writing is. This translates to teachers spending many hours grading student work, and teachers who are contracted for more than one class find their workload multiplying rapidly. High quality, individualized feedback must be distributed to the student in a timely manner, a difficult task which can be both time-consuming and impractical, so how can a teacher reduce his or her grading time while providing quality student feedback? The answer is using rubrics. Composition teachers should master how to use rubrics in a specific way within their writing classes because using a rubric is not a "one size fits all" solution. Students appreciate the instructor's use of a rubric because they perceive that it makes the analysis of their writing less subjective. Learning to write is an arduous and anxietyriddled undertaking for many students, and knowing that their work will be graded according to a predetermined set of guidelines helps to ease student anxiety. Atkinson and Lim conducted a study which focused on students' perception of the use of rubrics embedded within a learning management system and concluded that 95% of students felt the use of rubrics "contributed consistency and fairness to their professors' evaluation," which substantiates this claim (Leader and Clinton 90). Learning to write academically is an arduous process that requires students to strategize the organization of their paper and grapple with the content within it. Knowing exactly what is required mitigates some of the stress associated with writing and releases students to express themselves, while ensuring that the teacher adheres

An Analysis of Feedback Given to Strong and Weak Student Writers

Reading Horizons, 2009

Improvement-oriented feedback has been shown to be more effective at raising writing achievement than simple evaluative feedback. This study investigates whether teachers differ in the feedback they give to weak and strong writers as well as how feedback differs across grades. Interviews were conducted with 15 teachers about the feedback they gave students on their writing. Contrary to expectations, analyses indicate that both weak and strong writers received minimal improvement-oriented feedback. However, strong writers received more positive evaluative feedback while weak writers received more negative evaluative feedback. This research has implications for both teacher education and the professional development of teachers. "Writing today is not a frill of the few, but an essential skill for the many" (The National Commission on Writing, 2003, p. 11), sums up the importance of writing in our society today. The July 2005 report by the National Commission on Writing maintains that over 90% of state agencies surveyed acknowledged that writing is a key factor that determines whether one is hired or promoted. The pervasiveness of standardized assessments measuring progress, particularly the No Child Left Behind Act of 2001, provides another example of the need to improve student writing. Research suggests formative assessment is effective in raising student

Across Performance Contexts: Using Automated Writing Evaluation to Explore Student Writing

The Journal of Writing Analytics, 2022

• Background: This exploratory writing analytics study uses argumentative writing samples from two performance contexts-standardized writing assessments and university English course writing assignments-to compare (1) linguistic features in argumentative writing and (2) relationships between linguistic characteristics and academic performance outcomes. Writing data from this study come from 180 students enrolled at five four-year universities in the United States. Automated writing evaluation (AWE) tools were used to generate linguistic features from students' writing. • Literature Review: Few studies have been conducted that use AWE to examine postsecondary writing skill and relationships between writing and broader academic performance outcomes. To address this gap, our study draws upon research on standardized and coursework writing, construct modeling, and AWE feature design. We also draw on related work to demonstrate how AWE can provide insights about linguistic characteristics of students' writing and relationships of that writing to academic performance factors.

Towards Automatic Classification of Teacher Feedback on Student Writing

International Journal of Information and Education Technology, 2018

This paper reports and discusses the results of a study aimed at automatically categorising teacher feedback on student writing. A total of 3412 teachers' written comments on 90 students' draft essays were collected from an EFL course offered by a Hong Kong university during the first semester of 2016/17. The data were primarily used to design and implement an automated tool to classify teachers' comments with respect to a taxonomy of their characteristics. The findings of this study show that the performance of the automated tool is comparable to that of human annotators, suggesting the feasibility of using the automatic approach to identify and analyse different types of teacher feedback. This study can contribute to future research into the investigation of the impact of teacher feedback on student writing in a big data world.