TYCA White Paper on Placement Reform (original) (raw)
Related papers
Writing Placement that Supports Teaching and Learning
2012
This article describes the development of a curriculum-based, expert-reader placement system called Placement and Teaching Together (PTT). The essay presents data and analysis of the PTT system as implemented and evaluated at one university for three years. In contrast to other placement methods such as standardized tests or Directed Self-Placement (DSP), PTT combines placement, teaching, and direct assessment of student writing by integrating the placement instrument into regular semester coursework and requiring students to produce writing similar to that expected in first-year writing classes. Including discussion of PTT's shortcomings and strengths (from teacher, student, and administrator perspectives), the study is rooted in analysis of several surveys, grade data, and other administrative documents, a review of the literature on writing placement with particular comparison to DSP, and self-reflection by the primary administrators of this approach to writing assessment. WPA 35�2 (Spring 2012) 56 the SATs and the like are not: deliberately subjective, putting each student into the driver's seat� With significantly less attention afforded to it, we observe a third movement that strives to return the acts of students writing and faculty reading to the enterprise of placement� Best represented by William Smith, Brian Huot, Irvin Peckham, Theresa Freda Nicolay, Richard Haswell and Susan Wyche-Smith, among others, this movement offers a curriculumbased, expert-reader approach to placement� We call our version of such an approach Placement and Teaching Together (PTT) to emphasize its connection to curriculum, its embedding of assessment within teaching, and its aligning of assessment with curriculum� Unlike DSP, this approach relies on assessment by teachers, yet it significantly improves upon traditional practices of teacher placement that use an impromptu essay and objective test in two essential ways� First, PTT asks expert readers to make judgments based on their context knowledge-their knowledge of the writing courses offered at a given university� Second, PTT requires students to do work similar to that expected in first-year writing classes: writing produced and revised outside of a testing situation, under conditions that reflect the environments in which complex writing acts emerge, and in response to readings� In short, by combining reading, writing, discussion, and other activities that are guided and reviewed by teachers, our version of PTT does not merely introduce or mimic the work expected of students in first-year writing courses, it is that work� PTT, like DSP, is a local, affordable measure that writing faculty can turn to in response to the weaknesses of the assessments offered by the testing establishment, as dominated by the two non-profit corporations, the College Board (CB) and ACT� The College Board and ACT each offer two different kinds of assessments that are used for placement: Accuplacer (CB) and Compass (ACT) are tests that are designed for placement; the SAT in Critical Reading and Writing (CB) and the ACT test (ACT) are admissions tests that these companies have relatively recently studied and re-labeled for use in placement (ACT; College Board)� While these admissions and placement tests do offer data collection and organization, and therefore the potential for accumulative analysis and national and other comparisons, 2 they provide little value for placement that is based on the local context and curriculum� Nevertheless, these tests are well positioned to reach practical and fiscally-minded college administrators who are educated by a psychometrician mindset to value reliability -producing "perfectly consistent scores" (Huot and Neal 429) -almost as much as efficiency, and who may pressure writing program administrators to adopt them for placement� Indeed, adoption
Writing placement tools: Constructing and understanding students’ transition into college writing
Assessing Writing, 2019
Writing placement tools: Constructing and understanding students' transition into college writing 1. Key ideas about student writing and writing placement Underpinning the Tools & Tech Forum are three ideas about student writing: One, student writing is constructed according to how it is assessed. Two, student writing is understood according to how it is analyzed. And three, because assessment and analysis constitutes writing and our understanding of it, we need as much information as possible as we determine how to assess and analyze student writing. In support of this goal, the Tools & Tech Forum offers reviews of assessment tools and technologies, in efforts to support informed decisions about writing assessments and how they are interpreted and used. This year's Tools & Tech Forum focuses on a set of assessment practices affecting millions of students each year: college writing placement. Whether students are non-native or native writers of English, and especially if they are pursuing higher education in the United States, they will likely need to complete a writing placement assessment as their very first writing task as an enrolled college student. These writing placement assessments are used to pair incoming college students with a course or level appropriate for their writing preparation and skills by determining a point of entry within an institution's curricular sequence (Crusan, 2002; Haswell, 2005; Leki, 1991). There are myriad options for assessing students' writing placement. Many institutions rely on locally-designed essay tests and/or multiple-choice questions (Gere, Aull, Lancaster, Perales Escudero, & Vander Lei, 2013). Some use writing portfolios (Yancey, 1999). Many draw on national placement tests or more general standardized tests, such as the Scholastic Aptitude Test (SAT), the Test of English as a Foreign Language (TOEFL), or the American College Test (ACT) (Crusan, 2002; Elliot, Deess, Rudniy, & Joshi, 2012a). Some institutions rely on a combination of standardized and locally-designed assessments (Peckham, 2009). And while some institutions consult TOEFL scores for international and/ or English language learning students (Williams, 1995), many institutions use the same placement process for all incoming first-year students (Gere, Aull, Green, & Porter, 2010). These various writing placement choices foreground different cultural and institutional values, from broad writing constructs (e.g., emphasis on student self-assessment via Directed Self-Placement) to more specific, corresponding choices (e.g., certain Directed Self-Placement questions and not others) (Toth & Aull, 2014). The results or scores of these placement assessment are then used by academic advisers to place students in writing courses (or to exempt them from them), sometimes with input from instructors or students (Crusan, 2002; Elliot, Deess, Rudniy, & Joshi, 2012b). That placement will, in turn, directly impact students' future coursework. Furthermore, the placement process and outcome will contribute to students' perceptions about the kind of writing they are expected to do in higher education, even if a placement task differs substantially from what they will later write as college students (Aull, 2015). Many students will, based on their writing placement, form perceptions regarding how they fair as writers with respect to the writing constructs they perceive are valued. Thus the stakes of writing placement are high, and they entail a range of important decisions. In their introduction to the Journal of Writing Assessment Special Issue on Two-Year College Writing Placement, Kelly-Riley and Whithaus (2019), put it this way: Assessment practices reinforce cultural and educational values; the ways in which assessments-particularly writing placement assessments-work should be examined to understand the values they reinforce. If the assessments are not evolving to reflect current values and expectations, they may be detrimental to the intended social and educational effects of increasing access to higher education. Placement assessment decisions entail conceptual and practical choices, including (1) valued writing constructs-what a given institution or set of institutions wants to know about student writing; (2) the type and design of the assessment-how such information will be gathered; and (3) the interpretation and consequence of said information-how the information will be used, and by whom. Any writing placement assessment that is used over time, then, ideally includes construct evidence, which forms a
Towards an Ethics of Writing Placement
CEA Critic, 2013
The increasing diversity of college student populations is hardly news to most in the business of higher education. The National Center for Education Statistics (NCES) reports an overall increase of 38% in enrollment from 1999 to 2009, an almost 30% jump from the previous ten-year period. Additionally, enrollment for minorities has jumped nearly 20% since the 1970s and more than 6% within the last eight years, the result of a continued push to make higher education more accessible to students of various racial and socioeconomic backgrounds ("Fast Facts"). More recently, the economy has played a significant role in spurring more students to seek college degrees. Within the last several years, enrollment has grown over 27% for students over the age of 25 ("Fast Facts"). These older, nontraditional students are attending college to gain the credentials they need to increase their chances of employment, and younger students are seeing that a college degree is crucial for ensuring career paths are open to them. These changing student demographics have now made it more important than ever for institutions of higher education to examine the role they play in mediating student access to college degrees. One particular place where this accessibility comes into question is at the very beginning of students' college careers-placement into First-Year Composition (FYC) courses. Even though many scholars have discussed the gatekeeping function of FYC (i.e., to ensure that those who have passed FYC courses are prepared for both college and "real-world" writing), the idea of FYC as a speed bump or barrier to a college degree has not been discussed as widely in academic contexts. Outside academia, however, this issue is gaining attention. As a part of Washington Monthly's 2011 "College Guide," Susan Headden, in her article "How the Other Half Tests," recounts the story of Monica Dekany, a non-traditional student who was placed in remedial courses for English and mathematics based on the scores she received from the ACCUPLACER placement exam, despite having already passed many college courses earlier in her life. However much her life experiences might have demonstrated that her level of ability was far above the level of developmental courses, Dekany had no other option but to accept remedial placement. As a result, Headden writes, "Remediation cost [Dekany] several thousand dollars and set her education and her career back by a year" (par. 4). Stories such as Dekany's make clear that assessors need to take into account a range of financial and personal consequences of the writing
Building Bridges: Articulating Writing Programs between Two-and Four-Year Colleges
1993
Two faculty members from a California community college and a nearby state university worked together to articulate placement and assessment procedures in writing courses at the two institutions to better serve students transferring from one to the other. Historically the institutions, Bakersfield College (California) (BC) and California State University, Bakersfield (CSUB) had established hostile relations and independent placement and assessment standards which did not best serve their students. The project's goal was to learn more about each school's programs particularly in assessment for placement, developmental standards, freshman composition standards, and proficiency standards for the two-year and four-year degrees. In addition English as a Foreign Language and special problems of minority language speakers became topics of discussion. The schedule of seminars with faculty at bon schools covered five aspects of the writing programs: (1) placement agreements, (2) develcpmental English standards, (3) Freshman composition standards, and (4) lower division exit exams, and (5) proficiency standards. The project held two sessions for each phase, one at each campus and a joint workshop off-campus to explore issues arising from the campus sessions. Seventy percent of CSUB and 90 percent of BC department faculty participated in at least one session. Outcomes included proposed formal articulation of placement standards and curriculum, faculty professional growth, improved communication on both campuses, and publication of a research manual. Also included are recommendations for replication. (JB)
Writing Proficiency and Student Placement in Community College Composition Courses
2019
Despite national efforts to accelerate students through precollegiate writing course sequences to transfer-level composition, questions persist regarding appropriate placement and the support needed for students to succeed. An analytical text-based writing assessment was administered to students across four levels of composition courses at a California community college. Differences in student writing scores between course levels and the relationship between writing score, course level, and high school GPA were examined. Key findings include (1) significant differences in average scores between the first precollegiate course and other courses in the sequence and (2) weak relationships between course level and high school GPA and assessment scores and high school GPA.
Research Problem: Recent research suggests that the standardized tests used for writing placement at a majority of open admissions community colleges may be systematically under-placing students in ways that undermine their likelihood of persistence and degree completion. These tests may have particularly negative consequences for students from some structurally disadvantaged groups. Directed Self-Placement (DSP) has been touted as a more socially just approach to writing placement, but to date there has been little published research on the consequences of DSP in community college settings. Research Questions: What are the motivations of community colleges that adopt DSP? What have been the consequences of adopting DSP at these community colleges? What are the consequences of DSP for different groups of students at community colleges? Literature Review: I ground this study in an examination of the social justice issues surrounding writing placement at open admissions community colleges and the various social justice-related arguments made for and against DSP. I also synthesize the available literature on how DSP affects different groups of students. Methodology: I reviewed the scholarly literature, searched the archives of professional listservs, and used listservs and professional email lists to identify community colleges that have implemented DSP. I then conducted semi-structured interviews with faculty and administrators at twelve two-year colleges that either have imple