A Review on IELTS Writing Test, Its Test Results and Inter Rater Reliability (original) (raw)
Related papers
Modern Journal of Language Teaching Methods, 2014
ABSTRACT Validity is a crucial test quality, and presenting a strong validity argument is a must and an ongoing process in the development of large-scale language tests such as IELTS and TOEFL However, the presented validity evidence for writing and speaking skills, whose evaluation is subjective by nature, is somewhat shaky in comparison with other two skills. The present study was an attempt to examine whether raters are actually assessing test takers’ writing samples based on the constructs defined in the scoring rubric. Using a standard multiple regression, the predictive ability of three objective measures, namely Fluency, Grammatical complexity, and Accuracy, were checked for learners’ scores in IELTS task 2 in writing. The preliminary analysis showed no violation of the assumptions underlying the use of the multiple regression test. The results indicate that the model explains 50% of the variance in the dependent variable, i.e., learners' scores in IELTS Task 2 in writing (adjusted R2 = .501) which was found statistically significant: F (3, 37) = 14.40, p < .001. However, among the independent variables, only the accuracy measure had a statistically significant unique contribution to R2 by 40 %, indicating that accuracy of the texts written by L2 learners is the most important factor affecting the scores they receive in the writing task in IELTS. It seems that raters are so heavily affected by the accuracy of texts written by test takers that they ignore other text qualities specified in the scoring rubric. KEYWORDS: IELTS writing test, Validity, Fluency, Grammatical complexity, Accuracy
A critical review of the IELTS writing test
Being administered at local centres throughout the world in 120 countries, IELTS is one of the most widely used large-scale ESL tests that also offers a direct writing test component. Because of its popularity and its use for making critical decisions about test takers, the present article finds it crucial to draw attention to some issues regarding assessment procedures of IELTS. Therefore, the present paper aims to provide a descriptive and critical review of the IELTS writing test by focusing particularly on various reliability issues such as single marking of papers, readability of prompts, comparability of writing topics, and validity issues such as the definition of the "international writing construct," without considering variations among rhetorical conventions and genres around the world. Consequential validity-impact issues will also be discussed and suggestions will be given for the use of IELTS around the world and for future research to improve the test.
On the Validity of IELTS Writing Component; Do Raters Assess What They Are Supposed To?
Modern Journal of Language Teaching Methods, 2014
ABSTRACTValidity is a crucial test quality, and presenting a strong validity argument is a must and an ongoing process in the development of large-scale language tests such as IELTS and TOEFL. However, the presented validity evidence for writing and speaking skills, whose evaluation is subjective by nature, is somewhat shaky in comparison with other two skills. The present study was an attempt to examine whether raters are actually assessing test takers' writing samples based on the constructs defined in the scoring rubric. Using a standard multiple regression, the predictive ability of three objective measures, namely Fluency, Grammatical complexity, and Accuracy, were checked for learners' scores in IELTS task 2 in writing. The preliminary analysis showed no violation of the assumptions underlying the use of the multiple regression test. The results indicate that the model explains 50% of the variance in the dependent variable, i.e., learners' scores in IELTS Task 2 in...
Authenticity and Validity of the Ielts Writing Test as Predictor of Academic Performance
2021
The International English Language Testing System (IELTS) has become one of the most widely used measurements of English proficiency in the world for academic, professional and migration purposes. For universities in particular, it is expected that applicants’ IELTS scores closely reflect their actual ability in communicating and doing their assignments in English. This study examines the authenticity and predictive validity of the writing section in the IELTS Academic Module by reviewing relevant research on IELTS within the last two decades. In general, those studies have provided evidence that the IELTS writing test suffers from low authenticity and predictive validity, and is thus an inaccurate predictor of a candidate’s performance in writing real-life academic tasks.
International Journal of Language Testing
This study explored possible reasons why IELTS candidates usually score low in writing by investigating the effects of two different test designs and scoring criteria on Iranian IELTS candidates' obtained grades in IELTS and World Englishes (WEs) essay writing tests. To this end, first, a WEs essay writing test was preliminarily designed. Then, 17 Iranian IELTS candidates wrote two essays on the same topic, one under the IELTS test condition and one under the WEs test condition. Each of the 34 obtained essays was scored six times, three times based on IELTS scoring criteria, each time by a different rater, and then, three times based on WEs scoring criteria. The results of repeated-measures ANOVA showed that test design and scoring criteria had significant effects on essay grades. The study concludes that some of the reasons why IELTS candidates usually score low in writing may be rooted in the test design and scoring criteria of the IELTS essay writing test, not necessarily in IELTS candidates' weaknesses in writing. The implications of the study focus on the importance and relevance of the results to IELTS candidates, international students, and the future of assessing writing in World Englishes contexts.
Let Their Voices be Heard: IELTS Candidates' Problems with the IELTS Academic Writing Test
TESL-EJ, 2023
According to the IELTS official website, IELTS candidates usually score lower in the IELTS Writing test than in the other language skills. This is disappointing for the many IELTS candidates who fail to get the overall band score they need. Surprisingly enough, few studies have addressed this issue. The present study, then, is aimed at shedding some light on why IELTS candidates usually score lowest in writing by investigating IELTS candidates' problems with the IELTS Academic Writing test. To this end, 10 Iranian IELTS candidates were interviewed concerning the difficulties they had with this test. The interview summaries were subjected to thematic analysis. The results suggested that IELTS candidates may face four main problems with the IELTS Academic Writing test: insufficient time, unclear and difficult-tounderstand task instructions, "distant" topics, and overvaluation of advanced vocabulary and grammar in the scoring system. The study suggests that IELTS candidates' problem of scoring lowest in the Writing test may not be entirely due to deficiencies in their writing skills, and that certain features of the IELTS Academic Writing test may aggravate undesirable testing outcomes. The implications of the results of the study are discussed.
Analysis of Iain Parepare Students’ Academic Writing Levels on Ielts Preparation Class
Al-Irsyad: Journal of Education Science
This research is intended to describe the academic writing levels of 13 EFL students at IAIN Parepare and to consider what is the students’ perceptions are about IELTS academic writing levels after finishing the IELTS Preparation class. The researcher used qualitative descriptive method. The researcher collected the students’ worksheet and assessed based on IELTS writing band score descriptors that consists of four aspects writing assessment criteria: task achievement and response, coherence and cohesion, lexical resource, grammatical range and accuracy. To gain information about students’ perception, the researcher guided an interview.The result found that 2 students achieved 7 band scores (Good User Level) as the highest achiever. Then, 2 students achieved 5,5 band scores (Modest User Level) as the medium achievement, and only 1 student achieved 3 band score (Extremely Limited User level) as the lowest achiever. The researcher chose three selected texts were written by MA who repr...
This paper attempts to demonstrate the differences in writing between International English Language Testing System (IELTS) bands 6.0, 6.5 and 7.0. An analysis of exemplars provided from the IELTS test makers reveals that IELTS 6.0, 6.5 and 7.0 writers can make a minimum of 206 errors, 96 errors and 35 errors per 1000 words. The following section explores the differences in error patterns between IELTS 6.0, 6.5 and 7.0 writers and a proposition is made that the IELTS 7.0 writer shows some convergence of error types found among native English writers. In regard to workload issues, the paper discusses the impact of errors as a distraction which affects reading time and gives an indication of the amount of extra workload that may be required to assess IELTS 6.0, 6.5 and 7.0 writing. The paper concludes with remarks about entrance requirements for tertiary study and suggests that it may be simpler to raise entry standards than attempt to remediate the writing of students with low IELTS scores.
2013
The International English Language Testing System (IELTS) is the world's leading high stakes test that assesses the English Language Proficiency of candidates who speak languages other than English and wish to gain entry into universities where English is the language of instruction. Recently, over 3000 institutions in the United States accepted the IELTS test to be an indicator of language proficiency (IELTS, 2012a). Because of this preference for the IELTS test, and its worldwide recognition, there has been an increase in the number of students who are taking the test every year. According to the IELTS website, more than 7000 institutions around the world trust the test results and, not surprisingly, more than 1.7 million candidates take the test every year in one of the 800 recognised test centres across 135 countries (IELTS, 2012a). These candidates include people who seek not only to obtain admission to universities, but also for immigration authorities, employers of certain companies and government agencies. Acknowledging this popularity and importance to learners of English as a Foreign Language (EFL), this qualitative study has investigated the construct validity of the academic writing module in the IELTS test from the perspectives of the stakeholders (i.e. candidates, lecturers and markers). The aim was to understand why some Saudi students fail to cope with demands of the university despite the fact that they have achieved the minimum requirements in IELTS. In this study, data was collected in two phases in two different settings through open-ended questionnaires, semi-structured observations and semi-structured interviews. Phase I was carried out in the Department of English Language (DEL) at King Faisal University in Saudi Arabia, while Phase II was conducted in one university in the UK. The sample of the study included: 8 students, 6