Working Papers Series: Education (original) (raw)

Which teaching practices improve student performance on high-stakes exams? Evidence from Russia

2014

This study examines the relationship between student achievement and teaching practices aimed at raising student performance on a high stakes college entrance examination—the Russian Unified State Exam (USE). Data come from a survey of 3000 students conducted in 2010 in three Russian regions, and the analysis employs a student fixed effects method that estimates the impact of mathematics and Russian language teachers’ practices in advanced and basic tracks on students’ exam results. The study finds that the only practices positively affecting test outcomes are greater amounts of subject-specific homework, and that the most effective type of homework differs across tracks.

Are teachers accurate in predicting their students’ performance on high stakes’ exams? The case of Russia

International Journal of Educational Development, 2015

The paper focuses on how accurate teachers may or may not be in gauging their class’academic abilities. We use a sample of classrooms in three Russian regions to identify sources of mathematics and Russian teachers’ inaccuracies in predicting their high school classes’ scores on Russian and mathematics high stakes college entrance tests (the Unified State Exam, or USE). We test the hypothesis that teachers’ perceptions of their relationship with their classes are good predictors of such inaccuracies. This is important because teachers often focus on their relationship with the class as an end in itself or as a means to engaging students. Good teacher–student relations may indeed result in more students’ learning, but perhaps not nearly as much as teachers’ believe. We find that both Russian and mathematics teachers make inaccurate predictions of their class’ high stakes examination results based on how they perceive their relationship with their class. Teachers who believe they have a very good relationship with the class significantly overestimate their class’ performance on the USE, and those who perceive a poor relationship, underestimate their class’ performance, although this underestimate is generally not statistically significant.

Can International Test Score Comparisons Inform Educational Policy? A Closer Look at Student Performance in Russia and its Neighbors

SSRN Electronic Journal, 2000

In this paper, we develop a multi-level comparative approach to analyse Trends in International Mathematics and Science Survey (TIMSS) and Programme of International Student Achievement (PISA) mathematics results for a country, Russia, where the two tests provide contradictory information about students' relative performance. Russian students do relatively well on the TIMSS mathematics test but relatively poorly on the PISA. We compare the performance of Russian students with different levels of family academic resources over the past decade on these tests compared to students with similar family resources in Russia's neighbours and to Russian students studying in Latvian and Estonian Russian-medium schools. These comparisons and interviews with educators in Latvia and Estonia help us understand why students in Russia may perform lower on the PISA and to draw education policy lessons for improving international test performance generally and Russian students' PISA mathematics performance specifically.

Analysis of Students Performance in Relation to the Results of State Unified Exam: The Case of Russian University

Business, Management and Economics Engineering

Purpose – Considering the limited number of studies covering the topic, the goal is to check the existence of the correlation between the results of Russia’s Unified State Exam and performance at the university. Research methodology – the article uses quantitate analysis (regression) of the student performance on a sample of 4664 students. To provide statistical evaluation, the authors use SPSS Statistics software. Findings – the research suggests, that results of unified state exam and individual students scores, awarded by the university under restrictions, are non-efficient in terms of predicting student performance. On the opposite, students’ performance during their first semester is a good predictor for the whole period of academic studies. As existing results of testing such hypotheses are inconsistent, the research provides value to the field of educational research. Research limitations – data for research refer to only Kazan National Research Technical University named aft...

Raising the Stakes: Inequality and Testing in Russia Raising the Stakes: Inequality and Testing in the Russian Education System

Social Forces, 2019

S ociologists have argued that high-stakes tests open the door to high levels of educational inequality at transition points: in a high-stakes testing regime, parents and students are able to focus all energy and resources on test preparation, thus enhancing pre-existing inequalities in academic performance. But arguments about a special role for high-stakes tests are often prosecuted without explicit comparisons to other types of tests and assessments, usually because information on other tests is not available. In this article, we analyze a unique dataset on a contemporary cohort of Russian students, for whom we have PISA and TIMSS scores, low-stakes test scores, and high-stakes test scores. We compare the role each test plays in mediating socioeconomic background inequalities at the important transitions in the Russian educational system: the transition to upper secondary education and the transition to university. We find evidence in favor of a special role for the high-stakes test at the transition to university, but we also find evidence that gives cause to question the standard assumption that high-stakes tests should be a primary focus for those concerned about inequality of educational opportunity.

Using TIMSS and PISA results to inform educational policy: a study of Russia and its neighbours

Compare: A Journal of Comparative and International Education, 2013

In this paper, we develop a multi-level comparative approach to analyse Trends in International Mathematics and Science Survey (TIMSS) and Programme of International Student Achievement (PISA) mathematics results for a country, Russia, where the two tests provide contradictory information about students' relative performance. Russian students do relatively well on the TIMSS mathematics test but relatively poorly on the PISA. We compare the performance of Russian students with different levels of family academic resources over the past decade on these tests compared to students with similar family resources in Russia's neighbours and to Russian students studying in Latvian and Estonian Russian-medium schools. These comparisons and interviews with educators in Latvia and Estonia help us understand why students in Russia may perform lower on the PISA and to draw education policy lessons for improving international test performance generally and Russian students' PISA mathematics performance specifically.

Assessment Model of High School Students’ Performance: Experience of Ukraine

Neperervna profesìjna osvìta : teorìa ì praktika, 2021

The article considers modern approaches to assessment at schools based on the analysis of various assessment scales (range from three to one hundred points). The pedagogical regularities influencing the choice of assessment scales are determined, in particular: 1) increase of quantitative parameters of the assessment scale; 2) the use of a tribal rating scale for one-element answers; 3) the use of indirect evaluation with a significant amount of evaluation scale; 4) application of mathematical methods of transition from qualitative parameters to quantitative indicators of estimation; 5) taking into account the level of structure of the subject and the relationship between learning and development of subjects of study. Finally, we propose three secondary school testing and evaluation systems models that provide mathematical, humanities and general education. To identify the causal effects from different assessment scales, we conduct an educational experiment and a largescale online survey in Ukrainian schools from 2019-2021. As a result of experimental research, we allocate the essential elements of testing and estimation activity: educational parameters, the structure of components of knowledge of a subject, criteria, a scale of estimations, an interval scale of transition to assessments, forms of final and local testing. The findings suggest that the developed approaches to assessing high school students' educational achievements are more effective than traditional ones. They encourage schoolchildren motivation to learn, in particular, in performing independent (especially homework) tasks. The obtained data confirm the need to use new approaches to assessing student achievement.

The impact of high-stakes tests on the teachers: A case of the Entrance Exam of the Universities (EEU) in Iran

The Entrance Exam of the Universities (EEU) in Iran is a multiple-choice high-stakes test that affects the teachers, students, parents and other stakeholders. The EEU is designed to screen the high school graduates for admission into higher education. This study aimed to explore the impact of this high-stakes test on the Iranian high school English teachers. To achieve the aim of the study, a validated survey questionnaire was administered to stratified random sample of 132 high school English teachers who were teaching in the five main educational districts in the city of Isfahan, Iran. The data analysis revealed that little attention was given to three language skills of speaking, writing, and listening in the classroom as these skills are not tested in the EEU. Moreover, the EEU negatively and implicitly influences English teachers to teach to the test format.

Relationships between the use of test results and students’ academic performance

In this study, we examined relationships between the use of test results and U.S. students’ math, reading, and science performance in Programme for International Student Assessment (PISA) 2009. Based on a literature review, we hypothesized that the 16 items in the PISA school questionnaire, which are related to the use of test results, can be categorized according to four factors. We validated this hypothesized factor structure using a confirmatory factor analysis and then obtained composite scores for each factor. As revealed by a multilevel analysis, when student and school demographic variables were controlled for, using test results to hold schools accountable to authority and the public was significantly positively related to students’ performance across all three subjects. No statistically significant relationship, however, was detected between students’ performance and the following uses of test scores: informing parents of their children’s performance, providing information for instructional purposes, and evaluating teachers and principals.

Updated Content of Education in Kazakhstan: Longitudinal Trajectories of Learning Performance in Mathematics and Science

JOURNAL OF EDUCATION AND HUMAN DEVELOPMENT, 2020

Purpose. The purpose of this study is to examine the effect of the Updated Content of Education (UCE) project in Kazakhstan on longitudinal trajectories of learners" performance on diagnostic tests conducted under the piloting of UCE (2015/2019) in the areas of Mathematics and Science. Research Methods. A longitudinal growth modeling was used to study targeted UCE effects based on a study sample of 2,509 students from 30 pilot schools and 1,082 from control schools. Findings. The results revealed positive UCE effects on the learners" growth and performance in Mathematics and Science, the presence of two latent classes of growth trajectories in Science, no gender effects, and partial effects of school location (rural/urban) on the learners" performance. Implications for Research and Practice. It is necessary to continue the work on development of teachers" professional competencies, the development of teaching aids and didactic materials, and criteria-based assessments. Learners from different latent classes of performance in Science need to be identified and provided with support in differential learning strategies.