Multiple-Choice Exams: An Obstacle for Higher-Level Thinking in Introductory Science Classes (original) (raw)
Related papers
Multiple choice questions can be designed or revised to challenge learners' critical thinking
Multiple choice (MC) questions from a graduate physiology course were evaluated by cognitive-psychology (but not physiology) experts, and analyzed statistically, in order to test the independence of content expertise and cognitive complexity ratings of MC items. Integration of higher order thinking into MC exams is important, but widely known to be challenging—perhaps especially when content experts must think like novices. Expertise in the domain (content) may actually impede the creation of higher-complexity items. Three cognitive psychology experts independently rated cognitive complexity for 252 multiple-choice physiology items using a six-level cognitive complexity matrix that was synthesized from the literature. Rasch modeling estimated item difficulties. The complexity ratings and difficulty estimates were then analyzed together to determine the relative contributions (and independence) of complexity and difficulty to the likelihood of correct answers on each item. Cognitive complexity was found to be statistically independent of difficulty estimates for 88 % of items. Using the complexity matrix, modifications were identified to increase some item complexities by one level, without affecting the item’s difficulty. Cognitive complexity can effectively be rated by non-content experts. The six-level complexity matrix, if applied by faculty peer groups trained in cognitive complexity and without domain-specific expertise, could lead to improvements in the complexity targeted with item writing and revision. Targeting higher order thinking with MC questions can be achieved without changing item difficulties or other test characteristics, but this may be less likely if the content expert is left to assess items within their domain of expertise.
Pushing Critical Thinking Skills With Multiple-Choice Questions
Academic Medicine, 2018
Medical school assessments should foster the development of higher-order thinking skills to support clinical reasoning and a solid foundation of knowledge. Multiple-choice questions (MCQs) are commonly used to assess student learning, and well-written MCQs can support learner engagement in higher levels of cognitive reasoning such as application or synthesis of knowledge. Bloom's taxonomy has been used to identify MCQs that assess students' critical thinking skills, with evidence suggesting that higher-order MCQs support a deeper conceptual understanding of scientific process skills. Similarly, clinical practice also requires learners to develop higher-order thinking skills that include all of Bloom's levels. Faculty question-writers and examinees may approach the same material differently based on varying levels of knowledge and expertise, and these differences can influence the cognitive levels being measured by MCQs. Consequently, faculty question-writers may perceive that certain MCQs require higher-order thinking skills to process the question, whereas examinees may only need to employ lower-order thinking skills to render a correct response. Likewise, seemingly lower-order questions may actually require higher-order thinking skills in order to respond correctly. In this Perspective, the authors describe some of the cognitive processes examinees use to respond to MCQs. The authors propose that various factors affect both the question-writer and examinee's interaction with test material and subsequent cognitive processes necessary to answer a question.
Assessing critical thinking in a student-active science curriculum
meeting of the National …, 1999
The desired student outcomes of a mature inquiry-oriented college science curriculum were identified by conducting and analyzing a series of faculty interviews. Both motivational and cognitive-intellectual learning goals were identified. The present study concerns two classes of target cognitive skills: skills involved in the cycle of scientific inquiry and skills involved in constructing quantitative models and interpreting quantitative data. A paper-and-pencil inventory consisting of open-ended questions about simple scientific scenarios was constructed to assess these skills. The effects of one semester of inquiry-oriented instruction on the skills was assessed by administering the inventory pre-and postsemester to a group of students that took inquiry-oriented science courses, a group that took no science courses, and a group that took more traditional biology courses. The scores of students in the inquiry courses improved significantly, while the scores of the other three groups showed no change. The results suggest that inquiry-oriented science instruction can improve students' ability to reason scientifically about issues that are outside the domain of instruction.
Graded Response Method: Does Question Type Influence the Assessment of Critical Thinking?
Journal of Curriculum and Teaching, 2019
Graded Response Method (GRM) is an alternative to multiple-choice testing where students rank options accordingto their relevance to the question. GRM requires discrimination and inference between statements and is acost-effective critical thinking assessment in large courses where open-ended answers are not feasible. This studyexamined critical thinking assessment in GRM versus open-ended and multiple-choice questions composed fromBloom’s taxonomy in an introductory undergraduate course in anthropology and archaeology (N=53students).Critical thinking was operationalized as the ability to assess a question with evidence to support or evaluatearguments (Ennis, 1993). We predicted that students who performed well on multiple-choice from Bloom’staxonomy levels 4-6 and open-ended questions would perform well on GRM involving similar concepts. Highperforming students on GRM were predicted to have higher course grades. The null hypothesis was question typewould not have an effect on criti...
An Investigation of Explanation Multiple-Choice Items in Science Assessment
Educational Assessment, 2011
Both multiple-choice and constructed-response items have known advantages and disadvantages in measuring scientific inquiry. In this article we explore the function of explanation multiple-choice (EMC) items and examine how EMC items differ from traditional multiple-choice and constructed-response items in measuring scientific reasoning. A group of 794 middle school students was randomly assigned to answer either constructed-response or EMC items following
CBE life sciences education, 2016
Recent reform efforts in undergraduate biology have recommended transforming course exams to test at more cognitively challenging levels, which may mean including more cognitively challenging and more constructed-response questions on assessments. However, changing the characteristics of exams could result in bias against historically underserved groups. In this study, we examined whether and to what extent the characteristics of instructor-generated tests impact the exam performance of male and female and middle/high- and low-socioeconomic status (SES) students enrolled in introductory biology courses. We collected exam scores for 4810 students from 87 unique exams taken across 3 yr of the introductory biology series at a large research university. We determined the median Bloom's level and the percentage of constructed-response questions for each exam. Despite controlling for prior academic ability in our models, we found that males and middle/high-SES students were disproport...
Journal of Agricultural Education, 2008
Some researchers have argued that science classrooms must move away from rote and passive applications of memorized concepts to the use of critical thinking skills as a primary component in facilitating learning. Yet few studies have examined the effect of overtly teaching for critical thinking on subsequent skill development. The purpose of this study was to assess if overtly teaching for critical thinking, as a teaching method, contributed to explaining increases in critical thinking skill scores of undergraduate students enrolled in agricultural biotechnology. One group of students were taught components of critical thinking and then asked to use the newly learned skills in class. A nonequivalent control group was instructed using the inquirybased teaching method. The data exhibited significance between groups giving evidence that overtly teaching for critical thinking improves students' critical thinking skills as opposed to using the inquiry-based teaching method. Adding gender to the model did not significantly increase the explanation of variance in critical thinking skills. Also, a weak positive correlation was found between the total critical thinking skill score and the total critical thinking disposition score.
A New Method for Assessing Critical Thinking In the Classroom
BioScience, 2006
To promote higher-order thinking in college students, we undertook an effort to learn how to assess critical-thinking skills in an introductory biology course. Using Bloom's taxonomy of educational objectives to define critical thinking, we developed a process by which (a) questions are prepared with both content and critical-thinking skills in mind, and (b) grading rubrics are prepared in advance that specify how to evaluate both the content and critical-thinking aspects of an answer. Using this methodology has clarified the course goals (for us and the students), improved student metacognition, and exposed student misconceptions about course content. We describe the rationale for our process, give detailed examples of the assessment method, and elaborate on the advantages of assessing students in this manner.
International Journal of Emerging Technologies in Learning (iJET), 2019
This study aims to describe and test students' perception of scientific learning, students' critical thinking skill level in scientific learning, differences in perception and critical thinking skill level of students based on school differences and gender, and the influence of perception on students' critical thinking skill in scientific approach learning. It involved 206 students from three high schools in Banjarmasin, Indonesia. Quantitative data was obtained from a perception questionnaire and critical thinking skill test. The perception questionnaire refers to Perception of Science Classes Survey (PSCS). The test used refers to indicators of critical thinking skill from Ennis in 2011. Data were analyzed using non parametric statistical tests with the SPSS version 23 application for Windows. The results showed that 1) students' perception of scientific learning were in the medium category, 2) students' critical thinking skill level was in a low category, 3) t...