Procedures for establishing defensible programmes for assessing practice performance (original) (raw)
Related papers
When enough is enough: a conceptual basis for fair and defensible practice performance assessment
Medical Education, 2002
Introduction An essential element of practice performance assessment involves combining the results of various procedures in order to see the whole picture. This must be derived from both objective and subjective assessment, as well as a combination of quantitative and qualitative assessment procedures. Because of the severe consequences an assessment of practice performance may have, it is essential that the procedure is both defensible to the stakeholders and fair in that it distinguishes well between good performers and underperformers.
Implementing workplace-based assessment across the medical specialties in the United Kingdom
Medical Education, 2008
OBJECTIVES To evaluate the reliability and feasibility of assessing the performance of medical specialist registrars (SpRs) using three methods: the mini-clinical evaluation exercise (mini-CEX), directly observed procedural skills (DOPS) and multi-source feedback (MSF) to help inform annual decisions about the outcome of SpR training. METHODS We conducted a feasibility study and generalisability analysis based on the application of these assessment methods and the resulting data. A total of 230 SpRs (from 17 specialties) in 58 UK hospitals took part from 2003 to 2004. Main outcome measures included: time taken for each assessment, and variance component analysis of mean scores and derivation of 95% confidence intervals for individual doctorsÕ scores based on the standard error of measurement. Responses to direct questions on questionnaires were analysed, as were the themes emerging from open-comment responses. RESULTS The methods can provide reliable scores with appropriate sampling. In our sample, all trainees who completed the number of assessments recommended by the Royal Colleges of Physicians had scores that were 95% certain to be better than unsatisfactory. The mean time taken to complete the mini-CEX (including feedback) was 25 minutes. The DOPS required the duration of the procedure being assessed plus an additional third of this time for feedback. The mean time required for each rater to complete his or her MSF form was 6 minutes. CONCLUSIONS This is the first attempt to evaluate the use of comprehensive workplace assessment across the medical specialties in the UK. The methods are feasible to conduct and can make reliable distinctions between doctorsÕ performances. With adaptation, they may be appropriate for assessing the workplace performance of other grades and specialties of doctor. This may be helpful in informing foundation assessment.
Selecting performance assessment methods for experienced physicians
Medical Education, 2002
Background While much is now known about how to assess the competence of medical practitioners in a controlled environment, less is known about how to measure the performance in practice of experienced doctors working in their own environments. The performance of doctors depends increasingly on how well they function in teams and how well the health care system around them functions.
Linking assessment to learning: a new route to quality assurance in medical practice
Medical Education, 2002
Background If continuing professional development is to work and be sensible, an understanding of clinical practice is needed, based on the daily experiences of doctors within the multiple factors that determine the nature and quality of practice. Moreover, there must be a way to link performance and assessment to ensure that ongoing learning and continuing competence are, in reality, connected. Current understanding of learning no longer holds that a doctor enters practice thoroughly trained with a lifetime's storehouse of knowledge. Rather a doctor's ongoing learning is a ÔjourneyÕ across a practice lifetime, which involves the doctor as a person, interacting with their patients, other health professionals and the larger societal and community issues. Objectives In this paper, we describe a model of learning and practice that proposes how change occurs, and how assessment links practice performance and learning. We describe how doctors define desired performance, compare actual with desired performance , define educational need and initiate educational action. Method To illustrate the model, we describe how doctor performance varies over time for any one condition, and across conditions. We discuss how doctors perceive and respond to these variations in their performance. The model is also used to illustrate different formative and summative approaches to assessment, and to highlight the aspects of performance these can assess. Conclusions We conclude by exploring the implications of this model for integrated medical services, highlighting the actions and directions that would be required of doctors, medical and professional organisations, universities and other continuing education providers, credentialling bodies and governments.