Viewpoint: Taking Apart the Art: The Risk of Anatomizing Clinical Competence (original) (raw)

Assessing Residents’ Competencies at Baseline: Identifying the Gaps

Academic Medicine, 2004

Purpose. Entering residents have variable medical school experiences and differing knowledge and skill levels. To structure curricula, enhance patient safety, and begin to meet accreditation requirements, baseline assessment of individual resident's knowledge and skills is needed. To this end, in 2001 the University of Michigan Health System created the Postgraduate Orientation Assessment (POA), an eight-station, objective structured clinical examination for incoming residents. Method. The POA, administered at orientation, included items addressing critical laboratory values, crosscultural communication, evidence-based medicine, radiographic image interpretation, informed consent, pain assessment and management, aseptic technique, and system compliance such as fire safety. The POA assessed many of the skills needed by interns in their initial months of training when supervision by senior physicians might not be present. Results. In 2002, 132 interns from 14 different specialties and 59 different schools participated in the POA. The mean score was 74.8% (SD ϭ 5.8). When scores were controlled for U.S. Medical Licensing Examination scores, there were no significant differences in performance across specialties. There were differences between University of Michigan Medical School graduates and those from other institutions (p Ͻ .001). Eighty-one percent of the residents would recommend the POA. Conclusions. The POA provides a feasible format to measure initial knowledge and skills and identify learning needs. Orientation is an effective time to identify important gaps in learning between medical school and residency. This is the first step in a continuing evaluation of the Accreditation Council for Graduate Medical Education's general competencies.

A New Model for Accreditation of Residency Programs in Internal Medicine

Annals of Internal Medicine, 2004

A renewed emphasis on clinical competence and its assessment has grown out of public concerns about the safety, efficacy, and accountability of health care in the United States. Medical schools and residency training programs are paying increased attention to teaching and evaluating basic clinical skills, stimulated in part by these concerns and the responding initiatives of accrediting, certifying, and licensing bodies. This paper,

The role of assessment in competency-based medical education

Medical Teacher, 2010

Competency-based medical education (CBME), by definition, necessitates a robust and multifaceted assessment system. Assessment and the judgments or evaluations that arise from it are important at the level of the trainee, the program, and the public. When designing an assessment system for CBME, medical education leaders must attend to the context of the multiple settings where clinical training occurs. CBME further requires assessment processes that are more continuous and frequent, criterion-based, developmental, work-based where possible, use assessment methods and tools that meet minimum requirements for quality, use both quantitative and qualitative measures and methods, and involve the wisdom of group process in making judgments about trainee progress. Like all changes in medical education, CBME is a work in progress. Given the importance of assessment and evaluation for CBME, the medical education community will need more collaborative research to address several major challenges in assessment, including ''best practices'' in the context of systems and institutional culture and how to best to train faculty to be better evaluators. Finally, we must remember that expertise, not competence, is the ultimate goal. CBME does not end with graduation from a training program, but should represent a career that includes ongoing assessment.

Meeting the Accreditation Council for Graduate Medical Education competencies using established residency training program assessment tools

The American Journal of Surgery, 2004

Background: Most existing residency evaluation tools were constructed to evaluate the Accreditation Council for Graduate Medical Education (ACGME) competencies. Methods: Before ACGME's six competency based assessment requirements for resident performance were developed, we created a residency evaluation tool with 5 domains important to successful surgical resident performance. Reliability was determined after 6 months of use. Factor analysis assessed whether the evaluation tool was a construct-valid measure of the ACGME competencies. Results: Three hundred forty-three evaluations for 36 surgical residents were tested. The original evaluation tool was highly reliable with an overall reliability of 0.97. Factor analysis defined 4 new combinations of questions analogous to 4 of the ACGME competencies: professionalism (reliability 0.95), patient care (reliability 0.93), medical knowledge (reliability 0.92), and communication (reliability 0.92). The new competency clusters were correlated with each other to a moderate degree. Conclusions: Our locally developed tool demonstrated high reliability and construct validity for 4 of 6 ACGME competencies. The correlation between factors suggests overlap between competencies.

Assessing Residents' Competency at Baseline: How Much Does the Medical School Matter?

Journal of Graduate Medical Education

Background Although there is some consensus about the competencies needed to enter residency, the actual skills of graduating medical students may not meet expectations. In addition, little is known about the association between undergraduate medical education and clinical performance at entry into and during residency. Objective We explored the association between medical school of origin and clinical performance using a multi-station objective structured clinical examination for incoming residents at the University of Michigan Health System. Methods Prior to assuming clinical duties, all first-year residents at the University of Michigan Health System participate in the Postgraduate Orientation Assessment (POA). This assesses competencies needed during the first months of residency. Performance data for 1795 residents were collected between 2002 and 2012. We estimated POA variance by medical school using linear mixed models. Results Medical school predicted the following amounts of variance in performance-data gathering scores: 1.67% (95% confidence interval [CI] 0.36-2.93); assessment scores: 4.93% (95% CI 1.84-6.00); teamwork scores: 0.80% (95% CI 0.00-1.82); communication scores: 2.37% (95% CI 0.66-3.83); and overall POA scores: 4.19% (95% CI 1.59-5.35). Conclusions The results show that residents' medical school of origin is weakly associated with clinical competency, highlighting a potential source of variability in undergraduate medical education. The practical significance of these findings needs further evaluation.

Competency-based medical education: origins, perspectives and potentialities

Medical Education, 2014

This discussion addressed its roots, significance and limitations, which extended to considerations of canonical knowledge and skill versus context-dependent ability, and on legitimate peripheral participation through a growing portfolio of entrustable professional activities. OtC [Tuesday July 2, 2013]: Not long ago, when we met at the American Educational Research Association annual meeting in San Francisco and later in Utrecht, we discussed the topic of competency-based medical education and agreed to continue our conversation using email. You have extensive knowledge of the literature and a lot of experience in workplace learning in other domains than medicine, and I expect your insights should be very valuable for medical educators. I was impressed when you used a ruler during your invited talk in Utrecht to show how short we have been using structured schooling approaches to learning compared to the existence of mankind. Workplace learning in clinical medicinebasically clerkships, internships and residency-is increasingly being structured with educational objectives, instructional theory, teaching methods and assessment approaches that are being validated. Competency-based medical education, around since the 1970s (1), has really only caught on at international scale in the last decade and a half.(2-4) I sincerely believe this has improved training, but it has also met with criticism.(5-10) Let me try to summarize the value and also the problems of competency-based education for undergraduate and postgraduate medical specialty courses as I see them. While university education arose from the wish to provide a means to educate the population in general fields, or liberal arts, as philosophy, rhetoric, music, the more vocational courses, among which medicine, have always combined university education with a practical purpose.

Faculty Development in Assessment: The Missing Link in Competency-Based Medical Education

Academic Medicine, 2011

As the medical education community celebrates the 100th anniversary of the seminal Flexner Report, medical education is once again experiencing significant pressure to transform. Multiple reports from many of medicine's specialties and external stakeholders highlight the inadequacies of current training models to prepare a physician workforce to meet the needs of an increasingly diverse and aging population. This transformation, driven by competencybased medical education (CBME) principles that emphasize the outcomes, will require more effective evaluation and feedback by faculty. Substantial evidence suggests, however, that current faculty are insufficiently prepared for this task across both the traditional competencies of medical knowledge, clinical skills, and professionalism and the newer competencies of evidence-based practice, quality improvement, interdisciplinary teamwork, and systems. The implication of these observations is that the medical education enterprise urgently needs an international initiative of faculty development around CBME and assessment. In this article, the authors outline the current challenges and provide suggestions on where faculty development efforts should be focused and how such an initiative might be accomplished. The public, patients, and trainees need the medical education enterprise to improve training and outcomes now. Justover100yearsago,Abraham Flexner's 1 seminal report, Medical Education in the United States and Canada, sparked widespread reform, and now medical education is once again experiencing significant pressure to transform. Multiple reports from many of medicine's specialty groups and external stakeholders highlight the inadequacies of current training models to prepare a physician workforce to meet the needs of an increasingly diverse and aging population across the globe. 2-9 Educators and regulatory bodies are responding to these calls for transformation by focusing on competency-based medical education (CBME), an amalgam of educational theories and approaches that emphasize the outcomes of training. 10-12 CBME was recently defined by a group of international collaborators as an outcomes-based approach to the design, implementation, assessment, and evaluation of a medical education program using an organizing framework of competencies. In CBME, the unit of progression is mastery of specific knowledge, skills, and attitudes and is learner-centered. 13 One of the first competency-based frameworks to be introduced was CanMEDS in the mid-1990s. 14 The Accreditation Council for Graduate Medical Education followed with the development and introduction of the general competencies framework for residency and fellowship in 2001. 15 More recently, the Association of American Medical Colleges has strengthened its emphasis on competencies and outcomes for medical students, 16 and the United States Medical Licensing Examination will increasingly emphasize physician competencies. 17 Other countries, looking to improve the quality of training and potentially reduce costs, are also working to implement CBME. 18

Advancing resident assessment in graduate medical education

Journal of graduate medical education, 2009

The Outcome Project requires high-quality assessment approaches to provide reliable and valid judgments of the attainment of competencies deemed important for physician practice. The Accreditation Council for Graduate Medical Education (ACGME) convened the Advisory Committee on Educational Outcome Assessment in 2007-2008 to identify high-quality assessment methods. The assessments selected by this body would form a core set that could be used by all programs in a specialty to assess resident performance and enable initial steps toward establishing national specialty databases of program performance. The committee identified a small set of methods for provisional use and further evaluation. It also developed frameworks and processes to support the ongoing evaluation of methods and the longer-term enhancement of assessment in graduate medical education. The committee constructed a set of standards, a methodology for applying the standards, and grading rules for their review of assessm...