SGIM Forum

The Milestone Gap 

06-22-2023 10:48

Medical Education

The Milestone Gap 

Dr. Doolittle (Benjamin.doolittle@yale.edu) is a professor of internal medicine and pediatrics at the Yale School of Medicine. He serves as the program director of Yale’s Combined Internal Medicine-Pediatrics Residency Program and the medical director of the Medicine-Pediatrics Practice. Dr. Gielissen (katherine.gielissen@yale.edu) is an assistant professor of internal medicine and pediatrics at the Yale School of Medicine. She is the editor of the Yale Office Based Medicine Curriculum and the associate clerkship director for the Department of Internal Medicine.

In 2013, the Accreditation Council for Graduate Medical Education (ACGME) introduced the Milestones into graduate medical education as a method to describe competency-based education. The Milestones show promise to provide more accurate assessment of a trainees’ progression to autonomous practice. Now 10 years into this project, as we incorporate Milestones 2.0 into graduate training programs, we recognize that there persists a Milestone Gap between theory and practice, between experts well-versed in Milestone language and front-line clinician-educators who evaluate trainees. The Milestones have been a labor-intensive process, mobilizing time and treasure from across the medical education community. The assessment of our trainees is critical. We have a public mandate to train competent, empathic, autonomous physicians. How do we close the Milestone Gap? Despite growing evidence for content and internal structure, there remains relatively scant evidence for response process validity, the extent to which assessors interpret Milestones in the way intended by the ACGME.1 A recent qualitative study on response process showed that faculty Milestone scoring was “not always aligned, and sometimes in conflict with, the intended purpose of [Milestone] assessment.”2 Additionally, existing evidence suggests that not only individual Milestone assessments are prone to bias but also there is ongoing confusion about Milestones language.3 The heterogeneity in assessment is so well known amongst programs that some have begun the process of weighing different inputs during the analysis process via the Clinical Competency Committee (CCC) and are developing continuous quality improvement systems to improve the quality of assessment data on an ongoing basis.4, 5 

In 2018, the ACGME launched Milestones 2.0 to make them more user-friendly and accessible to clinicians.3 Medical education experts streamlined the Milestones, simplified the language, and crafted a supplemental guide to provide real world examples for each subcompetency. Assessment requires in-the-moment observation and feedback by trained clinician-educators, thoughtful, collaborative appraisal by CCCs, and global, summative review by trusted advisors. Perhaps the most important purpose of the Milestones is to identify struggling learners early in their training so that supportive action can be taken. This process assumes that the assessment data reflects the actual performance of the trainee. Therefore, assessment requires medical educators who are one-part data scientist and one-part clinician—individuals who understand the validity, consequences, and implications of assessment outcomes.

The clarion call to close the Milestone Gap has been to increase faculty training.4 While we recognize its importance, isolated sessions often have limited lasting impact. The ACGME is actively combating this issue by providing concentrated training opportunities and setting up a network of Assessment Hubs that serves as training centers for program leaders to learn basic and advanced assessment skills.4 However, even with this training, the Milestone Gap remains. Frontline clinicians are busy. Many of the Milestones are nuanced, such as “reflective practice and commitment to individual growth” or “the physician’s role in healthcare systems,” and are difficult to assess amidst the hubbub of patient care. There are many other priorities in medical education: well-being, duty hours, procedural supervision, and patient safety. The reservoir of enthusiasm for Milestones is limited except for the most stalwart amongst medical educators. 

While efforts to ground assessment in clinical work using Entrustable Professional Activities (EPAs) has been embraced by a number of specialties, there remains much speculation as to whether this approach improves the accuracy of frontline faculty assessments.5 Milestones 2.0 is an important step forward—it incorporates 10 years of accumulated experience and feedback—but there remains significant work to be done to ensure valid assessments of trainees. A recent report by Hamstra and Yamazaki stressed the importance of implementing continuous quality improvement (CQI) system to monitor the quality of assessment data to produce summative metrics that can be relied upon for high-stakes decisions, many programs lack the resources to implement such measures.5

Assessment and feedback have become an increasingly complex endeavor that now has an alphabet soup of jargon: CCCs, CQI, EPAs, and ILPs (Individualized Learning Plans). Further, significant time and growing expertise are required to ensure high-quality assessment programs.4 To close the Milestone Gap, we believe an important innovation is for institutions and programs to designate an assessment officer, an individual with advanced training in assessment, validity frameworks, and competency-based education who serves as a local resource for programs. The assessment officer coordinates efforts to implement Milestones, trains clinicians to use them, monitors how Milestones are used, coaches CCCs in effective practices, and works with struggling programs. In addition, assessment officers have training in psychometrics and data analysis to provide ongoing evaluation of assessment data to ensure it is sufficient to make the high-stakes decisions required of residency training. In turn, each residency and fellowship program should identify an assessment officer who should be trained and supported to focus on the specifics of their program. 

In recent years, institutions have identified Wellbeing and Diversity, Equity, and Inclusion (DEI) officers. Often, there are leaders in these domains at every level of an organization. At many institutions, each hospital, department, and program has a designated person to support initiatives in Wellbeing and DEI. Given the complexity of assessment and feedback, and the challenges in closing the Milestone Gap, we believe institutions should designate an assessment officer at each level of the organization—from the GME office to each residency and fellowship. 

The duties of the assessment officer are manifold. First, they need to design assessment tools that are user-friendly and relevant to the clinical experience. This is no easy task. A surgical theater is very different from a continuity clinic, and the expectations of trainees vary between and across specialties. Tailoring the assessments requires a team approach, deep understanding of required tasks, and expertise in assessment design. Second, while faculty training is important, the assessment officer needs to incorporate assessment seamlessly into the rhythm of clinical duties. This requires innovation and continuous quality improvement. Third, the Assessment Officer needs to assess the overall progress of the programs. Do we have a deficiency in professionalism across all trainees? Are there particular rotations where learners struggle? The big picture view is critical for designing programs-wide improvement initiatives. Fourth, the assessment officer can support the CCCs as they identify struggling trainees. While critics may highlight the cost, we believe there is a greater cost if we do not adequately assess our trainees and give them proper feedback. 

The Milestone Gap endures. The resources marshalled by the ACGME to launch the Milestones project need to be met with the support of our institutions.

References

  1. Swing SR, Beeson MS, Carraccio C, et al. Educational milestone development in the first 7 specialties to enter the next accreditation system. J Grad Med Educ. 2013 Mar;5(1):98-106. doi:10.4300/JGME-05-01-33.
  2. Maranich AM, Hemmer PA, Uijtdehaage S, et al. ACGME milestones in the real world: A qualitative study exploring response process evidence. J Grad Med Educ. 2022 Apr;14(2):201-209. doi:10.4300/JGME-D-21-00546.1. Epub 2022 Apr 14.
  3. Edgar L, Roberts S, Holmboe E. Milestones 2.0: A step forward. J Grad Med Educ. 2018 Jun;10(3):367-369. doi:10.4300/JGME-D-18-00372.1.
  4. Heath JK, Davis JE, Dine CJ, et al. Faculty development for milestones and clinical competency committees. J Grad Med Educ. 2021 Apr;13(2 Suppl):127-131. doi:10.4300/JGME-D-20-00851.1. Epub 2021 Apr 23.
  5. Hamstra SJ, Yamazaki K. A validity framework for effective analysis and interpretation of milestones data. J Grad Med Educ. 2021 Apr;13(2 Suppl):75-80. doi:10.4300/JGME-D-20-01039.1. Epub 2021 Apr 23.


#Year2023
#July
#Regular
#Featured

Statistics
0 Favorited
8 Views
0 Files
0 Shares
0 Downloads

Tags and Keywords

Related Entries and Links

No Related Resource entered.