How do you know when pediatric trainees are ready to practice medicine without supervision?
In recent years, researchers have been attempting to use the direct observations of trainees providing care, by members of the health care team, to develop reliable approaches to assessing competence. In order to do so, medical educators have created programs of assessment using frameworks such as Entrustable Professional Activites (EPAs), Competencies and Milestones. They have also used advances in technology to make these assessments more user friendly by making them accessible on mobile devices. These new approaches are currently being developed and refined by various groups within the pediatric community, including the American Board of Pediatrics, the Council on Pediatric Subspecialties (CoPS), Accreditation Council for Graduate Medical Education (ACGME), the Association of Pediatric Program Directors (APPD), the National Board of Medical Examiners (NBME) and others.
Research is underway to study the validity of these approaches. Recent results from two of these projects indicate that EPAs, milestones, and supervision scales are effective ways to assess whether a resident or fellow is ready to advance to the next step in training.
What are these approaches?
EPAs are essential activities that physicians are entrusted to perform safely and effectively without supervision. Competencies are the skills a physician has to acquire in order to perform those EPAs. You might think of it this way: if riding a bicycle were an EPA, then pedaling, maintaining balance and use of the brakes would be competencies.
Milestones are narrative descriptions of behaviors for each of the competencies along a continuum of development ranging from novice—early medical student—to a master clinician who is years into practice.
“We create brief narrative descriptions of behaviors at different performance levels, from novice to expert,” says Carol Carraccio, MD, MA, Vice President for Competency-Based Assessment at the American Board of Pediatrics (ABP). “They give us a shared mental image of what trainees look like delivering care at various levels of performance.”
As Dr. Carraccio explains, a trainee who needs to perform a physical exam, for example, and quickly takes a newborn from a mother’s arms and lays the baby on a cold examining table which prompts the baby to cry would be demonstrating behaviors of a novice in the competency of performing a physical exam. A resident who is much farther along the development continuum might come into the room, establish rapport with the mother, and observe the baby in the mother’s lap for rate and ease of breathing, color and use of both arms and legs before ever touching the baby.
Milestones are descriptions of the stages of development within an area of competence. As such, milestones provide a roadmap for learners who can utilize these descriptions of behaviors at each phase of their training to help reinforce and set learning goals. Integrating the behaviors of all competencies that are needed to perform an entrustable professional activity such as “providing care to a well newborn” is what trainees are called upon to do in delivering care to patients.
What did we learn from the research?
A study led by Richard B. Mink, MD, MACM, Chief of the Division of Pediatric Critical Care, and Director of the Pediatric Critical Care Fellowship Program at Harbor-UCLA Medical Center in Torrance, Calif., and Professor of Pediatrics at the David Geffen School of Medicine at the University of California-Los Angeles (UCLA) has been presented in several formats at a number of academic medical meetings.
Dr. Mink and colleagues developed supervisory scales for EPAs. For example, a trainee who scored a 1 on the scale was trusted only to observe a more experienced doctor, while one who scored a 5 was trusted to practice that particular professional activity unsupervised. The research, which involved more than 200 pediatric subspecialty fellowship programs and 1,000 fellows, provided evidence for the validity of the scales as effective supervisory tools. The research also showed a correlation between levels of supervision and performance levels on milestones.
Another research project that is providing insight into the assessment of competencies and milestones is being led by Patricia J. Hicks, MD, MHPE, Professor of Clinical Pediatrics at the Perelman School of Medicine at the University of Pennsylvania and a general pediatrician at the Children’s Hospital of Philadelphia. Dr. Hicks is director of the Pediatrics Milestones Assessment Collaborative (PMAC), a joint effort of the ABP, APPD and NBME.
“The collaboration of the ABP, APPD and NBME has provided an exciting opportunity for experts in assessment to work in concert with the community to develop assessment items, tools and reports,” says Dr. Hicks. “The research and development work is focused on designing high-quality assessment content that provides guidance both to the learner and to those who are responsible for making decisions about learners and their competence. PMAC outcomes seek high levels of reliability, validity, ease of use, educational value and acceptability—all at a reasonable cost.”
Dr. Hicks and colleagues are assessing the validity of items and tools that can be used to inform important decisions about readiness for trainees to progress to the next level of responsibility. Because advancement from one training level to the next involves less direct supervision and more patient care responsibility, PMAC has constructed a framework where the assessment of the learner informs the decision of readiness to advance within the training program. Additional outcomes provided to the learner and to the program include individual competency and milestone scores with narrative text feedback describing suggested recommendations for improvement. Faculty members, nurses and senior residents who work with and observe the resident complete these assessment tools on mobile devices. The researchers have found that having a variety of people assess a resident provides more thorough and specific feedback enriched by the variety of perspectives.
Current research results indicate that participants find the tools easy to use and that completion times range from four to 10 minutes. Reports, which aggregate data across tools, provided residents with insight into competency-specific performance and areas for improvement and provided program directors with data to support decisions about readiness of trainees to advance.
Evidence to support the value of competency-based assessment in the workplace continues to grow. To date, five papers relating to PMAC have been published in peer-reviewed journals, and Dr. Hicks and colleagues have committed to a number of additional publications and presentations in 2017. Data from Dr. Mink and Dr. Hicks’ projects—as well as other ongoing research— is being collected in APPD LEARN (Longitudinal Educational Assessment Research Network) for management and storage within their database.
Another general pediatrics EPA study was launched in January 2016 with the goal of determining clinical competency committee expectations for levels of supervision expected at graduation from residency. The study also seeks to collect data on actual performance by following a cohort of residents over three years. We have only collected one round of data from 23 participating programs,” Dr. Carraccio says. “But the partnership with APPD LEARN is leading to an amazing database that will allow the education research community to do major secondary analyses.
With the increasing recognition of the importance of these approaches has come an increase in the number of residency and fellowship programs that are using EPAs, competencies and milestones for trainee assessment and feedback. While evidence supporting their use is just emerging, Dr. Carraccio says, “It’s positive enough that the message is to just keep going.”