Anatomy of an Exam

Thursday, February 21, 2019

Volunteers and ABP Staff Work Many Hours to Ensure Board Exams Reflect Knowledge Base Necessary for Practice

For many pediatricians, board certification is synonymous with examinations. For both initial certification and continuing certification, exams are a major component of assessing whether pediatricians have the knowledge that their peers have determined is essential for the safe practice of pediatrics.

But creating valid and reliable exams for General Pediatrics and 15 subspecialties — and then scoring them — is not as simple as it might seem. The test development and scoring processes are rigorous and the underlying science is complex, but ultimately, these methods produce a fair and strong measure of a pediatrician’s knowledge.

Dr. Ritu Sachdeva“Before I joined the ABP, I always wondered why we don’t get instantaneous results after taking the computer-based exams,” says Ritu Sachdeva, MD, Professor of Pediatrics in the Cardiology Division of the Department of Pediatrics, Emory University School of Medicine, and a member of the ABP Pediatric Cardiology Subboard since 2015. “During my orientation with the Board, when I learned about all that goes on behind the scenes to validate every single question following the exam, it was truly eye-opening! Learning about the science and statistical methods behind building and grading the certification exams gave me a newfound appreciation for what the Board does for its pediatricians.”

More than 300 certified pediatricians and pediatric subspecialists, supported by the ABP staff, perform the substantial job of creating exams and setting passing standards. Volunteers on the General Pediatrics Examination Committee and subspecialty subboards serve for terms of six years.

Dr. Jonathan Teitelbaum“It is important to keep in mind that the goal [of the certification process] is to ensure the proper medical treatment of children,” says Jonathan Teitelbaum, MD, a board-certified pediatric gastroenterologist in Long Branch, NJ, who also is maintaining his General Pediatrics certification and is a member of the General Pediatrics Examination Committee. “To that end, who better to determine the core knowledge that is needed to practice medicine than practicing pediatricians?”

THE PROCESS BEGINS WITH PRACTICE ANALYSIS

The Life Cycle of an Exam from Practice Analysis to Exam Administration

Panels of volunteer pediatricians, certified in their areas of expertise, are heavily involved in all aspects of test development. The work of these various panels is facilitated primarily by two departments at the ABP: Psychometrics and Test Development. The first major step is to determine what topics the exams should cover — a process known as practice analysis.

Andrew Dwyer, PhD“We [psychometrics staff] are involved in the ‘bookends’ of the test development process,” says Andrew Dwyer, PhD, ABP Director of Psychometrics. “At the beginning, we facilitate the practice-analysis process, and then, after the test is administered, we are responsible for scoring and score reporting.”

Every five or six years, for General Pediatrics and each of the subspecialties, the ABP recruits a special practice-analysis panel of 10 to 12 pediatricians in active practice who identify the knowledge areas that are required to care for patients.

Then, the ABP sends (via online survey) the list of knowledge areas to all pediatricians certified in General Pediatrics or a subspecialty, as appropriate, asking them to rate each area based on frequency (i.e., “How frequently is knowledge in this area required in practice?”) and criticality (i.e., “Would a lack of knowledge in this area result in harm?”). After the psychometrics team analyzes the survey results, the panel reviews the results to make final revisions to the list of knowledge areas and determines how many questions on the exam should be devoted to each area. Knowledge areas rated as being frequently required and highly critical receive more questions on the exam. The information from the survey determines what is known as the content outline and is published on the ABP website.

Once the content outline has been approved, the practice-analysis panel’s work is complete, and the annual, yearlong work of the volunteer pediatrician panels who write and review exam questions begins. A separate group of pediatricians is involved in developing test questions for each certification program (General Pediatrics and 15 subspecialties).

QUESTION ANALYSIS AND WRITING

In addition to writing new questions, volunteers analyze questions already in the “bank” to ensure that they are still current, relevant, accurate, and align with the most current content outline. These analyses provide the ABP with additional information about gaps in the question bank and where focused item-writing may need to occur.

Jared Riel“The industry standard for building a good, valid, reliable exam is to have on hand three times the questions you need,” says Jared Riel, MA, ABP Director of Test Development. “So, if we need five questions in a specific area or domain for an exam, then that area of the bank is not considered healthy unless it has 15 questions. If it has fewer than 15, it shows up as a need when we do our gap analysis.”

The committee or subboard members are each assigned about 10 to 20 questions to draft and given two months to complete this task. The actual number varies, depending on the gap analysis and whether new areas are identified in the practice analysis, particularly as medicine evolves.

“We assign them a specific content area,” Riel says. “For example, we’ll say, ‘We want you to write five questions on the differential diagnosis of cardiac conditions.’”

After receiving the draft questions, the ABP editorial staff edits them for consistent style before each person on the subboard reviews another member’s questions and provides content edits or feedback. In particular, they are looking at questions for relevance and accuracy. The same set of questions then goes out for another review to other members of the committee or subboard.

Dr. Archana Chatterjee“While I had done my very best to write questions that were unambiguous and develop evidence-based correct answers to them along with incorrect answers that could not simply be ‘guessed at,’ it was helpful to have a respected colleague review my work and provide me feedback so that I could do better in future attempts,” says Archana Chatterjee, MD, PhD, Professor and Chair, Department of Pediatrics and Senior Associate Dean for Faculty Development at the University of South Dakota Sanford School of Medicine.

After all this feedback, the committee or subboard meets in person to review the new questions and either approves them for use on an exam or flags them for further revisions in the future.

A final review of the questions is conducted remotely by a medical editor to ensure that all revisions or edits requested at the meeting have been made appropriately.

After a new version of an exam is built, a process, known as form review is conducted. In this process the entire committee or subboard reviews the collection of items and suggests which items need to be replaced on the exam. At this point, the ABP committee or subboard members have deemed the content in the exam to be ready for administration, but there are still many steps remaining for test development staff, including quality assurance reviews on the exam administration vendor’s systems.

SCORING AND SCORE REPORTING

The Life Cycle of an Exam from Key Validation to Score Reporting

After an exam has been administered, the scoring data is sent to the ABP psychometrics team, who then reviews and analyzes the way questions were answered. For each exam, the psychometrics team usually flags fewer than 10 questions to be re-examined.

“If, for example, an unusually large percentage of the people answered a question incorrectly,” Dwyer says, “we flag that question and send it back to the committee or subboard that reviewed and approved it.”

During this process, called key validation, the subboard or exam committee looks at each flagged question to make sure that the option identified as the correct answer really is correct and that none of the other options could also be considered correct.

Often, a question that many people get wrong points out a knowledge gap in the community. But sometimes, the subboard members agree that a question or answer is confusing or outdated. If the subboard or committee identifies questions that shouldn’t count toward someone’s score, Dwyer says, then the test taker isn’t penalized for having received these questions.

Ensuring the fairness of each test question and computing each person’s total exam score is only part of the scoring process, however. Because the exam is used to make pass or fail decisions, one of the most important aspects of scoring an exam is determining the score that is needed to pass the exam, a process referred to as standard setting.

During the standard-setting process, yet another independent panel of 10 to 12 practicing pediatricians participates in a two-day (or longer) workshop where they are asked to review the questions on the exam, discuss the knowledge level needed for certification, and make a recommendation regarding the passing score. Typically, a standard-setting panel is asked to establish the passing standard for the first exam that follows a new practice analysis. As a result, the passing standard is typically revisited every five to six years, which mirrors the practice-analysis schedule.

Because each exam version (test form) may vary slightly in its overall difficulty level, the psychometrics team uses a statistical process known as equating to ensure that all test takers are treated fairly. “We rely on our volunteer pediatricians to identify the score that they feel reflects the level of knowledge required for board certification, and we use statistical methods to ensure that all test takers are held to that same passing standard, regardless of which test form they receive,” says Dwyer.

While the scoring process is taking place, another group within the psychometrics team is conducting data forensics and web patrolling to ensure the security of the exam material. Statistical analysis of response data is done after every administration to identify any potential patterns of test fraud. Web patrolling allows the ABP to determine if any exam questions have been inappropriately leaked to the public. Any incident reports from the testing centers also are reviewed.

Finally, once the security process is complete, the psychometrics team can proceed with score reporting. They send a letter and an accompanying report to each test taker. In addition to letting pediatricians know whether they passed, the report also is designed to provide more detailed feedback that pediatricians can use to assess their strengths and weaknesses.


ABP VOLUNTEERS REFLECT ON THEIR ROLE IN EXAM PREPARATION

Dr. Daniel Rauch“The ultimate value of the exams is that they reflect the knowledge base necessary for practice. The only way to ensure that is to have practicing pediatricians in the field participate in the question writing. I am tremendously impressed with how seriously test development is taken by the ABP.”

Daniel Rauch, MD
Professor of Pediatrics and Chief of Pediatric Hospital Medicine
Tufts University School of Medicine
The Floating Hospital for Children at Tufts Medical Center


Dr. Nicole Washington“As pediatricians who are directly involved in patient care and intimately understand the workflow of a practicing pediatrician, it is important for us to have a seat at the table and use our experiences to positively influence and continually improve the board certification process.”

Nicole Washington, MD
Clinical Assistant Professor of Pediatrics
Perelman School of Medicine
Pediatric Hospitalist
The Children’s Hospital of Philadelphia


Dr. Lisa Samson-Fang“The crew of pediatricians who volunteer much of their time to writing and reviewing questions is at the heart of the process. To be in the room and listen to the input of the participants, it quickly becomes evident that creation of a certification examination without the pediatric teams’ input would be challenging.”

Lisa Samson-Fang, MD
Developmental Behavioral Pediatrics
General Pediatrics
Intermountain Medical Group
Salt Lake City, UT


Dr. Karen Leonard“Pediatricians from all over the country, who work in diverse practice settings, are able to come together and bring these different experiences and viewpoints to the table as we help create the exam questions. Seeing how much work goes into the creation of a single question has been astonishing.”

Karen Leonard, MD
Associate Professor of Pediatrics
University of Vermont
Director of Inpatient Pediatrics
University of Vermont Children’s Hospital


Dr. Debra Boyer“I remember my first time discussing the certification exam at one of our subboard meetings. Just seeing how much time and effort is put into making the test a high quality and fair exam was very impressive to me.”

Debra Boyer, MD
Associate Professor of Pediatrics
Harvard University Medical School
Pediatric Pulmonologist
Boston Children’s Hospital


Dr. Douglas Willson“Clearly, pediatricians not only should be, but must be involved [in developing exams and other certification activities]. It is work, a lot of work! It is also humbling to be working with other experts in my field.”

Douglas Willson, MD
The John Mickell Professor of Pediatric Critical Care
Virginia Commonwealth University School of Medicine


This story was first published in the ABP's 2018 Annual Report.

2018 Annual Report (PDF)

Was this page helpful?
* Please do not use this space to ask questions that require a response. Instead, please use the CONTACT tab on the right.