Sorry, you need to enable JavaScript to visit this website.

Frequently Asked Questions About MOC Exam Scoring

Questions About Scoring

This page includes information about how examinations are scored and how scores are reported. If you have a question about your examination results, please look here before contacting our office.

How is the examination scored?

Your examination score is based on the total number of scored questions you answered correctly. Diplomates are encouraged to answer each question, as no points are deducted for questions answered incorrectly. After administration, statistical analyses are conducted and a small number of questions may be deleted if they do not meet the standards for statistical and psychometric validity. Deleted questions are not included in calculating diplomates’ final scores. Because a relatively large number of physicians take the General Pediatrics MOC exam as compared to the subspecialty MOC exams, the General Pediatrics MOC exam utilizes a slightly different scoring model which includes some pretest questions that are not scored.

How is the passing score for the examination set?

The passing standard for each MOC examination is established through a standard setting process in which a panel of practicing, certified pediatricians determines the level of knowledge a diplomate must demonstrate in order to pass the secure examination. There are different panels for general pediatrics and each subspecialty, as every panel is comprised of subject matter experts specializing in the content of the examination. The required passing standard for each MOC examination is set utilizing well-established, evidence-based approaches.

What is a scaled score?

A diplomate’s raw score (the total number of questions answered correctly) is transformed into a scaled score for reporting purposes. This transformation is necessary because multiple forms are administered for each of our examinations. Diplomates are held to the same passing standard regardless of which form they take, so scaled scores — which take into account any variation among the forms — are reported instead of raw scores. With scaled scores, a direct comparison of performance across forms and administrations can be made.

The reporting scale ranges from 1 to 300, with 180 designated as the passing score. Scores on this scale do not represent raw scores and cannot be used to determine the percent correct (i.e., a 180 on the 300-point scale does not mean that diplomates need a score of 60 percent to pass).

How does the ABP ensure that examination forms are equal?

All examination forms are constructed using the same test blueprint (content outline). Although every attempt is made to ensure that the difficulty level of the various forms is as equivalent as possible, slight variance may occur due to the inclusion of new questions that have never before been tested. In order to account for any variation among the forms, a statistical process known as equating is applied, which ensures that diplomates are not advantaged or disadvantaged by taking one form of the examination over another. Through equating, a given scaled score on the examination will reflect the same level of content mastery regardless of the form administered.

What percentage of questions do I need to answer correctly to pass?

The percentage of questions that a diplomate needs to answer correctly is based on the difficulty of the form of the examination taken. Although every effort is made to create forms of an examination that are equivalent in difficulty, some differences may exist after final scoring. Therefore, because the percentage of correct answers required to pass the examination may vary among forms, a specific passing percentage is not available to diplomates.

Can I find out the raw score for my examination?

No. Because multiple examination forms are used within and across administrations, the raw score on an examination form is meaningless until it is transformed to a scaled score. For example, answering 160 out of 200 questions correctly on an easy examination form would have a different meaning than answering 160 questions correctly on a harder form. The conversion to a scaled score accounts for these differences making score comparisons across forms meaningful.

Can I find out my score on the various sections of the examination?

No. Sections were implemented simply to allow for scheduled breaks within an examination. Section scores are not computed as part of the final scoring process.

My score was just 1 point below passing. How close was I to passing?

Because multiple forms of the examination are used, “how close” a score of 179 is to passing may vary slightly across forms, but it is likely that between one and five additional questions should have been answered correctly to pass. The ABP will not provide the exact number of questions a diplomate answered correctly on a given examination form.

Are repeat test takers scored the same way as those taking the examination for the first time?

Yes. The examination is scored in exactly the same way for all diplomates, regardless of whether they are a first-time or repeat test takers. All diplomates must attain a Total Scaled Score of 180 or higher to pass the examination.

Why is the content area information of the questions I answered incorrectly reported?

Content area information is provided solely to give diplomates a sense of their areas of strength and weakness. For those who fail the examination, the information may provide guidance regarding areas in which remediation is needed.

What is the passing rate for the examination?

The overall passing rate for any of the ABP MOC examinations varies from year to year depending upon how diplomates perform compared with the absolute passing standard for their particular examination. An absolute passing standard allows the pass rate to range anywhere from 0 to 100 percent, as each diplomate’s performance is judged independently of the performance of other diplomates. Over the past five years, more than 95 percent of diplomates passed their MOC examination on their first attempt.

How can I be assured that my responses were captured and transmitted appropriately?

The ABP works extensively with Prometric to ensure the necessary steps are in place for accurate transmittal of your responses to the ABP. When the ABP’s psychometric staff perform initial scoring, their results are compared to the response information provided by Prometric to ensure that all data were transferred accurately. This is just one of many quality control procedures that the ABP uses throughout the scoring process.

Can I ask to have my score verified?

Yes. Although it is very unlikely that an error occurred during the transfer and processing of your examination results, you may request to have your examination rescored or verified for a fee of $250. Score verification requests must be made in writing and should include your name, ABP ID, mailing address, and a $250 check or money order payable to the American Board of Pediatrics. All requests must be made within one month of when results are made available.

During score verification, the psychometric staff at the ABP will use a completely separate method of reviewing and recalculating your results and then compare the recalculated score to your initial score results. Please note that the ABP does not encourage score verification requests, as we have multiple quality control procedures throughout the scoring process to ensure the accurate reporting of examination results.

If your question remains unanswered, please address your communication to:

The American Board of Pediatrics
111 Silver Cedar Court
Chapel Hill, North Carolina 27514-1513

Email for questions about scoring: scoring@abpeds.org
Email for all other questions: moc@abpeds.org

Please include your ABP ID # in all correspondence.

***** Please note that the American Board of Pediatrics (ABP) requires that all issues encountered on your examination day be reported to the ABP in writing within three business days of your examination date. Therefore, this page does not address technical difficulties or other administration issues experienced at Prometric testing centers.*****