Blueprint Review by Physicians Taking the Certification Assessments

While developing ‘blueprints’ for its exams, ABIM surveyed all of its diplomates about the content they believed was most relevant to their practices by asking about the importance and frequency of clinical topics in their discipline. This enhanced trust because it gave all diplomates input into defining what matters in their fields, and demonstrated ABIM’s respect for their knowledge and concern about their experience, as well as the fairness of the assessment process. It meant that ABIM listened to and responded to their diplomates’ most frequent and trust-undermining criticisms of ABIM: that its exams lacked relevance to their daily practices. How does this build trustworthiness? ABIM demonstrated that it cared about its diplomates’ views and respected their competence.

How It Works

The blueprints for the American Board of Internal Medicine (ABIM) exams were initially developed by expert consensus of the discipline-specific Exam Committees, which utilized tables of contents of popular medical textbooks, established practice guidelines, milestones data from the  Accreditation Council for Graduate Medical Education (ACGME), and published research that has (or has strong potential to) influence practice. The Exam Committees initially determined both the major content categories and their relative percentages of the exam by looking at these resources and coming to a consensus agreement about target percentages for the assessment.


Beginning in 2016, ABIM designed a new process to solicit and, importantly, incorporate feedback from practitioners into the exam blueprints and the overall specifications for the design of the assessments.  To accomplish this, ABIM invited certified physicians to provide ratings of the relative frequency and importance of blueprint topics in practice, as well as include other areas not covered in the blueprint.  ABIM feels that the content on its assessments should be informed by front-line clinicians sharing their perspective on what is important to know. This review process, which resulted in a new, co-created exam blueprint, will be replicated on a periodic basis (~every 5 years) to inform and update ABIM assessments.


A representative sample of physicians, similar to the total population in age, time spent in direct patient care, and practice setting provided the blueprint topic ratings. The discipline-specific Exam Committees and Specialty Boards then used this feedback to update the blueprint.


To inform how exam content should be distributed across the major blueprint content categories, ABIM considered the average respondent ratings of topic frequency and importance in each of the content categories. A second important source of information, that was not self-report data, was the relative frequency of patient conditions in the content categories, as seen by certified internists and documented in national health care data sets (i.e., Medicare database; National Ambulatory Medical Care Survey (NAMCS)).


To determine prioritization of specific exam content within each major medical content category, ABIM used the combined information to set new thresholds for these parameters in assembling the questions that would appear on the assessment.

Skills and Competencies

The blueprint review process relies on ABIM diplomates’ medical knowledge, day-to-day practice experience, and expert judgment concerning the importance and frequency of various medical conditions that might be seen by a physician practicing in the relevant medical specialty.


Job analysis, or practice analysis, is the gold standard in the professions testing industry to validate exam blueprints. There are many different ways to conduct practice analyses that range from direct observation of professionals (in this case, physicians) in day-to-day practice to “chart pull” data collection and analysis. The ABIM blueprint review is a form of practice analysis that we used to validate the expert opinions of the Exam Committees after they initially developed the blueprints for assessments in their disciplines as described above. The blueprint review was used to correlate the expert judgements of the Exam Committee with self-report data from a representative sample of practicing physicians in each discipline about the relative importance and frequency of different clinical presentations. The external data sets (i.e., Medicare, NAMCS) were used to further corroborate the self-report frequency data to ensure not total reliance on self-report data. This massive effort was conducted to ensure that the assessments were truly a reflection of what ABIM physicians consider to be important and/or frequently seen in practice. This trust practice was conducted in the spirit of co-creation to ensure that the assessments are truly relevant to practicing physicians.


Yes.  Our primary quantitative measure of the perception of relevance is a question on our post-exam survey that asks respondents to rate the statement “The examination was a fair assessment of clinical knowledge in this discipline” on a Likert-type scale ranging from 1 (Strongly Disagree) to 5 (Strongly Agree). After observing a steady increase in “negative responses” to this question (i.e., increased numbers of respondents selecting “Disagree” or “Strongly Disagree” to the statement) in the years immediately preceding the blueprint review, we saw a positive shift and reversal of this trend for most disciplines after the results of the blueprint review were used to create new, more relevant exam forms.

We have also noted qualitative support for the effectiveness of this practice in the form of a dramatic decrease in the number of negative comments about rare presentations seen in the assessment and a related increase in positive comments about the fairness of the content and its representativeness of day-to-day practice.


This practice is definitely scalable.  Every assessment program should use some form of practice analysis to ensure congruence between a proposed assessment blueprint and the realities of day-to-day practice.  The methodology of the ABIM blueprint review is a robust one that could be replicated in a variety of settings.  Recognizing this, we published two papers describing the methodology and findings from our blueprint review and validation process:

  • Gray B, Vandergrift J, Lipner RS, Green MM.  Comparison of content on the American Board of Internal Medicine Maintenance of Certification examination with conditions seen in practice by general internists. JAMA. 2017;317:2317–2324.
  • Poniatowski PA, Dugosh JW, Baranowski RA, Arnold GK, Lipner RS, Dec G, Jr., et al. Incorporating physician input into a Maintenance of Certification Examination: A content validity tool. Acad Med. 2019;94(9):1369-1375.