“The [Objective Structured Clinical Examination] testing paradigm offers great hope of assuring the American public that every Board-certified anesthesiologist not only knows what to do but also possesses the specific skills and abilities that are critical to the practice of anesthesiology.”

MILLER’S Pyramid of Assessment, the well-recognized and widely used model for the development and assessment of medical competence, defines four stages of capability: “knows,” “knows how,” “shows how,” and “does.”1  Each stage builds on the prior and each stage requires specific assessment tools. Nearly 4 decades of experience with the Objective Structured Clinical Examination (OSCE) has amassed substantial evidence of the usefulness of this approach in testing higher levels of competency.2–4  The American Board of Anesthesiology (ABA) recently announced its intent to add the OSCE to enhance the part 2 (oral) board examination for primary certification of physicians in anesthesiology beginning in 2017.* In this issue of Anesthesiology, Hastie et al.5  review the history of the development of the OSCE, its current application in medical education, and its limited use in assessing anesthesiologists and warn that careful assessment of validity and reliability is essential to assure that the examination is sound. In the paragraphs that follow we present a brief discussion on why the ABA is incorporating the OSCE into the primary board certification process and how the Board is working to assure that the new examination is methodically developed and thoroughly evaluated to ensure that it adds a valid measure of competence to the current certification process.

In the field of anesthesiology, there is evidence that use of the OSCE for assessment of physicians captures information about examinees that is not captured by either written or oral examinations. Examinees that do well on written and oral tests do not necessarily do well in OSCE evaluations, and vice versa.6  Heretofore, assessment of physician performance in the clinical setting has been largely a subjective process. Use of standardized anesthesia OSCE performance offers hope of greater objectivity and has demonstrated excellent inter-rater reliability.7,8  The Israeli Board of Anesthesiology has used OSCEs as part of its board certification process since April 2003.9–11  The Israeli Board of Anesthesiology examination incorporates five 15-min, hands-on simulation-based examination stations in OSCE format: trauma management, resuscitation, operating room crisis management, mechanical ventilation, and regional anesthesia. The Israeli Board of Anesthesiology has closely examined candidate satisfaction and validity of its OSCEs, demonstrating overall examinee satisfaction and good inter-rater reliability in use of the OSCE format.10  In the United Kingdom, the Royal College of Anaesthetists has used the OSCE as an integral part of its examination process for nearly a decade. The OSCE portion of assessment by the Royal College of Anaesthetists includes evaluation in 16 different OSCE stations, including: resuscitation, technical skills, anatomy (general procedure), history-taking, physical examination, communication skills, anesthetic hazards, and the interpretation of X-rays. As with the Israeli Board of Anesthesiology, the validity of the OSCE of the Royal College of Anaesthetists as an assessment tool has been established.

Since its incorporation in 1938, the ABA’s purpose has been to establish and conduct assessment processes by which the Board can determine whether a physician meets the ABA’s definition of a Board-certified anesthesiologist. The ABA defines a Board-certified anesthesiologist as a physician who possesses the knowledge, judgment, adaptability, clinical skills, technical facility, and personal characteristics sufficient to carry out the entire scope of anesthesiology practice independently. An ABA diplomate must be able to logically organize and effectively present rational diagnoses and appropriate treatment protocols to peers patients, patients’ families, and others involved in the medical community; serve as an expert in matters related to anesthesiology, deliberate with others, and provide advice and defend opinions in all aspects of the specialty of anesthesiology; and function as the leader of the anesthesiology care team.*

Over the course of its 75-yr history, the Board has continued to evaluate and update its assessment processes to better achieve this purpose. Relatively recent changes have included staging of the part 1 examination and greater focus on the perioperative management of hypothetical patients in the part 2 examination. In recent years, the value-added of the current part 2 (oral) examination toward identifying diplomates in anesthesiology has been debated. These debates have ultimately centered on two questions: (1) Is the part 2 examination an effective and objective evaluation tool for assessing a candidate’s decision making, judgment, adaptability, managing patients presented in clinical scenarios, and ability to logically organize and effectively present information?; and (2) Are there ways to improve the discriminative value and objectivity of the part 2 examination?

The ABA recognizes that its current part 1 (written) and part 2 (oral) examinations only assess the “knows” and “knows how” stages of medical competence; they cannot reliably assess the higher “shows how” and “does” levels of medical competence that are actually required for clinicians in practice. By incorporating the OSCE into the ABA examination process, examiners will be able to directly assess candidates’ clinical and communication skills as well as their professionalism in a highly structured environment. It was based on this logic that the ABA Directors decided to move forward with the addition of the OSCE to the primary certification process.

Once this decision was made, the Board assembled the ABA OSCE Development Advisory Panel, which comprises anesthesiologists, research scientists, and medical educators. The OSCE Advisory Panel is assisting with development and validation of the new examinations. The panel sought broad input from diplomates (practicing anesthesiologists), current part 2 examiners, simulation experts, and residency Program Directors; it also examined the American Society of Anesthesiologists’ Closed Claims database for the most common medical errors leading to malpractice litigation among anesthesiologists. The panel asked specifically, “What are the behaviors that cause anesthesiologists to struggle in clinical practice that are not well assessed in the existing written and oral examinations?” From this broad input, the ABA developed a preliminary blueprint for the new OSCE examinations that will be refined in the months ahead as we develop specific examination scenarios.

In April 2013, the ABA relocated its corporate offices to the 15th floor of the CapTrust Building in Raleigh, North Carolina. This new space will include a dedicated assessment center where, beginning in 2015, all future part 2 examinations will be conducted. The OSCE Development Advisory Panel is working closely with the ABA to design and build the part of the assessment center where the OSCE examinations will be conducted.

As Hastie et al.5  detail in their review, the ABA has much work ahead, including the hard work of scientifically testing the validity and reliability of the new OSCE examinations. This is clearly a basic requirement before the OSCE is incorporated into the current certification process and will begin as soon as the new assessment center is constructed. The OSCE testing paradigm offers great hope of assuring the American public that every Board-certified anesthesiologist not only knows what to do but also possesses the specific skills and abilities that are critical to the practice of anesthesiology.

*

Booklet of Information. The American Board of Anesthesiology, Inc., Raleigh, North Carolina, 2013. Available at: http://www.theaba.org/Home/publications. Accessed October 12, 2013.

Primary FRCA OSCE/SOE. The Royal College of Anaesthetists. London, England, 2013. Available at: http://www.rcoa.ac.uk/examinations/primary-frca-osce-soe. Accessed October 12, 2013.

1.
Miller
GE
:
The assessment of clinical skills/competence/performance.
Acad Med
1990
;
65
(
9 suppl
):
S63
7
2.
Hodges
B
:
OSCE! Variations on a theme by Harden.
Med Educ
2003
;
37
:
1134
40
3.
Hodges
B
:
Validity and the OSCE.
Med Teach
2003
;
25
:
250
4
4.
Casey
PM
,
Goepfert
AR
,
Espey
EL
,
Hammoud
MM
,
Kaczmarczyk
JM
,
Katz
NT
,
Neutens
JJ
,
Nuthalapaty
FS
,
Peskin
E
;
Association of Professors of Gynecology and Obstetrics Undergraduate Medical Education Committee
:
To the point: Reviews in medical education—The Objective Structured Clinical Examination.
Am J Obstet Gynecol
2009
;
200
:
25
34
5.
Hastie
MJ
,
Spellman
JL
,
Pagano
PP
,
Hastie
J
,
Egan
BJ
:
Designing and implementing the objective structured clinical examination in anesthesiology.
A
2014
;
120
:
196
203
6.
Savoldelli
GL
,
Naik
VN
,
Joo
HS
,
Houston
PL
,
Graham
M
,
Yee
B
,
Hamstra
SJ
:
Evaluation of patient simulator performance as an adjunct to the oral examination for senior anesthesia residents.
A
2006
;
104
:
475
81
7.
Devitt
JH
,
Kurrek
MM
,
Cohen
MM
,
Fish
K
,
Fish
P
,
Murphy
PM
,
Szalai
JP
:
Testing the raters: Inter-rater reliability of standardized anaesthesia simulator performance.
Can J Anaesth
1997
;
44
:
924
8
8.
Weller
JM
,
Bloch
M
,
Young
S
,
Maze
M
,
Oyesola
S
,
Wyner
J
,
Dob
D
,
Haire
K
,
Durbridge
J
,
Walker
T
,
Newble
D
:
Evaluation of high fidelity patient simulator in assessment of performance of anaesthetists.
Br J Anaesth
2003
;
90
:
43
7
9.
Berkenstadt
H
,
Ziv
A
,
Gafni
N
,
Sidi
A
:
Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology.
Anesth Analg
2006
;
102
:
853
8
10.
Berkenstadt
H
,
Ziv
A
,
Gafni
N
,
Sidi
A
:
The validation process of incorporating simulation-based accreditation into the anesthesiology Israeli national board exams.
Isr Med Assoc J
2006
;
8
:
728
33
11.
Ben-Menachem
E
,
Ezri
T
,
Ziv
A
,
Sidi
A
,
Brill
S
,
Berkenstadt
H
:
Objective Structured Clinical Examination-based assessment of regional anesthesia skills: The Israeli National Board Examination in Anesthesiology experience.
Anesth Analg
2011
;
112
:
242
5