+ All Categories
Home > Documents > ORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY … · ORGANIZATION OF A COMPREHENSIVE...

ORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY … · ORGANIZATION OF A COMPREHENSIVE...

Date post: 01-Jun-2020
Category:
Upload: others
View: 8 times
Download: 0 times
Share this document with a friend
21
1/31/17, 11:06 AM ORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation Page 1 of 21 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable J Educ Perioper Med. 1999 May-Aug; 1(2): E007. Published online 1999 May 1. PMCID: PMC4803385 ORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACTICE EXAMINATION PROGRAM: Planning, Structure, Startup, Administration, Growth and Evaluation Armin Schubert , MD, John Tetzlaff , MD, Michael Licina , MD, Edward Mascha , MS, and Michael P. Smith , MD MSEd Copyright © 1999 Journal of Education in Perioperative Medicine (JEPM). Published by the Society for Education in Anesthesia. Abstract Background Oral practice examinations (OPEs) are used in many anesthesiology programs to familiarize residents with the format of the oral qualifying examination given by the American Board of Anesthesiology (ABA). The purpose of this communication is to describe the planning, structure, startup, administration, growth and evaluation of a comprehensive oral examination program at a sizeable residency program in the Midwest. Methods and Results A committee of three experienced faculty was formed to plan the effort. Planning involved consideration of format and frequency of administration, timing for best resident and faculty availability, communication, forms design, clerical support, record keeping and quality monitoring. To accommodate resident rotation and faculty work schedules, a semiannual administration schedule on 3-4 consecutive Mondays was chosen. The mock oral format was deliberately constructed to resemble that used by the ABA so as to enhance resident familiarity and comfort with ABA style oral exams. Continued quality improvement tools put in place consisted of regular examiner and examinee inservice sessions, communication and feedback from ABA associate examiners to faculty examiners as well as review of examinee exit questionnaires. A set of OPE databases were constructed so as to facilitate quality monitoring and educational research efforts. Continued administration of the OPE program required ongoing construction of a pool of guided case- oriented questions, selection of appropriate questions based on examinee training exposure, advance publication of the exam calendar and scheduling of recurring examiner and examinee activities. Significant issues which required action by the governing committee were exam timing, avoidance of conflict with clinical demands, use of OPE results, and procurement of training resources. Despite initial skepticism, the OPE program was begun successfully and grew substantially from 56 exams in the first year to 120 exams by year three. The OPE was perceived positively by the majority of *† †‡ §†
Transcript

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 1 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

J Educ Perioper Med. 1999 May-Aug; 1(2): E007.Published online 1999 May 1.

PMCID: PMC4803385

ORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORALPRACTICE EXAMINATION PROGRAM: Planning, Structure, Startup,Administration, Growth and EvaluationArmin Schubert, MD, John Tetzlaff, MD, Michael Licina, MD, Edward Mascha, MS, and Michael P. Smith, MDMSEd

Copyright © 1999 Journal of Education in Perioperative Medicine (JEPM). Published by the Society for Education in Anesthesia.

Abstract

Background

Oral practice examinations (OPEs) are used in many anesthesiology programs to familiarize residents withthe format of the oral qualifying examination given by the American Board of Anesthesiology (ABA). Thepurpose of this communication is to describe the planning, structure, startup, administration, growth andevaluation of a comprehensive oral examination program at a sizeable residency program in the Midwest.

Methods and Results

A committee of three experienced faculty was formed to plan the effort. Planning involved considerationof format and frequency of administration, timing for best resident and faculty availability,communication, forms design, clerical support, record keeping and quality monitoring. To accommodateresident rotation and faculty work schedules, a semiannual administration schedule on 3-4 consecutiveMondays was chosen. The mock oral format was deliberately constructed to resemble that used by theABA so as to enhance resident familiarity and comfort with ABA style oral exams. Continued qualityimprovement tools put in place consisted of regular examiner and examinee inservice sessions,communication and feedback from ABA associate examiners to faculty examiners as well as review ofexaminee exit questionnaires. A set of OPE databases were constructed so as to facilitate qualitymonitoring and educational research efforts.

Continued administration of the OPE program required ongoing construction of a pool of guided case-oriented questions, selection of appropriate questions based on examinee training exposure, advancepublication of the exam calendar and scheduling of recurring examiner and examinee activities. Significantissues which required action by the governing committee were exam timing, avoidance of conflict withclinical demands, use of OPE results, and procurement of training resources.

Despite initial skepticism, the OPE program was begun successfully and grew substantially from 56 examsin the first year to 120 exams by year three. The OPE was perceived positively by the majority of

*† †‡ §†

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 2 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

residents. 90.2% of exit questionnaires acknowledged specific learning about oral exam technique, whileonly 0.3% indicated lack of meaningful information exchange at OPE sessions. Fewer than 10% ofresponses indicated misleading questions or badgering by examiners. Although anxiety remained constantover time, resident preparedness increased with repeat OPE exposure.

Summary

A comprehensive mock oral examination of substantial scope was successfully planned, initiated, anddeveloped at our anesthesiology resident training program. It is well accepted by residents and faculty. Itsinception was associated with an increase in resident preparedness. Now in its tenth year of existence itcontinues to be an asset and essential component of our training program.

Keywords: Education, anesthesiologists, residents, oral examinations, board certification

INTRODUCTION

In an era of continued and enhanced importance of board certification in anesthesiology, anesthesiologyprograms have had many indirect and direct incentives to assure that their residents have the best possiblechance for passage of ABA (American Board of Anesthesiology) sponsored examinations. Anesthesiologymock oral examinations for residents have been conducted in various formats for many years. Programs inthe 1970’s and 1980’s conducted mock orals largely on an individualized basis, frequently based onresidents requesting them from specific faculty. Virtually no published material is available about practiceoral examination efforts from this period. More recently, the ABA has encouraged the staging of mock oralexaminations in all anesthesiology training programs.

Given the lack of available literature, the continued high level of interest among current and prospectiveresidents and the complexities of establishing and administering a large scale OPE program, the authors setout to describe the planning process, structure, startup, administration, growth and evaluation of theircomprehensive OPE program. This information may be helpful to others contemplating to initiate, expandor enrich similar programs.

Planning and Initiation

Prior to the organized OPE program, the department had deliberated about holding oral practice exams.The recruitment of several anesthesiologists from a program with an established oral practice effortgalvanized these deliberations into action. Support from the department chair occurred after presenting anoverview of goals, advantages and resources for an OPE program. Shortly thereafter, a governingcommittee of three experienced faculty was formed to plan and administer the effort. Each had at least fouryears’ experience with mock orals in another institution and all were or soon became ABA question writersor associate examiners. The responsibilities of the committee included planning, policy setting, examinerselection, review of questions, ABA liaison, research coordination and results reporting.

Planning involved goal setting, consideration of format and frequency of administration, timing for bestresident and faculty availability, communication, forms design, clerical support, record keeping and qualitymonitoring. The goals of the OPE planning committee were as follows: (1) Providing a high qualitypractice exam (defined as mirroring the ABA format; minimizing examinee confusion; maximizing value

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 3 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

to examinees), (2) Minimizing the impact on clinical responsibilities and (3) Creating an opportunity forstudy of issues important to anesthesiology residency training.

In its planning effort, the committee gave several factors special consideration: the size of the trainingprogram (30-60 residents and fellows would need to be examined each session), the busy clinical scheduleof an operating suite handling over 30,000 patients a year, the effect of off-campus rotations and the needto demonstrate value to the training program.

To accommodate resident rotations, faculty work schedules and the number of residents to be tested, asemiannual administration schedule on 3-4 consecutive Mondays was chosen. The mock oral format wasdeliberately constructed to resemble closely that used by the ABA so as to enhance resident familiarity andcomfort level with ABA style oral exams. Quality enhancement tools put in place consisted of regularexaminer and examinee inservice sessions, communication and feedback from ABA associate examinersto faculty examiners as well as review of examinee exit questionnaires. An electronic database wasidentified as a key resource to facilitate exam administration, quality monitoring and educational researchefforts.

Format and Content

The OPE was modeled closely after the oral qualifying examination given by the ABA. It therefore makesuse of the guided question format and includes a stem (case scenario) that is divided into sections forpreoperative evaluation (Section A), intraoperative management (Section B) and postoperative care(Section C). In addition to the stem, each question also contains Section D (“additional topics”) in the formof 2-3 clinical vignettes designed to explore expertise in areas, which are different from the materialcovered in the stem (Section D). An example of such a guided question, as used in the OPE at thisinstitution, appears in Appendix A.

Twenty-one stem questions (case scenarios) were used during the first five years of the OPE program (Table 1). Thereafter, new case scenarios were added at a rate of approximately three biannually. Theauthors of our OPE case scenarios were OPE faculty who based the content of the exam questions on theirclinical, ABA or practice oral exam experience. An effort was made to achieve a diverse sampling ofanesthetic problems within any one case scenario and through the use of additional topics (Section D).Questions thus generated were reviewed by the organizing committee, and edited as necessary. The guidedquestion format utilized by the ABA served to standardize trainee exposure to examination material. Thisformat is designed to allow conclusions about such consultant attributes as application of knowledge,judgment, adaptability under changing clinical circumstances, ability to manage complications, andeffective communication. All candidates are required to return their stem question papers in an effort topreserve security of the case scenario material.

Faculty

Initially, twelve faculty examiners were invited to participate in the OPE. Although experience level variedamong examiners, all were board certified anesthesiologist members of the professional staff of theCleveland Clinic with a permanent faculty appointment with a demonstrated interest in resident education,and had attended at least one OPE inservice session. Small-group inservice sessions were conducted atleast yearly for all examiners. OPE examiners also attended briefings and participated in information

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 4 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

exchange with ABA associate examiners and board members. Several examiners with substantial non-operating commitments (intensive care, pain therapy) were included to broaden the scope and to providegreater flexibility in scheduling. Several days before the examination, the faculty examiners received apacket of materials that included the examination schedule, the examination questions as well as asynopsis of the OPE format, grading procedure, and desirable examining technique (see www.jepm.orgElectronic Educational Library: “Mock Oral Examiner Briefing Packet). Less experienced examiners wereassigned to examine with more experienced faculty examiners. The training material, as well as each OPEcase scenario question were reviewed applicability to the oral exam process by faculty members who werealso ABA associate examiners.

Administration

Continued administration of the OPE program required ongoing review and construction of guided case-oriented questions, selection of appropriate questions consistent with examinee training exposure, advancepublication of the exam calendar and scheduling of recurring examiner and examinee activities. A scheduleof communications associated with a typical OPE sessions appears in Table 2. Selection and assignment ofOPE questions to each candidate involved (1) avoidance of duplication for residents who had prior OPEsand (2) avoidance of questions whose stem dealt with an area covered during a subspecialty rotation towhich the resident had not yet been exposed (e.g., pediatrics). Electronic database queries to accomplishthe above were necessary prior to each OPE session. Significant issues which required action by thegoverning committee were exam timing, avoidance of conflict with clinical demands, use of OPE results,procurement of training resources, and communication of results to training program and developmentleadership.

The OPE is administered in the spring and fall of each year. All Cleveland Clinic anesthesiology residentswith at least 9 months of clinical anesthesia training are scheduled to participate in the semiannual OPE.Participation is mandatory. At least yearly, residents participate in a 2-hour inservice session during whichfaculty explain the format of and the rationale for the OPE. An agenda of a typical resident in-servicesession appear in Appendix B. OPE performance was deliberately not used in the evaluation of anyresident, either by the program director or the clinical competence committee. However, it was resolvedthat the reliability and validity of the OPE as a resident assessment tool should be investigated.

Each examination session is preceded by a 7 to 10 minute preparation period during which the candidatereviewed the short narrative of the stem question. The resident also completes a “candidate informationsheet” (see Appendix C) prior to the exam. The candidate is then asked to enter a faculty office and isseated with two faculty examiners. Until the recent change in the ABA oral examination format in 1997,examiner 1 began with the preoperative evaluation (Section A) and questioned the candidate for 5 minutes.Examiner 2 continued for 15 minutes with the intra- and post-operative sections (Sections B & C),followed by examiner 1 who explored one to three additional topics (Section D) in depth for the next 10minutes. This examiner returned to the stem question only if he or she felt that conduct of the exam did notallow a conclusive grade (FG 70 or 80) to be assigned. At the conclusion of the examination, the residentis briefly excused and the examiners independently complete the standardized grading sheet.

OPE Grading Technique

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 5 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

To guide examiner scoring, a standardized grading sheet is used (see Appendix C). Both examinersindependently rate the examinee’s performance on the preoperative (A), intraoperative (B), postoperative(C) and additional topics (D) sections of the OPE. Each section (A through D) contains 3 to 6 sub-questions that are graded separately. A “sub-score” is generated for each sub-question. Sub-scores arecombined to yield the section score (see below). Permissible sub-scores are 1,2,3 and 4, with 1 denotingbest and 4 lowest performance. In addition, examiners mark the presence of weakness in one of four areas(judgment, application of knowledge, clarity and adaptability) for each sub-question. A final grade (FG) isthen assigned on a four point scale [80, 77, 73, 70]. A grade of 80 is defined as “definite pass”, 70 as“definite fail”. A definite pass was understood to mean that the candidate functioned at or near the levels ofa board certified anesthesiologist. In grading, examiners are urged to identify one or two key areas in theexam. They are also reminded to accumulate sufficient information during the exam to be able todistinguish clearly between pass and fail. Only in situations where the examining procedure leaves roomfor uncertainty, are the grades of 77 and 73 to be used. Questions demanding exclusively factualinformation were discouraged because they were considered ungradeable. The object of questioning duringthe exam is to elicit the candidate’s thought processes (application of knowledge) and evidence for aconsultant level of function (judgment, communication skills, adaptability).

During the first five years of the OPE program, a systematic quantitation of trainee performance wasundertaken, which went beyond the P/F grading employed in the ABA exam. The OPE scoring procedureprovided the inputs for the four major indices of candidate performance. These are (1) the FG from theOPE grading sheet; (2) the P/F grade (A candidate was considered to have passed when the average of thetwo examiners’ FGs exceeded 75); (3) the section score (calculated for each section (i.e., A-D) as the sumof all reported sub-scores in that section divided by the number of sub-questions per section; and (4) theoverall numerical score (ONS; defined as the sum of all sub-scores divided by the number of sub-questions). A summary of scores used during this period is found in the glossary. While OPE grades werenot used for resident evaluation, this information was needed to enable faculty to conduct systematiceducational investigation of a variety of OPE characteristics, such as inter-rater reliability, internalconsistency, and validity.

Debriefing Session and Exit Questionnaire

Within a few minutes of having completed the mock oral examination, residents were asked to return tothe exam location for a 5-10 minute debriefing session with the faculty. During the debriefing period,examiners attempt to elicit feedback from the examinee about their subjective experience and providefeedback to the resident about his or her performance. Grading is de-emphasized. Rather, the focus is onimproving the resident’s approach to an oral exam question, point out good performance as well as pitfallsand providing suggestions for further development. There was specific effort to avoid teaching didacticmaterial. Examiners point out speech and behavior patterns exhibited by the candidate and suggestmodification, as necessary. To ensure that specific behaviors and responses are discussed, examiners makenotes on the scoring sheet and consult them during the debrief. Questions are answered and an opportunityfor venting is created before the resident returns to his or her clinical assignment. Candidates also receivegeneral information about the setting and characteristics of ABA style oral examinations. Thereafter,examinees complete an anonymous exit questionnaire (see Appendix E), designed to assess the impact ofthe OPE on resident perceptions and, ultimately, to improve exam quality.

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 6 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Clerical Support, Data Management and Quality Improvement

The information contained in the hand-marked examiner scoring sheet (Appendix C), the candidateinformation sheet (appendix D) and the exit evaluation questionnaire (appendix E) is transferred intoMicrosoft Excel for Windows (Microsoft Co., Redmond, Washington) and Superbase 4 for Windows(Superbase Inc., Bohemia, New York) utilizing a dedicated personal computer. Anonymity was created byreplacing individual’s names with numerical examinee and examiner codes. The key to the code issafeguarded by the data management specialist. Examiners do not have access to the code.

After each semiannual OPE period, scoring sheets and exit questionnaires were summarized. The results,in particular the rates of disagreement on pass-fail, and the rate of non-definitive final grade (i.e. not either70 or 80) scoring were fed back to OPE faculty, along with a summary of those responses from the exitquestionnaire which were considered to indicate quality issues. The rates at which resident reportedbadgering, misleading questions, being challenged, having their questions answered, having both strengthsand weaknesses pointed out, etc., were considered indicators of exam technique and opportunities forimprovement.

DISCUSSION

Oral examinations continue to be part of the certification process for many medical specialty boardsincluding the American Board of Anesthesiology. The format, strengths and weaknesses of oral qualifyingexaminations in anesthesiology have been well-communicated.,,, Throughout the years, manyanesthesiology programs have recognized that practice exposure to oral examinations may help preparecandidates. Despite the long history of oral practice examinations, it was not until 1989 that the ABAofficially encouraged the staging of “mock” oral examinations in all anesthesiology-training programs.1 Inaddition, courtesy observer status at the ABA new examiner workshop is extended to faculty from trainingprograms without board examiners.

One justification for holding anesthesiology OPEs lies in the need for residents to be exposed to the styleand content of an oral examination that represents for many the final hurdle to board certification as aconsultant in anesthesiology. Anesthesiology residents take several written examinations during theirtraining continuum, but consistent large scale access to practice with ABA-type oral examinations is notavailable in many programs, leading to the proliferation of commercial oral examination training courses.Intuitively, a resident who is familiar with the format of such an examination should perform betterbecause anxiety about format will be minimized, allowing a more effective demonstration of relevantskills. However, greater familiarity with exam format alone does not encompass the full scope of potentialbenefits that either the training program or the individual trainee may ultimately reap from an OPEprogram. Table 8 summarizes such additional benefits.

The methods used by the authors resulted in a successful mock oral examination effort at their institution.This statement is supported by the growth of the program in the early phase, its continued prominence aspart of the residency program and its reasonable performance on the evaluation tool, the exit questionnaire.More than 90% of responses indicated that specific learning took place with regard to the taking of an oralexamination where clinical judgment and communication skills are evaluated. Although ideally, noresident should feel confused or badgered during an examination, the relatively low and constant rate(<10%) of these examination practices speaks in favor of the OPE program as administered. Another

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 7 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

indication of the OPE program’s success may be found in the observation that resident self-assessedpreparedness increased after the first OPE session and confidence about taking oral exams increased withrepeat OPE exposure.

Not all information was positive. The rate with which examiners addressed residents’ strengths during thedebriefing period decreased over time. The rate at which residents reported confusion with questioningwas a steady 10-15%. These trends have been discussed at the faculty level and require continual attention.

Knowledge of the organization of the authors’ OPE program may assist anesthesiology educators inexpanding or adjusting their own programs. We believe that we are reporting the most detailed informationabout mock oral examinations available to-date. We also recognize that the circumstances that conspired toresult in a successful program at our institution may not apply elsewhere. A program with a more extensivegeographical dispersion of clinical training sites may encounter substantially more difficult logisticalproblems than we did. Some programs may choose to use OPE results as a means for resident assessment.Such a change in milieu would likely alter resident perceptions and faculty-resident interactionssurrounding the oral exam, resulting in different observations. A small program with several active ABAexaminers on staff may be able to provide an experience much more intense and realistic than was the casefor our OPEs. We offer our observations as one view into a fascinating aspect in resident education andhope that others will provide additional information to further refine and develop oral case-based testingfor experience, and, possibly, for assessment for clinical competence.

Oral examinations have been criticized for lack of reliability, validity, and objectivity. Specific areas ofweakness are inter-examiner agreement, case specificity (and therefore, poor generalizability), undueinfluences by extraneous factors, and poor correlation with “objective” measures of knowledge. On theother hand, oral examinations in anesthesiology have been refined considerably under the aegis of theABA in the last two decades. In a previous report, the authors have shown that their OPEs have excellentinternal consistency and an inter-rater reliability which was more than satisfactory for the purposes of aninternal examination. OPE outcomes also correlated moderately well with global faculty evaluations andwritten in-training examination scores. This is consistent with other reports of successful oral examinationsincluding the ABA oral examination3. Recognized means of improving the reliability of oral examinationsinclude standard grading forms, examiner training, grading from videotaped recordings and standardized,scripted questions to reduce content sampling error and the use of two simultaneous evaluators.,,,, Theauthors’ OPE includes several of these features, perhaps accounting for the acceptable observed reliabilityand validity.2,11

As part of their customer-focused definition of health care quality, policy makers may soon demandcertification of competence based on performance in clinical situations. In 1991, the Liaison Committee onMedical Education formally incorporated performance-based assessment into the accreditation standardsof medical schools. As of this year, the clinical competency examination for international medicalgraduates includes a clinical skills examination. Medical education, including that of the anesthesiologist,is already being shaped by an emphasis on demonstration of competence by simulated clinicalperformance in OSCEs and standardized patients,, and is being linked to improvements in quality of care.,,This trend will generate a growing need for measures of trainee performance which directly relate to thequality of health care provided. In this context, standardized oral practice examinations might contribute toa “clinical performance criterion” necessary to graduate students of anesthesiology into independent

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 8 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

practice.

The oral examination in anesthesiology is likely to continue to occupy its importance in the process ofboard certification because it assesses clinical judgment, problem solving ability and communicationskills, essential components of competence for anesthesiologists. Documentation of clinical competencethroughout the residency-training period is acknowledged to be a difficult task, with a general lack ofagreement on what constitutes the best definition of competence. Measures other than written multiplechoice examinations are needed to make up for variations in program and rater standards, as well as for thelimited observation of residents by faculty. The retention of a face-to-face examination in many specialtyboards, including the ABA, provides evidence of an uneasiness among specialty boards and medicalteaching faculty,,, to entrust certification of professional competence entirely to a written examination.Objective structured clinical examinations (OSCEs) are increasingly used for medical student assessmentand have been introduced by several medical specialty boards. In some ways, the OPE resembles anOSCE. An OSCE presents a carefully scripted clinical situation to the candidate whose responses aregraded systematically. Validity of our OPE was similar to that of many OSCEs.7,, The OPE is a labor andtime intensive process, requiring resources similar to those needed for OSCEs.26,

A comprehensive mock oral examination effort of substantial scope was successfully planned, rolled out,grown at our anesthesiology resident training program. The feedback from residents and fellows whoparticipate in this program is overwhelmingly positive. Now in its tenth year of existence the OPEcontinues to be an acknowledged asset and essential component of our training program.

ACKNOWLEDGEMENT

The authors thank the following individuals without whom the OPE program would not have beenpossible: the residents, fellows and staff anesthesiologists of The Cleveland Clinic Foundation; Mrs. ShellySords, education coordinator of the Division of Anesthesiology for her cheerful support and interest; Ms.Charlie Androjna for her skillful management of the large databank; Dr. Frances Rhoton for herencouragement and support during the project’s early days; Dr. Arthur Barnes, Vice Chairman and Dr.Fawzy G. Estafanous, Chairman of the Division of Anesthesiology, for creating an environment in whichthe OPE program could flourish. We especially thank Mrs. Nikki Williams for her expert secretarialsupport.

Appendix A. Sample OPE Question

A 42 year-old hypertensive, 135 kg, 155 cm, female is scheduled for elective cholecystectomy. She has a40-pack year smoking history and complains of epigastric pain. There is a history of barbiturate allergy.Arterial blood gas on room air reveals pH=7.38, PO2=62, PCO2=43, BP=160/100, HR=96, RR=20,Hb=16.

A. PREOPERATIVE EVALUATION

1. Obesity: Is an obese person an increased anesthetic risk? Why? Criteria for severity?Further evaluation? Why? (For associated disorders: diabetes, cardiac reserve, pulmonaryhypertension, hepatic problems.) What would you do with these test results?

2. Hypertension: Would you require normalization of blood pressure perioperatively? Why?

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 9 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Why not? Risks of hypertension? Perioperative MI risk? Neurologic risk?3. Pulmonary Assessment: How do you explain the blood gas values? Hemoglobin? Are

further pulmonary function tests needed? Under what circumstances? What if the PCO2were 52?

B. INTRAOPERATIVE COURSE

1. Premedication: Do you think a sedative is necessary? Why? What would make you givea sedative premedication? Which and why? Is aspiration prophylaxis needed? Bestregimen? Why?

2. Selection of Monitors: CVP catheter indicated? Why/Why not? What would prompt youto recommend CVP? Arterial line? Why?

3. Choice of anesthesia: Your surgical colleague asks if a combined regional/generalanesthetic is better than general anesthesia? Effect of epidural anesthesia/analgesia onpost op ABG/FRC/VC? Epidural opiates vs. local anesthetics?

4. Induction of Anesthesia: How would you induce general anesthesia? (Consider airway,aspiration, risk, CV status). Justify each agent! When would you consider awakeintubation/laryngoscopy? Assume the history of allergy to barbiturates is real. What arethe implications for anesthetic management? Porphyria? Types? How would youmanage? Why?

5. Maintenance of Anesthesia (Pharmacokinetics of Obesity): Is narcotic or inhalationaltechnique preferable in obesity? Why? Difference in recovery times? Dosing (TBW vs.BSA vs. LBW?) What pharmacokinetic properties would be desirable in the obesepatient?

C. POSTOPERATIVE CARE

1. Should the patient be monitored in a special care unit postoperatively? How do youdecide?

2. Postoperatively, the patient is in pain and has an O2 saturation of 87% on 50% O2 bymask. What are your recommendations? Why?

3. Suddenly the patient becomes extremely tachypneic and tachycardic with concomitanthypotension. What would you do? Why? Assume pulmonary embolism: How would youdiagnose and manage? Why?

D. ADDITIONAL TOPICS

1. A pregnant nurse anesthetist asks if she should be allowed to work? Risk ofmiscarriage/fetal malformation?

2. After induction with halothane and succinylcholine, a 10-year-old boy develops masseterspasm. Implications? Do you cancel the case? Why? What do you tell the parents/family?Why?

3. A 55-year-old war veteran presents with chronic burning upper extremity pain. How doyou evaluate? Assume reflex sympathetic dystrophy: Outline treatment plan. How do youdecide on treatment modality?

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 10 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Appendix B. Agenda for Typical Resident OPE Inservice Session

1. Introduction: Background & benefit of mock orals2. Scheduling of OPEs3. Expectations for attendance4. Description of activities on day of OPE session5. How (and how not) to prepare for mock oral examinations6. How to approach a stem question; use of time immediately before the exam7. Conduct of the OPE; timing & content8. What do examiners look for (communication, clinical judgment, adaptability, defense of decisions

made; etc.)9. Importance of exit questionnaire

10. Discussion of debriefing session11. OPE scoring by examiners; implications for residents

Appendix C. Candidate Information Questionnaire

   Candidate Information Questionnaire

Name:__________________________ Started CA-1 Year: ________(m/y)

ON A SCALE OF 1-10, RATE HOW YOU FEEL PRESENTLY (10 = highest):

_____ I am anxious about the oral board exam.

_____ I feel prepared for the oral board exam.

_____ I feel I have adequate knowledge to take the oral board exam.

_____ I feel I have the knowledge, but may not be able to get it across to the examiners.

_____ I am so nervous I can’t think straight.

CHECK ALL APPLICABLE ITEMS:

_____ I have taken practice oral boards before.

_____ I have studied specifically for the practice oral boards by:

_____ Reading standard anesthesia texts.

_____ ASA Refresher courses and journal review articles.

_____ Editorials and journal articles.

_____ Attending a course geared to prepare for orals.

_____ Doing nothing specifically.

I look up pertinent information and read about anesthesia management in connection with the cases I do:

_____ > 75% of the time

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 11 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

_____ 50-75% of the time

_____ 25-50% of the time

_____ < 25% of the time

I feel I know:

_____ A little bit about everything

_____ A few topics in depth.

_____ A lot of topics in depth.

Appendix D. OPE Candidate Scoring Sheet

Candidate’s Name _____________________________________ Date ___________

CATEGORIES OF EVALUATION SUBSCORE CODE

A. Ability to organize and express thoughts clearly 1 Definite PassB. Sound judgment in decision making and application 2 Probable PassC. Ability to apply basic science principles to clinical problems 3 Probable FailD. Adaptability to changing clinical conditions 4 Definite Fail

*Please circle topic number to designate major areas of test Areas of weakness (check all applicable)

Summary Statement:

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 12 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Examiner _______________________________ Grade ___________

  (Print Name)

Appendix E. Exit Questionnaire

Fig 1

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 13 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Early growth of the OPE program: Number of exams administered 1989-1993

Fig 2

Resident Self-Assessed Anxiety and Preparedness*

Fig 3

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 14 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Acronyms

Definition of Terms

Resident Ratings of the OPE Debrief

GLOSSARY

CA-1,2,3 Clinical Anesthesia, year one, two, three

ITE In-Training Examination (written exam sponsored by joint ABA/ASA Council)

OPE Oral Practice Examination

ONS Overall Numerical Score = average of all sub-scores

ABA American Board of Anesthesiology

OSCE Objective structured clinical examination

IRR Inter-rater reliability

Internalconsistency:

The degree to which performance in one area of the OPE relates to performance in adifferent area.

Inter-raterreliability:

The agreement between scores assigned by two concurrent examiners.

Validity:Defined here as criterion or concurrent validity, referring to the degree to which OPEscores are related to other concurrent measures of anesthesiology resident performance.

One of four parts of the OPE guided question:

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 15 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Section:A = preoperative C = postoperative

B = intraoperative D = additional topics

Sub-question Specific question/question complex within a section. There are 3-6 sub-questions per section.

Sub-score:Examiner-assigned score based on examinee OPE performance on sub-questions withinan OPE section (possible scores=1,2,3,4; 1=best).

Sectionscore:

The average of all sub-scores in one section (range 1-4; 1=best; continuous variable).

Final grade(FG):

One of 4 possible (70=definite fail, 73=probable fail, 77=probable pass, 80=definitepass) scores assigned by examiners. Except for IRR calculations, the average of twoexaminers FGs is used.

Pass-fail(P/F):

Pass is defined as a FG of >75, fail as £75.

OPEoutcome:

Any measure of examinee performance on OPE.

REFERENCES

1. ABA News. 1989;2(1):4.

2. Carter HD. How reliable are good oral examinations? Calif J Educ Research. 1962;13:147–53.

3. Kelley PR, Matthews KH, Schumacher CF. Analysis of the oral examination of the American Board ofAnesthesiology. J Med Education. 1971;46:982–8.

4. Pope WD. Anaesthesia oral examination (editorial) Can J Anaesth. 1993;40:907–10.[PubMed: 8222027]

5. Eagle CJ, Martineau R, Hamilton K. The oral examination in anaesthetic resident evaluation. Can JAnaesth. 1993;40:947–53. [PubMed: 8222035]

6. Colliver JA, Verhulst SJ, Williams RG, Norcini JJ. Reliability of performance on standardized patientcases: A comparison of consistency measures based on generalizability theory. Teaching and Learning inMedicine. 1989;1:31–7.

7. Schubert A, Hull A, Tetzlaff J, Maurer W, Barnes A. Reliability and validity of anesthesiology “mockorals” during a three-year period. Anesthesiology. 1992;77:A1118.

8. McGuire CH. Studies of the oral examination: experiences with orthopedic surgery. In: Lloyd JS,Langley DG, editors. Evaluating the skills of medical specialists. Chicago: American Board of MedicalSpecialists; 1983. pp. 105–109.

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 16 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

9. Yang JC, Laube DW. Improvement of reliability of an oral examination by a structured evaluationinstrument. J Med Education. 1983;58:864–72.

10. Anastakis DJ, Cohen R, Reznick RK. The structured oral examination as a method for assessingsurgical residents. Am J Surg. 1991;162:67–70. [PubMed: 2063973]

11. Muzzin LJ, Hart L. Oral Examinations (Ch 5) In: Neufeld VR, Norman GR, editors. Assessing ClinicalCompetence. Springer Publishing Co; New York: 1984.

12. Evans LR, Ingersoll RW, Smith EJ. The reliability, validity and taxonomic structure of the oralexamination. J Med Educ. 1966;41:651–7. [PubMed: 5918831]

13. Elliott RL, Juthani NV, Rubin EH, Greenfeld D, Skelton WD, Yudkowsky R. Quality in residencytraining: toward a broader, multidimensional definition. Academic Medicine. 1996;71(3):243–7.[PubMed: 8607919]

14. Liaison Committee on Medical Education . AMA. Vol. 14. Chicago, IL: Council on MedicalEducation; 1991. Functions and structure of a medical school; pp. 174–9.

15. Polk SL. Educational initiatives. Problems in Anesthesia. 1991;5:305–18.

16. Stillman PL, Regan MB, Philbin M, Haley HL. Results of a survey on the use of standardized patientsto teach and evaluate clinical skills. Academic Medicine. 1990;65:288–92. [PubMed: 2337429]

17. Polk SL. Educational initiatives. Problems in Anesthesia. 1991;5:305–18.

18. Green JS, Robin HS, Schibanoff J, Goldstein P, Lorente JL, Hoagland P, et al. Continuous medicaleducation: using clinical algorithms to improve the quality of health care in hospitals. J Cont Educ inHealth Prof. 1992;12:143–55.

19. Triplett HB, Wilson-Pessano S, Vintilla-Friedaman S, Levine R, Marshall G. Critical performancerequirements for anesthesiology residency. Anesthesiology. 1988;69:A797.

20. Orkin FK, Greenhow DE. A study of decision making: How faculty define competence.Anesthesiology. 1978;48:267–71. [PubMed: 637335]

21. Stillman P, Swanson D, Regan MB, Philbin NM, et al. Assessment of clinical skills of residentsutilizing standardized patients. A follow-up study and recommendations for application. Ann Intern Med.1991;114(5):393–401. [PubMed: 1992883]

22. Van der Vleuten CPM, Norman GR, DeGraaff E. Pitfalls in the pursuit of objectivity: issues ofreliability. Medical Education. 1991;25:110–8. [PubMed: 2023552]

23. Giles TJ. Task force completes four-year study of in-training evaluation in internal medicine. Ann RColl Physicians Surg Can. 1983;16:265–6.

24. Newble DI, Baxter A, Elmslie RG. A comparison of multiple choice and free response tests inexaminations of clinical competence. Medical Education. 1979;13:263–8. [PubMed: 470647]

25. Elstein AS. Beyond multiple choice questions and essays: The need for a new way to assess clinicalcompetence. Academic Medicine. 1993;68:244–9. [PubMed: 8466598]

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 17 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

26. Quattlebaum TG, Darden PM, Sperry JB. In-training examinations as predictors of resident clinicalperformance. Pediatrics. 1989;84:165–72. [PubMed: 2740167]

27. Maatsch JL, Huang R. An evaluation of construct validity of four alternative theories of clinicalcompetence. Proceedings of the Annual Conference on Research in Medical Education; 1986. pp. 69–74.

28. Kowlowitz V, Hoole AJ, Sloane PD. Implementing the objective structured clinical examination in atraditional medical school. Academic Medicine. 1991;66:345–7. [PubMed: 2069655]

Figures and Tables

Table 1

Frequency of OPE Stem Question Use

Stem Question # Frequency Percent of Total OPEs

1. 18 4.0

1 21 4.7

2 24 5.4

3 35 7.8

4 18 4.0

5 28 6.3

6 15 3.4

7 6 1.3

8 29 6.5

9 35 7.8

10 38 8.5

11 31 6.9

12 18 4.0

13 15 3.4

14 24 5.4

15 27 6.0

16 15 3.4

17 11 2.5

18 5 1.1

19 22 4.9

20 12 2.7

data from the first five years of the program

*

*

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 18 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Table 2

Communications Schedule for a Typical OPE Session

Subject Timing Distribution

OPE session dates 6-9monthsprior

Department heads, anesthesia control desks, SICU director, pain centerdirector, all residents & fellows

Arrange exam room availability& refreshments

one monthprior

Faculty affected

Examiner material: 1-2 weeksprior

OPE examiners; anesthesia control desks

 OPE assignment schedule

 OPE questions

 Instructions

 Special communication(s)

Arrange feedback location 1-2 weeksafter

Room scheduling

Examiner feedback 1-2 weeksafter

Examiners

 Thank you note

 Invitation to feedback session

Summary of current statistics

sample schedule available in JEPM electronic educational libraryavailable in electronic educational library

example available in electronic educational library

Table 3

Frequency of Examinations Given by Examiners

Examiner # of Exams Percent of Total Exams Pass Rate

1 65 7.5 60.0

2 18 2.1 38.9

3 57 6.6 59.6

4 78 9.0 44.9

5 44 5.1 43.2

6 53 6.1 39.6

7 51 5.9 72.5

*

**

#

*

**

#

*

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 19 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

8 19 2.2 68.4

9 80 9.2 62.5

10 147 17.0 39.5

11 170 19.6 67.1

12 27 3.1 44.4

13 23 2.7 73.9

14 16 1.8 93.8

15 6 0.7 33.3

16 11 1.3 36.4

data from the first five years of the program

Table 4

Pass, Disagreement And Waffle Rates (First Five Years)

1989 1990 1991 1992 1993

Spring Fall Spring Fall Spring Fall Spring Fall Spring

n 24 33 39 35 41 46 53 59 61

Pass % 42 42 43 31 37 43 43 42 31

Split % 8 9 13 43 39 37 32 29 36

Waffle % 31 32 17 26 23 13 15 19 20

refers to percent of exams during which examiners disagreed on pass-fail status (gave FGs which differed by > 3 points)refers to percent of examiners who gave FGs which were neither 70 no 80.

Table 5

Preparedness and Anxiety over Time

Preparedness Anxiety

N % > 5 (95% CI) Mean N % > 5 (95% CI) Mean

Spring 1989 0 0

Fall 1989 19 16 (3,40) 3.4 20 70 (46,88) 6.3

Spring 1990 36 33 (19,51) 4.5 36 61 (43,77) 5.6

Fall 1990 32 41 (24,59) 4.9 32 53 (35,71) 5.2

Spring 1991 35 43 (26,61) 4.8 34 50 (32,68) 5.1

Fall 1991 38 61 (43,76) 5.6 43 51 (35,67) 5.8

*

*

#

*

#

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 20 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

Spring 1992 41 49 (33,65) 5.2 42 52 (36,68) 5.5

Fall 1992 45 51 (36,66) 5.0 47 60 (44,79) 5.8

Spring 1993 47 40 (26,56) 4.9 47 45 (30,60) 5.1

Fall 1993 49 37 (23,52) 4.5 52 52 (38,66) 5.7

Overall 342 43 4.8 353 54 5.5

Table 6

Anxiety and Confidence in First Time vs. Repeat Examinees (Range: 1-10)

Anxiety (n=84) Confidence (n=78)

First-time 5.5 ± 1.7 3.7 ± 2.1

Repeat 5.6 ± 1.6 5.1 ± 1.5

<0.001 vs. first-time

Table 7

Fractional responses (%) on OPE Exit Questionnaire Results (First FiveYears n=387)

Year

All 1989 1990 1991 1992 1993

(N=377) (N=56) (N=73) (N=86) (N=106) (N=56)

Misled 4.0% 5.4% 5.5% 2.3% 1.9% 7.1%

Badgered 4.5% 5.4% 2.7% 4.7% 6.6% 1.8%

Material too complex 4.5% 5.4% 5.5% 4.7% 3.8% 3.6%

Intimidated 22.3% 19.6% 23.3% 20.9% 24.5% 21.4%

Confident 10.6% 5.4% 8.2% 14.0% 9.4% 16.1%

In Control 16.2% 14.3% 16.4% 18.6% 16.0% 14.3%

Challenged 80.1% 82.1% 80.8% 81.4% 78.3% 78.6%

Difficulty Expressing 45.1% 37.5% 41.1% 47.7% 50.9% 42.9%

Missed Things 54.1% 60.7% 56.2% 41.9% 57.5% 57.1%

see Also Appendix E

Table 8

Possible benefits from an organized oral practice examination program

*

*

*

*

1/31/17, 11:06 AMORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACT…Structure, Startup, Administration, Growth and Evaluation

Page 21 of 21https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4803385/?report=printable

∘ Resident familiarity with the ABA format∘ Reduced anxiety with regard to the format and process (anxiety about lack of preparation may remain)∘ Feedback on performance during an oral examination∘ Ability of the resident and/or training director to initiate remediary actions∘ Feedback to the training director on curriculum and program adequacy∘ Asset in residency recruiting∘ Assessment tool in resident evaluation

Articles from The Journal of Education in Perioperative Medicine : JEPM are provided here courtesy of

Society for Education in Anesthesia


Recommended