+ All Categories
Home > Documents > An In-Training Assessment Examination in Family Medicine

An In-Training Assessment Examination in Family Medicine

Date post: 16-Oct-2021
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
3
An In-Training Assessment Examination in Family Medicine: Report of a Pilot Project John E. Donnelly, MD, Bonnie Yankaskas, MPH, Craig Gjerde, PhD, John C. Wright, MD, and Douglas P. Longnecker, MD Farmington, Connecticut and Dayton, Ohio A family medicine in-training assessment examination was developed and piloted in 20 programs across the country. Core Content Review questions were used for the examination. Reporting of scores used both the traditional, normative-referenced approach, and a criterion-referenced approach. The latter permitted family medi- cine faculty to set passing standards for the examination. The pilot project was well received and the examination will be offered to all family medicine residency programs this year. Resident assessment examinations have been in use for over a decade. They were first designed to assist residents in preparation for board examinations and gradually have developed into examinations designed to aid individual residents and resi- dency programs in assessment of strengths and weaknesses. The special- ties of orthopedics, neurosurgery, ophthalmology, obstetrics and gyne- cology, and pediatrics have all developed resident assessment exami- nations.1"4 These are annual, norm- referenced examinations. Family medicine, with its rapid development and growth, has not developed a national resident assess- ment examination. In 1975, the Core Content Review Board formed a sub- committee charged with the responsi- bility of developing an assessment examination that could be offered to family medicine programs nationally. The examination was to be designed in From the Departments of Family Medicine and Research jp Health Education, Univer- sity of Connecticut, Farmington, Connecti- cut, and the Department of Family Practice, Wright State University, Dayton, Ohio. Re- quests for reprints should be addressed to Dr. John E. Donnelly, Department of Fam- ily Medicine, University of Connecticut, Farmington, CT 06032. such a fashion that programs could assess how well they were meeting their instructional goals and offer feed- back to individual residents on their areas of strength and weakness. The paper describes the nationwide pilot project that accepted the follow- ing goals', to develop a criterion- referenced examination for family medicine residents; to test the accep- tance of an in-training assessment examination by residency programs; and, to test the acceptance of the criterion-referenced approach to set- ting passing standards. Methodology Study Group Thirty family medicine residencies were selected nationwide and invited to participate in a pilot examination scheduled for July 1976. Twenty-eight residencies, with a total of 465 resi- dents, agreed to participate. There were no quotas set as to numbers of residents or year of training. In some programs only a portion of the resi- dents took the examination. Two hundred eighty-five residents from 20 residencies completed the examination: 110 were beginning their first year, 93 were beginning their second year, and 82 were beginning their third year. The participants were from many geographically distinct areas and represented both university- based and community hospital-based programs. Selection of Questions Questions for the examination were taken from the bank of questions of the Core Content Review, a self- assessment multiple-choice examina- tion developed by the Connecticut and Ohio Academies of Family Physicians for family practitioners. Core Content Review (CCR) questions are written by experts in various disciplines, and screened for teaching value and rele- vance by an executive committee of family physicians of CCR. All CCR questions from the 1974-1975 and 1975-1976 exam- inations were subcategorized by con- tent using the International Classifica- tion of Health Problems in Primary Care (ICHPPC) guidelines. From the 1,000 available questions, 400 were selected. This resulted in the desired balance of questions from the five major subject areas: medicine, pedi- atrics, surgery, psychiatry, and obste- trics and gynecology. Questions were also categorized according to the cognitive level required to answer the question: recall, understanding, or problem- solving. Thirty percent of the ques- tions were recall, 35 percent under- standing, and 35 percent problem- solving questions. The breakdown of the subcategories is shown in Table 1. Setting a Minimum Pass Level (MPL) For this examination, criterion- referenced passing standards were used.5"7 The criterion-referenced ex- amination offers an alternative to the traditional norm-referenced approach. The difference is that norm-referenced evaluation compares an individual against average performance, while criterion-referenced evaluation mea- sures the individual’s performance against previously determined stan- dards. The examination was designed to assess mastery of a core body of content and there was no guarantee that normative scores based on a self- selected group of residents would be meaningful. Ten faculty members of family medicine residency programs were the JOURNAL OF FAMILY PRACTICE, VOL. 5, NO. 6, 1977 987
Transcript

An In-TrainingAssessment Examination

in Family Medicine: Report of a Pilot Project

John E. Donnelly, MD, Bonnie Yankaskas, MPH, Craig Gjerde, PhD,John C. Wright, MD, and Douglas P. Longnecker, MDFarmington, Connecticut and Dayton, Ohio

A family medicine in-training assessment examination was developed and piloted in 20 programs across the country. Core Content Review questions were used for the examination. Reporting of scores used both the traditional, normative-referenced approach, and a criterion-referenced approach. The latter permitted family medi­cine faculty to set passing standards for the examination.

The pilot project was well received and the examination will be offered to all family medicine residency programs this year.

Resident assessment examinations have been in use for over a decade. They were first designed to assist residents in preparation for board examinations and gradually have developed into examinations designed to aid individual residents and resi­dency programs in assessment of strengths and weaknesses. The special­ties of orthopedics, neurosurgery, ophthalmology, obstetrics and gyne­cology, and pediatrics have all developed resident assessment exami­nations.1"4 These are annual, norm- referenced examinations.

Family medicine, with its rapid development and growth, has not developed a national resident assess­ment examination. In 1975, the Core Content Review Board formed a sub­committee charged with the responsi­bility of developing an assessment examination that could be offered to family medicine programs nationally. The examination was to be designed in

From the D e p artm e n ts o f F a m ily M edicine and Research jp H ealth E d u c a t io n , U n ive r­sity o f C o n n e c t icu t, F a rm in g to n , C o n n e c t i­cut, and the D e p artm e n t o f F a m ily P ractice , Wright S ta te U n iv e rs ity , D a y to n , O h io . R e ­quests fo r re p rin ts sh o u ld be addressed to Dr. Jo h n E . D o n n e lly , D e partm e nt o f F a m ­ily M edicine, U n ive rs ity o f C o n n e c t icu t, Farmington, C T 0 6 0 3 2 .

such a fashion that programs could assess how well they were meeting their instructional goals and offer feed­back to individual residents on their areas of strength and weakness.

The paper describes the nationwide pilot project that accepted the follow­ing goals', to develop a criterion- referenced examination for family medicine residents; to test the accep­tance of an in-training assessment examination by residency programs; and, to test the acceptance of the criterion-referenced approach to set­ting passing standards.

MethodologyStudy Group

Thirty family medicine residencies were selected nationwide and invited to participate in a pilot examination scheduled for July 1976. Twenty-eight residencies, with a total of 465 resi­dents, agreed to participate. There were no quotas set as to numbers of residents or year of training. In some programs only a portion of the resi­dents took the examination.

Two hundred eighty-five residents from 20 residencies completed the examination: 110 were beginning their first year, 93 were beginning their

second year, and 82 were beginning their third year. The participants were from many geographically distinct areas and represented both university- based and community hospital-based programs.

Selection o f Questions

Questions for the examination were taken from the bank of questions of the Core Content Review, a self- assessment multiple-choice examina­tion developed by the Connecticut and Ohio Academies of Family Physicians for family practitioners. Core Content Review (CCR) questions are written by experts in various disciplines, and screened for teaching value and rele­vance by an executive committee of family physicians of CCR.

All CCR questions from the 1974-1975 and 1975-1976 exam­inations were subcategorized by con­tent using the International Classifica­tion of Health Problems in Primary Care (ICHPPC) guidelines. From the 1,000 available questions, 400 were selected. This resulted in the desired balance of questions from the five major subject areas: medicine, pedi­atrics, surgery, psychiatry, and obste­trics and gynecology.

Questions were also categorized according to the cognitive level required to answer the question: recall, understanding, or problem­solving. Thirty percent of the ques­tions were recall, 35 percent under­standing, and 35 percent problem­solving questions. The breakdown of the subcategories is shown in Table 1.

Setting a Minimum Pass Level (MPL)

For this examination, criterion- referenced passing standards were used.5"7 The criterion-referenced ex­amination offers an alternative to the traditional norm-referenced approach. The difference is that norm-referenced evaluation compares an individual against average performance, while criterion-referenced evaluation mea­sures the individual’s performance against previously determined stan­dards. The examination was designed to assess mastery of a core body of content and there was no guarantee that normative scores based on a self- selected group of residents would be meaningful.

Ten faculty members of family medicine residency programs were

t h e J O U R N A L O F F A M I L Y P R A C T I C E , V O L . 5 , N O . 6 , 1977 9 8 7

Table 1. Number of Questions in Each Major Subject by ICH PPC Classification and Cognitive Level

Major Subject

MED PED P S Y C H SU R G O B /G YN

ICH PPC Classification

Accidents, poison, violence 3 10Cardiology 15 5Congenital and Perinatal - 10Dermatology 5 5Endocrinology 10 10Gastroenterology 10 5Hematology 5 2Infectious Disease 15 25Mental Disorder - 5 100Musculoskeletal System and

Connective Tissue 5 5Oncology 7 3Renal Disease 10 5Respiratory Disease 10 10Other 5 -

Total 100 100 100 50 50

Cognitive Level

Recall 20 24 39 15 23Understanding 21 43 45 19 13Problem Solving 59 33 16 16 14

Total 100 100 100 50 50

recruited to help set the MPL for the examination. Every answer to every question was evaluated and scored by at least t\yo different raters. In estab­lishing the criterion-referenced passing standards, raters assigned to each multiple-choice possibility a weight equal to the degree tp which choosing that particular item would correctly reflected appropriate mastery of the material. The item MPL weights were averaged across raters and summed. These sums were then transformed to percent figures with separate MPLs set for the total examination and each of the five subtests. The MPLs were 66.2 percent for the total examination, 66.0 percent for medicine, 65.8 per­cent for pediatrics, 65,3 percent for psychiatry, 68.9 percent for obstetrics and gynecology, and 66.4 percent for surgery. For the pilot project, the MPL

criterion was based on the desired knowledge of a resident who had completed two years of training (a beginning third year resident).

Test Administration

The examination was available for administration between June 20 and July 15, 1976. Program directors re­ceived detailed instructions on admin­istration. Six hours were allowed for testing and no references were per­mitted. Residents recorded their answers on computer scorable sheets and all testing materials were collected after the examination.

Following the computer scoring of results each residency director received the results in package form. The pack­age contained individual results of each resident and composite results of

all the residents in that particular program by year of training. In addi­tion, summary results of all residents who took the examination were included, also by year of training.

Surveys

The director of each participating program was sent a questionnaire before the examination to ascertain what types of resident assessments were in current use and to establish some data about the director’s expectations.

Six weeks after the results were mailed, a telephone interview was con­ducted with 15 of the 20 program directors. Its purpose was to get feed­back on difficulties in administration of the examination, problems in evalu­ating its results, use of its information, and desirability for an annual stan­dardized examination like the pilot project.

ResultsIn the pretest questionnaire, 15 out

of 17 directors who replied felt there was a need for an annual assessment examination. Six programs had used the standard Core Content Review for their residents as a self-assessment examination, two programs used patient management problems, and one used a pretest examination. The balance had not used any written test in their programs.

All program directors interviewed by telephone had received and reviewed the results of the examina­tion. They had all used the examina­tion for program assessment and for assessments of individual residents in a supportive and positive manner. No punitive action resulted from the pilot examination. Fourteen of the 15 pro­gram directors repeated their desire for a yearly examination of this type.

The MPL approach was acceptable to most programs and it was fre­quently suggested that multiple pass levels be used (eg, one for each year of training) rather than one MPL for all residents.

Examination results are reported in Table 2. It should be repeated that the MPL was set for a beginning third year resident. The examination did serve as a pretest for residents beginning their first year of training. An analysis of test results comparing years one, two, and three on the examination showed

9 8 8 T H E J O U R N A L O F F A M I L Y P R A C T I C E , V O L . 5, N O . 6 , 1977

Table 2. Examination Results (MPLs reported in percent)

Total MED PED PSY C H O B/G YN SU R G

All Residents!\l=285

MPL 66.2 66.0 65.8 65.3 68.9 66.4MEAN 63.4 60.0 62.3 67.8 63.1 63.8SD* 6.5 7.9 8.2 7.2 9.8 7.7

# Res above MPL 104 75 97 200 85 112# Res below M PL 181 210 188 85 200 173

Third Year ResidentsN=82

MPL 66.2 66.0 65.8 65.3 68.9 66.4MEAN 65.6 63.2 65.0 68.9 64.6 65.9SD 6.0 7.4 6.7 6.6 9.7 6.6

# Res above MPL 47 33 39 61 30 41# Res below MPL 35 49 43 21 52 41

Second Year ResidentsN=93

MPL 66.2 66.0 65.8 65.3 68.9 66.4MEAN 65.3 62.2 65.3 68.2 65.6 65.1SD 5.5 6.9 7.7 6.1 8.7 7.0

# Res above MPL 42 34 47 64 36 41# Res below M PL 51 59 46 29 57 52

First Year ResidentsN=100

MPL 66.2 66.0 65.8 65.3 68.9 66.4MEAN 60.1 55.7 57.6 66.7 59.8 61.2SD 6.2 7.1 7.5 8.2 9.9 8.1

# Res above M PL 15 8 11 75 19 30# Res below MPL 95 102 99 35 91 80

*SD = Standard Deviation

year two and year three had signifi­cantly higher scores than year one in all the subtests except psychiatry (analysis of variance, P<.001). On the total examination, 14 percent of the first year residents, 45 percent of the second year, and 57 percent of the third year residents attained the preset passing level (MPL).

ConclusionAs a pilot project, the results were

encouraging. Interest in an in-training assessment examination was docu­mented. The method employed (MPL) together with the scoring results indi­cated that a criterion-referenced examination could be used as an in­training assessment vehicle. Finally, the residency directors used the scores to assess strengths and weaknesses of individual residents and total programs.

In the future, as universally accept­able teaching objectives are developed, questions can be formulated to test those objectives. The use of a crite­rion-referenced examination would permit educators in family medicine to establish minimally acceptable passing standards for all programs and simul­taneously assess strengths and weak­nesses of individual residents and programs.

AcknowledgementT h e su p p o rt o f the C o re C o n te n t R ev ie w of the C o n n e c t ic u t and O h io A ca d e m ie s of F a m ily P h y s ic ia n s and the p a rtic ip a tin g resi­d e n cy d ire cto rs is g ra te fu lly ackn o w le d ge d .

References1. N atress LW : O rth o p a e d ic in -tra in in g

e xa m in a tio n . J Med E d u c 4 4 :8 7 8 , 19692. H u b b ard JP , F u r lo w L T , M atson D D :

A n in -tra in in g e xa m in a tio n fo r residents as a guide to learn in g. N E n g l J Med 2 7 6 :4 4 8 , 1967

3. R u b in M L : T h e o p h th a lm o lo g y resi­dent in -tra in in g e xa m in a tio n . A m J O p h th a lm o l 6 7 :7 0 , 1 969

4 . a M eskauskas J , N ew to n M, R ussell K : T h e 1 973 in -tra in in g e xa m in a tio n in o bste t­rics and g y n e c o lo g y . O b stet G y n e c o l 4 4 :4 6 3 , 1 9 7 4

5. T a y lo r D D , R e id J C : C r ite rio n -referenced e va lu atio n o f stu d e n t p e rfo r­m ance. J Med E d u c 4 7 :9 7 0 , 1 972

6. N e d e lsk y L : A b so lu te grad ing stan­dards fo r o b je ctive tests. E d u c P sy ch o l Meas 1 4 :3 , 1 9 5 4

7. B o b u la J : M in im u m Pass Leve l S tu d y . R e p o rt to F a c u lty , C e n te r fo r Ed u ca tio n a l D e ve lop m en t, C o llege o f M edicine , U n ive r­s ity o f Illin o is . U rban a, I llin o is , U n ive rs ity o f Illin o is , 1 9 7 4 , pp 25-31

t h e J O U R N A L O F F A M I L Y P R A C T I C E , V O L . 5 , N O . 6 , 1977 989


Recommended