The feasibility, reliability, and validity of a program director's (supervisor's) evaluation form for medical school graduates. Acad Med 2005 Oct;80(10):964-8
Date
09/28/2005Pubmed ID
16186618DOI
10.1097/00001888-200510000-00018Scopus ID
2-s2.0-25444464651 (requires institutional sign-in at Scopus site) 38 CitationsAbstract
PURPOSE: To determine the feasibility, reliability, and validity of the supervisor's evaluation form for first-year residents as an outcome measure for programmatic evaluation.
METHOD: Prospective feedback has been sought from supervisors for the Uniformed Services University of the Health Sciences (USUHS) graduates during their internship year. Supervisors are sent yearly evaluation forms with up to three additional mailings. Using a six-point scale, supervisors rate residents on 18 items. The authors used evaluation data from 1993 to 2002. Feasibility was estimated by response rate. Internal consistency was assessed by calculating Cronbach's alpha and analyzing scores on a year-to-year and interrater basis. Validity was determined by exploratory factor analysis with oblique rotations, comparing ratings with end-of-medical school GPA and United States Medical Licensing Examination (USMLE) Step 1 and Step 2 scores (Pearson correlations), and by analyzing the range of scores to include the percentage of scores below acceptable level.
RESULTS: A total of 1,247 evaluations were collected for the 1,559 USUHS graduates (80%). Cronbach's alpha was .96 with no significant difference in scores by supervisor specialty or year. Factor analysis found that the evaluation form collapsed into two domains accounting for 68% of the variance: professionalism and expertise. End-of-medical school GPA and USMLE Step 1 and 2 scores correlated with expertise but not with professionalism. Mean scores across items were 3.5-4.31 with a median of 4.0 for all items (SD .80-1.21). Four percent of graduates received less-than-satisfactory ratings.
CONCLUSIONS: This evaluation form has high feasibility and internal consistency. Factory analysis revealed two complimentary domains supporting its validity. Correlation with end-of-medical school measurements and analysis of range of scores supports the form's validity.
Author List
Durning SJ, Pangaro LN, Lawrence LL, Waechter D, McManigle J, Jackson JLAuthor
Jeffrey L. Jackson MD Professor in the Medicine department at Medical College of WisconsinMESH terms used to index this publication - Major topics in bold
Education, Medical, UndergraduateEducational Measurement
Feasibility Studies
Humans
Maryland
Organization and Administration
Physician Executives
Professional Competence
Program Evaluation
Reproducibility of Results
Schools, Medical
Statistics as Topic
Students, Medical