Meeting the Accreditation Council for Graduate Medical Education competencies using established residency training program assessment tools. Am J Surg 2004 Jul;188(1):9-12
Date
06/29/2004Pubmed ID
15219477DOI
10.1016/j.amjsurg.2003.11.036Scopus ID
2-s2.0-3042532656 (requires institutional sign-in at Scopus site) 43 CitationsAbstract
BACKGROUND: Most existing residency evaluation tools were constructed to evaluate the Accreditation Council for Graduate Medical Education (ACGME) competencies.
METHODS: Before ACGME's six competency based assessment requirements for resident performance were developed, we created a residency evaluation tool with 5 domains important to successful surgical resident performance. Reliability was determined after 6 months of use. Factor analysis assessed whether the evaluation tool was a construct-valid measure of the ACGME competencies.
RESULTS: Three hundred forty-three evaluations for 36 surgical residents were tested. The original evaluation tool was highly reliable with an overall reliability of 0.97. Factor analysis defined 4 new combinations of questions analogous to 4 of the ACGME competencies: professionalism (reliability 0.95), patient care (reliability 0.93), medical knowledge (reliability 0.92), and communication (reliability 0.92). The new competency clusters were correlated with each other to a moderate degree.
CONCLUSIONS: Our locally developed tool demonstrated high reliability and construct validity for 4 of 6 ACGME competencies. The correlation between factors suggests overlap between competencies.
Author List
Brasel KJ, Bragg D, Simpson DE, Weigelt JAMESH terms used to index this publication - Major topics in bold
AccreditationClinical Competence
Educational Measurement
Factor Analysis, Statistical
General Surgery
Humans
Internship and Residency
Reproducibility of Results
United States