How accurate are faculty evaluations of clinical competence?

J Gen Intern Med. 1989 May-Jun;4(3):202-8. doi: 10.1007/BF02599524.

Abstract

Objective: To determine the degree and sources of variability faculty evaluations of residents for the American Board of Internal Medicine (ABIM) Clinical Evaluation Exercise (CEX).

Design: Videotaped simulated CEX containing programmed resident strengths and weaknesses shown to faculty evaluators, with responses elicited using the open-ended form recommended by the ABIM followed by detailed questionnaires.

Setting: University hospital.

Participants: Thirty-two full-time faculty internists.

Intervention: After the open-ended form was completed and collected, faculty members rated the resident's performance on a five-point scale and rated the importance of various aspects of the history and physical examination for the patient shown.

Measurements and main results: Very few of the resident's strengths and weaknesses were mentioned on the open-ended form, although responses to specific questions revealed that faculty members actually had observed many errors and some strengths that they had failed to document. Faculty members also displayed wide variance in the global assessment of the resident: 50% rated him marginal, 25% failed him, and 25% rated him satisfactory. Only for performance areas not directly related to the patient's problems could substantial variability be explained by disagreement on standards.

Conclusions: Faculty internists vary markedly in their observations of a resident and document little. To be useful for resident feedback and evaluation, exercises such as the CEX may need to use more specific and detailed forms to document strengths and weaknesses, and faculty evaluators probably need to be trained as observers.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Clinical Competence*
  • Faculty, Medical*
  • Humans
  • Internal Medicine / education*
  • Internship and Residency*
  • Surveys and Questionnaires