Abstract
Problem addressed Family medicine residency programs require innovative means to assess residents’ competence in “soft” skills (eg, patient-centred care, communication, and professionalism) and to identify residents who are having difficulty early enough in their residency to provide remedial training.
Objective of program To develop a method to assess residents’ competence in various skills and to identify residents who are having difficulty.
Program description The Competency-Based Achievement System (CBAS) was designed to measure competence using 3 main principles: formative feedback, guided self-assessment, and regular face-to-face meetings. The CBAS is resident driven and provides a framework for meaningful interactions between residents and advisors. Residents use the CBAS to organize and review their feedback, to guide their own assessment of their progress, and to discern their future learning needs. Advisors use the CBAS to monitor, guide, and verify residents’ knowledge of and competence in important skills.
Conclusion By focusing on specific skills and behaviour, the CBAS enables residents and advisors to make formative assessments and to communicate their findings. Feedback indicates that the CBAS is a user-friendly and helpful system to assess competence.
Competence in family medicine includes knowing what to do, when and how to do it, and whether to do it at all. A valid and reliable means of assessing the competence of family physicians preparing to enter practice is essential to ensure patient safety. There are 2 important areas of concern when teaching residents and assessing their competence in family medicine: 1) assessment of residents’ development of “soft skills,” such as patient-centred care and communication, and 2) identification of those residents who are having difficulty early enough in their residency to provide remedial training.
Accurate assessment of competence requires continuing, serial, and direct observation of workplace behaviour and monitoring of progress based on feedback. This highlights the inadequacy of multiple-choice examinations or one-time demonstrations of skill (eg, objective structured clinical examinations [OSCEs]) to assess competence. These methods of assessment do not reveal the realities of being competent in practice.
Our work at the Department of Family Medicine at the University of Alberta in Edmonton builds upon the global movement toward competency-based medical education1 by providing not only a means of assessing competence, but also a system to support the development of competence. Our work draws from existing pilot programs in other areas of health care2–4 that suggest competency-based curricula and assessments lead to greater success in producing skilled and effective physicians when compared with traditional rotation and time-based programs.
The College of Family Physicians of Canada (CFPC) has also adopted a competency-based program. The skills family medicine residents need to learn and demonstrate have been determined from the results of a series of surveys conducted by the CFPC in which practising family physicians indicated the competency and skill domains they thought were most important for family physicians entering independent practice.5 The CFPC established separate working groups to develop guidelines for both curriculum development and competency-based assessment to ensure that residents were prepared for family practice. The challenge of implementing a competency-based framework in medical education is associated with the difficulties inherent in assessing competence. How do you assess competence? How do you know when someone has attained competence or is progressing toward it?
The most effective ways of measuring competence have yet to be clearly identified.6 Several medical education researchers suggest such methods as multiple-choice questionnaires, OSCEs, in-training evaluation reports (ITERs), and logs7,8; others build on that list by suggesting observer ratings.9 The concern about all these proposed methods is that they are limited to generating a checklist of competencies. Checklists provide only a summary evaluation. They do not give students useful information about what they are doing right or how they could improve before being evaluated. Information should be about residents’ performance and it must be specific to their progress toward clinical competence and to the development of a habitual approach to improving and using their skills and knowledge.10
To provide this kind of information, we developed an innovative approach to assessing not only the achievement of the many dimensions of competence in family medicine but also the progress toward that achievement. Our work was guided by the principles of high-quality feedback and the CFPC’s mandate to move to competency-based residency education.5 For residents, we wanted a system to guide their learning using formative feedback; and for their advisors and preceptors, we wanted a system that would be resident driven, so that residents would recognize when feedback was being given, would be able to act upon that feedback, and could help themselves to progress toward competence by soliciting feedback in areas where they needed it.
Drawing from work at the University of Maastricht in the Netherlands,2 the Cleveland Clinic Lerner College of Medicine in the United States,3 and other competency-based medical programs,4 our team determined the stages and requirements for a valid, reliable, and cost-effective system of evaluating competence using documented formative feedback. We called the system the Competency-Based Achievement System (CBAS). The key feature of the CBAS is that assessment goes both ways. Rather than clinical rotation-based summative evaluations from preceptors passing judgment on the knowledge and abilities of the residents they supervise, both advisors and residents review cumulative evidence of residents’ demonstrated skills and competence in a variety of clinical settings. After reviewing this evidence, advisors and residents come to a mutual understanding of residents’ strengths and weaknesses. The intent of the CBAS is to facilitate student-centred learning by giving residents a system for guided self-assessment.
Program description
The CBAS was designed to measure competence using 3 main principles: best practices in formative feedback, guided self-assessment, and regular face-to-face meetings between residents and advisors.
The CBAS uses FieldNotes as the primary tool for collecting evidence of progress toward competence. FieldNotes are forms in a prescription-sized pad on which immediate feedback about directly observed events can be noted. FieldNotes are intended to be a qualitative account of feedback11,12 and a check-in on progress. FieldNotes can serve as reminders of verbal feedback and act as memory prompts in place of detailed discussions of residents’ performance. Progress levels indicated on FieldNotes give residents assessments of their learning in real-time workplace settings. The contents of FieldNotes must be discussed by observers and residents in a timely fashion, ideally at the time FieldNotes are created or within a few days.
FieldNotes can be about diagnoses or management decisions, presentations, team interactions, charts or letters, patient interactions, procedures, and any other aspect of practising medicine. Observers could be preceptors, advisors, nurses, patients, peers, or other people on the scene. Residents are expected to get 1 FieldNote per clinical day and 1 FieldNote per clinical call-back day in a first-year family medicine rotation; more FieldNotes than this are a bonus for residents and are encouraged. Our program requires that FieldNotes be completed in eCBAS, an online electronic workbook (Box 1). Administrative support is available to residents to ensure that paper-based FieldNotes are entered into eCBAS. Residents use the FieldNotes in eCBAS to assess their progress toward competence as they proceed through the residency program and to identify gaps in their knowledge (Figure 1).
Information about the eCBAS (electronic workbook)
The eCBAS was ...
-
developed in response to user demand for electronic FieldNote entry
-
built as a custom configuration of Microsoft Sharepoint
The eCBAS acts as a repository and sorting tool for FieldNotes
Its contents are not individually evaluated unless a FieldNote requires program attention
___________________________
CBAS—Competency-Based Achievement System.
Sentinel habits:Common skills and habits that make a good physician.
Residents collect feedback through FieldNotes and have discussions with their advisors about progress in the following sentinel habits:
-
Incorporates patient context in determining care and treatment
-
Generates relevant hypotheses
-
Uses best practice to manage patient care
-
Selects appropriate focus in clinical encounters
-
Applies key features for all procedures
-
Demonstrates respect and responsibility
-
Uses clear and timely verbal and written communication
-
Helps others learn
-
Promotes effective practice quality
-
Seeks guidance and feedback
Residents and advisors have scheduled, structured, face-to-face meetings every 4 months to review residents’ progress (Figure 2). In preparation for these meetings, residents review all FieldNotes to assess their progress in sentinel habits (Box 2) and clinical domains (Box 3). Residents record the assessment of their progress on their 4-month progress report forms. Residents and advisors review the reports together. In-training evaluation reports for all rotations use the same sentinel habits and clinical domains organizational structure as the 4-month progress reports, and they are further support for residents’ self-assessments.
After residents have completed their portion of the 4-month progress report, they meet with their advisors to discuss the report. If the advisor agrees that the resident is accurate in his or her self-assessment, the 4-month progress report is left as completed by the resident. If the advisor believes that there are inaccuracies in the self-assessment or opportunities for additional insight, the resident and advisor mutually negotiate a more accurate progress report.
The final stage of the meeting is used to establish a learning action plan for the next 4 months. Together, advisors and residents identify possible opportunities for learning during upcoming scheduled clinical experiences or rotations. This learning action plan is recorded on the 4-month progress report, along with any comments on a resident’s progress. After the first meeting, each subsequent meeting begins with a review of the previous meeting’s learning action plan.
Clinical domains of family medicine
Residents collect feedback through FieldNotes and have discussions with their advisors about progress in the following clinical domains:
-
Maternity and newborn care
-
Care of children and adolescents
-
Care of adults
-
Care of elderly people
-
Palliative and end-of-life care
-
Behavioural medicine and mental health
-
Surgical and procedural skills
-
Care of vulnerable and underserviced people
Sentinel habits
In response to requests to make the categories on the FieldNotes more relevant to daily clinical practice, we derived the key skills of a good physician from the CFPC’s skill dimensions and the CanMEDS–family medicine roles. These key skills need to be learned and repeated until they become habitual behaviour. We have labeled these skills “sentinel habits,” and they are used in the CBAS to guide the evaluation of progress toward competence.
Evaluation of the CBAS
The CBAS was developed using participatory action research (PAR), and the PAR process was followed throughout pilot implementation to continuously monitor the system. In PAR research, the members of a community of interest are directly involved in reflection on a problem, in planning ways of addressing the problem, and in the action and observation that follows planning (Figure 3). The action is researched, changed, and researched again within the research process by participants.13 Table 1 summarizes each PAR cycle during development and implementation of the CBAS.
Throughout the PAR cycles, our participants included 34 residents, 3 program directors, 8 advisors, and the 4 members of the CBAS team, which was made up of education experts in our Department of Family Medicine. All participation was voluntary; all participants gave signed consent; and the program was approved by the University of Alberta Human Research Ethics Board. Changes to the system were made based on user feedback. These changes were then evaluated to ensure they responded to users’ comments. Considerable changes were made as a result of pilot feedback (Table 2).
We held focus groups to discuss the system. Comments from these groups revealed that, in the pilot implementation and in the first stages of full implementation, the CBAS was proving useful. Advisors and program directors had positive responses about the CBAS, and program directors indicated that they found that the CBAS gave more complete information than previous evaluation methods had. Before the CBAS, in-training evaluation reports and progress reports were in the form of checklists, which gave little information to program directors on residents’ progress.
Residents’ responses were more cautiously positive. Some residents at one of the pilot sites asked to have the CBAS discontinued. Residents at 3 other pilot sites saw potential in the CBAS and appreciated the quick response to requests for changes. End-user focus group data revealed the need for faculty development in giving good formative feedback and the need for continuous education for residents in using formative feedback to develop competence.
Focus groups with program directors revealed that the CBAS had proven to be an effective way of identifying residents having difficulty. The emphasis on regular use of FieldNotes and more frequent face-to-face meetings allowed residents who were having difficulty to be recognized and helped. The “stacking” (self-determined topic stacks for targeted learning) feature of the CBAS was used successfully to help residents focus on areas in which they needed further experience and more knowledge. Residents in need of remediation also benefited from the learning action plans embedded in the 4-month progress reports.
Discussion
The CBAS is unique among existing competency-based assessment systems owing to its focus on authentic workplace-based assessments. Unlike other systems,4–6 all formative feedback in the CBAS comes from direct observation of clinical practice and behaviour during encounters. The CBAS does not rely on summative examinations, checklists, or OSCEs. All summative evaluations are derived directly from the actual behaviour and knowledge exhibited by residents in their daily practice.
The transition to the CBAS as the primary assessment method in our program has met with success from the perspective of program directors. The new 4-month progress reports give more detailed information about residents’ progress. Residents are seeing the connection between the formative feedback they receive during their clinical days and the summative assessments of progress they get about every 4 months. This connection between learning and progress is much clearer than with previous assessments.
Most important, the focus groups indicated that residents now recognized good formative feedback. This was both the greatest success and the greatest challenge of the CBAS. Whereas the residents in our program previously complained about not receiving feedback, they now complained about not receiving good-enough feedback. Our residents are no longer satisfied with hearing “Good job!” They want to know why it is a good job. The feedback process is now clear to our residents, and they want the best constructive feedback they can get.
Limitations
The limitations to evaluating the research portion of this study were primarily the difficulty in recruiting participants. While many residents participated in the focus groups, not all residents shared their opinions within the PAR framework. Also, data collection during evaluation of the pilot implementation focused primarily on program directors and residents; advisors did not participate fully. We will address this limitation in our future research.
Future directions
As of July 2010, the CBAS was fully implemented across our entire residency program. We continue to evaluate and solicit feedback from users within the PAR framework and to make changes to the CBAS based on end-user feedback (from residents, advisors, program directors, program and site administrators, and off-service preceptors). This process will continue for the next several years as we validate the CBAS and expand it to other programs and specialties. We also intend to follow our CBAS-trained residents as they begin to practise in order to determine whether they are better able to assess themselves as practising physicians.
Conclusion
The CBAS is designed to sample certain skills and behaviour, which allows for formative assessment. The feedback gathered facilitates meaningful discussion around residents’ progress and allows residents to become more aware of how to direct their learning and practise guided self-assessment.
The CBAS approach to assessment incorporates PAR into each stage of development, implementation, monitoring, and readjustment. Our research shows that including all CBAS users makes this dynamic process easy to use and useful for assessing competence.
Finally, there is a feeling of empowerment among our users as they realize that they can decide how competency-based assessment will happen for them. We think this will further improve the perception of a positive learning environment in our residency program.
Notes
EDITOR’S KEY POINTS
-
The Department of Family Medicine at the University of Alberta in Edmonton created the Competency-Based Achievement System (CBAS) to assess residents’ competence during their preparation to practice.
-
Through formative feedback, guided self-assessment, and regular face-to-face meetings, the CBAS provides a framework for meaningful communication between residents and advisors.
-
Comments from those who have implemented the CBAS indicate that it is a user-friendly and helpful method to assess a resident’s progress.
POINTS DE REPÈRE DU RÉDACTEUR
-
Le département de médecine familiale de l’université de l’Alberta à Edmonton a créé le Competency–Based Achievement System (CBAS) pour évaluer la compétence des rési-dents durant leur préparation à la pratique.
-
Au moyen d’une rétroaction formatrice, d’une autoévaluation assistée et de rencontres individuelles, le CABS assure une communication significative entre les résidents et leurs moniteurs.
-
Les commentaires de ceux qui ont fait l’essai du CBAS indiquent que cette méthode d’évaluation des progrès des résidents est utile et facile à utiliser.
Footnotes
-
This article has been peer reviewed.
-
Cet article a fait l’objet d’une révision par des pairs.
-
Contributors
All the authors contributed to concept and design of the program; data gathering, analysis, and interpretation; and preparing the manuscript for submission.
-
Competing interests
None declared
- Copyright© the College of Family Physicians of Canada