Korean Journal of Anesthesiology

Search

Close

This article has been corrected. See "Application of competency-based education in the Korean anesthesiology residency program and survey analysis" in Volume 76 on page 516.
/W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> Kim, Choe, and Kim: Application of competency-based education in the Korean anesthesiology residency program and survey analysis

Application of competency-based education in the Korean anesthesiology residency program and survey analysis

Kyung Woo Kim, Won Joo Choe, Jun Hyun Kim
Received June 30, 2022       Revised July 27, 2022       Accepted August 1, 2022
Abstract
Background
Although competency-based education (CBE) is becoming a popular form of medical education, it has not been used to train residents. Recently, the Korean Society of Anesthesiologists completed a pilot implementation and evaluation of a CBE program.This study aims to outline the experience.
Methods
The chief training faculty from each hospital took a one-hour online course about CBE. Emails on the seven core competencies and their evaluation were sent ahead of a pilot core competency evaluation (CCE) to residents and faculty. The pilot CCE took place in late 2021, followed by a survey.
Results
A total of 68 out of 84 hospitals participated in the pilot CCE. The survey response rate was 55.9% (38/68) for chief training faculty, 10.2% (91/888) for training faculty, and 30.2% (206/683) for residents. More than half of the training faculty thought that CCE was necessary for the education of residents. Residents’ and training faculty’s responses about CCE were generally positive, although their understanding of CCE criteria was low. More than 80% of the hospitals had a defibrillator and cardiopulmonary resuscitation manikin while the rarest piece of equipment was an ultrasound vessel model. Only defibrillators were used in more than half of the hospitals. Thoughts about CCE were related to various factors, such as length of employment, location of hospitals, and the number of residents per grade.
Conclusions
This study’s results may be helpful in improving resident education quality to meet the expectations of both teaching faculty and residents while establishing CBE.

Introduction

Introduction

Competency-based education (CBE) has been the leading form of medical education for several decades [14]. In some countries, in addition to medical schools, it is also used in educating residents. Recently, it has become increasingly popular in medical schools in South Korea [5]. However, it is not widely used to educate residents in Korea.
In Korea, the first resident curriculum systemization project was implemented over eight months, from May 2020 to January 2021, and the second project was implemented over six months, from July to December 2021. These projects were implemented to reduce medical residents’ training time following the passage of a law about residency programs and increasing demands by residents for systematic education and reorganization of the training curriculum due to low satisfaction with them. Accordingly, the Korean Society of Anesthesiologists (KSA) applied to participate in and conducted the first and second implementation projects, which were financially sponsored and encouraged by the government. The resident curriculum systemization projects involved developing a competency-based residency program, developing training guides for training faculty members, establishing evaluation guidelines for core competencies, establishing feedback channels for evaluation results, developing an operations plan, and developing an e-portfolio. The KSA conducted a pilot core competency evaluation (CCE) from the end of November to the beginning of December 2021. In January 2022, it conducted a survey on CBE for residents, training faculty members, and chief training faculty members about their knowledge and feelings about the core competencies, their experience with pilot CCE, and their equipment used in CCE. This study was conducted to analyze the survey results, identify problems with incorporating CBE into the residency program, and find ways that it can be improved.

Materials and Methods

Materials and Methods

The KSA’s Training and Education Committee set seven core competencies and related learning objectives and milestones for evaluation (Supplementary Materials 1 and 2 [Korean]). The seven core competencies are preoperative assessment, difficult airway management, central venous catheter insertion using ultrasound, spinal and epidural anesthesia, treatment of myofascial pain syndrome, advanced cardiovascular life support, and mechanical ventilator management. The committee provided this information to training faculty members during the first and second resident curriculum systemization projects. Before the pilot CCE, the chief training faculty members completed one-hour online education courses related to CBE. Brief evaluation instructions were sent to residents and faculty members (Supplementary Material 3 [Korean]). Residents were evaluated on their mastery of the core competencies according to the KSA’s resident training curriculum (Supplementary Material 4). The chief training faculty members and the KSA Training and Education Committee held an online meeting prior to the pilot evaluation. The survey was conducted over the course of one week, from January 14 to January 21, 2021. The survey respondents were divided into three groups (chief training faculty members, training faculty members and residents). Chief training faculty members were asked about the importance and necessity of CCE and were asked to respond on a five-point Likert scale about whether the equipment necessary for CCE was provided and whether they thought it was necessary. In addition, they were asked how important they thought each core competency was. They were also given a multiple-choice questionnaire about the difficulties with CCE and how to improve it (Supplementary Material 5 [Korean]). Training faculty members were asked the same questions except for those questions about equipment. In addition to the importance and necessity of competency evaluation, residents were asked about several things that were shown to be important by a previous study [6]. Other information that may have been related to the survey results was also collected, such as the resident’s grade, years of experience of training faculty, hospital location, and the number of residents in each grade. All continuous variables were analyzed by Student’s t-test or Mann-Whitney U test according to the results of the normality test. Categorical variables were compared using Chi-square or Fisher’s exact test. A P value of less than 0.05 was considered statistically significant. This study was approved by the Institutional Review Board of Inje University Ilsan Paik Hospital (IRB no. 2022-05-014). The requirement that participants provide informed consent was waived.

Results

Results

The KSA has 84 training hospitals. Among those, 68 participated in the pilot evaluation. The number of chief training faculty members, training faculty members, and residents that participated in the pilot evaluation was 68, 888, and 683, respectively. Of those participants, 38 (55.9%), 91 (10.2%), and 206 (30.2%) responded to the surveys at the end of the program, respectively. Table 1 shows the demographic data of respondents by position. The responses to Likert scale items are shown in Figs. 13.
More than half of the chief training faculty and training faculty thought that CCE was necessary for educating residents. However, unlike the training faculty, less than half of the chief training faculty thought that it was important for educating residents. Less than half of the residents considered CCE to be both necessary and important for their education. Each core competency was considered important by all positions, though residents considered them to be more important than the faculty. The chief training faculty and training faculty thought the treatment of myofascial pain syndrome and central venous line insertion using ultrasound were less important than residents did. Residents’ responses to competency-related items about CCE and training faculty were generally positive. However, their understanding of CCE criteria was low.
Fig. 4 shows the results about whether various models were equipped and used and whether the respondent thought that they were useful. More than 80% of hospitals had a defibrillator and cardiopulmonary resuscitation manikin, but the rarest piece of equipment was an ultrasound vessel model. Defibrillators were used in more than half of the hospitals during the pilot CCE programs. The chief training faculty generally indicated that they thought that the models were useful for assessing competency.
Factors significantly related to each other were as follows. Among the chief training faculty, hospital location was related to whether it had a defibrillator (P = 0.013) and use of CPR manikin (P = 0.022). The chief training faculty member’s gender was related to the perceived need for a spinal anesthesia training model (P = 0.031). Thoughts about the importance of spinal anesthesia as a core competency differed by hospital location (P = 0.001).
Among the training faculty, length of employment was related to thoughts about the importance of cardiopulmonary resuscitation as a core competency (P = 0.026). The number of residents per grade was related to their thoughts about the importance of mechanical ventilation as a core competency (P = 0.025).
Training faculty as a whole (including chief training faculty) had different thoughts about the importance of preoperative assessment (P < 0.001), spinal anesthesia (P = 0.017), and mechanical ventilation (P = 0.005) according to hospital location. Length of employment was related to thoughts about the importance of advanced cardiovascular life support as a core competency (P = 0.015).
The number of residents per grade affected their thoughts about the need (P = 0.025) and importance (P = 0.041) of core competencies generally and their thoughts about each competency individually (spinal and epidural anesthesia: P = 0.013, mechanical ventilation: P = 0.011). All of the P values of the relationships are presented in Supplementary Material 6. Frequency plots are provided for significantly related factors (Supplementary Material 7 [Korean]).

Discussion

Discussion

According to the survey result analysis after this pilot competency evaluation, both training faculty and residents thought that the transition to CBE was necessary and important, but they thought so to different degrees. Given the responses about understanding core competencies and evaluation methods, it seems necessary to provide education about individual core competencies and competency evaluation methods in the future. Also, compensation or workload should be adjusted to avoid overloading training faculty. Interestingly, the degree of understanding of CBE differed by region among residents. Moreover, the number of residents per grade was negatively correlated with how many knew about competency evaluation well and thought it was important. Also, the number of residents per grade was positively correlated with their rating of training faculty (Supplementary Material 7 [Korean]).
Medical education has been transitioning to CBE for decades in the United States, Canada, and the United Kingdom [4]. The shift from knowledge, time-based education to task-based CBE is taking place in medical student education [7]. In order to nurture well-trained anesthesiologists who have not only the knowledge but also the skills, behaviors, and attitudes necessary to succeed in their profession, anesthesiology residents must be educated and evaluated accordingly, so the resident curriculum must provide CBE.
The KSA quickly introduced CBE, as described above. There were relatively few personnel involved in the process, including the Training and Education Committee members and members of related task force teams under the committee. The time provided for the validation, distribution, and education about each core competency was also insufficient.
As a result, as shown by this study’s results, although most training faculty and residents agree on the importance and necessity of CCE, only a small percentage of them knew the content well. Thus, CBE should continue to be conducted in the future.
One of the problems with CBE is that related terms are used interchangeably in various literatures. Competence refers to the array of abilities across multiple domains or aspects of physician performance in a specific context. On the other hand, competency means an observable ability of health professionals, integrating multiple components, such as knowledge, skills, values, and attitudes [8]. Entrustable professional activities (EPAs) are tasks that learners can execute unsupervised once they have attained a sufficient level of competency [9]. Milestones are achievements or behaviors presented by a physician that reflect their competency to execute EPAs [10]. However, in practice, these terms’ definitions can vary significantly [8]. The Korean Society of Otorhinolaryngology-Head and Neck Surgery’s competency-based residency program teaches eight clinical competencies and four conceptual EPAs. The Korean Association of Internal Medicine defines 18 EPAs and 80 competencies (Supplementary Material 8 [Korean]). The KSA residency program defines seven core competencies and its evaluation guidelines define EPAs and milestones (Supplementary Material 1, 2 [Korean]). The KSA deliberately minimized the number of core competencies to avoid overloading teaching faculty. It is unclear whether reducing the number of core competencies being taught undermines the quality of education. Thus, this small number of competencies should be reevaluated when establishing CBE or modifying current core competencies.
The KSA’s core competencies were intentionally designed to avoid textbook knowledge transfer and promote the learning of clinical techniques. As a result, they can be criticized for addressing only part of the resident education. However, the KSA’s CBE is only in its early stage. Furthermore, as shown in the survey results, most teaching faculty complained about the workload of teaching even this small number of core competencies. Increasing the number of competencies and content in the curriculum should be done gradually, even if it is essential.
This study’s results showed that thoughts about the core competencies varied by various variables, such as the number of residents per grade, hospital location, length of employment, and gender. Moreover, the number of residents per grade was negatively correlated with thoughts about CCE. The difference between faculty and residents should be considered when designing CBE education programs and distributing resources.
The first limitation of this study was that well over 30% of the residents responded that they did not know about CBE or their evaluation standards, which is not small. This result was likely a product of the fact that residents are receiving insufficient education, so this result would be expected to change as residents become better acquainted with CBE. The second limitation was that the overall response rate was low, particularly for training faculty, so there may have been a selection bias in the results. The third limitation was that the questionnaire asked what respondents thought about CCE, not CBE. Most of the faculty and residents likely did not know about CBE, so asking about the pilot CCE was the only feasible option. Thus, the survey results may not reflect their thoughts about CBE. The fourth limitation was that the training faculty’s thoughts about each competency may have differed by subspecialty, but the survey did not collect the respondents’ subspecialty, so this relationship was not analyzed.
This article is the only one that contains the results of an extensive survey conducted on educators and trainees after the pilot implementation of a CCE. This article may provide useful information on what needs to be implemented and to be corrected for the successful implementation of CBE in residency programs in the future.
To conclude, the KSA’s establishment of CBE is in its beginning stage. This study’s results may be used to improve resident education quality to meet the expectations of both teaching faculty and residents.
Acknowledgments
The survey of this study was supported financially by the Korean Society of Anesthesiologists (KSA). We thank members of the 22nd Training and Education Committee of the KJA (listed in Supplementary Material 9. [Korean]) and task force teams of the KSA for their help in establishing and applying competency-based education. We would like to thank the training faculties, chief training faculties, and residents of the KSA for participating and cooperating in the pilot evaluation. The authors appreciate Inje University Ilsan Paik Hospital Rheumatology Department, Prof. YOON Bo-Young for her precious advice.
NOTES

Funding

None.

Conflicts of Interest

No potential conflict of interest relevant to this article was reported.

Data Availability

The datasets generated during and/or analyzed during the current study are not publicly available due [The survey was conducted by the Korean Society of Anesthesiology, not by author] but are available from the corresponding author on reasonable request.

Author Contributions

Kyung Woo Kim (Data curation; Formal analysis; Writing – original draft; Writing – review & editing)

Won Joo Choe (Conceptualization; Supervision; Writing – review & editing)

Jun Hyun Kim (Conceptualization; Data curation; Formal analysis; Investigation; Methodology; Supervision; Writing – original draft; Writing – review & editing)

Supplementary Materials

Supplementary Materials

Supplementary Material 1.
Seven core competencies with detailed learning objects (Korean).
kja-22383-suppl1.pdf
Supplementary Material 2.
Milestones for evaluation (Korean).
kja-22383-suppl2.pdf
Supplementary Material 3.
An instruction for evaluation (Korean).
kja-22383-suppl3.pdf
Supplementary Material 4.
The evaluation items for each grade resident.
kja-22383-suppl4.pdf
Supplementary Material 5.
Questionnaire on competency evaluation (Korean).
kja-22383-suppl5.pdf
Supplementary Material 6.
Differences in equipment, feedback about core competency evaluation, resident's answer.
kja-22383-suppl6.pdf
Supplementary Material 7.
A frequency plots of significantly related factors (Korean).
kja-22383-suppl7.pdf
Supplementary Material 8.
Competency based education program in ENT and Internal Medicine.
kja-22383-suppl8.pdf
Supplementary Material 9.
The 22nd training committee members (Korean).
kja-22383-suppl9.pdf

Fig. 1.
The Likert scale of answers for each question of the chief training faculty. CCE: core competency evaluation.
kja-22383f1.tif
Fig. 2.
The Likert scale of answers for each question of the training faculty. CCE: core competency evaluation.
kja-22383f2.tif
Fig. 3.
The Likert scale of answers for each question of the resident. CCE: core competency evaluation, KSA: Korean Society of Anesthesiologists.
kja-22383f3.tif
Fig. 4.
The survey result on whether the various models were equipped (A), used (B), and thoughts on usefulness (C).
kja-22383f4.tif
Table 1.
Demographic Data of Respondents
Chief training faculty Training faculty Resident
Total number of answers 38 91 206
Gender
 M 30 (79) 54 (59) 117 (57)
 F 8 (21) 36 (40) 82 (40)
 No answer 0 (0) 1 (1) 7 (3)
Length of employment
 1–5 yr 3 (8) 26 (29) R1 42 (20)
 5–10 yr 14 (37) 28 (31) R2 49 (24)
 10–15 yr 13 (34) 10 (11) R3 58 (28)
 ≥ 15 yr 8 (21) 27 (30) R4 57 (28)
Number of residents per each grade
 1 6 (16) 13 (14) 18 (9)
 2 18 (47) 40 (44) 62 (30)
 3 7 (18) 15 (16) 43 (21)
 ≥ 4 7 (18) 23 (25) 83 (40)
Hospital location
 Seoul 15 (39) 29 (32) 84 (41)
 Incheon, Gyeonggi-do 5 (13) 23 (25) 42 (20)
 Daejeon, Chungcheong-do 3 (8) 11 (12) 18 (9)
 Busan, Gyeongsangnam-do 6 (16) 14 (15) 20 (10)
 Daegu, Gyeongsangbuk-do 3 (8) 3 (3) 14 (7)
 Gwanju, Jeollanam-do 1 (3) 0 (0) 7 (3)
 Jeonju, Jeollabuk-do 2 (5) 2 (2) 8 (4)
 Gangwon-do 1 (3) 7 (8) 5 (2)
 Jeju-do 2 (5) 2 (2) 2 (1)
 Rotation 0 (0) 0 (0) 6 (3)

Values are presented as number (%).

References

1. Ten Cate O. Competency-based postgraduate medical education: past, present and future. GMS J Med Educ 2017; 34: Doc69.
[PubMed] [PMC]
2. Kearney RA. Defining professionalism in anaesthesiology. Med Educ 2005; 39: 769-76.
[Article] [PubMed]
3. Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach 2007; 29: 642-7.
[Article] [PubMed]
4. Lee SW. Overseas residency training systems and implications for Korea. Korean Med Educ Rev 2018; 20: 128-34.
[Article]
5. Ahn M, Lee SW, Lee HJ. National support plan for general competency education for medical residents [Internet]. Seoul: Research Institute for Healthcare Policy; 2021 June [cited 2022 Jun 30]. Available from https://rihp.re.kr/bbs/board.php?bo_table=research_report&wr_id=316&sfl=wr_subject&stx=%EC%97%AD%EB%9F%89&sop=and

6. Fluit C, Bolhuis S, Grol R, Ham M, Feskens R, Laan R, et al. Evaluation and feedback for effective clinical teaching in postgraduate medical education: validation of an assessment instrument incorporating the CanMEDS roles. Med Teach 2012; 34: 893-901.
[Article] [PubMed]
7. Han JJ. The development of outcome-based curriculum in medical schools outside Korea. Korean Med Educ Rev 2013; 15: 19-24.
[Article]
8. Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, et al. Competency-based medical education: theory to practice. Med Teach 2010; 32: 638-45.
[Article] [PubMed]
9. Ten Cate O. Entrustability of professional activities and competency-based training. Med Educ 2005; 39: 1176-7.
[Article] [PubMed]
10. Hsiao CT, Chou FC, Hsieh CC, Chang LC, Hsu CM. Developing a competency-based learning and assessment system for residency training: analysis study of user requirements and acceptance. J Med Internet Res 2020; 22: e15655.
[Article] [PubMed] [PMC]

Go to Top