Satisfaction and attrition in Canadian surgical training program leadership: a survey of program directors ========================================================================================================== * Farhana Shariff * Frances C. Wright * Najma Ahmed * Fahima Dossa * Ashlie Nadler * Julie Hallet ## Abstract **Background:** Surgical program directors (PDs) play an integral role in the well-being and success of postgraduate trainees. Although studies about medical specialties have documented factors contributing to PD burnout, early attrition rates and contributory factors among surgical PDs have not yet been described. We aimed to evaluate Canadian surgical PD satisfaction, stressors in the role and areas institutions could target to improve PD support. **Methods:** We administered a cross-sectional survey of postgraduate Canadian surgical PDs from all Royal College of Physicians and Surgeons of Canada accredited surgical specialties. Domains we assessed included PD demographics and compensation, availability of administrative support, satisfaction with the PD role and factors contributing to PD challenges and burnout. **Results:** Sixty percent of eligible surgical PDs (81 out of 134) from all 12 surgical specialties responded to the survey. We found significant heterogeneity in PD tenure, compensation models and available administrative support. All respondents reported exceeding their weekly protected time for the PD position, and 66% received less than 0.8 full-time equivalent of administrative support. One-third of respondents were satisfied with overall compensation, whereas 43% were unhappy with compensatory models. Most respondents (70%) enjoyed many aspects of the PD role, including relationships with trainees and shaping the education of future surgeons. Significant stressors included insufficient administrative support, complexities in resident remediation and inadequate compensation, which contributed to 37% of PDs having considered leaving the post prematurely. **Interpretation:** Most surgical PDs enjoyed the role. However, intersecting factors such as disproportionate time demands, lack of administrative support and inadequate compensation for the role contributed to significant stress and risk of early attrition. Surgical program directors (PDs) face administrative demands from internal (faculty and supporting institution) and external (i.e., Royal College of Physicians and Surgeons of Canada [RCPSC]) sources. In addition, PDs are responsible for dynamic resident support, serving as career counsellors and safeguarding the educational and emotional needs of trainees while balancing busy clinical practices, and ensuring clinical services are staffed to maximize both trainee learning and patient care.1–3 Program directors of clinical programs in other medical specialties in Canada and the United States have been identified as being at increased risk for emotional exhaustion and burnout.4,5 Consequent turnover can affect faculty, trainees and the quality of residency programs.6,7 Studies in medicine and radiation oncology have found that factors associated with burnout and early attrition include administrative burden, available supportive resources, management of residents facing remediation, remuneration and limited opportunities for promotion.4–6,8,9 Surgical training programs are uniquely complex, requiring achievement not only of a broad range of medical competencies but also complex technical skills to achieve proficiency in both open and minimally invasive surgical approaches. At present, it is unknown if surgical PDs face similar risks for burnout and early attrition from the PD role as do medical PDs. We aimed to evaluate the satisfaction of Canadian surgical PDs, prominent stressors and potential factors contributing to early attrition from the position, with a view to guiding appropriate standardization of program structure, management and support. ## Methods ### Study design and population We conducted a Web-based self-administered cross-sectional survey of Canadian surgical PDs from postgraduate programs accredited by the RCPSC that we identified through their website. We identified 134 eligible PDs from cardiac surgery, colorectal surgery, general surgery, neurosurgery, orthopedic surgery, otolaryngology, pediatric surgery, plastic surgery, surgical oncology, thoracic surgery, urology and vascular surgery. ### Survey development We conducted a literature search exploring evidence pertaining to PD compensation, wellness and burnout on PubMed and MEDLINE, which we used to form the foundation for our survey. We followed recommendations for survey development and reporting.10,11 Team members with various areas of expertise (F.S., F.C.W., N.A. and J.H.) (surgeons, PDs and medical educators) identified important domains and subdomains related to PD experience, stressors and compensation. Domains were generated individually and reviewed as a group, and were informed by both review of the literature and professional experiences. Survey items were generated first without restriction, and later reduced to include only the most relevant items via 3 iterations of review by authors (F.S., F.C.W., N.A. and J.H.).12 We assessed 5 major domains for both quantitative and qualitative analysis: demographic characteristics, compensation, administrative support, satisfaction, and challenges and factors contributing to burnout. We used a combination of 5-point Likert scale questions and open-ended questions that facilitated expression of opinions not covered in the survey.13 Open-ended questions prompted participants to comment on the “best” and “worst” parts of the position, as well as to provide suggestions for how the experience might be improved. Additional open-ended text entry options where participants could comment on additional pertinent issues not directly covered in the survey questions were also included. ### Survey testing We pilot tested the survey using 3 steps to ensure face validity, content validity, clarity and feasibility of administration. The survey was pilot tested electronically using a group of 5 PDs (representative of the cohort) who were asked to comment on clarity, flow and difficulties with the survey instrument. Program directors were recruited from the eligible identified cohort and selected for pan-Canadian representation. None were in the author group. All were asked to participate in the final study survey. We then confirmed test–retest reliability by asking the pilot group to recomplete the survey within 1 month; answers were compared to ensure consistency. We revised survey items where necessary based on responses from the pilot surveys. We performed clinical sensibility analysis throughout the process authors using a standardized guide14 to ensure face validity, clarity and comprehensiveness.12,14 We used a translation–retranslation approach to translate the original validated English survey into French. We assessed test–retest reliability using weighted Cohen κ coefficients. More than 80% of κ scores were greater than 0.4, which indicated moderate agreement for most components of the survey.15 ### Survey administration We administered the survey online using SurveyMonkey (SurveyMonkey Inc.) from December 2019 to January 2020. Program directors were identified through the RCPSC program index and included in the eligible PD population. We contacted eligible participants by email in both English and French, and provided a single-use survey Web link as well as information regarding privacy, data storage, informed consent process and study purpose. We sent email reminders after 1, 2 and 4 weeks. Participation was voluntary and anonymous. There were no incentives offered. Respondents’ names and e-mail addresses were not linked to survey responses. In compliance with the American Association of Public Opinion Research, we considered the surveys to be completed when more than 80% of the questions were answered.10 ### Data analysis Quantitative responses were summarized and are reported as frequencies and proportions (F.S., F.D. and A.N.). We collapsed Likert responses, given the relatively limited participant size to facilitate ease of meaningful analysis, into 3 categories (agree/strongly agree, neutral and disagree/strongly disagree). For comparisons, we stratified programs into 3 groups based on program size (a small program comprised 1–10 residents, a mid-size program comprised 11–20 residents and a large program comprised > 20 residents). These sizes were selected with consideration for the spectrum of program sizes in Canada, and informed by our PD author experiences. We compared Likert response distributions across groups using the Kruskal–Wallis test; where results were statistically significant (*p* < 0.05), we performed post hoc pairwise comparisons using the Wilcoxon rank sum test. To reduce the likelihood of type 1 error, we performed a limited number of statistical comparisons; decisions for these comparisons were driven by clinical relevance (e.g., residency v. fellowship or comparison of PD compensation to satisfaction). Quantitative analyses were performed using R version 3.3. We analyzed narrative responses using an open coding qualitative strategy for qualitative research16 to ensure that valuable additional information provided by participants was captured, and to ensure issues not covered comprehensively in the survey were identified (F.S., F.C.W. and J.H.).17,18 We assigned codes to narrative comments, which were grouped into relevant categories or themes in sequential analyses. ### Reflexivity Described as important methodologic component of qualitative research, we engaged in reflexivity to keep track of potential preconceptions and inferences from team members.19 Although most team members are subspecialty general surgeons, there is variability in formal educational training and experience with surgical program leadership. F.C.W. and N.A. have served as fellowship and residency PDs. Two authors (F.S., a surgical fellow, and F.D., a surgical resident) provided the trainee perspective. J.H. is a surgical oncologist with expertise in survey methodology. F.S. and F.C.W. have expertise in qualitative research. Memos were kept during qualitative analysis, and regular cross-checking between team members allowed for discussion and for potential biases to be unveiled. ### Ethics approval This study was approved by the Research Ethics Board at Sunnybrook Health Sciences Centre, Toronto, Ontario. ## Results Eighty-one of 134 (60%) eligible PDs responded, including the 5 pilot group members. Two respondents partially completed surveys, with final analyzed responses representing 55% of potential participants, from all provinces with training programs. We excluded 5 PDs because they could not be contacted via email or had transitioned out of the PD role during the study’s course (Figure 1). ![Figure 1:](http://www.cmajopen.ca/https://www.cmajopen.ca/content/cmajo/11/2/E237/F1.medium.gif) [Figure 1:](http://www.cmajopen.ca/content/11/2/E237/F1) Figure 1: Flow chart for participant recruitment. Note: PD = program director. Twenty out of 22 eligible female PDs and 59 out of 112 eligible male PDs participated. Response rates by program were cardiac surgery 54% (7 out of 13), colorectal surgery 40% (2 out of 5), general surgery 75% (12 out of 16), neurosurgery 86% (12 out of 14), orthopedic surgery 47% (8 out of 17), otolaryngology and head and neck surgery 64% (9 out of 14), pediatric surgery 12.5% (1 out of 8), plastic surgery 82% (9 out of 11), surgical oncology 80% (4 out of 5), thoracic surgery 50% (4 out of 8), urology 67% (8 out of 12) and vascular surgery 36% (4 out of 11). Respondent characteristics are detailed in Table 1. Forty-six percent of participants were in practice for 5–10 years before becoming a PD. Most participants (68%) did not have training in education before becoming a PD. View this table: [Table 1:](http://www.cmajopen.ca/content/11/2/E237/T1) Table 1: Demographic characteristics of participating program directors We found that variability existed in expected PD tenure across programs. Options to renew the position after each term ranged from 1 or 2 renewals (31%, *n* = 25; and 20%, *n* = 16, respectively) to “no maximum” (32%, *n* = 26), with the latter more common in subspecialty programs such as colorectal surgery, general surgical oncology and thoracic surgery. Eighty-six percent (*n* = 70) of participants listed the total expected term, including renewals, as 10 years or more. Additional program characteristics are summarized in Table 2. View this table: [Table 2:](http://www.cmajopen.ca/content/11/2/E237/T2) Table 2: Program characteristics The median program size was 10 trainees (range 1–85). Protected time for the role varied by program size. Program directors of small and mid-size programs frequently reported having less than 1 hour of protected time per week (*n* = 22, 54%; *n* = 10, 42%, respectively); this was less frequently seen among PDs of large programs (*n* = 1, 7%) (Table 3). View this table: [Table 3:](http://www.cmajopen.ca/content/11/2/E237/T3) Table 3: Responsibilities of the program directors and administrative support available We also found that the amount of time spent on the PD role was often discordant with the amount of protected time. Although only 7% (*n* = 3) with the small programs reported more than 5 hours per week of protected time, 56% (*n* = 23) reported spending more than 5 hours on the role. Only 8% (*n* = 2) of PDs from mid-size programs reported more than 5 hours per week of protected time and 71% (*n* = 17) reported spending more than 5 hours per week on the role (Table 3). About half of the participants (52%, *n* = 41) reported spending more than half of their PD time on tasks related to program administration. Across all program sizes, PDs spent similar amounts of time on resident support, promotions and remediation, and curricular planning. Administrative support available in full-time equivalents (FTEs; 1 FTE = 37.5 h/wk) ranged by program size (Table 2). Seventy-three percent of all programs received less than 1 FTE of administrative support weekly, whereas 80% (*n* = 12) of large programs received 1 FTE or more weekly. Sixty percent (*n* = 48) of PDs stated they did not have access to additional administrative support during times of increased need (e.g., interviews, accreditation or remediation). Among those with access to additional support (*n* = 29, 36%), 31% (*n* = 9) reported the amount of support to be insufficient. Funding for the PD position was provided from the university division (*n* = 17) or department (*n* = 36) for 65% of programs. Additional sources included postgraduate medical education offices and sponsor money received from international sources to fund out-of-country trainees. Salaries for PD activities ranged from $0 to more than $100 000, with 42% (*n* = 32) receiving between $10 000 and $30 000. Five percent (*n* = 4) received less than $5000 annually, and an equal number received no salary (Figure 2). Perceived fair salary amounts ranged from $20 000 to $125 000 (Figure 2). Forms of nonmonetary compensation, including pathway to promotion and recognition from colleagues, are presented in Figure 3. Satisfaction with compensation models varied, even though the most PDs were satisfied with the position itself (Figure 4). ![Figure 2:](http://www.cmajopen.ca/https://www.cmajopen.ca/content/cmajo/11/2/E237/F2.medium.gif) [Figure 2:](http://www.cmajopen.ca/content/11/2/E237/F2) Figure 2: Current versus perceived fair salary for the program director (PD) position. Note: Other = remuneration by percentage of full-time equivalent, proportional remuneration by number of trainees, and compensation based on loss of clinical time and earnings. ![Figure 3:](http://www.cmajopen.ca/https://www.cmajopen.ca/content/cmajo/11/2/E237/F3.medium.gif) [Figure 3:](http://www.cmajopen.ca/content/11/2/E237/F3) Figure 3: Additional forms of compensation for performing the program director (PD) role (*n* = 81; 2 respondents provided other responses including teaching funding from the ministry and respect from residents/students). ![Figure 4:](http://www.cmajopen.ca/https://www.cmajopen.ca/content/cmajo/11/2/E237/F4.medium.gif) [Figure 4:](http://www.cmajopen.ca/content/11/2/E237/F4) Figure 4: Satisfaction with program director (PD) role compared with satisfaction with compensation for the PD role (*n* = 80 for satisfaction with PD role; *n* = 79 for satisfaction with compensation). Seventy-seven percent (*n* = 62) of respondents reported enjoying the position, 17% (*n* = 14) were neutral and 2% (*n* = 2) did not enjoy the work (Figure 4). Seventy-nine percent (*n* = 64) of PDs felt that the position had increased their profile at the university, 65% (*n* = 53) agreed that the position had helped their career and 2% (*n* = 2) felt that the position had neither increased their university profile nor helped their overall career trajectory. Narrative comments around the most enjoyable aspects of the position centred on themes of fostering resident relationships, educational influence and personal fulfillment (Table 4). Ninety-six percent (*n* = 77) of PDs commented on the enjoyment and personal satisfaction gained from teaching and mentoring residents, and the ability to play a role in shaping tomorrow’s surgeons. View this table: [Table 4:](http://www.cmajopen.ca/content/11/2/E237/T4) Table 4: Positive features and stressors of the role of program director Overall, 68% (*n* = 55) of participants agreed that the surgical PD position was more work than expected and 37% (*n* = 30) considered resigning before the end of their term. Narrative comments regarding challenges with the role fell into 5 themes: administrative demands, resident remediation, complexities of educational programming, faculty engagement challenges and insufficient overall compensation (Table 4). Forty percent of participants (*n* = 32) commented specifically on the challenges of navigating the management of a learner in difficulty with minimal training in this skill set, and the difficult balance between dedicating necessary time to a few residents in need, while still working to “devote time to program improvement and skill development of the majority.” We found no statistically significant association between program size (*p* = 0.45), time devoted to the role or administrative support available in the program (*p* = 1.0) and thoughts of resignation from the position (Table 5). Of those who had considered early exit from the position, a slightly higher proportion had less than 1 hour per week of protected time for PD activities (50%) compared with those with 1–5 hours per week (31.2%) and more than 5 hours per week (27.3%). Dissatisfaction with compensation was associated with thoughts of early resignation (*p* < 0.001). Only 18% of PDs who were satisfied or very satisfied with compensation had considered resigning compared with 66% of those dissatisfied with compensatory models (*p* = 0.001). View this table: [Table 5:](http://www.cmajopen.ca/content/11/2/E237/T5) Table 5: Factors associated with thoughts of resignation (responses to “I have considered giving up the program director position before the end of my term”) ## Interpretation We report on a national evaluation of stressors, compensation and satisfaction among surgical PDs and highlight areas for improvement. Participants reported substantial time demands for the position, variability in compensation, and insufficient administrative and institutional support participants. In keeping with previous evaluations of PDs in nonsurgical specialties, our results confirmed that surgical PDs balance a number of competing interests and are at risk for premature attrition from the position. One of our key findings was the perceived lack of administrative support by PDs. Accredited residency programs must follow set rules to ensure the standardization and quality of training; however, such standards offer no explicit direction about the amounts of administrative support required. Standards of accreditation for postgraduate programs across Canada include general requirements that “the program director has appropriate support to oversee and advance the residency program” (Canadian Residency Accreditation Consortium Standard 1.1.2)20 without further quantification. Common program requirements from the Accreditation Council for Graduate Medical Education stipulate that programs at a minimum “must be supported at 50% FTE (at least 20 h/wk) for administrative time,”21,22 with a 2021 change suggesting 100% FTE for some surgical programs, with additional support based on program size.23 Almost 1 out of 4 surgical programs (24%) included in our survey lacked this minimum 20 hours of administrative assistance. The increased workload and responsibilities associated with changes in program assessment in Canada (e.g., Competency-Based Medical Education) and extensive documentation required for residents undergoing remediation were mentioned as specific stressors by participants, exacerbated by a lack of compensatory increase in administrative support during times of heightened workload. Such deficiencies have been documented in transitional year residency programs,24 as well as medicine residency programs,25 which suggests that current administrative supports for postgraduate programs fall short of what is needed, and clearer guidelines and standardization of support between programs — ideally stemming from overarching accreditation bodies — are necessary. One-third of the PDs in our survey considered giving up the position before the end of their term. Coupling factors such as increased personal stress and time demand associated with managing and supporting residents in difficulty or remediation coupled with a lack of administrative or faculty support, it is not surprising that despite job enjoyment and opportunities for career advancement, many PDs considered leaving prematurely. To add to the picture, there appeared to be no clear relation between true versus expected work of the role and thoughts of leaving the position early, which suggests that unexpected workload is not the major issue. Such early attrition and burnout in PDs are widely documented among other specialties, (medicine, radiation oncology and anesthesia),9,26 which speaks to the highly complex and challenging nature of the job. Most PDs in our study reported having little to no specialized training or support for skills acquisition in educational development or leadership. Acknowledging the importance of PDs in the trajectory of the learner, increased turnover undoubtedly carries negative implications for trainees’ education,27 the university’s reputation and clinical care. As a potential remedy, program interventions to support trainees and aid in educational planning may help to distribute workload and help reduce overall PD burden by facilitating earlier resident feedback and coaching, and minimizing the number of trainees who require formal remediation requiring PD attention. For example, a formal mentorship program trialled in Otolaryngology at the University of Alberta resulted in lower resident stress scores, lower depersonalization and overall improved trainee quality of life,28 which suggests that resident well-being can be supported in ways independent of the PD. In addition, individual academic advisors for competency-based medical education may aid PDs in identifying barriers to progression in struggling learners, and may be uniquely positioned to contribute to educational developments as competency-based medical education progresses.29,30 Similarly, institutions should consider how they might mentor and support PDs as they move through challenges in their new roles. At present, such mentorship or coaching occurs in a largely ad hoc way, and formalized processes may enable PDs to better manage the complex and at times competing aspects of the position. A considerable proportion of respondents endorsed dissatisfaction with current salary and compensatory models. Interestingly, wide heterogeneity existed when it came to funding source, salary, time in practice before becoming a PD and additional forms of compensation. This variability seems unrelated to both a program’s number of residents and the number of trainees in remediation. Although all programs have baseline time requirements for accreditation and administration, resident scheduling, career counselling and mentorship undoubtedly require more time with increasing trainee numbers. Despite the commitment of PDs, absence of proportional compensation contributes to a multitude of challenges, with important implications for recruitment and retention. To compensate PDs equitably, institutions should look to standardize compensation and remuneration, based on defined requirements, as well as workload dependent on the number of trainees. In contrast to reports in other specialties,31 our analysis showed no difference in duration of PD tenure between male and female physicians. Of note, the rate of female surgical PDs at present in Canada is about 16%, which is comparable to recently reported rates in the United States.32,33 ### Limitations Forty percent of eligible PDs did not complete the survey, with nonresponders distributed across specialties. Reasons for this are unknown and, conceivably, response bias could be tied to satisfaction with the PD position. This limited our sample size and, ultimately, our ability to detect statistically significant associations. In some instances, we did not find statistically significant differences between groups, which may be a function of our small sample size rather than a true lack of associations. In addition, we did not collect data on accreditation status of individual programs, specifically, whether they were on probation or were in the process of preparing for an external review — both of which may have influenced responses or response rate. Like all survey studies, recall bias has the potential to affect participant responses; however, this was likely minimized in our study because all PDs who were included were current PDs with ideas and concerns regarding the position fresh in their minds. It is worth noting that the small sample size for instrument development may have limited our ability to accurately assess test–retest reliability. Finally, the confidence intervals for the estimated κ values based on *n* = 5 were wide. ### Conclusion Program directors of surgical specialty training programs reported heterogeneity in salary, administrative support and overall compensation across subspecialties. We identified a need reported by PDs for increased administrative resources during times of heightened program demand, more robust compensation, and increased support in counselling and mentoring trainees who are in difficulty. Systematic culture change at the institutional level to support PDs through better-defined structural processes and sufficient resources should be considered by academic surgical institutions to keep these educators engaged and improve both PDs’ and trainees’ experiences. ## Footnotes * **Competing interests:** Julie Hallet has received speaking honoraria from Ipsen Biopharmaceuticals Canada, Novartis Oncology and Advanced Accelerator Applications. Frances Wright has received speaking honoraria from Merck and Novartis (donated to the University of Toronto). No other competing interests were declared. * This article has been peer reviewed. * **Contributors:** Farhana Shariff, Frances Wright, Najma Ahmed and Julie Hallet conceptualized the study topic and design, and created the study tool, which was approved by all members of the author team. Fahima Dossa and Ashlie Nadler performed the statistical data analysis, and all of the authors reviewed qualitative and quantitative data results and contributed to interpretation of the data. All of the authors participated in writing and editing the manuscript, gave final approval of the version to be published and agreed to be accountable for all aspects of the work. * **Data sharing:** The authors are happy to make their data available to other researchers, with identifying institution names and gender that may compromise anonymity removed. Data may be accessed through formal request to the study authors with an outline for how the data will be used. Agreement to share the data will be established by all team members. * **Supplemental information:** For reviewer comments and the original submission of this manuscript, please see [www.cmajopen.ca/content/11/2/E237/suppl/DC1](http://www.cmajopen.ca/content/11/2/E237/suppl/DC1). This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY-NC-ND 4.0) licence, which permits use, distribution and reproduction in any medium, provided that the original publication is properly cited, the use is noncommercial (i.e., research or educational use), and no modifications or adaptations are made. See: [https://creativecommons.org/licenses/by-nc-nd/4.0/](https://creativecommons.org/licenses/by-nc-nd/4.0/) ## References 1. Willett LL, Halvorsen AJ, McDonald FS, et al. (2015) Gender differences in salary of internal medicine residency directors: a national survey. Am J Med 128:659–65. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1016/j.amjmed.2015.02.002&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 2. Lypson M, Simpson D (2011) It all starts and ends with the program director. J Grad Med Educ 3:261–3. [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=22655155&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 3. Kumar B, Swee ML, Suneja M (2019) The ecology of program director leadership: power relationships and characteristics of effective program directors. BMC Med Educ 19:436. 4. West CP, Halvorsen AJ, Swenson SL, et al. (2013) Burnout and distress among internal medicine program directors: results of a national survey. J Gen Intern Med 28:1056–63. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1007/s11606-013-2349-9&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=23595924&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 5. Monga M, Doyle NM, Campbell D, et al. (2003) Job satisfaction among program directors in obstetrics and gynecology: a national portrait. Am J Obstet Gynecol 189:628–30. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1067/S0002-9378(03)00890-1&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=14526279&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 6. O’Connor AB, Halvorsen AJ, Cmar JM, et al. (2019) Internal medicine residency program director burnout and program director turnover: results of a national survey. Am J Med 132:252–61. 7. Fountain D, Quach C, Norton D, et al. (2017) The perfect storm is on the horizon! J Surg Educ 74:e120–3. 8. Beasley BW, Kern DE, Kolodner K (2001) Job turnover and its correlates among residency program directors in internal medicine: a three-year cohort study. Acad Med 76:1127–35. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1097/00001888-200111000-00017&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=11704516&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 9. Aggarwal S, Kusano AS, Carter JN, et al. (2015) Stress and burnout among residency program directors in United States radiation oncology programs. Int J Radiat Oncol Biol Phys 93:746–53. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1016/j.ijrobp.2015.08.019&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=26530741&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 10. (2016) Standard definitions: final dispositions of case codes and outcome rates for surveys (American Association for Public Opinion Research, Alexandria (VA)) Available[https://www.aapor.org/aapor_main/media/publications/standard-definitions20169theditionfinal.pdf](https://www.aapor.org/aapor_main/media/publications/standard-definitions20169theditionfinal.pdf). accessed 2021 Feb. 10. 9th ed. 11. Eysenbach G (2004) Improving the quality of Web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res 6:e34. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.2196/jmir.6.3.e34&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=15471760&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 12. Burns KEA, Kho ME (2015) How to assess a survey report: a guide for readers and peer reviewers. CMAJ 187:E198–205. [FREE Full Text](http://www.cmajopen.ca/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiY21haiI7czo1OiJyZXNpZCI7czoxMDoiMTg3LzYvRTE5OCI7czo0OiJhdG9tIjtzOjIxOiIvY21ham8vMTEvMi9FMjM3LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 13. Rea LM, Parker RA (2014) Designing and conducting survey research: a comprehensive guide (John Wiley & Sons, Inc Jossey-Bass, San Francisco (CA)), 4th ed. 14. Burns KE, Duffett M, Kho ME, et al., ACCADEMY Group (2008) A guide for the design and conduct of self-administered surveys of clinicians. CMAJ 179:245–52. [FREE Full Text](http://www.cmajopen.ca/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiY21haiI7czo1OiJyZXNpZCI7czo5OiIxNzkvMy8yNDUiO3M6NDoiYXRvbSI7czoyMToiL2NtYWpvLzExLzIvRTIzNy5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 15. Viera AJ, Garrett JM (2005) Understanding interobserver agreement: the kappa statistic. Fam Med 37:360–3. [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=15883903&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) [Web of Science](http://www.cmajopen.ca/lookup/external-ref?access_num=000229008800015&link_type=ISI) 16. Charmaz K (2014) Constructing grounded theory (SAGE Publications, London (UK) and Thousand Oaks (CA)), 2nd ed, p xxi, pp 1–416. 17. O’Cathain A, Thomas KJ (2004) “Any other comments?” Open questions on questionnaires: a bane or a bonus to research? BMC Med Res Methodol 4:25. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1186/1471-2288-4-25&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=15533249&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 18. Decorte T, Malm A, Sznitman SR, et al. (2019) The challenges and benefits of analyzing feedback comments in surveys: Lessons from a cross-national online survey of small-scale cannabis growers. Methodol Innov 12, doi:10.1177/2059799119825606. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1177/2059799119825606.&link_type=DOI) 19. Berger R (2015) Now I see it, now I don’t: researcher’s position and reflexivity in qualitative research. Qual Res 15:219–34. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1177/1468794112468475&link_type=DOI) 20. (2020) General standards of accreditation for residency programs (Canadian Residency Accreditation Consortium, Ottawa). 21. ACGME common program requirements (residency) (2018) Accreditation Council for Graduate Medical Education (Chicago) effective 2019 July 1. Available: [https://www.acgme.org/globalassets/PFAssets/ProgramRequirements/CPRResidency2019.pdf](https://www.acgme.org/globalassets/PFAssets/ProgramRequirements/CPRResidency2019.pdf). accessed 2021 Feb. 10. 22. (2018) ACGME common program requirements (fellowship) (Accreditation Council for Graduate Medical Education, Chicago) effective 2019 July 1, Available: [https://www.acgme.org/globalassets/PFAssets/ProgramRequirements/CPRFellowship2019.pdf](https://www.acgme.org/globalassets/PFAssets/ProgramRequirements/CPRFellowship2019.pdf). accessed 2021 Feb. 10. 23. (2022) Specialty-specific program requirements: program coordinator dedicated time (Accreditation Council for Graduate Medical Education, Chicago) Availble: [https://www.acgme.org/globalassets/pdfs/specialty-specific-requirement-topics/dio-dedicated\_time\_coordinator\_2022.pdf](https://www.acgme.org/globalassets/pdfs/specialty-specific-requirement-topics/dio-dedicated_time_coordinator_2022.pdf). accessed 2021 Feb. 10. 24. Craig SR, Smith HL, Short MW (2012) Results from a transitional-year program director survey: identifying crucial issues and concerns. J Grad Med Educ 4:28–33. 25. Aronica M, Williams R, Dennar PE, et al. (2015) Benchmarks for support and outcomes for internal medicine-pediatrics residency programs: a 5-year review. J Grad Med Educ 7:574–9. 26. De Oliveira GS Jr., Almeida MD, Ahmad S, et al. (2011) Anesthesiology residency program director burnout. J Clin Anesth 23:176–82. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1016/j.jclinane.2011.02.001&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=21458978&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 27. Appelbaum NP, Lee N, Amendola M, et al. (2019) Surgical resident burnout and job satisfaction: the role of workplace climate and perceived support. J Surg Res 234:20–5. [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 28. Zhang H, Isaac A, Wright ED, et al. (2017) Formal mentorship in a surgical residency training program: a prospective interventional study. J Otolaryngol Head Neck Surg 46:13. [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 29. Soleas E, Dagnone D, Stockley D, et al. (2020) Developing Academic Advisors and Competence Committees members: a community approach to developing CBME faculty leaders. Can Med Educ J 11:e46–56. 30. Rich JV, Fostaty Young S, Donnelly C, et al. (2020) Competency-based education calls for programmatic assessment: But what does this look like in practice? J Eval Clin Pract 26:1087–95. 31. Wolfsthal SD, Beasley BW, Kopelman R, et al., Membership Survey and Scientific Data Committee, The Association of Program Directors of Internal Medicine (2002) Benchmarks of support in internal medicine residency training programs. Acad Med 77:50–6. [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=11788325&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) [Web of Science](http://www.cmajopen.ca/lookup/external-ref?access_num=000173300000011&link_type=ISI) 32. Carpenter A-M, Tan SA, Costopoulos K, et al. (2018) Gender diversity in general surgery residency leadership. J Surg Educ 75:e68–71. [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fcmajo%2F11%2F2%2FE237.atom) 33. Filiberto AC, Le CB, Loftus TJ, et al. (2019) Gender differences among surgical fellowship program directors. Surgery 166:735–7. * © 2023 CMA Impact Inc. or its licensors