1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Staffing Patterns of Non-ACGME Fellowships with 4-Year Residency Programs: A National Survey

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Introduction

          Emergency medicine (EM) is one of few specialties with variable training lengths. Hiring a three-year graduate to continue fellowship training in a department that supports a four-year residency program can lead to conflicts around resident supervision. We sought to understand hiring and clinical supervision, or staffing, patterns of non-Accreditation Council for Graduate Medical Education (ACGME) fellowships hosted at institutions supporting four-year residency programs.

          Methods

          We performed a web-based, cross-sectional survey of non-ACGME fellowship directors (FD) hosted at institutions supporting four-year EM residency programs. We calculated descriptive statistics. Our primary outcome was the proportion of programs with four-year EM residencies that hire non-ACGME fellows graduating from three-year EM residencies.

          Results

          Of 119 eligible FDs, 88 (74%) completed the survey. Seventy FDs (80%) indicated that they hire graduates of three-year residencies. Fifty-six (80%) indicated that three-year graduates supervise residents. Most FDs (74%) indicated no additional requirements exist to supervise residents outside of being hired as faculty. The FDs cited department policy, concerns about quality and length of training, and resident complaints as reasons for not hiring three-year graduates. A majority (10/18, 56%) noted that not hiring fellows from three-year programs negatively impacts recruitment and gives them access to a smaller applicant pool.

          Conclusion

          Most non-ACGME fellowships at institutions with four-year EM programs recruit three-year graduates and allow them to supervise residents. This survey provides programs information on how comparable fellowships recruit and staff their departments, which may inform policies that fit the needs of their learners, the fellowship, and the department.

          Related collections

          Most cited references12

          • Record: found
          • Abstract: found
          • Article: not found

          Procedural competency in emergency medicine: the current range of resident experience.

          To evaluate the recorded range of procedures tracked by emergency medicine (EM) programs, and to determine whether differences in procedural experience occur in various types of residency or hospital settings. The program directors of 112 approved EM programs were asked to send actual procedure logs. The requested information included the average total number of a given procedure per graduating resident, for all procedures that were tracked. Data were categorized by program format, hospital type, and ED volume. To assess the global procedural experience among programs, a set of 22 "index procedures" were identified; all procedures the EM residency review committee (RRC-EM) required to be tracked were included in this set. The means per graduating resident for each index procedure were added together to generate a "mean index procedure sum" (MIPS) per graduating resident for each residency program. These MIPSs for a residency were then compared by program format, hospital type, and ED volume. A similar analysis was performed for all resuscitations, and a "mean index resuscitation sum" (MIRS) per graduating resident was generated. An overall response rate of 82% was achieved; a number of programs had not graduated a residency class and were not included. Sixty-five of 85 eligible programs (76%) provided procedural data. The average number of a given procedure per graduating resident (95% CI in parentheses) for selected procedures is as follows: oral intubation 65 (46 to 85), intubation unspecified 75 (62 to 87), nasal intubation 6 (4 to 9), cricothyroidotomy 2 (1 to 2), subclavian catheter 23 (16 to 30), chest tubes 17 (14 to 20), intraosseous line 2 (1 to 3), thoracotomy 3 (2 to 5), and vaginal deliveries 17 (13 to 21). The only statistically significant differences in subgroup comparisons were in diagnostic peritoneal lavage, trauma resuscitations, and pediatric medical resuscitations when compared by postgraduate year format, and intubation-unspecified and cricothyroidotomy when compared by hospital type. There was no statistically significant difference when MIPSs were compared by format, hospital type, or ED volume. To the authors' knowledge, this is the first study of the range of EM resident procedure experience across the spectrum of EM residency types and settings. Overall, there are few statistically significant differences in procedure experience among different program formats. Similar experiences are recorded in a variety of different hospital types or ED volumes. However, some programs report very limited EM resident experience with selected critical procedures. There is a large variation in the types and numbers of procedures recorded by EM programs.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Reimagining the Clinical Competency Committee to Enhance Education and Prepare for Competency-Based Time-Variable Advancement

            Assessing residents and clinical fellows is a high-stakes activity. Effective assessment is important throughout training so that identified areas of strength and weakness can guide educational planning to optimize outcomes. Assessment has historically been underemphasized although medical education oversight organizations have strengthened requirements in recent years. Growing acceptance of competency-based medical education and its logical extension to competency-based time-variable (CB-TV) graduate medical education (GME) further highlights the importance of implementing effective evidence-based approaches to assessment. The Clinical Competency Committee (CCC) has emerged as a key programmatic structure in graduate medical education. In the context of launching a multi-specialty pilot of CB-TV GME in our health system, we have examined several program’s CCC processes and reviewed the relevant literature to propose enhancements to CCCs. We recommend that all CCCs fulfill three core goals, regularly applied to every GME trainee: (1) discern and describe the resident’s developmental status to individualize education, (2) determine readiness for unsupervised practice, and (3) foster self-assessment ability. We integrate the literature and observations from GME program CCCs in our institutions to evaluate how current CCC processes support or undermine these goals. Obstacles and key enablers are identified. Finally, we recommend ways to achieve the stated goals, including the following: (1) assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement; (2) strengthen CCC assessment processes to determine trainee readiness for independent practice; and (3) promote trainee reflection and informed self-assessment. The importance of coaching for competency, robust workplace-based assessments, feedback, and co-production of individualized learning plans are emphasized. Individual programs and their CCCs must strengthen assessment tools and frameworks to realize the potential of competency-oriented education.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Educator's blueprint: A how‐to guide for collecting validity evidence in survey‐ based research

              Surveys are descriptive assessment tools. Like other assessment tools, the validity and reliability of the data obtained from surveys depend, in large part, on the rigor of the development process. Without validity evidence, data from surveys may lack meaning, leading to uncertainty as to how well the survey truly measures the intended constructs. In documenting the evidence for the validity of survey results and their intended use, it is incumbent on the survey creator to have a firm understanding of validity frameworks. Having an understanding of validity evidence and how each step in the survey development process can support the validity argument makes it easier for the researcher to develop, implement, and publish a high‐quality survey.
                Bookmark

                Author and article information

                Journal
                West J Emerg Med
                West J Emerg Med
                WestJEM
                Western Journal of Emergency Medicine
                Department of Emergency Medicine, University of California, Irvine School of Medicine
                1936-900X
                1936-9018
                01 March 2024
                28 February 2024
                : 25
                : 2
                : 175-180
                Affiliations
                [* ]University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
                []University of Colorado, Department of Emergency Medicine, Aurora, Colorado
                Author notes
                Address for Correspondence: David A. Haidar, MD, University of Michigan, Department of Emergency Medicine, 1500 E. Medical Center Dr., B1-380, Ann Arbor, MI 48109. Email: dahaidar@ 123456med.umich.edu
                Article
                wjem-25-175
                10.5811/westjem.18454
                11000558
                2affc8b2-26d2-40f6-865a-bc13980807fa
                © 2024 Haidar et al.

                This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/licenses/by/4.0/

                History
                : 13 September 2023
                : 10 November 2023
                : 07 December 2023
                Categories
                Education
                Education Special Issue: Brief Research Report

                Emergency medicine & Trauma
                Emergency medicine & Trauma

                Comments

                Comment on this article