1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Application of the Team Emergency Assessment Measure Scale in undergraduate medical students and interprofessional clinical teams: validity evidence of a Spanish version applied in Chile

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Teamwork is one of the competencies necessary for physicians to work effectively in health systems and is a competency that can be developed with simulation in professionals and medicine students. The Team Emergency Assessment Measurement (TEAM) was created to evaluate the non-technical performance of team members during resuscitation events in real teams. The TEAM scale includes items to assess leadership, teamwork, situational awareness, and task management. An objective evaluation tool in Spanish is valuable for training health professionals at all undergraduate and continuing education levels. This study aimed to generate evidence of the validity of the Team Emergency Assessment Measure (TEAM) in Spanish to measure the performance of medical students and adult, pediatric, and obstetric emergency clinical teams in simulated emergencies as a self-assessment tool.

          Methods

          To develop the Spanish version of the instrument, a forward and backward translation process was followed by independent translators, native and fluent in English and Spanish, and a review by a panel of Chilean experts comprising three trained simulation instructors to verify semantics and cultural equivalence. High-fidelity simulations with debriefing were conducted with 5th-year medical students, in which students and instructors applied the Spanish version of the TEAM scale. In the second stage, adult, pediatric, and obstetric emergency management simulations were conducted using the TEAM scale for real clinical teams as a self-assessment tool.

          Findings

          By applying the overall TEAM scale to medicine students and clinical teams, Cronbach's alpha was 0.921. For medical students' self-assessment, we obtained Cronbach's alpha of 0.869. No significant differences were found between the overall scores and the scores by dimensions evaluated by instructors and students ( p > 0.05). In the case of clinical team training, Cronbach's alpha was 0.755 for adult emergency teams, 0.797 for pediatric emergency teams, and 0.853 for obstetric emergency teams.

          Conclusion

          The validated instrument is adequate for evaluating teamwork in medical student simulations by instructors and peers and for self-assessment in adult, pediatric, and obstetric emergency clinical teams.

          Related collections

          Most cited references23

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          COSMIN Risk of Bias checklist for systematic reviews of Patient-Reported Outcome Measures

          Purpose The original COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist was developed to assess the methodological quality of single studies on measurement properties of Patient-Reported Outcome Measures (PROMs). Now it is our aim to adapt the COSMIN checklist and its four-point rating system into a version exclusively for use in systematic reviews of PROMs, aiming to assess risk of bias of studies on measurement properties. Methods For each standard (i.e., a design requirement or preferred statistical method), it was discussed within the COSMIN steering committee if and how it should be adapted. The adapted checklist was pilot-tested to strengthen content validity in a systematic review on the quality of PROMs for patients with hand osteoarthritis. Results Most important changes were the reordering of the measurement properties to be assessed in a systematic review of PROMs; the deletion of standards that concerned reporting issues and standards that not necessarily lead to biased results; the integration of standards on general requirements for studies on item response theory with standards for specific measurement properties; the recommendation to the review team to specify hypotheses for construct validity and responsiveness in advance, and subsequently the removal of the standards about formulating hypotheses; and the change in the labels of the four-point rating system. Conclusions The COSMIN Risk of Bias checklist was developed exclusively for use in systematic reviews of PROMs to distinguish this application from other purposes of assessing the methodological quality of studies on measurement properties, such as guidance for designing or reporting a study on the measurement properties. Electronic supplementary material The online version of this article (10.1007/s11136-017-1765-4) contains supplementary material, which is available to authorized users.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration

            Diagnostic accuracy studies are, like other clinical studies, at risk of bias due to shortcomings in design and conduct, and the results of a diagnostic accuracy study may not apply to other patient groups and settings. Readers of study reports need to be informed about study design and conduct, in sufficient detail to judge the trustworthiness and applicability of the study findings. The STARD statement (Standards for Reporting of Diagnostic Accuracy Studies) was developed to improve the completeness and transparency of reports of diagnostic accuracy studies. STARD contains a list of essential items that can be used as a checklist, by authors, reviewers and other readers, to ensure that a report of a diagnostic accuracy study contains the necessary information. STARD was recently updated. All updated STARD materials, including the checklist, are available at http://www.equator-network.org/reporting-guidelines/stard. Here, we present the STARD 2015 explanation and elaboration document. Through commented examples of appropriate reporting, we clarify the rationale for each of the 30 items on the STARD 2015 checklist, and describe what is expected from authors in developing sufficiently informative study reports.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Defining and assessing professional competence.

              Current assessment formats for physicians and trainees reliably test core knowledge and basic skills. However, they may underemphasize some important domains of professional medical practice, including interpersonal skills, lifelong learning, professionalism, and integration of core knowledge into clinical practice. To propose a definition of professional competence, to review current means for assessing it, and to suggest new approaches to assessment. We searched the MEDLINE database from 1966 to 2001 and reference lists of relevant articles for English-language studies of reliability or validity of measures of competence of physicians, medical students, and residents. We excluded articles of a purely descriptive nature, duplicate reports, reviews, and opinions and position statements, which yielded 195 relevant citations. Data were abstracted by 1 of us (R.M.E.). Quality criteria for inclusion were broad, given the heterogeneity of interventions, complexity of outcome measures, and paucity of randomized or longitudinal study designs. We generated an inclusive definition of competence: the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community being served. Aside from protecting the public and limiting access to advanced training, assessments should foster habits of learning and self-reflection and drive institutional change. Subjective, multiple-choice, and standardized patient assessments, although reliable, underemphasize important domains of professional competence: integration of knowledge and skills, context of care, information management, teamwork, health systems, and patient-physician relationships. Few assessments observe trainees in real-life situations, incorporate the perspectives of peers and patients, or use measures that predict clinical outcomes. In addition to assessments of basic skills, new formats that assess clinical reasoning, expert judgment, management of ambiguity, professionalism, time management, learning strategies, and teamwork promise a multidimensional assessment while maintaining adequate reliability and validity. Institutional support, reflection, and mentoring must accompany the development of assessment programs.
                Bookmark

                Author and article information

                Contributors
                URI : http://loop.frontiersin.org/people/2090199/overviewRole: Role: Role: Role: Role: Role:
                Role: Role:
                Role: Role:
                URI : http://loop.frontiersin.org/people/739177/overviewRole: Role: Role: Role:
                Journal
                Front Med (Lausanne)
                Front Med (Lausanne)
                Front. Med.
                Frontiers in Medicine
                Frontiers Media S.A.
                2296-858X
                13 September 2023
                2023
                : 10
                : 1256982
                Affiliations
                [1] 1Escuela de Medicina, Facultad de Medicina Clínica Alemana Universidad del Desarrollo , Santiago, Chile
                [2] 2Unidad de Calidad y Seguridad del Paciente, Hospital Padre Hurtado , Santiago, Chile
                [3] 3Departamento de Desarrollo de las Personas, Hospital Padre Hurtado , Santiago, Chile
                [4] 4Centro de Habilidades Clínicas, Facultad de Medicina, Universidad de Chile , Santiago, Chile
                [5] 5Centro de Habilidades Clínicas y Disciplinares, Universidad de O'Higgins , Rancagua, Chile
                Author notes

                Edited by: Sebastian Schnaubelt, Medical University of Vienna, Austria

                Reviewed by: Christoph Veigl, Medical University of Vienna, Austria; Andrea Kornfehl, Medical University of Vienna, Austria

                Article
                10.3389/fmed.2023.1256982
                10525305
                37771978
                7e59b88c-6514-416d-9aed-8182e659980a
                Copyright © 2023 Armijo-Rivera, Ferrada-Rivera, Aliaga-Toledo and Pérez.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 11 July 2023
                : 23 August 2023
                Page count
                Figures: 3, Tables: 1, Equations: 0, References: 24, Pages: 7, Words: 4058
                Funding
                This study was supported with internal funding of Universidad del Desarrollo.
                Categories
                Medicine
                Original Research
                Custom metadata
                Healthcare Professions Education

                teamwork,leadership,interprofessional simulation,emergency,medical education

                Comments

                Comment on this article