28
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Usability and preference of electronic vs. paper and pencil OSCE checklists by examiners and influence of checklist type on missed ratings in the Swiss Federal Licensing Exam Translated title: Usability und Präferenz von elektronischen OSCE-Checklisten im Vergleich zu papierbasierten Checklisten gemäss Prüfenden und Einfluss des Checklisten-Typs auf fehlende Bewertungen in der Eidgenössischen Prüfung

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background:

          Only a few studies with small sample sizes have compared electronic Objective Structured Clinical Examination (OSCE) rating checklists with traditional paper-based OSCE rating checklists. In this study, the examiner-perceived usability and preference for type of OSCE checklist (electronic vs. paper based) were compared, and the influence of OSCE checklist type on missed ratings was determined, for the Swiss Federal Licensing Examination in clinical skills for human medicine.

          Methods:

          All examiners in the Swiss Federal Licensing Examination in clinical skills for human medicine were invited over two subsequent years to evaluate the OSCE checklist type they had worked with during the examination. This was based on a questionnaire with 14 closed questions (i.e., demographic, checklist-type experience, perceived usability, checklist type preference). Furthermore, the numbers of missed ratings for the paper-based checklist were recorded.

          Results:

          The data from these examiners ( n=377) with experience of both OSCE checklist types were analyzed. The electronic OSCE checklist was rated significantly higher on all usability aspects (i.e., ease of use, candidate rating and error correction, clarity, distraction using the checklist, overall satisfaction), except for the speed of registering comments (no significant difference). The majority of the examiners in both years (2014: 54.5%, n=60, 2015: 89.8%, n=230) reported preference for working with the electronic OSCE checklist in the future. Missed ratings were seen for 14.2% of the paper-based OSCE checklists, which were prevented with the electronic OSCE checklists.

          Conclusions:

          Electronic OSCE checklists were rated significantly more user-friendly and were preferred over paper-based OSCE checklists by a broad national sample of examiners, supporting previous results from faculty-level examinations. Furthermore, missed ratings were prevented with the electronic OSCE checklists. Overall, the use of electronic OSCE checklists is therefore advisable.

          Zusammenfassung

          Hintergrund:

          Nur wenige Studien mit kleinen Stichprobengrößen haben elektronische OSCE-Checklisten (Objective Structured Clinical Examination) mit traditionellen OSCE-Checklisten in Papierform verglichen. In dieser Studie wurden die von Prüfenden wahrgenommene Usability und Präferenz für den OSCE-Checklisten-Typ (elektronisch vs. papierbasiert) verglichen und der Einfluss des OSCE-Checklisten-Typs auf fehlende Bewertungen ermittelt für die Eidgenössische Clinical Skills-Prüfung Humanmedizin in der Schweiz.

          Methode:

          Die Prüfenden der Eidgenössischen Clinical Skills-Prüfung Humanmedizin wurden in zwei aufeinanderfolgenden Jahren gebeten, den OSCE-Checklisten-Typ zu bewerten, mit dem sie während der Prüfung gearbeitet hatten. Dies geschah anhand eines Fragebogens mit 14 geschlossenen Fragen (demographische Angaben, Erfahrung mit dem Checklisten-Typ, wahrgenommene Usability, Präferenz für den Checklisten-Typ). Außerdem wurde die Anzahl der fehlenden Bewertungen bei der papierbasierten Checkliste erfasst.

          Resultate:

          Die Daten derjenigen Prüfenden (n=377) mit Erfahrung mit beiden OSCE-Checklisten-Typen wurden ausgewertet. Die elektronische OSCE-Checkliste wurde bei allen Aspekten der Usability (einfache Benutzung, Kandidierendenbewertung und Fehlerkorrektur, Übersichtlichkeit, Ablenkung bei der Verwendung der Checkliste, Gesamtzufriedenheit) signifikant besser bewertet, mit Ausnahme der Geschwindigkeit des Erfassens von Kommentaren (kein signifikanter Unterschied). Die Mehrheit der Prüfenden in beiden Jahren (2014: 54.5%, n=60, 2015: 89.8%, n=230) gab an, in Zukunft lieber mit der elektronischen OSCE-Checkliste arbeiten zu wollen. Bei 14.2% der papierbasierten OSCE-Checklisten wurden fehlende Bewertungen festgestellt, welche mit elektronischen OSCE-Checklisten vermieden werden konnten.

          Schlussfolgerungen:

          Elektronische OSCE-Checklisten wurden von einer breiten nationalen Stichprobe von Prüfenden als deutlich benutzerfreundlicher eingestuft und gegenüber OSCE-Checklisten auf Papier bevorzugt, was frühere Ergebnisse von Prüfungen auf Fakultätsebene bestätigt. Außerdem wurden mit den elektronischen OSCE-Checklisten fehlende Bewertungen vermieden. Insgesamt ist die Verwendung elektronischer OSCE-Checklisten daher empfehlenswert.

          Related collections

          Most cited references16

          • Record: found
          • Abstract: found
          • Article: not found

          Pretesting survey instruments: an overview of cognitive methods.

          This article puts forward the case that survey questionnaires, which are a type of measuring instrument, can and should be tested to ensure they meet their purpose. Traditionally survey researchers have been pre-occupied with 'standardising' data collection instruments and procedures such as question wording and have assumed that experience in questionnaire design, coupled with pilot testing of questionnaires, will then ensure valid and reliable results. However, implicit in the notion of standardisation are the assumptions that respondents are able to understand the questions being asked, that questions are understood in the same way by all respondents, and that respondents are willing and able to answer such questions. The development of cognitive question testing methods has provided social researchers with a number of theories and tools to test these assumptions, and to develop better survey instruments and questionnaires. This paper describes some of these theories and tools, and argues that cognitive testing should be a standard part of the development process of any survey instrument.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Assessment of clinical competence using objective structured examination.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Cognitive Load Theory: implications for medical education: AMEE Guide No. 86.

              Cognitive Load Theory (CLT) builds upon established models of human memory that include the subsystems of sensory, working and long-term memory. Working memory (WM) can only process a limited number of information elements at any given time. This constraint creates a "bottleneck" for learning. CLT identifies three types of cognitive load that impact WM: intrinsic load (associated with performing essential aspects of the task), extraneous load (associated with non-essential aspects of the task) and germane load (associated with the deliberate use of cognitive strategies that facilitate learning). When the cognitive load associated with a task exceeds the learner's WM capacity, performance and learning is impaired. To facilitate learning, CLT researchers have developed instructional techniques that decrease extraneous load (e.g. worked examples), titrate intrinsic load to the developmental stage of the learner (e.g. simplify task without decontextualizing) and ensure that unused WM capacity is dedicated to germane load, i.e. cognitive learning strategies. A number of instructional techniques have been empirically tested. As learners' progress, curricula must also attend to the expertise-reversal effect. Instructional techniques that facilitate learning among early learners may not help and may even interfere with learning among more advanced learners. CLT has particular relevance to medical education because many of the professional activities to be learned require the simultaneous integration of multiple and varied sets of knowledge, skills and behaviors at a specific time and place. These activities possess high "element interactivity" and therefore impose a cognitive load that may surpass the WM capacity of the learner. Applications to various medical education settings (classroom, workplace and self-directed learning) are explored.
                Bookmark

                Author and article information

                Journal
                GMS J Med Educ
                GMS J Med Educ
                GMS J Med Educ
                GMS Journal for Medical Education
                German Medical Science GMS Publishing House
                2366-5017
                14 April 2022
                2022
                : 39
                : 2
                : Doc24
                Affiliations
                [1 ]University of Bern, Institute for Medical Education, Department for Assessment and Evaluation, Bern, Switzerland
                [2 ]University of Bern, Institute for Medical Education, Department for Software Development, Usability Consulting and IT Infrastructure, Bern, Switzerland
                [3 ]University of Bern, Institute for Medical Education, Bern, Switzerland
                Author notes
                *To whom correspondence should be addressed: Felicitas L. Wagner, University of Bern, Institute for Medical Education, Department for Assessment and Evaluation, Mittelstr. 43, CH-3012 Bern, Switzerland, E-mail: felicitas.wagner@ 123456iml.unibe.ch
                Article
                zma001545 Doc24 urn:nbn:de:0183-zma0015453
                10.3205/zma001545
                9174065
                35692359
                bce63f3e-bae3-48b5-82d2-8eee8cea9ea9
                Copyright © 2022 Wagner et al.

                This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 License. See license information at http://creativecommons.org/licenses/by/4.0/.

                History
                : 13 January 2021
                : 09 February 2022
                : 28 January 2022
                Categories
                Article

                osce,checklists,electronic,usability,evaluation,national
                osce, checklists, electronic, usability, evaluation, national

                Comments

                Comment on this article