4
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Evaluation of objective tools and artificial intelligence in robotic surgery technical skills assessment: a systematic review

      review-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          There is a need to standardize training in robotic surgery, including objective assessment for accreditation. This systematic review aimed to identify objective tools for technical skills assessment, providing evaluation statuses to guide research and inform implementation into training curricula.

          Methods

          A systematic literature search was conducted in accordance with the PRISMA guidelines. Ovid Embase/Medline, PubMed and Web of Science were searched. Inclusion criterion: robotic surgery technical skills tools. Exclusion criteria: non-technical, laparoscopy or open skills only. Manual tools and automated performance metrics (APMs) were analysed using Messick's concept of validity and the Oxford Centre of Evidence-Based Medicine (OCEBM) Levels of Evidence and Recommendation (LoR). A bespoke tool analysed artificial intelligence (AI) studies. The Modified Downs–Black checklist was used to assess risk of bias.

          Results

          Two hundred and forty-seven studies were analysed, identifying: 8 global rating scales, 26 procedure-/task-specific tools, 3 main error-based methods, 10 simulators, 28 studies analysing APMs and 53 AI studies. Global Evaluative Assessment of Robotic Skills and the da Vinci Skills Simulator were the most evaluated tools at LoR 1 (OCEBM). Three procedure-specific tools, 3 error-based methods and 1 non-simulator APMs reached LoR 2. AI models estimated outcomes (skill or clinical), demonstrating superior accuracy rates in the laboratory with 60 per cent of methods reporting accuracies over 90 per cent, compared to real surgery ranging from 67 to 100 per cent.

          Conclusions

          Manual and automated assessment tools for robotic surgery are not well validated and require further evaluation before use in accreditation processes.

          PROSPERO: registration ID CRD42022304901

          Abstract

          This systematic review provides a comprehensive evaluation of current objective, manual, automated and artificial intelligence (AI) methods used in robotic technical skills assessment. Many lack full evaluation and AI is in its conceptual stages.

          Related collections

          Most cited references260

          • Record: found
          • Abstract: found
          • Article: not found

          Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement

          David Moher and colleagues introduce PRISMA, an update of the QUOROM guidelines for reporting systematic reviews and meta-analyses
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            The feasibility of creating a checklist for the assessment of the methodological quality both of randomised and non-randomised studies of health care interventions.

            To test the feasibility of creating a valid and reliable checklist with the following features: appropriate for assessing both randomised and non-randomised studies; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting, internal validity (bias and confounding) and power, but also for external validity. A pilot version was first developed, based on epidemiological principles, reviews, and existing checklists for randomised studies. Face and content validity were assessed by three experienced reviewers and reliability was determined using two raters assessing 10 randomised and 10 non-randomised studies. Using different raters, the checklist was revised and tested for internal consistency (Kuder-Richardson 20), test-retest and inter-rater reliability (Spearman correlation coefficient and sign rank test; kappa statistics), criterion validity, and respondent burden. The performance of the checklist improved considerably after revision of a pilot version. The Quality Index had high internal consistency (KR-20: 0.89) as did the subscales apart from external validity (KR-20: 0.54). Test-retest (r 0.88) and inter-rater (r 0.75) reliability of the Quality Index were good. Reliability of the subscales varied from good (bias) to poor (external validity). The Quality Index correlated highly with an existing, established instrument for assessing randomised studies (r 0.90). There was little difference between its performance with non-randomised and with randomised studies. Raters took about 20 minutes to assess each paper (range 10 to 45 minutes). This study has shown that it is feasible to develop a checklist that can be used to assess the methodological quality not only of randomised controlled trials but also non-randomised studies. It has also shown that it is possible to produce a checklist that provides a profile of the paper, alerting reviewers to its particular methodological strengths and weaknesses. Further work is required to improve the checklist and the training of raters in the assessment of external validity.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Appraising the quality of medical education research methods: the Medical Education Research Study Quality Instrument and the Newcastle-Ottawa Scale-Education.

              The Medical Education Research Study Quality Instrument (MERSQI) and the Newcastle-Ottawa Scale-Education (NOS-E) were developed to appraise methodological quality in medical education research. The study objective was to evaluate the interrater reliability, normative scores, and between-instrument correlation for these two instruments.
                Bookmark

                Author and article information

                Contributors
                Journal
                Br J Surg
                Br J Surg
                bjs
                The British Journal of Surgery
                Oxford University Press (US )
                0007-1323
                1365-2168
                January 2024
                10 November 2023
                10 November 2023
                : 111
                : 1
                : znad331
                Affiliations
                The Griffin Institute, Northwick Park & St Marks’ Hospital , London, UK
                Wellcome/ESPRC Centre for Interventional Surgical Sciences (WEISS), University College London (UCL) , London, UK
                Division of Surgery and Interventional Science, Research Department of Targeted Intervention, UCL , London, UK
                Wellcome/ESPRC Centre for Interventional Surgical Sciences (WEISS), University College London (UCL) , London, UK
                Medical Physics and Biomedical Engineering, UCL , London, UK
                The Griffin Institute, Northwick Park & St Marks’ Hospital , London, UK
                Wellcome/ESPRC Centre for Interventional Surgical Sciences (WEISS), University College London (UCL) , London, UK
                The Griffin Institute, Northwick Park & St Marks’ Hospital , London, UK
                Wellcome/ESPRC Centre for Interventional Surgical Sciences (WEISS), University College London (UCL) , London, UK
                Medical Physics and Biomedical Engineering, UCL , London, UK
                Department of General Surgey, Dorset County Hospital NHS Foundation Trust , Dorchester, UK
                Division of Surgery and Interventional Science, Research Department of Targeted Intervention, UCL , London, UK
                University College London Hospitals NHS Foundation Trust , London, UK
                Division of Surgery and Interventional Science, Research Department of Targeted Intervention, UCL , London, UK
                University College London Hospitals NHS Foundation Trust , London, UK
                Division of Surgery and Interventional Science, Research Department of Targeted Intervention, UCL , London, UK
                University College London Hospitals NHS Foundation Trust , London, UK
                Wellcome/ESPRC Centre for Interventional Surgical Sciences (WEISS), University College London (UCL) , London, UK
                Computer Science, UCL , London, UK
                The Griffin Institute, Northwick Park & St Marks’ Hospital , London, UK
                Division of Surgery and Interventional Science, Research Department of Targeted Intervention, UCL , London, UK
                Yeovil District Hospital, Somerset Foundation NHS Trust , Yeovil, Somerset, UK
                Author notes
                Correspondence to: Nader Francis, The Griffin Institute, Y Block, Northwick Park & St Mark’s Hospital, London HA1 3UJ, UK (e-mail: n.francis@ 123456griffininstitute.org.uk )
                Author information
                https://orcid.org/0000-0002-7288-3354
                https://orcid.org/0000-0001-5655-6823
                https://orcid.org/0000-0002-0980-3227
                https://orcid.org/0000-0001-8498-9175
                Article
                znad331
                10.1093/bjs/znad331
                10771126
                37951600
                f5dee6e9-3d76-4663-9533-b706617b44eb
                © The Author(s) 2023. Published by Oxford University Press on behalf of BJS Society Ltd.

                This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                : 11 July 2023
                : 18 September 2023
                : 19 September 2023
                Page count
                Pages: 20
                Categories
                Systematic Review
                AcademicSubjects/MED00910
                Bjs/11
                Bjs/2
                Bjs/16
                Bjs/18

                Surgery
                Surgery

                Comments

                Comment on this article