8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Domain-Specific Physical Activity and Mental Health: A Meta-analysis.

      Read this article at

      ScienceOpenPublisherPubMed
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The mental health benefits of physical activity are well established. However, less is known about whether the relationship between physical activity and mental health is consistent across different life domains. It is important to understand how context may influence the relationship between physical activity and mental health so that interventions and policy guidelines can be tailored to maximize positive effects.

          Related collections

          Most cited references148

          • Record: found
          • Abstract: not found
          • Article: not found

          Measuring inconsistency in meta-analyses.

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Interrater reliability: the kappa statistic

              The kappa statistic is frequently used to test interrater reliability. The importance of rater reliability lies in the fact that it represents the extent to which the data collected in the study are correct representations of the variables measured. Measurement of the extent to which data collectors (raters) assign the same score to the same variable is called interrater reliability. While there have been a variety of methods to measure interrater reliability, traditionally it was measured as percent agreement, calculated as the number of agreement scores divided by the total number of scores. In 1960, Jacob Cohen critiqued use of percent agreement due to its inability to account for chance agreement. He introduced the Cohen’s kappa, developed to account for the possibility that raters actually guess on at least some variables due to uncertainty. Like most correlation statistics, the kappa can range from −1 to +1. While the kappa is one of the most commonly used statistics to test interrater reliability, it has limitations. Judgments about what level of kappa should be acceptable for health research are questioned. Cohen’s suggested interpretation may be too lenient for health related studies because it implies that a score as low as 0.41 might be acceptable. Kappa and percent agreement are compared, and levels for both kappa and percent agreement that should be demanded in healthcare studies are suggested.
                Bookmark

                Author and article information

                Journal
                Am J Prev Med
                American journal of preventive medicine
                Elsevier BV
                1873-2607
                0749-3797
                May 2017
                : 52
                : 5
                Affiliations
                [1 ] Institute for Positive Psychology and Education, Australian Catholic University, Strathfield, New South Wales, Australia.
                [2 ] Priority Research Centre for Physical Activity and Nutrition, University of Newcastle, Callaghan, New South Wales, Australia.
                [3 ] Population Wellbeing and Environment Research Lab (PowerLab), School of Health and Society, Faculty of Social Sciences, University of Wollongong, Wollongong, New South Wales, Australia; Early Start Research Institute, Faculty of Social Sciences, University of Wollongong, Wollongong, New South Wales, Australia; Illawarra Health and Medical Research Institute, University of Wollongong, Wollongong, New South Wales, Australia.
                [4 ] Institute for Positive Psychology and Education, Australian Catholic University, Strathfield, New South Wales, Australia. Electronic address: chris.lonsdale@acu.edu.au.
                Article
                S0749-3797(16)30689-4
                10.1016/j.amepre.2016.12.008
                28153647
                b336c79a-4dab-43be-9e83-f9c80e35f1b7
                History

                Comments

                Comment on this article