16
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Regression analysis for continuous independent variables in medical research: statistical standard and guideline of Life Cycle Committee

      Life Cycle
      Life Cycle

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references17

          • Record: found
          • Abstract: found
          • Article: not found

          Statistics corner: A guide to appropriate use of correlation coefficient in medical research.

          M M Mukaka (2012)
          Correlation is a statistical method used to assess a possible linear association between two continuous variables. It is simple both to calculate and to interpret. However, misuse of correlation is so common among researchers that some statisticians have wished that the method had never been devised at all. The aim of this article is to provide a guide to appropriate use of correlation in medical research and to highlight some misuse. Examples of the applications of the correlation coefficient have been provided using data from statistical simulations as well as real data. Rule of thumb for interpreting size of a correlation coefficient has been provided.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Measuring agreement in method comparison studies

            Agreement between two methods of clinical measurement can be quantified using the differences between observations made using the two methods on the same subjects. The 95% limits of agreement, estimated by mean difference +/- 1.96 standard deviation of the differences, provide an interval within which 95% of differences between measurements by the two methods are expected to lie. We describe how graphical methods can be used to investigate the assumptions of the method and we also give confidence intervals. We extend the basic approach to data where there is a relationship between difference and magnitude, both with a simple logarithmic transformation approach and a new, more general, regression approach. We discuss the importance of the repeatability of each method separately and compare an estimate of this to the limits of agreement. We extend the limits of agreement approach to data with repeated measurements, proposing new estimates for equal numbers of replicates by each method on each subject, for unequal numbers of replicates, and for replicated data collected in pairs, where the underlying value of the quantity being measured is changing. Finally, we describe a nonparametric approach to comparing methods.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Multicollinearity and misleading statistical results

              Jong Kim (2019)
              Multicollinearity represents a high degree of linear intercorrelation between explanatory variables in a multiple regression model and leads to incorrect results of regression analyses. Diagnostic tools of multicollinearity include the variance inflation factor (VIF), condition index and condition number, and variance decomposition proportion (VDP). The multicollinearity can be expressed by the coefficient of determination (Rh 2) of a multiple regression model with one explanatory variable (Xh ) as the model’s response variable and the others (Xi [i≠h] as its explanatory variables. The variance (σh 2) of the regression coefficients constituting the final regression model are proportional to the VIF ( 1 1 - R h 2 ) . Hence, an increase in Rh 2 (strong multicollinearity) increases σh 2. The larger σh 2 produces unreliable probability values and confidence intervals of the regression coefficients. The square root of the ratio of the maximum eigenvalue to each eigenvalue from the correlation matrix of standardized explanatory variables is referred to as the condition index. The condition number is the maximum condition index. Multicollinearity is present when the VIF is higher than 5 to 10 or the condition indices are higher than 10 to 30. However, they cannot indicate multicollinear explanatory variables. VDPs obtained from the eigenvectors can identify the multicollinear variables by showing the extent of the inflation of σh 2 according to each condition index. When two or more VDPs, which correspond to a common condition index higher than 10 to 30, are higher than 0.8 to 0.9, their associated explanatory variables are multicollinear. Excluding multicollinear explanatory variables leads to statistically stable multiple regression models.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                Journal
                Life Cycle
                Life Cycle
                Life Cycle
                2799-8894
                February 19 2022
                February 19 2022
                : 2
                Article
                10.54724/lc.2022.e3
                dc226667-b7bf-45e6-9544-40986b2e913a
                © 2022

                http://creativecommons.org/licenses/by-nc/4.0

                History

                Comments

                Comment on this article