16
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      A simple imputation algorithm reduced missing data in SF-12 health surveys.

      Journal of Clinical Epidemiology
      Algorithms, Data Interpretation, Statistical, Health Surveys, Humans, Questionnaires, Selection Bias

      Read this article at

      ScienceOpenPublisherPubMed
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The SF-12 Health Survey is a 12-item questionnaire that yields two summary scores (physical and mental health). Neither score can be computed when an item is missing. We explored imputation methods for missing scores for this instrument. Using data from a population-based survey, we tested several ways of imputing simulated missing data. Among 1250 participants, 118 (9.6%) had at last one missing SF-12 item. Missing data were more common among women, older respondents, non-Swiss nationals, and health service users. Among the 1132 respondents with complete data, replacement of any item with the mean population item weight yielded good results: the mean correlation between imputed and true score was 0.979 for both the physical and mental score. Results remained satisfactory when up to three of the six key items for each score (items that contribute predominantly to a given score), and any number of non-key items, were replaced by the mean. Application of this imputation algorithm to the original survey reduced the proportion of missing scores to <1%. Respondents with incomplete surveys, hence imputed scores, had lower scores than respondents with complete data (physical score: 44.9 vs. 49.8, p < 0.001, mental score: 44.4 vs. 46.3, p=0.064). A simple imputation algorithm can substantially reduce the proportion of missing scores for the SF-12 health survey, and consequently reduce non-response bias.

          Related collections

          Author and article information

          Journal
          15680747
          10.1016/j.jclinepi.2004.06.005

          Chemistry
          Algorithms,Data Interpretation, Statistical,Health Surveys,Humans,Questionnaires,Selection Bias

          Comments

          Comment on this article