13
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Sparse Multivariate Regression With Covariance Estimation.

      Journal of computational and graphical statistics : a joint publication of American Statistical Association, Institute of Mathematical Statistics, Interface Foundation of North America
      Informa UK Ltd.
      Sparsity, Multiple output regression, Lasso, High dimension low sample size

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We propose a procedure for constructing a sparse estimator of a multivariate regression coefficient matrix that accounts for correlation of the response variables. This method, which we call multivariate regression with covariance estimation (MRCE), involves penalized likelihood with simultaneous estimation of the regression coefficients and the covariance structure. An efficient optimization algorithm and a fast approximation are developed for computing MRCE. Using simulation studies, we show that the proposed method outperforms relevant competitors when the responses are highly correlated. We also apply the new method to a finance example on predicting asset returns. An R-package containing this dataset and code for computing MRCE and its approximation are available online.

          Related collections

          Most cited references19

          • Record: found
          • Abstract: found
          • Article: not found

          Sparse inverse covariance estimation with the graphical lasso.

          We consider the problem of estimating sparse graphs by a lasso penalty applied to the inverse covariance matrix. Using a coordinate descent procedure for the lasso, we develop a simple algorithm--the graphical lasso--that is remarkably fast: It solves a 1000-node problem ( approximately 500,000 parameters) in at most a minute and is 30-4000 times faster than competing methods. It also provides a conceptual link between the exact problem and the approximation suggested by Meinshausen and Bühlmann (2006). We illustrate the method on some cell-signaling data from proteomics.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Model selection and estimation in the Gaussian graphical model

              M Yuan, Y. Lin (2007)
                Bookmark

                Author and article information

                Journal
                24963268
                4065863
                10.1198/jcgs.2010.09188

                Sparsity,Multiple output regression,Lasso,High dimension low sample size

                Comments

                Comment on this article