12
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Testing composite hypotheses, Hermite polynomials and optimal estimation of a nonsmooth functional

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          A general lower bound is developed for the minimax risk when estimating an arbitrary functional. The bound is based on testing two composite hypotheses and is shown to be effective in estimating the nonsmooth functional \({\frac{1}{n}}\sum|\theta_i|\) from an observation \(Y\sim N(\theta,I_n)\). This problem exhibits some features that are significantly different from those that occur in estimating conventional smooth functionals. This is a setting where standard techniques fail to yield sharp results. A sharp minimax lower bound is established by applying the general lower bound technique based on testing two composite hypotheses. A key step is the construction of two special priors and bounding the chi-square distance between two normal mixtures. An estimator is constructed using approximation theory and Hermite polynomials and is shown to be asymptotically sharp minimax when the means are bounded by a given value \(M\). It is shown that the minimax risk equals \(\beta_*^2M^2({\frac{\log\log n}{\log n}})^2\) asymptotically, where \(\beta_*\) is the Bernstein constant. The general techniques and results developed in the present paper can also be used to solve other related problems.

          Related collections

          Most cited references2

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Effect of mean on variance function estimation in nonparametric regression

          Variance function estimation in nonparametric regression is considered and the minimax rate of convergence is derived. We are particularly interested in the effect of the unknown mean on the estimation of the variance function. Our results indicate that, contrary to the common practice, it is not desirable to base the estimator of the variance function on the residuals from an optimal estimator of the mean when the mean function is not smooth. Instead it is more desirable to use estimators of the mean with minimal bias. On the other hand, when the mean function is very smooth, our numerical results show that the residual-based method performs better, but not substantial better than the first-order-difference-based estimator. In addition our asymptotic results also correct the optimal rate claimed in Hall and Carroll [J. Roy. Statist. Soc. Ser. B 51 (1989) 3--14].
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            ON A CONJECTURE OF S BERNSTEIN IN APPROXIMATION THEORY

              Bookmark

              Author and article information

              Journal
              16 May 2011
              Article
              10.1214/10-AOS849
              1105.3039
              a30e9b35-9693-414d-97b9-75ae08f1ac9c

              http://arxiv.org/licenses/nonexclusive-distrib/1.0/

              History
              Custom metadata
              IMS-AOS-AOS849
              Annals of Statistics 2011, Vol. 39, No. 2, 1012-1041
              Published in at http://dx.doi.org/10.1214/10-AOS849 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org)
              math.ST stat.TH
              vtex

              Comments

              Comment on this article