Comparing Bhattacharyya and Kshirsagar bounds with bootstrap method

AuthorsMohammad Khorashadizadeh,Mohtashami Borzadaran G.R.,Nayeban S.,Rezaei Roknabadi A.H.
JournalHacettepe Journal of Mathematics and Statistics
Page number564-579
Serial number48
Volume number2
IF0.415
Paper TypeFull Paper
Published At2019
Journal GradeISI
Journal TypeTypographic
Journal CountryTurkey
Journal IndexISI،JCR،isc،Scopus

Abstract

In the class of unbiased estimators for the parameter functions, the variance of estimator is one of the basic criteria to compare and evaluate the accuracy of the estimators. In many cases the variance has complicated form and we can not compute it, so, by lower bounds, we can approximate it. Many studies have been done on the lower bounds for the variance of an unbiased estimator of the parameter. Another common and popular method that is used in many statistical problems such as variance estimation, is bootstrap method. This method has some advantages and disadvantages that must be careful when using them. In this paper, first we briefly introduce the two famous lower bounds named "Kshirsagar" (one parameter case) and "Bhattacharyya" (one and multi parameter case) bounds and then we extend the Kshirsagar bound in multi parameter case. Also, by giving some examples in different distributions, we compare one and multi parameter Bhattacharyya and Kshirsagar lower bounds with bootstrap method for approximating the variance of the unbiased estimators and show that the mentioned bounds have a better performance than bootstrap method.

Paper URL

tags: Bhattacharyya bound, Bootstrap method, Cramer-Rao bound, Hammersley-Chapman-Robbins bound, Fisher information, Kshirsagar bound.