نویسندگان | Mohammad Khorashadizadeh,Mohtashami Borzadaran G.R.,Nayeban S.,Rezaei Roknabadi A.H. |
---|---|
نشریه | Hacettepe Journal of Mathematics and Statistics |
شماره صفحات | 564-579 |
شماره سریال | 48 |
شماره مجلد | 2 |
ضریب تاثیر (IF) | 0.415 |
نوع مقاله | Full Paper |
تاریخ انتشار | 2019 |
رتبه نشریه | ISI |
نوع نشریه | چاپی |
کشور محل چاپ | ترکیه |
نمایه نشریه | ISI،JCR،isc،Scopus |
چکیده مقاله
In the class of unbiased estimators for the parameter functions, the variance of estimator is one of the basic criteria to compare and evaluate the accuracy of the estimators. In many cases the variance has complicated form and we can not compute it, so, by lower bounds, we can approximate it. Many studies have been done on the lower bounds for the variance of an unbiased estimator of the parameter. Another common and popular method that is used in many statistical problems such as variance estimation, is bootstrap method. This method has some advantages and disadvantages that must be careful when using them. In this paper, first we briefly introduce the two famous lower bounds named "Kshirsagar" (one parameter case) and "Bhattacharyya" (one and multi parameter case) bounds and then we extend the Kshirsagar bound in multi parameter case. Also, by giving some examples in different distributions, we compare one and multi parameter Bhattacharyya and Kshirsagar lower bounds with bootstrap method for approximating the variance of the unbiased estimators and show that the mentioned bounds have a better performance than bootstrap method.
tags: Bhattacharyya bound, Bootstrap method, Cramer-Rao bound, Hammersley-Chapman-Robbins bound, Fisher information, Kshirsagar bound.