First Course on Parametric Inference, A, Second Edition
Authors: B. K. Kale
ISBN:
978-81-7319-599-0 Publication Year: 2005
Pages: 308 Binding: Paper Back
About the book
After a brief historical perspective, the text discusses the basic concept of sufficient statistic and the classical approach based on minimum variance unbiased estimator. There is a separate chapter on simultaneous estimation of several parameters. Large sample theory of estimation, based on consistent asymptotically normal estimators obtained by method of moments, percentile and the method of maximum likelihood is also introduced. The tests of hypotheses for finite samples with classical Neyman – Pearson theory is developed pointing out its connection with Bayesian approach. The hypotheses testing and confidence interval techniques are developed leading to likelihood ratio tests, score tests and tests based on maximum likelihood estimators.
Key Features
Table
of content
Preface to the Second Edition / Preface to the First Edition / Introduction / Sufficient Statistic / Minimum Variance Unbiased Estimation / Simultaneous Estimation of Several Parameters / Consistent Estimators / Consistent Asymptotically Normal Estimators / Method of Maximum Likelihood / Tests of Hypotheses –I / Tests of Hypotheses – II / Interval Estimation / Nonparametric Statistical Inference / References / Index.
Audience
Senior Undergraduate (Hons.), Graduate Students and Teachers