Statistical Inference Based on Divergence Measures
Book Details
Format
Paperback / Softback
Book Series
Statistics: A Series of Textbooks and Monographs
ISBN-10
0367578018
ISBN-13
9780367578015
Publisher
Taylor & Francis Ltd
Imprint
Chapman & Hall/CRC
Country of Manufacture
GB
Country of Publication
GB
Publication Date
Jun 30th, 2020
Print length
512 Pages
Weight
950 grams
Product Classification:
Probability & statistics
Ksh 9,850.00
Werezi Extended Catalogue
Delivery in 28 days
Delivery Location
Delivery fee: Select location
Delivery in 28 days
Secure
Quality
Fast
Organized in systematic way, Statistical Inference Based on Divergence Measures presents classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence with applications to multinomial and generation populations. On the basis of divergence measures, this book introduces min
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.
Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, presenting the interesting possibility of introducing alternative test statistics to classical ones like Wald, Rao, and likelihood ratio. Each chapter concludes with exercises that clarify the theoretical results and present additional results that complement the main discussions.
Clear, comprehensive, and logically developed, this book offers a unique opportunity to gain not only a new perspective on some standard statistics problems, but the tools to put it into practice.
Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, presenting the interesting possibility of introducing alternative test statistics to classical ones like Wald, Rao, and likelihood ratio. Each chapter concludes with exercises that clarify the theoretical results and present additional results that complement the main discussions.
Clear, comprehensive, and logically developed, this book offers a unique opportunity to gain not only a new perspective on some standard statistics problems, but the tools to put it into practice.
Get Statistical Inference Based on Divergence Measures by at the best price and quality guaranteed only at Werezi Africa's largest book ecommerce store. The book was published by Taylor & Francis Ltd and it has pages.