On Some Relative Entropy Statistics
Abstract
Statistical entropy is a measure of dispersion or spread of a random variable. Especially when the random variable is nominal, classical measures of dispersion like standard deviation can not be computed. In such cases, measures of variation, including entropy-based statistics;computed by using cell frequencies of a distributionmust be used. The asymptotic properties of entropy statistics have long been studied in literature. Relative entropy plays an important role in evaluating the degree of fit. In other words, relative entropy is a measure of goodness fit of an empirical distribution to a theoretical or hypothesized distribution. In this study for some frequently-used probability distributions,some relative entropy measures are derived by exploiting additivity property of Kullback-Leibler divergence and Jeffreys divergence. Their asymptotic properties under certain assumptions have been discussed. In the end, by some applications, the close relation between relative entropy statistics and other classical test statistics have been emphasized.
Full Text: PDF DOI: 10.15640/jasps.v3n2a5
Abstract
Statistical entropy is a measure of dispersion or spread of a random variable. Especially when the random variable is nominal, classical measures of dispersion like standard deviation can not be computed. In such cases, measures of variation, including entropy-based statistics;computed by using cell frequencies of a distributionmust be used. The asymptotic properties of entropy statistics have long been studied in literature. Relative entropy plays an important role in evaluating the degree of fit. In other words, relative entropy is a measure of goodness fit of an empirical distribution to a theoretical or hypothesized distribution. In this study for some frequently-used probability distributions,some relative entropy measures are derived by exploiting additivity property of Kullback-Leibler divergence and Jeffreys divergence. Their asymptotic properties under certain assumptions have been discussed. In the end, by some applications, the close relation between relative entropy statistics and other classical test statistics have been emphasized.
Full Text: PDF DOI: 10.15640/jasps.v3n2a5
Browse Journals
Journal Policies
Information
Useful Links
- Call for Papers
- Submit Your Paper
- Publish in Your Native Language
- Subscribe the Journal
- Frequently Asked Questions
- Contact the Executive Editor
- Recommend this Journal to Librarian
- View the Current Issue
- View the Previous Issues
- Recommend this Journal to Friends
- Recommend a Special Issue
- Comment on the Journal
- Publish the Conference Proceedings
Latest Activities
Resources
Visiting Status
Today | 7 |
Yesterday | 159 |
This Month | 1425 |
Last Month | 2866 |
All Days | 680226 |
Online | 5 |