On Some Relative Entropy Statistics
Erhan Ustaoglu, Atif Evren

Abstract
Statistical entropy is a measure of dispersion or spread of a random variable. Especially when the random variable is nominal, classical measures of dispersion like standard deviation can not be computed. In such cases, measures of variation, including entropy-based statistics;computed by using cell frequencies of a distributionmust be used. The asymptotic properties of entropy statistics have long been studied in literature. Relative entropy plays an important role in evaluating the degree of fit. In other words, relative entropy is a measure of goodness fit of an empirical distribution to a theoretical or hypothesized distribution. In this study for some frequently-used probability distributions,some relative entropy measures are derived by exploiting additivity property of Kullback-Leibler divergence and Jeffreys divergence. Their asymptotic properties under certain assumptions have been discussed. In the end, by some applications, the close relation between relative entropy statistics and other classical test statistics have been emphasized.

Full Text: PDF     DOI: 10.15640/jasps.v3n2a5