Pełnotekstowe zasoby PLDML oraz innych baz dziedzinowych są już dostępne w nowej Bibliotece Nauki.
Zapraszamy na https://bibliotekanauki.pl

PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
1996-1997 | 24 | 4 | 415-423

Tytuł artykułu

Information-type divergence when the likelihood ratios are bounded

Treść / Zawartość

Warianty tytułu

Języki publikacji

EN

Abstrakty

EN
The so-called ϕ-divergence is an important characteristic describing "dissimilarity" of two probability distributions. Many traditional measures of separation used in mathematical statistics and information theory, some of which are mentioned in the note, correspond to particular choices of this divergence. An upper bound on a ϕ-divergence between two probability distributions is derived when the likelihood ratio is bounded. The usefulness of this sharp bound is illustrated by several examples of familiar ϕ-divergences. An extension of this inequality to ϕ-divergences between a finite number of probability distributions with pairwise bounded likelihood ratios is also given.

Rocznik

Tom

24

Numer

4

Strony

415-423

Opis fizyczny

Daty

wydano
1997
otrzymano
1996-09-16
poprawiono
1996-12-10

Twórcy

  • Department of Mathematics and Statistics, University of Maryland at Baltimore County, 1000 Hilltop Circle, Baltimore, Maryland 21250, U.S.A.

Bibliografia

  • [1] D. A. Bloch and L. E. Moses, Nonoptimally weighted least squares, Amer. Statist. 42 (1988), 50-53.
  • [2] T. M. Cover, M. A. Freedman and M. E. Hellman, Optimal finite memory learning algorithms for the finite sample problem, Information Control 30 (1976), 49-85.
  • [3] T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley, New York, 1991.
  • [4] L. Devroy, Non-Uniform Random Variate Generation, Springer, New York, 1986.
  • [5] G. S. Fishman, Monte Carlo: Concepts, Algorithms and Applications, Springer, New York, 1996.
  • [6] L. Györfi and T. Nemetz, f-dissimilarity: A generalization of the affinity of several distributions, Ann. Inst. Statist. Math. 30 (1978), 105-113.
  • [7] G. Pólya and G. Szegő, Problems and Theorems in Analysis. Volume 1: Series, Integral Calculus, Theory of Functions, Springer, New York, 1972.
  • [8] A. L. Rukhin, Lower bound on the error probability for families with bounded likelihood ratios, Proc. Amer. Math. Soc. 119 (1993), 1307-1314.
  • [9] A. L. Rukhin, Recursive testing of multiple hypotheses: Consistency and efficiency of the Bayes rule, Ann. Statist. 22 (1994), 616-633.
  • [10] A. L. Rukhin, Change-point estimation: linear statistics and asymptotic Bayes risk, Math. Methods Statist. 5 (1996), 412-431.
  • [11] J. W. Tukey, Approximate weights, Ann. Math. Statist. 19 (1948), 91-92.
  • [12] I. Vajda, Theory of Statistical Inference and Information, Kluwer, Dordrecht, 1989.

Typ dokumentu

Bibliografia

Identyfikatory

Identyfikator YADDA

bwmeta1.element.bwnjournal-article-zmv24i4p415bwm
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.