ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals

Dukkipati, A and Bhatnagar, S and Murty, MN (2007) Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals. In: Information Sciences, 177 (24). pp. 5707-5714.

Full text not available from this repository. (Request a copy)
Official URL: http://www.sciencedirect.com/science?_ob=ArticleUR...

Abstract

The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KL-entropy) plays a basic role in defining various classical information measures on general spaces. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic definitions are key to extending the ergodic theorems of information theory to non-discrete cases. A fundamental theorem in this respect is the Gelfand-Yaglom-Perez (GYP) Theorem [M.S. Pinsker, Information and Information Stability of Random Variables and Process, 1960, Holden-Day, San Francisco, CA (English ed., 1964, translated and edited by Amiel Feinstein), Theorem. 2.4.2] which states that measure-theoretic relative-entropy equals the suprenmum of relative-entropies over all measurable partitions. This paper states and proves the GYP-theorem for Renyi relative-entropy of order greater than one. Consequently, the result can be easily extended to Tsallis relative-entropy.

Item Type: Journal Article
Additional Information: Copyright of this article belongs to Elsevier Science.
Keywords: Measure space;Kullback-Leibler;Renyi.
Department/Centre: Division of Electrical Sciences > Computer Science & Automation (Formerly, School of Automation)
Date Deposited: 22 Jul 2009 12:10
Last Modified: 22 Jul 2009 12:10
URI: http://eprints.iisc.ernet.in/id/eprint/18553

Actions (login required)

View Item View Item