Concepts and Recent Advances in Generalized Information Measures and Statistics

Essentials of Information Entropy and Related Measures

Author(s): Raul D. Rossignoli, Andres M. Kowalski and Evaldo M. F. Curado

Pp: 30-56 (27)

DOI: 10.2174/9781608057603113010007

* (Excluding Mailing and Handling)

Abstract

This introductory chapter provides a basic review of the Shannon entropy and of some important related quantities like the joint entropy, the conditional entropy, the mutual information and the relative entropy. We also discuss the Fisher information, the fundamental property of concavity, the basic elements of the maximum entropy approach and the definition of entropy in the Quantum case. We close this chapter with the axioms which determine the Shannon entropy and a brief description of other information measures.


Keywords: Shannon Entropy, Mutual Information, Relative Entropy, Fisher Information, von Neumann Entropy.

Related Books
© 2024 Bentham Science Publishers | Privacy Policy