Entropy and Information Theory (2nd ed.)

Ina Fourie (University of Pretoria)

Online Information Review

ISSN: 1468-4527

Article publication date: 15 June 2012

217

Keywords

Citation

Fourie, I. (2012), "Entropy and Information Theory (2nd ed.)", Online Information Review, Vol. 36 No. 3, pp. 481-482. https://doi.org/10.1108/14684521211241477

Publisher

:

Emerald Group Publishing Limited

Copyright © 2012, Emerald Group Publishing Limited


In Entropy and Information Theory Robert Gray offers an excellent text to stimulate research in this field. He devotes his attention to the theory of probabilistic information measures and the application to coding theories for information sources and noisy channels, with a strong emphasis on source coding and stationary codes. The intention is to reach engineers interested in the mathematical aspects and general models of the theory, as well as mathematicians interested in the engineering applications of performance bounds and code design for communication systems.

The introduction offers a brief history of the development of Shannon's information theory, emphasising interactions with ergodic theory. This fits well with the intention of the book, which is to present a general development of Shannon's mathematical theory of communication for single user systems, as well as addressing the tools and methods required to prove the Shannon coding theorems, especially the notions of sources, channels, codes, entropy, information, and the entropy ergodic theorem. The 14 chapters that follow cover the following: information sources, pair processes (including channels, codes and couplings), entropy, the entropy ergodic (theorem), distortion and approximation, distortion and entropy, relative entropy, information rates, distortion and information, relative entropy rates, ergodic theorems for densities, source coding theorems, properties of good source codes, and coding noise channels.

Gray acknowledges the influence of Pinsker's Information and Information Stability of Random Variables and Processes (San Francisco: Holden Day, 1964), as well as the work of Kolmogorov, Gelfand, Yaglom and Dobrushin (several titles are listed) on his work. Entropy and Information Theory is also mentioned as a sequel to another of Gray's books, Probability, Random Processes and Ergodic Properties (2nd ed. New York: Springer, 2009).

The book concludes with a list of 199 references and an index. Although of an acceptable quality, I would expect a more detailed and in‐depth index to accompany such a valuable theoretical text.

Entropy and Information Theory is highly recommended as essential reading to academics and researchers in the field, especially to engineers interested in the mathematical aspects and mathematicians interested in the engineering applications. I hope that it will contribute to further synergy between the two fields and the deepening of research efforts.

Related articles