Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Bok av Tan
This self-contained tutorial presents a unified treatment of single- and multi-user problems in Shannon's information theory considering in particular the cases that depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. Divided into three parts, the monograph begins with an introduction to binary hypothesis testing. From there the author develops the theme for point-to-point communication systems. Finally, Network Information Theory problems such as channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels are considered. The monograph is written in a didactic nature that makes it accessible for students, researchers and engineers building practical communication systems.