Some Generalized Information Measures : Their Characterization and Applications

Bok av Dilip Kumar Sharma
Information theory is a relatively new branch of mathematics that was made mathematically rigorous only in the 1940s. Since information is energy, we have to measure, manage, regulate and control it for the welfare of mankind. The main use of information is to remove uncertainty. In fact, we measure information supplied by the amount of uncertainty removed in an experiment. Hence the measure of information is essentially a measure of uncertainty. Various measures of information such as entropies and directed divergence have attracted the interest of scientific community recently primarily due to their use in several disciplines such as in Biology, Psychology, Economics, Statistics, Cybernetics, Questionnaire theory, coding theory and many more. The aim of this book is to study generalized information measures and their applications. New generalized exponential survival entropies are defined and their important properties and applications have been studied. The generalized 'useful' f-divergence information measures have been introduced and their bounds have been derived.