New Bounds for Entropy of Information Sources

Document Type: Research Paper


Department of Mathematics, Sirjan University Of Technology, Sirjan, Islamic Republic of Iran.



Shannon's entropy plays an important role in information theory, dynamical systems and thermodynamics. In this paper we applying Jensen's inequality in information theory and we obtain some results for the Shannon's entropy of random variables and Shannon's entropy of stochastic process. Also we obtain upper bound and lower bound for Shannon's entropy of information sources.


[1] J.M. Amigo and M.B. Kennel, Topological permutation entropy, Physica D, 231 (2007), 137-142.
[2] J.M. Amigo, Permutation Complexity in Dynamical Systems "Ordinal Patterns, Permutation Entropy, and All That", Springer-Verlag, Berlin, 2010.
[3] Ch. Corda, M. FatehiNia, M.R. Molaei and Y. Sayyari, Entropy of iterated function systems and their relations with black holes and Bohr-Like black holes entropies, Entropy, 20(56) (2018), 2-17.
[4] A. Mehrpooya, Y. Sayyari and M.R. Molaei, Algebraic and Shannon entropies of commutative hypergroups and their connection with information and permutation entropies and with calculation of entropy for chemical algebras, Soft Computing, 23(24) (2019), 13035-13053.
[5] S. SIMIC, Jensen’s inequality and new entropy bounds, Appl. Math. Lett., 22(8) (2009), 1262-1265.
[6] N. Tapus and P.G. Popescu, A new entropy upper bound, Appl. Math. Lett., 25(11) (2012), 1887-1890.
[7] P. Walters, An Introduction to Ergodic Theory, Springer Verlag, New York, 2000.