Skip to main content
Entropy logoLink to Entropy
editorial
. 2021 Feb 17;23(2):232. doi: 10.3390/e23020232

Information Theory: Deep Ideas, Wide Perspectives, and Various Applications

Irad Ben-Gal 1,2, Evgeny Kagan 2,3,*
PMCID: PMC7922818  PMID: 33671301

The history of information theory, as a mathematical principle for analyzing data transmission and information communication, was formalized in 1948 with the publication of Claude Shannon’s famous paper “A Mathematical Theory of Communication” [1]. At the heart of the new theory stood the definition of the entropy measure of a random variable, as the average inherent “uncertainty” in the variable’s possible outcomes. However, the origins of this theory were incubated approximately seventy years earlier, when Willard Gibbs [2] and Ludwig Boltzmann [3] published their definition of entropy with a similar form and meaning to Shannon’s entropy. Gibbs and Boltzmann considered entropy within the framework of thermodynamics and used it as a characterization measure of the order in a mechanical system. They framed the laws of thermodynamics using the statistical properties of ensembles of possible states of a physical system composed of many particles.

In contrast to Boltzmann’s entropy, which relies on the number of system states, Gibbs’ and Shannon’s entropy relies on the probabilities of the system states while introducing the uncertainty concept into the entropy definition. This underling uncertainty also appears in the Von Neumann entropy, which is an extension of the classical Gibbs entropy concepts to the field of quantum mechanics.

Such an association between physical parameters and uncertainty led to a revolutionary change in intuitions underlying both physical research and philosophical speculations; even the contemporary cosmological theory is based on the entropy of black holes [4] and on the value which is considered as ‘Information of the Universe’ [5].

More surprising is a relation between entropy and modern mathematical theories, in which entropy appears both as an internal concept as well as an external parameter. For example, following the introduction of the entropy of partitions, and, consequently, the entropy of a map [6], it was applied to the analysis of dynamical systems and used to represent the structure of systems’ evolution. In later studies, the concept of entropy was introduced to algebraic geometry and used to characterize some properties of functional spaces [7]. Similarly, considerations of the entropy within the framework of combinatorics led to the development of algebraic information [8], which associates uncertainty in the sense of Shannon’s measure with the formal structure of combinatorial objects.

The relation between the entropy and uncertainty was further emphasized in optimal search and decision-making, where it was found that an optimal search procedure [9] is related to an optimal coding tree [10], while both are governed by similar search and optimization rules. In another example, studies in biophysics demonstrated that Gibbs entropy and corresponding statistical physics terms can well explain basic learning processes in neural networks [11]. Similarly, recent studies support the use of entropy and information to better understand the way that deep neural networks operate [12].

In the introduction to the well-known textbook “Elements of Information Theory” [13], Cover and Thomas discuss the relationship of the information theory with seven other fields with some leading applications, including mathematics (inequalities), computer science (Kolmogorov complexity), physics (AEP, thermodynamics, quantum information theory), communication theory (limits of communication), probability theory (limit theorems, large deviations), statistics (hypothesis testing, Fisher information), and economics (portfolio theory, Kelly gambling). Today, thirty years after the initial publication of this textbook and seventy years after Shannon’s publication, such a list can be complemented with many more applications in various levels and fields. These include data mining, search theory, machine learning, AI, optimization, robotics, control theory, game theory, decision making, industrial engineering, and biology, as well as even more distant research areas, such as neuroscience, philosophy, social sciences, and cognitive sciences. The wide variety of fields that are using concepts of information theory are represented in the different volumes of the Entropy journal and partially by this Special Issue, which focuses on the applications of information theory in industrial and service systems.

The current issue includes eight papers that represent various information theory applications in different fields of engineering and the industry. The paper titled Project Management Monitoring Based on Expected Duration Entropy by S. Cohen Kaspi, S. Rozenes, and I. Ben-Gal deals with classical industrial and management engineering problems by proposing an analytical method for identifying optimal inspection points in a project management using information theory measures. The paper titled Ordinal Decision-Tree-Based Ensemble Approaches: The Case of Controlling the Daily Local Growth Rate of the COVID-19 Epidemic by G. Singer and M. Marudi presents an ordinal decision tree approach in which an objective-based information gain measure is used to select the classifying attributes. The selected case study is extremely relevant to this epoch as it classifies areas with different growth rates of COVID-19. In contrast, the paper titled Event-Triggered Adaptive Fault Tolerant Control for a Class of Uncertain Nonlinear Systems by C. Zhu, C. Li, X. Chen, K. Zhang, X. Xin, and H. Wei solves the problem of adaptive fault-tolerant control in non-linear systems with uncertain feedback.

Other types of problems are considered in the paper titled Cooperative Detection of Multiple Targets by the Group of Mobile Agents by B. Matzliach, I. Ben-Gal, and E. Kagan and the paper titled Multi-Harmonic Source Localization Based on Sparse Component Analysis and Minimum Conditional Entropy by Y. Du, H. Yang, and X. Ma. The first paper presents information-theoretic based algorithms for the search and detection of hidden targets by a group of autonomous mobile agents, while the second paper applies sparse component analysis and minimum conditional entropy techniques and proposes a method for identifying multiple harmonic source locations in a distributed system.

New directions for using information theory measures are presented in two papers which consider the relation between entropy and other measures of uncertainty developed in fuzzy logic and apply new concepts for the analysis of industrial engineering problems. In particular, the paper titled Entropy-Based GLDS Method for Social Capital Selection of a PPP Project with q-Rung Orthopair Fuzzy Information by L. Liu, J. Wu, G. Wei, C. Wei, J. Wang, and Y. Wei considers the problem of social capital selection in a public-private partnership, for which the method of gained and lost dominance score with the q-rung orthopair fuzzy entropy is applied. The second paper, titled Negation of Pythagorean Fuzzy Number Based on a New Uncertainty Measure Applied in a Service Supplier Selection System, by H. Mao and R. Cai follows more traditional techniques of Pythagorean fuzzy numbers, but, in contrast to conventional studies, it implements a new definition of the entropy of Pythagorean fuzzy numbers and applies it for the evaluation of service quality in service systems with supplier selection.

Finally, the paper titled Inferring Authors’ Relative Contributions to Publications from the Order of their Names when Default Order is Alphabetical by Y. Gerchak demonstrates the effectiveness of the entropy measure to allocate the contribution of authors by the ordering of their names. This is a field that is traditionally considered to be strongly dependent on human judgment or game theory concepts. Such an application sheds a light on new perspectives of information-based decision-making and recommendation systems.

The variety of themes and applications considered even in this limited issue emphasize, once again, that the significant contributions and broadness of information theory concepts are not limited to the analysis of traditional communication systems. The use of these concepts allows us to explain a wide range of natural and artificial phenomena as well as to implement theoretical ideas and precise measures of uncertainty and information in engineering projects and complex systems.

In his famous lecture [14], Eugene Wigner discussed the unreasonable effectiveness of mathematics in natural sciences and, in particular, said “philosophy is the misuse of a terminology which was invented just for this purpose. In the same vein, I would say that mathematics is the science of skillful operations with concepts and rules invented just for this purpose”.

The concepts used in information theory are probabilities and bits; the first are continuous measures of uncertainty, while the second are discrete levels of truth. Today, it seems meaningful to follow Wigner’s thinking and discuss the immense effectiveness of information theory concepts in human reasoning and complex systems’ operation and analysis.

Acknowledgments

We thank the contributing authors, the referees, and the staff of Entropy for their excellence that made this issue possible. Special thanks to Shayna Tang for her kind assistance during all the editing and publication processes.

Author Contributions

I.B.-G. and E.K. contributed equally in the conceptualization and writing of the paper. All authors have read and agreed to the published version of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

Footnotes

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Shannon C.E. A mathematical theory of communication. Bell Syst. Techn. J. 1948;27:379–423, 623–656. doi: 10.1002/j.1538-7305.1948.tb01338.x. [DOI] [Google Scholar]
  • 2.Gibbs J.W. On the equilibrium of heterogeneous substances. Trans. Conn. Acad. Arts Sci. 1875–1876;3:108–248. doi: 10.2475/ajs.s3-16.96.441. [DOI] [Google Scholar]
  • 3.Boltzmann L.E. Über die Beziehung zwischen dem zweiten Hauptsatze der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung respektive den Sätzen über das Wärmegleichgewicht. Wien. Ber. 1877;76:373–435. (Eng. Trans.: Sharp, K.A.; Matschinski, F. Entropy2015, 17, 1971–2009) [Google Scholar]
  • 4.Bekenstein J.D. Black holes and entropy. Phys. Rev. D. 1973;7:2333–2346. doi: 10.1103/PhysRevD.7.2333. [DOI] [Google Scholar]
  • 5.Susskind L. The world as a hologram. J. Math. Phys. 1995;36:6377–6396. doi: 10.1063/1.531249. [DOI] [Google Scholar]
  • 6.Dinaburg E.I. On the relation between various entropy characteristics of dynamical systems. Math. USSR-Izv. 1971;5:337–378. doi: 10.1070/IM1971v005n02ABEH001050. [DOI] [Google Scholar]
  • 7.Gromov M. Entropy, homology and semialgebraic geometry. Astérisque. 1987;663:225–240. [Google Scholar]
  • 8.Goppa V.D. Group representations and algebraic information theory. Izv. Math. 1995;59:1123–1147. doi: 10.1070/IM1995v059n06ABEH000051. [DOI] [Google Scholar]
  • 9.Zimmerman S. An optimal search procedure. Am. Math. Mon. 1959;66:690–693. doi: 10.1080/00029890.1959.11989389. [DOI] [Google Scholar]
  • 10.Huffman D.A. A method of the construction of minimal-redundancy codes. Proc. IRE. 1952;40:1098–1101. doi: 10.1109/JRPROC.1952.273898. [DOI] [Google Scholar]
  • 11.Klimontovich Y.L. Problems in statistical theory of open systems: Criteria for the relative degree of order in self-organization processes. Sov. Phys. Usp. 1989;32:416–433. doi: 10.1070/PU1989v032n05ABEH002717. [DOI] [Google Scholar]
  • 12.Shwartz-Ziv R., Tishby N. Opening the black box of deep neural networks via information. arXiv. 20171703.00810 [Google Scholar]
  • 13.Cover T.M., Thomas J.A. Elements of Information Theory. John Wiley & Sons; New York, NY, USA: 1991. [Google Scholar]
  • 14.Wigner E. The unreasonable effectiveness of mathematics in natural sciences. Comm. Pure Appl. Math. 1960;13:1–14. doi: 10.1002/cpa.3160130102. [DOI] [Google Scholar]

Articles from Entropy are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES