Entropy Theory and its Application in Environmental and Water
Engineering responds to the need for a book that deals with
basic concepts of entropy theory from a hydrologic and water
engineering perspective and then for a book that deals with
applications of these concepts to a range of water engineering
problems. The range of applications of entropy is constantly
expanding and new areas finding a use for the theory are
continually emerging. The applications of concepts and techniques
vary across different subject areas and this book aims to relate
them directly to practical problems of environmental and water
engineering.
The book presents and explains the Principle of Maximum Entropy
(POME) and the Principle of Minimum Cross Entropy (POMCE) and their
applications to different types of probability distributions.
Spatial and inverse spatial entropy are important for urban
planning and are presented with clarity. Maximum entropy spectral
analysis and minimum cross entropy spectral analysis are powerful
techniques for addressing a variety of problems faced by
environmental and water scientists and engineers and are described
here with illustrative examples.
Giving a thorough introduction to the use of entropy to measure
the unpredictability in environmental and water systems this book
will add an essential statistical method to the toolkit of
postgraduates, researchers and academic hydrologists, water
resource managers, environmental scientists and engineers. It
will also offer a valuable resource for professionals in the same
areas, governmental organizations, private companies as well as
students in earth sciences, civil and agricultural engineering, and
agricultural and rangeland sciences.
This book:
* Provides a thorough introduction to entropy for beginners and
more experienced users
* Uses numerous examples to illustrate the applications of the
theoretical principles
* Allows the reader to apply entropy theory to the solution of
practical problems
* Assumes minimal existing mathematical knowledge
* Discusses the theory and its various aspects in both univariate
and bivariate cases
* Covers newly expanding areas including neural networks from an
entropy perspective and future developments.
Autorentext
Vijay P. Singh, Texas A & M University, USA
Zusammenfassung
Entropy Theory and its Application in Environmental and Water Engineering responds to the need for a book that deals with basic concepts of entropy theory from a hydrologic and water engineering perspective and then for a book that deals with applications of these concepts to a range of water engineering problems. The range of applications of entropy is constantly expanding and new areas finding a use for the theory are continually emerging. The applications of concepts and techniques vary across different subject areas and this book aims to relate them directly to practical problems of environmental and water engineering.
The book presents and explains the Principle of Maximum Entropy (POME) and the Principle of Minimum Cross Entropy (POMCE) and their applications to different types of probability distributions. Spatial and inverse spatial entropy are important for urban planning and are presented with clarity. Maximum entropy spectral analysis and minimum cross entropy spectral analysis are powerful techniques for addressing a variety of problems faced by environmental and water scientists and engineers and are described here with illustrative examples.
Giving a thorough introduction to the use of entropy to measure the unpredictability in environmental and water systems this book will add an essential statistical method to the toolkit of postgraduates, researchers and academic hydrologists, water resource managers, environmental scientists and engineers. It will also offer a valuable resource for professionals in the same areas, governmental organizations, private companies as well as students in earth sciences, civil and agricultural engineering, and agricultural and rangeland sciences.
This book:
- Provides a thorough introduction to entropy for beginners and more experienced users
- Uses numerous examples to illustrate the applications of the theoretical principles
- Allows the reader to apply entropy theory to the solution of practical problems
- Assumes minimal existing mathematical knowledge
- Discusses the theory and its various aspects in both univariate and bivariate cases
- Covers newly expanding areas including neural networks from an entropy perspective and future developments.
Inhalt
Preface, xv
Acknowledgments, xix
1 Introduction, 1
1.1 Systems and their characteristics, 1
1.1.1 Classes of systems, 1
1.1.2 System states, 1
1.1.3 Change of state, 2
1.1.4 Thermodynamic entropy, 3
1.1.5 Evolutive connotation of entropy, 5
1.1.6 Statistical mechanical entropy, 5
1.2 Informational entropies, 7
1.2.1 Types of entropies, 8
1.2.2 Shannon entropy, 9
1.2.3 Information gain function, 12
1.2.4 Boltzmann, Gibbs and Shannon entropies, 14
1.2.5 Negentropy, 15
1.2.6 Exponential entropy, 16
1.2.7 Tsallis entropy, 18
1.2.8 Renyi entropy, 19
1.3 Entropy, information, and uncertainty, 21
1.3.1 Information, 22
1.3.2 Uncertainty and surprise, 24
1.4 Types of uncertainty, 25
1.5 Entropy and related concepts, 27
1.5.1 Information content of data, 27
1.5.2 Criteria for model selection, 28
1.5.3 Hypothesis testing, 29
1.5.4 Risk assessment, 29
Questions, 29
References, 31
Additional References, 32
2 Entropy Theory, 33
2.1 Formulation of entropy, 33
2.2 Shannon entropy, 39
2.3 Connotations of information and entropy, 42
2.3.1 Amount of information, 42
2.3.2 Measure of information, 43
2.3.3 Source of information, 43
2.3.4 Removal of uncertainty, 44
2.3.5 Equivocation, 45
2.3.6 Average amount of information, 45
2.3.7 Measurement system, 46
2.3.8 Information and organization, 46
2.4 Discrete entropy: univariate case and marginal entropy, 46
2.5 Discrete entropy: bivariate case, 52
2.5.1 Joint entropy, 53
2.5.2 Conditional entropy, 53
2.5.3 Transinformation, 57
2.6 Dimensionless entropies, 79
2.7 Bayes theorem, 80
2.8 Informational correlation coefficient, 88
2.9 Coefficient of nontransferred information, 90
2.10 Discrete entropy: multidimensional case, 92
2.11 Continuous entropy, 93
2.11.1 Univariate case, 94
2.11.2 Differential entropy of continuous variables, 97
2.11.3 Variable transformation and entropy, 99
2.11.4 Bivariate case, 100
2.11.5 Multivariate case, 105
2.12 Stochastic processes and entropy, 105
2.13 Effect of proportional class interval, 107
2.14 Effect of the form of probability distribution, 110
2.15 Data with zero values, 111
2.16 Effect of measurement units, 113
2.17 Effect of averaging data, 115
2.18 Effect of measurement error, 116
2.19 Entropy in frequency domain, 118
2.20 Principle of maximum entropy, 118
2.21 Concentration theorem, 119
2.22 Principle of minimum cross entropy, 122
2.23 Relation between entropy and error probability, 123
2.24 Various interpretations of entropy, 125
2.24.1 Measure of randomness or disorder, 125
2.24.2 Measure of unbiasedness or objectivity, 125
2.24.3 Measure of equality, 125
2.24.4 Measure of diversi…