Handbook of Neural Computing Applications is a collection of articles that deals with neural networks. Some papers review the biology of neural networks, their type and function (structure, dynamics, and learning) and compare a back-propagating perceptron with a Boltzmann machine, or a Hopfield network with a Brain-State-in-a-Box network. Other papers deal with specific neural network types, and also on selecting, configuring, and implementing neural networks. Other papers address specific applications including neurocontrol for the benefit of control engineers and for neural networks researchers. Other applications involve signal processing, spatio-temporal pattern recognition, medical diagnoses, fault diagnoses, robotics, business, data communications, data compression, and adaptive man-machine systems. One paper describes data compression and dimensionality reduction methods that have characteristics, such as high compression ratios to facilitate data storage, strong discrimination of novel data from baseline, rapid operation for software and hardware, as well as the ability to recognized loss of data during compression or reconstruction. The collection can prove helpful for programmers, computer engineers, computer technicians, and computer instructors dealing with many aspects of computers related to programming, hardware interface, networking, engineering or design.
Inhalt
Acknowlegments
Preface
1 Introduction to Neural Networks
1.0 Overview
1.1 Practical Applications
1.2 The Advantages of Neural Networks
1.3 A Definition of Neural Networks
1.4 Summary
References
2 History and Development of Neural Networks
2.0 Overview
2.1 Early Foundations
2.2 Promising and Emerging Technology
2.3 Disenchantment
2.4 Innovation
2.5 Re-Emergence
2.6 Current Status
2.7 Summary
References
3 The Neurological Basis for Neural Computations
3.0 Neuroscience As A Model
3.1 The Single Neuron
3.2 Early Research
3.3 Structural Organization of Biological Neural Systems
3.4 Structurally Linked Dynamics of Biological Neural Systems
3.5 Emergent Properties Arise from the Dynamics of Biological Neural Systems
3.6 Learning in Biological Neural Systems
3.7 Functional Results of Neural Architecture
3.8 Computer Simulations Based on the Brain
References
4 Neural Network Structures: Form Follows Function
4.0 Overview
4.1 Levels of Structural Description
4.2 Neural Micro-Structures
4.3 Neural Meso-Structures
4.4 The Macro-Structure
4.5 Summary
5 Dynamics of Neural Network Operations
5.0 Overview
5.1 Typical Network Dynamics
5.2 Energy Surfaces and Stability Criterion
5.3 Network Structures and Dynamics
References
6 Learning Background for Neural Networks
6.0 Overview
6.1 Intelligence: An Operational Definition
6.2 Learning and Conditioning
6.3 Learned Performance
6.4 Motivation
6.5 Summary
References
7 Multilayer Feedforward Neural Networks I: Delta Rule Learning
7.0 Overview
7.1 Introduction
7.2 The Perceptron Network
7.3 Adaline and Madaline Neural Networks
7.4 The Back-Propagation Network
References
8 Multilayer Feedforward Neural Networks II: Optimizing Learning Methods
8.0 Overview
8.1 The Boltzmann Machine
8.2 The Cauchy Machine: A Refinement of the Boltzmann Machine
8.3 Summary
References
9 Laterally-Connected, Autoassociative Networks
9.0 Overview
9.1 Introduction to Association Networks
9.2 Auto Associative Networks
9.3 The Hopfield/Tank Network
9.4 The Brain-State-In-A-Box Network
9.5 Kanerva's Sparse Distributed Memory Network
9.6 Summary
References
10 Vector-Matching Networks
10.0 Overview
10.1 Introduction
10.2 The Kohonen Learning Vector Quantization Network
10.3 The Self-Organizing Topology-Preserving Map
10.4 Summary
References
11 Feedforward/Feedback (Resonating) Heteroassociative Networks
11.0 Chapter Overview
11.1 Introduction
11.2 The Carpenter/Grossberg Adaptive Resonance Theory Network
11.3 Bidirectional Associative Memories and Related Networks
11.4 Summary
References
12 Multilayer Cooperative/Competitive Networks
12.0 Overview
12.1 Introduction
12.2 Competitive Learning Networks
12.3 Masking Fields
12.4 The Boundary Contour System
12.5 Hierarchical Scene Structures
12.6 The Neocognitron
12.7 Summary
References
13 Hybrid and Complex Networks
13.0 Overview
13.1 Introduction
13.2 Hybrid Networks: The Hamming Network and the Counter-Propagation Network
13.3 Neural Networks Operating in Parallel
13.4 Hierarchies of Similar Networks
13.5 Systems of Different Types of Neural Networks
13.6 Systems of Networks are Useful for Adaptive Control
13.7 Summary
References
14 Choosing A Network: Matching the Architecture to the Application
14.0 Chapter Overvie
14.1 When to use A Neural Network
14.2 What Type of Network?
14.3 Debugging, Testing, and Verifying Neural Network Codes
14.4 Implementing Neural Networks
References
15 Configuring and Optimizing the Back-Propagation Network
15.0 Overview
15.1 Issues in Optimizing and Generalizing Feedforward Networks
15.2 Micro-Structural Considerations
15.3 Meso-Structural Considerations
15.4 Optimizing Network Dynamics
15.5 Learning Rule Modifications
15.6 Modifications to Network Training Schedules and Datasets
References
16 Electronic Hardware Implementations
16.0 Overview
16.1 Analog Implementations
16.2 Digital Neural Network Chips
16.3 Hybrid Neural Network Chips
16.4 Method for Comparing Neural Network Chips
16.5 Summary
Further Reading in Neural Network Hardware Implementation
17 Optical Neuro-Computing
17.0 Overview
17.1 Historical Introduction of Optical Neurocomputing
17.2 Review of Learning Algebras and Architectures
17.3 Associative Memory vs. Wiener Filter and Self-Organization-Map vs. Kalman Filters
17.4 Optical Implementations of Neural Networks
17.5 Comparison Between Electronic and Optic Implementations of Neural Networks
17.6 Hybrid Neurocomputing
17.7 Application to Pattern Recognition and Image Processing
17.8 The Superconducting Mechanism
17.9 The Super-Triode
17.10 The Super-Triode Neurocomputer
17.11 Wave-Front Imaging Telescope with a Focal Plane Array of Super-Triodes
17.12 Space-Borne In-Situ Smart Sensing with Neurocomputing
17.13 Conclusion
Bibliography
18 Neural Networks for Spatio-Temporal Pattern Recognition
18.0 Overview
18.1 Creating Spatial Analogues of Temporal Patterns
18.2 Neural Networks with Time Delays
18.3 Storing and Generating Temporal Patterns Via Recurrent Connections
18.4 Using Neurons with Time-Varying Activations and Summing Information Over Time Intervals
18.5 Neural Nets which have Short-Term and Long-Term Memories
18.6 Frequency Coding in Neural Networks
18.7 Networks with Combinations of Different Tempor…