Title: Neural Networks And Its Applications
1Neural Networks AndIts Applications
2OUTLINE
- Introduction Software
- Basic Neural Network Processing
- Software Exercise Problem/Project
- Complementary Technologies
- Genetic Algorithms
- Fuzzy Logic
- Examples of Applications
- Manufacturing
- RD
- Sales Marketing
- Financial
3Introduction
What is a Neural Network?
- A computing system made up of a number of highly
interconnected processing elements, which
processes information by its dynamic state
response to external inputs - Dr. Robert Hecht-Nielsen
A parallel information processing system based on
the human nervous system consisting of large
number of neurons, which operate in parallel.
4Biological Neuron Its Function
Information Processed in Neuron Cell Body and
Transferred to Next Neuron via Synaptic Terminal
5Processing in Biological Neuron
Neurotransmitters Carry information to Next
Neuron and It is Further Processed in Next Neuron
Cell Body
6Artificial Neuron Its Function
Dendrites
Neuron
Axon
Outputs
Inputs
Processing Element
7Processing Steps Inside a NeuronElectronic
Implementation
- Summed
- Inputs
- Sum
- Min
- Max
- Mean
- OR/AND
Add Bias Weight
- Transform
- Sigmoid
- Hyperbola
- Sine
- Linear
Inputs
Outputs
Processing Element
8Sigmoid Transfer Function
Transfer 1 Function ?????? ( 1 e
(- sum) )
9Basic Neural Network Its Elements
Clustering of Neurons
Bias Neurons
Output Neurons
Hidden Neurons
Input Neurons
10Back-Propagation NetworkForward Output Flow
- Random Set of Weights Generated
- Send Inputs to Neurons
- Each Neuron Computes Its Output
- Calculate Weighted Sum
- I j ? i W i, j-1 X i, j-1 B j
- Transform the Weighted Sum
- X j f (I j) 1/ (1 e (Ij T) )
- Repeat for all the Neurons
11Back-Propagation NetworkBackward Error
Propagation
- Errors are Propagated Backwards
- Update the Network Weights
- Gradient Descent Algorithm
- ?Wji (n) ? ?j Xi
- Wji (n1) Wji (n) ?Wji (n)
- Add Momentum for Convergence
- ?Wji (n) ? ?j Xi ? ?Wji (n-1)
Where n Iteration Number ? Learning
Rate ? Rate of Momentum (0 to 1)
12Back-Propagation NetworkBackward Error
Propagation
- Gradient Descent Algorithm
- Minimization of Mean Squared Errors
- Shape of Error
- Complex
- Multidimensional
- Bowl-Shaped
- Hills and Valleys
- Training by Iterations
- Global Minimum is Challenging
13Simple Transfer Functions
14Recurrent Neural Network
15Time Delay Neural Network
16Training - Supervised
- Both Inputs Outputs are Provided
- Designer Can Manipulate
- Number of Layers
- Neurons per Layer
- Connection Between Layers
- The Summation Transform Function
- Initial Weights
- Rules of Training
- Back Propagation
- Adaptive Feedback Algorithm
17Training - Unsupervised
- Only Inputs are Provided
- System has to Figure Out
- Self Organization
- Adaptation to Input Changes/Patterns
- Grouping of Neurons to Fields
- Topological Order
- Based on Mammalian Brain
- Rules of Training
- Adaptive Feedback Algorithm (Kohonen)
Topology Map one space to another without
changing geometric Configuration
18Traditional Computing Vs. NN Technology
CHARACTERISTICS TRADITIONAL COMPUTING ARTIFICIAL NEURAL NETWORKS
PROCESSING STYLE Sequential Parallel
FUNCTIONS Logically Via Rules, Concepts Calculations Mapping Via Images, Pictures And Controls
LEARNING METHOD By Rules By Example
APPLICATIONS Accounting Word Processing Communications Computing Sensor Processing Speech Recognition Pattern Recognition Text Recognition
19Traditional Computing Vs. NN Technology
CHARACTERISTICS TRADITIONAL COMPUTING ARTIFICIAL NEURAL NETWORKS
PROCESSORS VLSI - Traditional ANN Other Technologies
APPRAOCH One Rule at a time Sequential Multiple Processing Simultaneous
CONNECTIONS Externally Programmable Dynamically Self Programmable
LEARNING Algorithmic Adaptable Continuously
FAULT TOLERANCE None Significant via Neurons
PROGRAMMING Rule Based Self-learning
ABILITY TO TEST Need Big Processors Require Multiple Custom-built Chips
20HISTORY OF NEURAL NETWORKS
TIME PERIOD Neural Network Activity
Early 1950s IBM Simulate Human Thought Process Failed Traditional Computing Progresses Rapidly
1956 Dartmouth Research Project on AI
1959 Stanford Bernard Widrows ADALINE/MADALINE First NN Applied to Real World Problem
1960s PERCEPTRON Cornell Neuro-biologist(RosenBlatt)
1982 Hopfiled CalTech, Modeled Brain for Devices Japanese 5th Generation Computing
1985 NN Conference by IEEE Japanese Threat
1989 US Defense Sponsored Several Projects
Today Several Commercial Applications Still Processing Limitations Chips ( digital,analog, Optical)