Title: Introduction to Neural Networks
1Introduction toNeural Networks
2Contents
- Neuronal Networks
- Real to artificial NN
- Bits of history
- Learning
- Information Processing
- Preprocessing
- Selection de variables
- Net parameters red
- Postprocessing
- NN flavors
- Kohonen
- Time recursive
- .
- NN for time series and finances
- Structure of time series
- NN enhancement
- Tutorials
- Antibiotics
- Car insurance
- Credit card
- Sales forecast
- Stocks
3 What is a neural network ?
NN Basics
Model for the brain
Data
Historic Data variables goals
New data variables ??
Neural Networks learn from examples
Matematitian/Physicist Universal Aproximant
(Huge set of functions which is unbiased,
robust, flexible and implements bayesian
inference)
Business man Prediction tool (objetive,
consolidate, adaptable to complex problems,
integrable)
4 What are they good for ?
NN Basics
- Clasification
- Good/Bad client, Helicity of a particle
- Interpolation
- I need to guess the behavior of a client
- Optimize the working of a chemical oven
-
- Modeling
- Build a quantitative model for fire propagation
in cables - Prediction
- Sun spots, Sales forecast
They can be used to deal with any statistical
inference problem
5Idea Copy NatureReal Neural Networks
Real to artificial NN
6Real to artificial NN
Big sets of neurons take control Of highly
especialized tasks Connectivity among sets is
very complex
7Real to artificial NN
Real neural networks differ in shape and tasks
Our brain contains over 1 000 000 000 000
neurons Each neuron handles thousands of
connections Every minute some 10 000 neurons die
in our brain!
8The neuron
Real to artificial NN
- Neuron
- Dendrites
- Axon
- ....
How does a neuron work?
9Real to artificial NN
Flow of charged ions (Calcium)!
Sinapsis
- A neuron can
- colaborate towards the activation
- of other neurons
- inhibite the activation
- of other neurons
10Real to artificial NN
V1 V2 V3
U
If the incoming potencial gets over a
threshold the neuron fires
11Short summary of real NN
Real to artificial NN
- Information processing takes place in neural
networks - Information is transferred by electricity flows
- Neurons die, but information processing remains
robust - A neuron fires depending on a local processing
of inputs - versus threshold
- Sinapsis evolve in time (enhanced / suppressed)
-
12Artificial Neural Networks
13The big picture
Bits of history
- Alan Turing (37), Church, Post Turing Machine
- McCullough and Pitts (43) binary neuron
- John von Neumann von Neumann computer
- Two major schools of thought , 50 60
- symbol manipulation
- Intelligent behavior consists of rules to
manipulate symbols - (subsymbolic level is overlooked)
- pattern matching, or feature detection
- Hearing, vision, taste, and tactile input to
brain - People develop many context-sensitive models of
what to expect as we interact with the world
14Bits of history
top examples parallel fuzzy robust general
down
top rules serial boolean brittle expert dow
n
- Prolog and Lisp, AI machine
- Rule-based expert systems
- mid-1980s realized that the idea was not a
full success - reexamine the work from the 1960s on neural
networks
15Learning to learn
Bits of history
- Hebb (49), Caianello (61)
- First learning algorithm
- Rosenblatt (62)
- Perceptron learning rule
- Minsky Papert (69)
- XOR (CNOT) can not be learnt by perceptron
- Little (74), Hopfield(82),..
- Relation to spin glasses
- Content adressable associative memory
- (80s)Kohonen, Carpenter, Grossberg, Rumelhart,
Zipser - Unsupervised learning
- Werbos (74) ? Parker, Rumelhart, Hinton,
Williams (85) - Error Back-Propagation learning
16Real vs artificial neuron
NN Basics
in weights
activation
threshold
out weights
17How does a neural network work ?
NN Basics
multilayer feedforward Neural Network
capa 1 capa 2 capa l .....
18NN Basics
- The function can make the response of
neurons to - be non-lineal
- The weights w and the thresholds t define the
way information - is processed in every neuron
- The number of layers and neurons in each layer
- define the architecture of the neural network
The algorithm for learning by error
back-propagation (1985) is a systematic
procedure to adjust the weights and thresholds of
a neural networks to reproduce known example
patterns. No need of knowledge of underlying
model is necessary.
19NN Basics
T vs C ?
T
C
T
C
T
c
T
C
- Training
- 0. Random w and t
- 1. Feed an example (T)
- Output T
- fine
- Output C
- error
- Propagate a change of
- w and t through the net
- to reduce error
- 4. Go to 1
T
Supervised learning of T / C
Robust if a neuron dies!
20Serious pattern recognition
A neural network is trained to recognize
military plane patterns The NN detects a
military plane hidden under a commercial one
Belgrado 19/04/1999
21Summary
- Nature has tried many problem solving aproaches
- Neural Networks implement inference through
learning - NN robust, non-linear, adaptable, consolidated,
- learn from incomplete, deteriorated
data - Standard in scientific data analysis
-