Neural Networks - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Neural Networks

Description:

We looked at the biological underpinnings of neural networks ... Sigmoid function. Y. X 1. 0. 0. Y = 1, if X greater than/equal to 0, -1 if less than 0 ... – PowerPoint PPT presentation

Number of Views:34
Avg rating:3.0/5.0
Slides: 27
Provided by: cies9
Category:

less

Transcript and Presenter's Notes

Title: Neural Networks


1
Neural Networks
  • Lecture 2
  • By Rob Buxton

2
Last Lecture
  • We looked at the biological underpinnings of
    neural networks
  • This week we will look at some simple
    Supervised neural networks
  • We will discuss-
  • How they work
  • What their limitations are

3
Artificial Neurons
  • We have already briefly discussed the structure
    of a typical artificial neuron, commonly referred
    to as the McCulloch and Pitts neuron
  • What are its key features?

4
An artificial neuron
Neuron
Input Signals
Output Signals
Weights
Sum Activation
5
Inputs
  • These connections receive signals from input
    devices or other neurons
  • The inputs represent the dendrite structures that
    we discussed in lecture 1
  • It is usual for a neuron to have 1 input per
    input vector component

6
Input vectors
  • This is something weve seen before!
  • An input vector has several components each
    corresponding to a feature, or meaningful
    measurement that we have derived

4 component vector
(0.2,0.4,0.7,0.9)
7
Weights
  • The weights are a simplistic way of trying to
    emulate the complex patterns of activity that we
    see in the brain
  • The weights regulate the impact of each input
    value (vector component)
  • The weights can be changed during a training
    period to allow a network to learn to solve a
    problem

8
Neuron summation
  • The neuron needs a method of combining all the
    inputs so that it can act upon the information it
    receives
  • Unlike the biological neuron that combines inputs
    in a complex fashion the artificial neuron simply
    sums the modified input values

9
Decision function
  • Also sometimes called an activation function
  • This is a simple mathematical function that
    determines whether the neuron will fire
  • There are many different types of function that
    have been tried we will only look at the most
    common ones

10
Output
  • This is a value indicating the state of the
    neuron dependent on the inputs presented to it
  • The output value can be discrete, indicating a
    class ie. 0 and 1 or 1 and 1, or real ie. 0.7
    depending upon the type of network and the nature
    of the problem

11
Lets put all that together!
  • The neuron computes the weighted sum of the input
    signals
  • Compares the result with a threshold value
  • If the net input is less than the threshold
    output -1
  • Else output 1


Note this is for the MP neuron the output values
can vary
12
Activation functions
Y
Y
Y
1
1
1
X
X
X
0
0
0
-1
Step function
Sign function
Sigmoid function
Y 1, if X greater than/equal to 0, -1 if less
than 0
Y 1/1e-X
Y 1, if X greater than/equal to 0
13
Turings ideas
  • It is not widely realised, but Turing was
    probably the first person to consider building
    computing machines out of simple neuron-like
    elements
  • He proposed a training regime and his networks,
    had they been adopted, might have been
    considerably more advanced than those we will now
    consider

14
Rosenblatts perceptron
  • The simplest supervised neural network
  • This is a single layer of one or several
    artificial neurons, depending upon the problem at
    hand
  • For example.

15
How the problem dictates the network structure
Our system classifies cubes as large, medium and
small
3 dimensions means 3 inputs

L
Class 1
3 output classes means 3 neurons are needed
M
Class 2
S
Class 3
Note we have only shown the weighted connections
for one of the inputs, if all were shown there
would be 9 weights
w1
i1
w2
w3
16
How does the perceptron learn?
  • Rosenblatt proposed a simple learning algorithm
    whereby the weights could be modified according
    to the amount of error present during training
  • This sounds very confusing, how does it work?

17
Supervised
  • We started the lecture by saying we were going to
    discuss Supervised neural networks today, this
    is where we find out what this means!
  • The learning strategy employed by a supervised NN
    (eg. A perceptron) needs a target value to
    update itself during the training process

18
The algorithm
Target value
w1
1
0
w2
Data
Yd(p) -Y(p)
S
1
w3
0 or 1
?w1
?w2
p p1
Error
?w3
19
The Equations
The output Y is calculated using the equation
above
The changes in weights are derived using the
error, as above
20
Limitations
  • These simple perceptron type neural networks do
    have limitations
  • They can only solve problems where there is a
    straight line decision boundary between classes
  • What does this mean?
  • A lot of real life problems are too complex to be
    bounded by this criteria

21
The solution
  • Perceptrons with multiple layers
  • These have the flexibility to be able to model
    the curved decision boundaries needed in most
    real life situations

22
Multi Layer Perceptrons
  • This is a feedforward neural network with 1 or
    more hidden layers
  • The layout is
  • Input layer of source neurons
  • 1 or more hidden layers
  • An output layer

23
Multi Layer Perceptron
Input layer
Hidden layers
Output layer
24
The Back Propogation Algorithm
  • This is a more complex 2 pass learning strategy
  • Activations are calculated in same way as in
    Rosenblatts perceptron EXCEPT that the
    activation function is sigmoidal
  • The error at each layer is then calculated, and
    this is used to update the weights on the
    connections

25
Advantages
  • Neural networks are a robust and proven AI
    technique
  • They are good at dealing with uncertainty
  • They perform well when dealing with noisy or
    incomplete data

26
Disadvantages
  • They require a lot of training data
  • They provide an opaque solution
  • They can be prone to overtraining
Write a Comment
User Comments (0)
About PowerShow.com