Blind Channel Equalization CMA: Classical - PowerPoint PPT Presentation

About This Presentation
Title:

Blind Channel Equalization CMA: Classical

Description:

Title: Blind Channel Equalization CMA: Classical & Pre-Whitened Author: nrb Last modified by: nrb Created Date: 6/7/2004 3:32:24 AM Document presentation format – PowerPoint PPT presentation

Number of Views:493
Avg rating:3.0/5.0
Slides: 31
Provided by: NRB1
Category:

less

Transcript and Presenter's Notes

Title: Blind Channel Equalization CMA: Classical


1
(No Transcript)
2
I welcome you all to this presentation On
3
Neural Network Applications
Imran Nadeem Naveed R. Butt 220504 230353
  • Systems Engineering Dept. KFUPM

4
Part II LMS RBF
  • Part I Introduction to Neural Networks

Part III Control Applications
5
Part I Introduction to NNs
  • There is no restriction on the unknown function
    to be linear. Thus, neural networks provide a
    logical extension to create nonlinear adaptive
    control schemes.
  • Universal Approximation Theorem neural networks
    can reproduce any nonlinear function for a
    limited input set.
  • Neural networks are parameterized nonlinear
    functions whose parameters can be adjusted to
    achieve different shaped nonlinearities.
  • In essence, we try to adjust the neural network
    to serve as an approximator for an unknown
    function that we know only through its inputs and
    outputs

6
Human Neuron
7
Artificial Neuron
8
Adaptation in NNs
9
Single Layer Feedforward NNs
10
Multi-Layer Feedforward NNs
11
Recurrent (feedback) NNs
A recurrent neural network distinguishes itself
from the feed-forward network in that it has at
least one feedback loop. For example, a recurrent
network may consist of a single layer of neurons
with each neuron feeding its output signal back
to the input of all input neurons.
12
Recurrent (feedback) NNs
The presence of feedback loops has a profound
impact on the learning capability of the network
and on its performance.
13
Applications of NNs
Neural networks are applicable in virtually every
situation in which a relationship between the
predictor variables (independents, inputs) and
predicted variables (dependents, outputs) exists,
even when that relationship is very complex and
not easy to articulate in the usual terms of
"correlations" or "differences between groups
14
Applications of NNs
  • Detection of medical phenomena
  • Stock market prediction
  • Credit assignment
  • Condition Monitoring
  • Signature analysis
  • Process control
  • Nonlinear Identification Adaptive Control

15
End of Part I
16
Part II LMS RBF
LMS The Adaptation Algorithm
RBF Radial Bases Function NN
17
LMS The Adaptation Algo.
18
RBF-NNs
Radial functions are a special class of
functions. Their characteristic feature is that
their response decreases (or increases)
monotonically with distance from a central point
and they are radially symmetric.
19
RBF-NNs
Gaussian RBF
20
RBF-NNs
Neural Networks based on radial bases functions
are known as RBF Neural Networks and are among
the most commonly used Neural Networks
21
RBF-NNs
  • Two-layer feed-forward networks.
  • Hidden nodes radial basis functions.
  • Output nodes linear summation.
  • Very fast learning
  • Good for interpolation, estimation
    Classification

22
Part III Control Applications
Nonlinear System Identification
Adaptive Tracking of Nonlinear Plants
23
Nonlinear System Identification
24
Nonlinear System Identification
Continuously Stirred Tank Reactor
25
Nonlinear System Identification
Simulation Results Using SIMULINK
26
Adaptive Nonlinear Tracking
27
Adaptive Nonlinear Tracking
Hammerstein Model
28
Adaptive Nonlinear Tracking
Simulation Results Using SIMULINK
29
Adaptive Nonlinear Tracking
Simulation Results Using SIMULINK
30
Thank you
Write a Comment
User Comments (0)
About PowerShow.com