Title: Functional Networks Framework
1Functional Networks Framework
ICS 581 Advanced Artificial Intelligence
Lecture 15
Dr. Emad A. A. El-Sebakhy
Term 061 Meeting Time 630 -745 Location
Building 22, Room 132
2Agenda
- Introduction
- What is a Neural Network (NN)?
- What is a Functional Network (FunNet)?
- How do Functional Networks work?
- Differences between FunNets and NNs
- Examples of some applications of FunNets
- Summary
3Introduction
The Data Pyramids
4Knowledge Discovery (KD)
- Data is accumulated rapidly and it needs to be
analyzed. - KD is the use of computational intelligence
schemes to extract hidden patterns in the
(useful information) in bodies of data for
use in decision support and estimation. It is
the automated extraction of hidden predictive
information from large databases.
- Prediction or estimation of an outcome
- Classification (supervised Learning)
- Clustering (unsupervised Learning)
5The Common Learning Schemes
6Artificial Neural Networks (ANNs) Background
A neural network is a powerful data modeling tool
that is able to represent complex input/output
relationships (those relationships that can not
be described by traditional methods).
ANN is an information processing system that
tries to simulate the human brain in the
following two ways
- NN acquires knowledge through learning.
- NNs knowledge is stored within inter-neuron
connection strengths known as synaptic weights.
- The Goal of NN
- Create a model that maps the input to the output
using input data. - Model is to predict the desired output when it is
unknown.
7The most common Neural Networks Architecture
Multilayer Perceptron (MLP)
8Advantages of Neural Networks
- The true power and advantage of neural networks
lies in their ability to - represent both linear and non-linear
relationships - learn these relationships directly from the data
being modeled
Disadvantages of Neural Networks
- The weights of the MLP network are initially set
to random values. So that the learning
algorithm will take a large number of iterations
for the learning to converge. - The speed of convergence and stability in the
backpropagation learning algorithm depends
on the magnitude of the learning rate parameter
this will cause oscillations during the
training. - The configuration of the MLP network is determine
by the number of hidden layers, number of the
neurons in each of the hidden layers. The choice
of number of the hidden layers and the
number of the neurons in each of the hidden
layers are still critical. The number of the
hidden layers is determine by trial and error.
9Limitations of Neural Networks
- Ad hoc approach for determining network structure
andthe training process. - Significant inputs are not immediately obvious
- When to stop training to avoid over-fitting ?
- Stuck at local Minima, it is unable to converge
to theoptimal solution because of the initial
random weights. - A neural network model is a relative "black box"
and havelimited ability to explicitly identify
possible causalrelationships. - The multi-layer perceptron feed-forward requires
off-line training and iterative presentation of
the training data. - The choice of hidden layers and number of neurons
ineach hidden layers are optional (trial and
errors). - In practice, it is difficult to determine a
sufficient number of the neurons necessary
to achieve the desired degree of approximation
accuracy.
10Agenda
- References
- What is a Neural Network (NN)?
- What is a Functional Network (FunNet)?
- How do Functional Networks work?
- Differences between FunNets and NNs
- Examples of some applications of FunNets
- Summary
?
?
11What is a Functional Network (FunNet)?
Like neural network, a functional network is a
powerful data modeling tool that is able to
capture and represent complex input/output
relationships.
Functional networks, however, are a
generalization or extension of neural networks.
They are also problem driven (not a black box).
12The Mathematical Definition of FunNet
A Functional Network is a pair ,
where X is a set of nodes and
is a set of
functions over X, such that every node must be
either an input or an output node of at least one
neuron function in
13A FunNet is analogous to a Printed Circuit Board
(PCB)
14Elements of Functional Networks
1. Input Units a, b, c, d, e, f, g
f
e
L
g
h
a
N
K
b
c
M
d
j
i
15Elements of Functional Networks
2. Computing Neurons K, L, M, N
f
e
L
g
h
a
N
K
b
c
M
d
j
i
16Elements of Functional Networks
3. Output Units i, j
f
e
L
g
h
a
N
K
b
c
M
d
j
i
17Elements of Functional Networks
4. Intermediate Units h
f
e
L
g
h
a
N
K
b
c
M
d
j
i
18Elements of Functional Networks
5. Directed Links arrows
f
e
L
g
h
a
N
K
b
c
M
d
j
i
19Note N gives two outputs i and j and the i
output of N must be identical to the output of M.
20Agenda
- References
- What is a Neural Network (NN)?
- What is a Functional Network (FN)?
- How do Functional Networks work?
- Differences between FNs and NNs
- Examples of some applications of FNs
- Summary
?
?
?
21How Do Functional Networks Work?
- Selection of the initial topology
- Simplifying the initial topology
- Uniqueness of representation
- Parametric Learning
- Model selection
- Model validation
221. Selection of the Initial Topology
- Problem driven design The selection of the
initial topology of a functional network is often
based on the characteristics of the problem at
hand.
23Example Medical Diagnosis
Suppose the level of a disease d is a function of
three symptoms x, y and z, that is, d D(x, y,
z).
Suppose we obtain the symptoms in three
different sequences Case 1 We measure x and
y, then z.
24Example Medical Diagnosis
Case 2 We measure y and z, then x.
25Example Medical Diagnosis
26Example Medical Diagnosis
Combine the three cases
Intermediate units
Output unit
This is the initial topology of the initial
functional network
27Corresponding topology of the ANNs
Whats the difference between ANNs and FunNets.
282. Simplifying Functional Nets
- Can the initial topology of a FunNet be
simplified? - Using functional equations, we can determine
- whether or not there exists a simpler but
equivalent functional network which gives the
same output for the same input.
29Simplifying Functional Nets (cont)
From which, we have d D(x, y, z) kp(x)
q(y) r(z)
Therefore,
303. Uniqueness of Representation
- Given the topology of a FunNet, we need to know
the conditions for uniqueness (whether or not
several sets of functions (neurons) lead to
exactly the same output for the same input). - See the following list of references for more
details.
- Castillo E., Cobo A., Gómez N., and Hadi A.
(2000), A General Framework for Functional
Networks, Networks, 35, 7082. - Castillo E., Gutiérrez J. M., Hadi A. S., and
Lacruz B. (2001), "Some Applications of
Functional Networks in Statistics and
Engineering," Technometrics, 43, 1024. - Castillo, E., Hadi, A., and Lacruz, B. (2001),
"Optimal Transformations in Multiple Linear
Regression Using Functional Networks,"
Proceedings of the International Work-Conference
on Artificial and natural Neural Networks. IWANN
2001, in Lecture Notes in Computer Science 2084,
Part I, 316324. - Castillo, E., Hadi, A. S., Lacruz and, B., and
Pruneda, R. E. (2003), "Functional Network Models
in Statistics," MonografÃas del Seminario
Matamático GarcÃa de Galdeano, 27, 174177.
31More References
- Emad A. El-Sebakhy, (2004), Functional networks
training algorithm for statistical pattern
recognition Ninth IEEE International Symposium
on Computers and Communications. IEEE Computers
and Communications, V.1, Page(s)92 - 97. - Castillo, E. and Hadi, A. S. (2006), "Functional
Networks," in Encyclopedia of Statistical
Sciences, (Samuel Kotz, N. Balakrishnan, Campbell
B. Read and Brani Vidakovic, eds.), 4, 25732583. - Emad A. El-Sebakhy, (2005), Unconstrained
Functional Networks Classifier the
International Conference of Artificial
Intelligence and Machine Learning (AIML05),
Volume 3, 19-21 December, 2005. Page(s)99 105. - Emad A. El-Sebakhy, K. Faisal, T. El-Bassuny, F.
Azzedin, and A. Al-Suhaim, (2006), Evaluation of
Breast Cancer Tumor Classification with
Unconstrained Functional Networks Classifier
the 4th ACS/IEEE International Conference on
Computer Systems and Applications. 281-287. - Emad A. El-Sebakhy, Faisal A. Kanaan, and Ali S.
Hadi, (2006), Iterative Least Squares Functional
Networks Classifier, IEEE Transactions Neural
Networks V.2 March 2007. - Emad A. El-Sebakhy, (2007), Functional Networks
as a Novel Approach for Building Knowledge-Based
Classification System, Journal of Artificial
Intelligence. (In Press). - Emad A. El-Sebakhy, (2007), Constrained
Estimation Functional Networks for Statistical
Pattern Recognition Problems, International
Journal of Machine Learning. (In Press). - Emad A. El-Sebakhy, (2007), Mining the Breast
Cancer Diagnosis Using Functional
Networks-Maximum Likelihood Classifier,
International Journal of Bioinformatics. (In
press).
324. Parametric Learning
Each function in a Functional Networks can be
approximated by families of linearly independent
functions ?j ?j1(X),, ?jq(X), j 1,
, p, where p is the number of functions and q is
the number of elements in a family.
The common families of Linearly Independent
Functions
Polynomial family
Exponential family
Fourier family
33Example Nonlinear Parametric Learning
Medical Diagnosis Example (cont)
Let
be the training set,
we can write the model as
These functions can be approximated by
34Nonlinear Parametric Learning
The parameters Q aj, bj, cj, dj, j 1,, q,
can then be estimated by minimizing some
functions of the errors such as
subject to the uniqueness constraints.
These lead to a nonlinear system of equations or
to nonlinear programming problems.
355. Model Selection
- There are two questions to be answered when
selecting a functional network
- Which family of functions to use?
- Which terms in the family are important?
6. Model Validation
Tests for quality and cross validations are
performed. Using internal and external validation
techniques. See the following list of references
for more details.
36Model Selection Let x be a sample of size n and
let ? be the set of parameters to be estimated
- We select the best model as follows
- Selection methods
- Backward-Forward (BF)
- Forward-Backward (FB)
- Quality Criteria We use the MDL, that is,
37More References
- Emad A. El-Sebakhy, (2004), Functional networks
training algorithm for statistical pattern
recognition Ninth IEEE International Symposium
on Computers and Communications. IEEE Computers
and Communications, V.1, Page(s)92 - 97. - Castillo, E. and Hadi, A. S. (2006), "Functional
Networks," in Encyclopedia of Statistical
Sciences, (Samuel Kotz, N. Balakrishnan, Campbell
B. Read and Brani Vidakovic, eds.), 4, 25732583. - Emad A. El-Sebakhy, (2005), Unconstrained
Functional Networks Classifier the
International Conference of Artificial
Intelligence and Machine Learning (AIML05),
Volume 3, 19-21 December, 2005. Page(s)99 105. - Emad A. El-Sebakhy, K. Faisal, T. El-Bassuny, F.
Azzedin, and A. Al-Suhaim, (2006), Evaluation of
Breast Cancer Tumor Classification with
Unconstrained Functional Networks Classifier
the 4th ACS/IEEE International Conference on
Computer Systems and Applications. 281-287. - Emad A. El-Sebakhy, Faisal A. Kanaan, and Ali S.
Hadi, (2006), Iterative Least Squares Functional
Networks Classifier, IEEE Transactions Neural
Networks V.2 March 2007. - Emad A. El-Sebakhy, (2007), Functional Networks
as a Novel Approach for Building Knowledge-Based
Classification System, Journal of Artificial
Intelligence. (In Press). - Emad A. El-Sebakhy, (2007), Constrained
Estimation Functional Networks for Statistical
Pattern Recognition Problems, International
Journal of Machine Learning. (In Press). - Emad A. El-Sebakhy, (2007), Mining the Breast
Cancer Diagnosis Using Functional
Networks-Maximum Likelihood Classifier,
International Journal of Bioinformatics. (In
press).
38From data to predictions
FunNets Architectural Design
Implementation and Prediction
Functional Networks Training Algorithm
FunNets Learning Algorithm
39Agenda
- References
- What is a Neural Network (NN)?
- What is a Functional Network (FN)?
- How do Functional Networks work?
- Differences between FNs and NNs
- Examples of some applications of FNs
- Summary
?
?
?
?
40Differences Between FunNets and ANNs
- The topology of a NN is chosen from among
severaltopologies using trial and error. The
initial topology inFunNet is a problem driven
and can be simplified usingfunctional equations. - In standard NNs, the neural functions are given
andweights are learned. In FunNets, the neural
functions arelearned from data. - In standard NNs all the neural functions are
identical,univariate and single-argument (a
weighted sum of inputvalues). In FunNets the
neural functions can be different,multivariate,
and/or multiargument. - In FunNets, common outputs of different
functions(neurons) are forced to be identical.
This structureis not possible in standard neural
networks.
41Agenda
- References
- What is a Neural Network (NN)?
- What is a Functional Network (FunNets)?
- How do Functional Networks work?
- Differences between FunNets and NNs
- Examples of some applications of FunNets
- Summary
?
?
?
?
?
42Typical Applications of Functional Networks
- Optical Character Recognition (OCR) Scanning
typewritten/handwritten documents, finger prints,
etc - Voice Recognition Transcribing spoken words into
ASCII text - Medical Diagnosis Assisting doctors with their
diagnosis by analyzing the reported symptoms
and/or medical imaging data such as MRIs or
X-rays - Machine Diagnostics Detect when a machine has
failed so that the system can automatically shut
down the machine when this occurs - Target Recognition Military application which
uses video and/or infrared image data to
determine if an enemy target is present
43Typical Applications of Functional Networks
- Targeted Marketing Finding the set of
demographics which have the highest response rate
for a particular marketing campaign - Intelligent Searching An internet search engine
that provides the most relevant content and
banner ads based on the users' past behavior - Fraud Detection Detect fraudulent credit card
transactions and automatically decline the charge
44Some Examples of Functional Networks
- Modeling structural engineering problems
- Bayesian Statistics
- Time series
- Iterative problems
- Bioinformatics
- Transformations of variables
- Nonlinear Regression
- Pattern Classification
- Cryptography Security
- Signal Processing with complex arguments
45Example Iterative Functions
- Suppose that we wish to calculate the n-th
iterate of a given function f, that is
1. Selecting the Initial Topology
y
2. Simplifying the Initial Topology
Let
Since
then
462. Simplifying the Initial Topology
A general solution of this functional equation is
So, the two functional networks are equivalent
473. Uniqueness of Representation
Since
The uniqueness of representation implies
solving the functional equation
with unique solution
where c is an arbitrary constant.
So, the function g must be fixed at a point.
484. Learning the Model
494. Learning the Model
50Example Nonlinear Regression
- Consider the semi-parametric regression model
- h(y) 1(x1) q(xq) (1)
- FunNets do not require the functions h(.), 1(.),
, q(.) to be known.
51Assuming that h(.) is invertible, the
semi-parametric regression model can be
represented by y
h-11(x1) q(xq)and the following
functional network
Example Nonlinear Regression
52Numeric Example
A data set consisting of n 40 observations is
generated from where X1, X2 are U(0, 1) and
is U(-0.005, 0.005).
53We now fit the following model to these data
(y) 1(x1) 2(x2), which is equivalent
to y -11(x1) 2(x2).
Numeric Example
54An Example
Consider the function
and suppose that we are interested in its
n-iterate.
Assume also that we have a set of data points
where to learn the function
f(x).
Then, we select a polynomial family and learn
55An Example
Then, we select a polynomial family and minimize
56An Example
We get the following models
Exhaustive and Backward-Forward Method
57An Example
Forward-backward method
58An Example
Forward-backward method
59An Example
Exact and predicted values for different values
of n.
60Questions?
61(No Transcript)