Title: Hypercubes and Neural Networks
1Hypercubes and Neural Networks
2(No Transcript)
3(No Transcript)
4(No Transcript)
5(No Transcript)
6(No Transcript)
7(No Transcript)
8(No Transcript)
9(No Transcript)
10(No Transcript)
11(No Transcript)
12(No Transcript)
13(No Transcript)
14(No Transcript)
15(No Transcript)
16(No Transcript)
17(No Transcript)
18(No Transcript)
19(No Transcript)
20Modeling
21activation level
Net Input
22Saturation
0 lt ai lt 1
23Dynamics
daj/dt Netj (1-aj)(aj)
243 Neuron Example
25Brain State lta1, a2, a3gt
26Thinking
27Binary Model
aj 0 or 1
Neurons are either on or off
28Binary Stability
aj 1 and Netj gt0 Or aj 0 and Netj lt0
29Hypercubes
30(No Transcript)
31(No Transcript)
324-Cube
334-Cube
34(No Transcript)
35(No Transcript)
365-Cube
375-Cube
385-Cube
39(No Transcript)
40Hypercube Computer Game
http//www1.tip.nl/t515027/hypercube.html
41Hypercube Graph
2-Cube
Adjacency Matrix
42Recursive Definition
43Eigenvectors of the Adjacency Matrix
Theorem 1 If v is an eigenvector of Qn-1 with
eigenvalue x then the concatenated vectors
v,v and v,-v are eigenvectors of Qn with
eigenvalues x1 and x-1 respectively.
44Proof
45Generating Eigenvectors and Eigenvalues
46Walsh Functions for n1, 2, 3
47(No Transcript)
48(No Transcript)
49eigenvector
binary number
1 1 1 1 -1 -1 -1 -1
000 001 010 011 100 101 110 111
50n3
51Theorem 3 Let k be the number of 1 choices in
the recursive construction of the eigenvectors of
the n-cube. Then for k not equal to n each
Walsh state has 2n-k-1 non adjacent subcubes of
dimension k that are labeled 1 on their
vertices, and 2n-k-1 non adjacent subcubes of
dimension k that are labeled -1 on their
vertices. If k n then all the vertices are
labeled 1. (Note Here, "non adjacent" means the
subcubes do not share any edges or vertices and
there are no edges between the subcubes).
52n5, k 3
n5, k 2
53Inhibitory Hypercube
54Theorem 5 Each Walsh state with positive, zero,
or negative eigenvalue is an unstable, weakly
stable, or strongly stable state of the
inhibitory hypercube network, respectively.