Title: Robust Neural Networks using Motes
1Robust Neural Networks using Motes
- James Hereford, Tüze Kuyucu
- Department of Physics and Engineering
- Murray State University
2Outline
- Introduction
- Goal
- System overview
- Background information
- Results
- Neural net with independent nodes (motes)
- Training of distributed neural net
- Demonstration of fault recovery
-
3Why interested
- Ultimate goal
- Build devices that never fail
- Applications
- NASA Long journeys (e.g., roundtrip to Mars),
space probes - Hazardous/dangerous operations difficult to
repair by humans
4System Overview
One approach to fault tolerance redundant
components
Multiple, redundant sensors
T1
Processing circuit
T2
Tavg
T3
- Steps
- Derive/evolve circuit to average N sensors.
- Detect a failure.
- Re-evolve to average remaining sensors.
Simple processing nodes provide redundancy which
opens the possibility of fault tolerance in the
processing circuit
5System Overview - neural net
- Neural net Each artificial neuron (node)
receives weighted info from previous layer, sums
data, then passes it to next layer. - Neural nets are trained with an iterative
technique that determines the interconnection
weights. - Challenge reprogram the neural net when it is
unknown a priori which node failed.
Idea If one of our processing units fails,
re-train the whole network using evolvable
programming techniques.
6System Overview
- 2 key questions
- How to build the neural net? What hardware device
to use for each of the nodes? - How to program the neural net? We need a
programming technique that does not require a
priori information.
7System Overview
Hardware components
T1
Multiple, redundant sensors
T2
Tavg
T3
- Required node characteristics
- Multiply/add
- Memory (Store weights)
- Internode communication
- Power
- Mote characteristics
- Processor
- Memory
- Transmit/receive (wireless)
- Power
- Interface to sensor boards
- Software infrastructure (TinyOS, nesC)
Devices?
8Background information
The function of each node is performed by a mote.
Mica2Dot mote
Mica2 mote
Practicalities Crossbow, 125 (Mica2), range
10s of meters, event driven OS, 433 MHz/900
MHz/2.4 Ghz, programming is non-intuitive!
9Background information - Particle Swarm
Optimization
Use PSO to train and re-train neural net
40 39 . . . . 20 . . . . 4 3 2 1 0
Corn Field
2-D Search Space
0 1 2 3 4 ...20..39 40
10Background information - PSO
- 2 update equations
- Velocity
- vn1 vn c1rand(pbestn pn)
c2rand(gbestnpn) - Position
- pn1 pn vn1
- Advantages for our application
- Simple update equations
- No hard functions (e.g., sqrt, sin, fft)
- Can tune algorithm via constants c1 and c2
11Results
- Results in 3 major areas
- Training of neural network with PSO (simulation)
- Fault recovery (simulation hardware)
- Building neural net with independent (physically
distinct) nodes
12Neural Net Training Results
Comparison of classical NN training techniques vs
PSO
2 layer
3 layer
13Neural Net Training Results
Used PSO to re-train the neural net for a
different operation
Successful in all cases!
14Neural Net Training Results
Failure recovery failure in hidden node(s)
2x4x1
Showed fault recovery for NAND and XOR
operations. Successful in all cases-again! Conce
rn Highly variable number of evaluations to
reprogram. Extreme case showed 2001 variation.
2x3x1
2x2x1
15Neural Net - Hardware
- Built hardware NN out of motes
- 2 layer (no hidden layer) neural net
- Training times shorter with PSO than with
perceptron - Demonstrated fault recovery
16Neural Net - Hardware
2 layer neural net with fault recovery
17Neural Net - Hardware
3 layer neural net built using motes
- Programmed successfully off-line
- Weights from simulation stored on motes worked
fine.
18Neural Net - Hardware
- Embedded programming
- Programmed using base mote as master
- All training done by output (base) node and
weight updates sent to hidden layer nodes - Developing distributed training method
- Experiments on-going. Mote to mote feedback
communications problematic.
Once programmed system is able to withstand the
hammer test.
19Acknowledgements
- David Gwaltney NASA-MSFC
- James Humes
- Funding
- Murray State Committee on Institutional Studies
and Research - KY NASA EPSCOR program
20Conclusions
- Simulated and built neural network with
independent processors used for each node - Trained (and re-trained) neural net with PSO
- Used motes to do processing, not just sensing
- Failure recovery approach every node is
identical and replaceable