+ All Categories
Home > Documents > Behavioral Fault Model for Neural Networks A. Ahmadi, S. M. Fakhraie, and C. Lucas Silicon...

Behavioral Fault Model for Neural Networks A. Ahmadi, S. M. Fakhraie, and C. Lucas Silicon...

Date post: 13-Dec-2015
Category:
Upload: chad-barker
View: 217 times
Download: 2 times
Share this document with a friend
Popular Tags:
30
Behavioral Fault Model for Neural Networks A. Ahmadi, S. M. Fakhraie, and C. Lucas Silicon Intelligence and VLSI Signal Processing Laboratory, School of Electrical and Computer Engineering, University of Tehran, Tehran, Iran. International Conference on Computer Engineering and Technology 2009 (ICCET 2009) January 23, 2009
Transcript

Behavioral Fault Model for Neural Networks

A. Ahmadi, S. M. Fakhraie, and C. Lucas

Silicon Intelligence and VLSI Signal Processing Laboratory,School of Electrical and Computer Engineering, University of Tehran,

Tehran, Iran.

International Conference on Computer Engineering and Technology 2009 (ICCET 2009) January 23, 2009

Outline

• Introduction to ANNs

• Fault in ANNs

• Conventional Fault Model for ANNs and their Limitation

• Proposed Fault Model and Simulation Results

Outline

• Introduction to ANNs

• Fault in ANNs

• Conventional Fault Model for ANNs and their Limitation

• Proposed Fault Model and Simulation Results

Introduction to ANNs

What are (everyday) computer systems good at... and not so good at?

Good atNot so good at

Rule-based systems: doing what the programmer wants them to do

Dealing with noisy data

Dealing with unknown environment data

Massive parallelism

Fault tolerance

Adapting to circumstances

Introduction to ANNs

• Neural network: information processing paradigm inspired by biological nervous systems, such as our brain

• Structure: large number of highly interconnected processing elements (neurons) working together

• Like people, they learn from experience (by example)

Introduction to ANNs (Applications)

• Prediction: learning from past experience– pick the best stocks in the market

– predict weather

– identify people with cancer risk

• Classification– Image processing

– Predict bankruptcy for credit card companies

– Risk assessment

Introduction to ANNs (Applications)

• Recognition– Pattern recognition: SNOOPE (bomb detector in U.S.

airports)

– Character recognition

– Handwriting: processing checks

• Data association– Not only identify the characters that were scanned but

identify when the scanner is not working properly

Introduction to ANNs (Applications)

• Data Conceptualization– infer grouping relationships

e.g. extract from a database the names of those most likely to buy a particular product.

• Data Filteringe.g. take the noise out of a telephone signal, signal smoothing

• Planning– Unknown environments

– Sensor data is noisy

– Fairly new approach to planning

Introduction to ANNs (Mathematical representation of Artificial Neuron )

The neuron calculates a weighted sum of inputs.

SUMΣ

Activation Function

f()y

Output

x1

x2

xn

w1

w2

wn

Introduction to ANNs (cont’d)

Inputs

Output

An artificial neural network is composed of many artificial neurons that are linked together according to a specific network architecture. The objective of the neural network is to transform the inputs into meaningful outputs.

Outline

• Introduction to ANNs

• Fault in ANNs

• Conventional Fault Model for ANNs and their Limitation

• Proposed Fault Model and Simulation Results

Fault in ANNs

Unit is Functionality of Neuron

It shows that defects will occur in these three components

Fault in ANNs (cont’d)

• ANNs are Fault-tolerance due to:– Non-Linearity of NN– Distributed manner of information storage– Number of neurons in a NN– Difference between training and operational

error margins

Outline

• Introduction to ANNs

• Fault in ANNs

• Conventional Fault Model for ANNs and their Limitation

• Proposed Fault Model and Simulation Results

ANN’s Fault Model

• In [3] a model for fault in neural networks is presented. They assumed that defects in three components of neural networks could be cover with broken link defect.

• In [4] use single error bits and also clusters of bits for weights, and the error model for adders and multipliers can be represented by erroneous computation results.

ANN’s Fault Model (cont’d)

• Conventional Model– Stuck at-0, stuck at-1 for inputs and weights

• Disadvantageous– Every faults in inputs and weights detected as a

fault in ANN.

Outline

• Introduction to ANNs

• Fault in ANNs

• Conventional Fault Model for ANNs and their Limitation

• Proposed Fault Model Simulation Results

Proposed Fault Model

**

*

11,,

ij

j

iijk

ik XWS )( i

ki

k Sf

11iX

12iX

1imX

1,1,ii

kW

1,2,ii

kW

1,,

iimkW

ikX

An Artificial Neuron

Proposed Fault Model (cont’d)

Y = Tanh(x) = xx

xx

ee

ee

- Transition region -3<x<3 - Saturation region 3<x or x<-3

Proposed Fault Model (cont’d)

• All faults enable Error signal.• Some faults could be mask.

• Weighted sum

• In saturation region

Fault should be masked if faulty weighted sum remain in saturation region.

• In transition region

Fault should be masked if faulty (abs(WS – FWS) < μ)

Proposed Fault Model (cont’d)

• Define reliable margin for weighted sum in two regions.

f(FWS) = f(WS ± μ) ~ f(WS)

• Extract μ by Simulation– Inject single fault in inputs and weights and calculate MSE

Proposed Fault Model (cont’d)

0j

jkjk zWs

)()(),( kkkk sTanhsTanhse

ij

iijk

ij

iijk

h

ih

iihk

ik zwzwzws ,1

,,1

,,1

,

1

iijk

ik

ik wss ,1

,11

Saturation Region

Proposed Fault Model (cont’d)

Transition Region

Simulation to Extract μ for saturation region

Simulation to Extract μ for transition region

Simulation Results

• Model an ANN in C++ and inject fault. With extracted μ many faults

could be masked. results for XOR Problem (2-3-1).

#of injected faults

#of masked faults

#faults recognized (our fault

model)

#faults recognized (conventional

model)

#faultsAppear in

output

Weights144(2*3 +3*1)*8*2

13681448

Inputs3220123212

Total1761562017620

Simulation Results

#of injected faults

#of masked faults

#faults recognized (our fault

model)

#faults recognized (conventional

model)

#faultsAppear in

output

Weights27000250002000270002000

Inputs520165355520165

Total27520251652355275202165

Results for character recognition Problem (65-15-10).

Questions about this Fault Model

• Why is this Model important? – ANNs have an inherit tolerance to faults.

• How could be use?– In all fault-tolerant methods in fault detection phase,

compare two outputs to detect faults, by using this model compare should be as:

abs(output1 – output2) > μ

So it could be used in TPG and FT methods

References[1] A.S. Pandya, Pattern Recognition with Neural network using C++ , IEEE PRESS, J. New York, 2nd

ed. vol. 3.

[2] J. L. Holt, J.N. Hwang. “Finite error precision analysis of neural network hardware implementation”, IEEE Transactions on Computers, vol. 42, no. 3, March 1993, pp. 1380-1389.

[3] K. Takahashi, et. al. “Comparison with defect compensation methods for feed-forward neural networks”, International Symposium on Dependable Computing (PRDC’02), 4/02, 0-7695-1852.

[4] D. Uwemedimo “A fault tolerant technique for feed-forward neural networks,” PHD’s thesis University of Saskatchewan Fall 1997.

[5] L. Breveglieri and V. Piuri, “Error detection in digital neural networks: an algorithm-based approach for inner product protection,” Advance Signal Processing, San Diego CA, July 1994,' pp. 809_820.

[6] M. Stevenson, R. Winter, and B. Widrow, “Sensitivity of feedforward neural networks to weight errors,” IEEE Trans. Neural Networks 1 (March 1990), 71_80.

[7] C. Lehmann and F. Blayo, “A VLSI implementation of a generic systolic synaptic building block for neural networks”, workshop on VLSI for Artificial Intelligence and Neural Networks, Oxford, UK,1990.'

[8] D.S. Phatak and I. Koren, “Complete and partial fault tolerance of feedforward neural nets,” IEEE Trans. Neural Network., 1995, pp.446–456.

[9] T. Horita, et. al, “Learning algorithms which make multilayer neural networks multiple-weight-and-neuron-fault”, IEICE Trans. Inf. &Syst., VOL.E91–D, NO.4 APRIL 2008.

Thanks for your attention

Questions?


Recommended