+ All Categories
Home > Documents > October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the...

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the...

Date post: 21-Dec-2015
Category:
View: 213 times
Download: 0 times
Share this document with a friend
14
October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Example I: Predicting the Weather Weather We decide (or experimentally determine) We decide (or experimentally determine) to use a to use a hidden layer hidden layer with with 42 42 sigmoidal sigmoidal neurons. neurons. In summary, our network has In summary, our network has 207 input neurons 207 input neurons 42 hidden neurons 42 hidden neurons 4 output neurons 4 output neurons Because of the small output vectors, 42 Because of the small output vectors, 42 hidden units may suffice for this hidden units may suffice for this application. application.
Transcript

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

1

Example I: Predicting the WeatherExample I: Predicting the WeatherWe decide (or experimentally determine) to use a We decide (or experimentally determine) to use a hidden layerhidden layer with with 4242 sigmoidal neurons. sigmoidal neurons.

In summary, our network has In summary, our network has

• 207 input neurons207 input neurons

• 42 hidden neurons42 hidden neurons

• 4 output neurons4 output neurons

Because of the small output vectors, 42 hidden units Because of the small output vectors, 42 hidden units may suffice for this application.may suffice for this application.

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

2

Example I: Predicting the WeatherExample I: Predicting the WeatherThe next thing we need to do is The next thing we need to do is collecting the collecting the training exemplarstraining exemplars..

First we have to specify what our network is First we have to specify what our network is supposed to do:supposed to do:

In production mode, the network is fed with the In production mode, the network is fed with the current weather conditions, and its output will be current weather conditions, and its output will be interpreted as the weather forecast for tomorrow.interpreted as the weather forecast for tomorrow.

Therefore, in training mode, we have to present the Therefore, in training mode, we have to present the network with exemplars that associate known past network with exemplars that associate known past weather conditions at a time weather conditions at a time tt with the conditions at with the conditions at t – 24 hrst – 24 hrs..

So we have to collect a set of So we have to collect a set of historical exemplarshistorical exemplars with known correct output for every input.with known correct output for every input.

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

3

Example I: Predicting the WeatherExample I: Predicting the WeatherObviously, if such data is unavailable, we have to start Obviously, if such data is unavailable, we have to start collecting them.collecting them.

The selection of exemplars that we need depends, The selection of exemplars that we need depends, among other factors, on the among other factors, on the amountamount of changes in of changes in weather at our location.weather at our location.

For example, in For example, in HonoluluHonolulu, Hawaii, our exemplars may , Hawaii, our exemplars may not have to cover all seasons, because there is little not have to cover all seasons, because there is little variation in the weather.variation in the weather.

In In BostonBoston, however, we would need to include data from , however, we would need to include data from every calendar month because of dramatic changes in every calendar month because of dramatic changes in weather across seasons.weather across seasons.

As we know, some winters in Boston are much harder As we know, some winters in Boston are much harder than others, so it might be a good idea to collect data for than others, so it might be a good idea to collect data for several yearsseveral years..

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

4

Example I: Predicting the WeatherExample I: Predicting the WeatherAnd how about the And how about the granularity granularity of our exemplar data, of our exemplar data, i.e., the frequency of measurement?i.e., the frequency of measurement?

Using one sample per day would be a natural choice, Using one sample per day would be a natural choice, but it would neglect rapid changes in weather.but it would neglect rapid changes in weather.

If we use hourly If we use hourly instantaneousinstantaneous samples, however, samples, however, we increase the likelihood of conflicts.we increase the likelihood of conflicts.

Therefore, we decide to do the following:Therefore, we decide to do the following:

We will collect input data We will collect input data every hourevery hour, but the , but the corresponding output pattern will be the corresponding output pattern will be the averageaverage of of the instantaneous patterns over a 12-hour period.the instantaneous patterns over a 12-hour period.

This way we reduce the possibility of errors while This way we reduce the possibility of errors while increasing the amount of training data.increasing the amount of training data.

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

5

Example I: Predicting the WeatherExample I: Predicting the Weather

Now we have to Now we have to train train our network.our network.

If we use samples in one-hour intervals for one year, If we use samples in one-hour intervals for one year, we have we have 8,760 exemplars8,760 exemplars..

Our network has 207Our network has 20742 + 4242 + 424 = 8862 weights, 4 = 8862 weights, which means that data from which means that data from ten yearsten years, i.e., , i.e., 87,600 87,600 exemplarsexemplars would be desirable (rule of thumb). would be desirable (rule of thumb).

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

6

Example I: Predicting the WeatherExample I: Predicting the Weather

Since with a large number of samples the hold-one-Since with a large number of samples the hold-one-out training method is very time consuming, we decide out training method is very time consuming, we decide to use to use partial-set trainingpartial-set training instead. instead.

The best way to do this would be to acquire a The best way to do this would be to acquire a test settest set (control set), that is, another set of input-output pairs (control set), that is, another set of input-output pairs measured on random days and at random times.measured on random days and at random times.

After training the network with the 87,600 exemplars, After training the network with the 87,600 exemplars, we could then use the test set to we could then use the test set to evaluateevaluate the the performance of our network.performance of our network.

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

7

Example I: Predicting the WeatherExample I: Predicting the Weather

Neural network troubleshooting:Neural network troubleshooting:

• Plot the global error as a function of the training Plot the global error as a function of the training epoch. The error should decrease after every epoch. The error should decrease after every epoch. If it oscillates, do the following tests.epoch. If it oscillates, do the following tests.

• Try reducing the size of the training set. If then the Try reducing the size of the training set. If then the network converges, a conflict may exist in the network converges, a conflict may exist in the exemplars.exemplars.

• If the network still does not converge, continue If the network still does not converge, continue pruning the training set until it does converge. Then pruning the training set until it does converge. Then add exemplars back gradually, thereby detecting add exemplars back gradually, thereby detecting the ones that cause conflicts.the ones that cause conflicts.

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

8

Example I: Predicting the WeatherExample I: Predicting the Weather

• If this still does not work, look for saturated neurons (extreme If this still does not work, look for saturated neurons (extreme weights) in the hidden layer. If you find those, add more weights) in the hidden layer. If you find those, add more hidden-layer neurons, possibly an extra 20%.hidden-layer neurons, possibly an extra 20%.

• If there are no saturated units and the problems still exist, try If there are no saturated units and the problems still exist, try lowering the learning parameter lowering the learning parameter and training longer. and training longer.

• If the network converges but does not accurately learn the If the network converges but does not accurately learn the desired function, evaluate the coverage of the training set.desired function, evaluate the coverage of the training set.

• If the coverage is adequate and the network still does not If the coverage is adequate and the network still does not learn the function precisely, you could refine the pattern learn the function precisely, you could refine the pattern representation. For example, you could include a season representation. For example, you could include a season indicator to the input, helping the network to discriminate indicator to the input, helping the network to discriminate between similar inputs that produce very different outputs.between similar inputs that produce very different outputs.

Then you can start predicting the weather!Then you can start predicting the weather!

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

9

Example II: Face RecognitionExample II: Face Recognition

Now let us assume that we want to build a network for Now let us assume that we want to build a network for a computer vision application.a computer vision application.

More specifically, our network is supposed to More specifically, our network is supposed to recognize faces and face poses.recognize faces and face poses.

This is an example that has actually been This is an example that has actually been implemented.implemented.

All information, program code, data, etc, can be found All information, program code, data, etc, can be found at:at:http://www-2.cs.cmu.edu/afs/cs.cmu.edu/user/mitchell/ftp/faces.htmlhttp://www-2.cs.cmu.edu/afs/cs.cmu.edu/user/mitchell/ftp/faces.html

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

10

Example II: Face RecognitionExample II: Face Recognition

The goal is to classify camera images of faces of The goal is to classify camera images of faces of various people in various poses.various people in various poses.

Images of 20 different people were collected, with up Images of 20 different people were collected, with up to 32 images per person.to 32 images per person.

The following variables were introduced:The following variables were introduced:

• expression (happy, sad, angry, neutral)expression (happy, sad, angry, neutral)

• direction of looking (left, right, straight ahead, up)direction of looking (left, right, straight ahead, up)

• sunglasses (yes or no)sunglasses (yes or no)

In total, 624 grayscale images were collected, each In total, 624 grayscale images were collected, each with a resolution of 30 by 32 pixels and intensity with a resolution of 30 by 32 pixels and intensity values between 0 and 255.values between 0 and 255.

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

11

Example II: Face RecognitionExample II: Face RecognitionThe network presented here only has the task of The network presented here only has the task of determining the determining the face poseface pose (left, right, up, straight) (left, right, up, straight) shown in an input image.shown in an input image.

It uses It uses

• 960 input units960 input units (one for each pixel in the image), (one for each pixel in the image),

• 3 hidden units3 hidden units

• 4 output neurons4 output neurons (one for each pose) (one for each pose)

Each output unit receives an Each output unit receives an additional inputadditional input, which , which is always 1.is always 1.

By varying the weight for this input, the By varying the weight for this input, the backpropagation algorithm can adjust an backpropagation algorithm can adjust an offsetoffset for for the net input signal.the net input signal.

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

12

Example II: Face RecognitionExample II: Face RecognitionThe following diagram visualizes all network weights The following diagram visualizes all network weights after 1 epochafter 1 epoch and and after 100 epochsafter 100 epochs..

Their values are indicated by Their values are indicated by brightnessbrightness (ranging (ranging from black = -1 to white = 1).from black = -1 to white = 1).

Each 30 by 32 matrix represents the weights of one of Each 30 by 32 matrix represents the weights of one of the three hidden-layer units.the three hidden-layer units.

Each row of four squares represents the weights of Each row of four squares represents the weights of one output neuron (three weights for the signals from one output neuron (three weights for the signals from the hidden units, and one for the constant signal 1).the hidden units, and one for the constant signal 1).

After training, the network is able to classify After training, the network is able to classify 90%90% of of new (non-trained) face images correctly.new (non-trained) face images correctly.

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

13

Example II: Face RecognitionExample II: Face Recognition

October 14, 2010 Neural Networks Lecture 12: Backpropagation Examples

14

Example III: Character RecognitionExample III: Character Recognition

http://sund.de/netze/applets/BPN/bpn2/ochre.htmlhttp://sund.de/netze/applets/BPN/bpn2/ochre.html


Recommended