+ All Categories
Home > Documents > Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural...

Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural...

Date post: 19-Dec-2015
Category:
View: 219 times
Download: 0 times
Share this document with a friend
45
Neural Network Oleh Danny Manongga
Transcript
Page 1: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Neural Network

Oleh

Danny Manongga

Page 2: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

2

Overview

Basics of Neural Network

Advanced Features of Neural Network

Applications I-II

Summary

Page 3: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

3

Basics of Neural Network

What is a Neural Network Neural Network Classifier Data Normalization Neuron and bias of a neuron Single Layer Feed Forward Limitation Multi Layer Feed Forward Back propagation

Page 4: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

4

Neural Networks

What is a Neural Network?

Similarity with biological network

Fundamental processing elements of a neural network is a neuron

1.Receives inputs from other source

2.Combines them in someway

3.Performs a generally nonlinear operation on the result

4.Outputs the final result

•Biologically motivated approach to machine learning

Page 5: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

5

Similarity with Biological Network

• Fundamental processing element of a neural network is a neuron

• A human brain has 100 billion neurons

• An ant brain has 250,000 neurons

Page 6: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

6

Page 7: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

7

Page 8: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

8

Synapses,the basis of learning and memory

Page 9: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Real Neural Networks

The business end of this is made of lots of these joined in networks like this

Our own computations are performed in/by this network

This type of computer is fabulous at pattern recognition

Page 10: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 11: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 12: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 13: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

13

Neural Network

Neural Network is a set of connected INPUT/OUTPUT UNITS, where each

connection has a WEIGHT associated with it.

Neural Network learning is also called CONNECTIONIST learning due to the connections between units.

It is a case of SUPERVISED, INDUCTIVE or CLASSIFICATION learning.

Page 14: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

14

Neural Network Neural Network learns by adjusting the

weights so as to be able to correctly classify the training data and hence, after testing phase, to classify unknown data.

Neural Network needs long time for training.

Neural Network has a high tolerance to noisy and incomplete data

Page 15: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 16: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Artificial Neural Networks

An artificial neuron (node)

An ANN (neural network)

Nodes do very simple number crunching

Numbers flow from left to right: the numbers arriving at the input layer get“transformed” to a new set of numbers at the output layer.

There are many kinds of nodes, and many ways of combining them intoa network, but we need only be concerned with the types described here,which turn out to be sufficient for any (consistent) pattern classification task.

Page 17: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 18: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 19: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 20: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Different types of node:A Threshold Linear Unit

TLUT=1

2.6

0.1

-1.0

-0.5

2.0

1.0

Output - if weighted sum >= threshold, output 1, else 0Here, weighted sum is: 2.6x-0.5 + 0.1x2.0 + -1.0x1.0 = -2.1TLUs are:

• useful in teaching the basics of NNs• networks of them can do anything, so TLUs have all the required power• but they are difficult to use in practice (hard to design the right network of TLUs for the job in hand), which is why we normally use … (next slide)

0

Page 21: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

A Logistic Unit

f(weighted sum)

2.6

0.1

-1.0

-0.5

2.0

1.0

It calculates the weighted sum of inputs, as beforeAnd then squashes that sum between 0 and –1 using thisfunction:

0.109

umweighted_s1

1)umweighted_s(

ef

Page 22: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

The Logistic Map

)umweighted_s(f

weighted_sum

Doing this makes it easier to train NNs to do what we want them to.(sometimes f is slightly different, e.g. so that output is between –1 and 1)

11

2)umweighted_s(

umweighted_s

e

f

Page 23: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Bias Units

f(weighted sum)

2.6

0.1

-1.0

-0.5

2.0

1.0

0.109

1

In practice, ANNs tend to have logistic units, with one of the inputsfixed to 1 (this is treated as input arriving from a fixed `bias’ unit)

The connection from the bias unit has a weight, which can be varied,and this amounts to a kind of threshold which is gradually learned.

-3.1

Page 24: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Simple ANN Example (with TLUs)

1

0.5

A

B

0.5

1-1

1

This one calculates XOR of the inputs A and B

Each non-input node is an LTU (linear threshold unit), with a threshold of 1.Which means: if the weighted sum of inputs is >= 1, it fires out a 1, Otherwise it fires out a zero.

Page 25: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Computing AND with an NN

0.5

0.5

A

B

The blue node is the output node.It adds the weighted inputs, and outputs 1 if the result is >= 1, otherwise 0.

Page 26: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Computing OR with a NN

1

1

A

B

The green node is the output node.It adds the weighted inputs, and outputs 1 if the result is >= 1, otherwise 0.

With these weights, only one of the inputs needs to be a 1, and the outputwill be 1. Output will be 0 only if both inputs are zero.

Page 27: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Computing NOT with a NN

-1

1

A

Bias unit which alwayssends fixed signal of 1

This NN computes the NOT of input A

The blue unit is a threshold unit with a threshold of 1 as before.

So if A is 1, the weighted sum at the output unit is 0, hence output is 0;If A is 0, the weighted sum is 1, so output is 1.

Page 28: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

So, an NN can compute AND, OR and NOT – so what?

It is straightforward to combine ANNs together, with outputs from some becoming the inputs of others, etc. That is, we can combine them just like logic gates on a microchip.

Since we can compute AND, OR and NOT, we can in fact therefore compute any logical function at all with a network of TLUs.

This is a powerful result, because it means a network of TLUs can do anything.

The catch is that, for most interesting problems (precisely the ones where wewant to use ANNs) we don’t know what the logical function is.

E.g. given gene expression patterns (or blood test results, etc…) from the cells of each of a number of patients, we have no idea what the function is which would indicate which of those patients is in the early stages of cancer.

But, given some data where we know the required outputs, there are algorithms which can find appropriate sets of weights for an ANN, so that:

– The ANN learns an appropriate function by itself– The ANN can therefore provide the correct outputs for each of the patterns it was trained with– The ANN therefore can make predictions of the correct output for patterns it has never seen before

Page 29: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Imagine this. Image of handwritten character converted into array of grey levels (inputs)26 outputs, one for each character

72

3

00

0

a

bcde

f… … …Weights are the links are chosen such that the output corresponding to thecorrect letter emits a 1, and all the others emit a 0.

This sort of thing is not only possible, but routine:Medical diagnosis, wine-tasting, lift-control, sales prediction, …

To put it another way

Page 30: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

If wrong, weights are adjusted in a simple way which makes it more likelythat the ANN will be correct for this input next time

Getting the Right Weights

72

3

00

0

001000

An ANN starts with randomised weightsAnd with a database of known examples for training

If this pattern corresponds to a “c”

We want these

outputs

Send Training Pattern in

Crunch to outputs

Adjust weights

STOPAll correct

Some wrong

Clearly, an application will only be accurate if the weights are right.

Page 31: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Training Algorithms

Classic: Backpropagation (BP)Others: variants of BP, evolutionary algorithms, etc …Details: Not needed for this specific course

You will learn BP in the Connectionism module, if you take that.

Here are some good tutorials (there are plenty more):http://www.dacs.dtic.mil/techs/neural/neural_ToC.htmlhttp://www.statsoft.com/textbook/stneunet.html

Here’s some excellent free software:http://www-ra.informatik.uni-tuebingen.de/SNNS/

Page 32: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

GeneralisationThe ANN is Learning during its training phaseWhen it is in use, providing decisions/classifications for live cases ithasn’t seen before, we expect a reasonable decision from it. I.e. wewant it to generalise well.

A A

A

B B

B

A A

A

B B

B

A A

A

B B

B

Well learned Poor generalisation Diabolical

Coverage and extent of training data helps to avoid poor generalisatonMain Point: when an NN generalises well, its results seems sensible, intuitive,and generally more accurate than people

A A A

This network was trained with the black As and Bs; the whiteA represents an unseen test case. In the third example, it thinks is a B

Page 33: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

ANN points

One main issue is deciding how to present the data as input patterns to the network, and how to encode the output (next slide)

Another issue is the topology. It needs three layers of networks (I.e. 1 layer between the inputs and outputs – called the hidden layer – in order to be capable of learning any function), but how many nodes? Often this is trial and error; a rule of thumb is to use 40% more nodes than in the input layer.

Commonly employed in control applications (lift, environment, chemical process)

Also credit rating: NNs and similar are used by banks to throw out decisions about accept or declining loans.

Also prediction (sales, sunspots, stocks, horse races)

Can be slow (hours, days) to train, but decision time is always instant.

Very useful in bioinformatics – currently, standard NN provides better accuracy than anything else in predicting secondary structure (regions of helix, sheet, loop) from protein primary sequence. Also great at classifying expression patterns.

Page 34: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Encoding

Suppose we gene expression patterns from each of 10 cancerous skin cells, one from each patient:

1: 3.2, 4.6, 0.2, 1.7, 1.1 …. (20 numbers)2: 3.4, 2.9, 1.2, 1.3, 1.8 ….…10: 6.4, 1.1, 1.0, 1.7, 1.0 …. Etc … And we also have gene expression patterns from each of 10 healthy skin

cells from different patients:1: 3.2, 4.7, 3.1, 0.8, 1.2 …. (20 numbers)2: 3.4, 3.9, 4.2, 0.3, 1.8 ….…10: 5.4, 2.1, 2.0, 1.9, 1.0 …. Etc …

Page 35: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

We might set up the ANN like this

(20 input nodes)

(30 hidden nodes)

One outputnode

3.2

4.6

0.2

With the rule that: if output > 0.5, prediction is cancerous,Else prediction is healthy.

All of these are connected to all of these

Page 36: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Or like this …

… …

Two outputnodes

0.5

0.9

-0.04

With the rule that: if O1’s output is higher than O2’s, thenprediction is cancer, if O2’s output ishigher than O1’s, predictionis healthy

All of these are connected to all of these

Inputs scaled to lie between –1 and 1

O1

O2

Page 37: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

About encoding

There are various other ways(you will see how things are usually set up for

protein secondary structure prediction in a later lecture)

The choice depends on things we won’t go into here (out of scope), but are essentially commonsense reasons, based on what would seem to give the ANN the best chance of learning the required function.

Page 38: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

A Final Note• NNs can provide very accurate predictions• But it is difficult to figure out how they work out their predictions• However, we can inspect the weights of a trained NN, to gain some insight into what it is about the inputs that leads to certain outputs.

E.g. Data onracehorsepast formand currentconditions

Predictionof whetherit will win therace

If we train this network and find that the weights on the links from the “number of previous wins” unit are all very low, we can infer that this contributes little or nothing to the prediction. The “length of race” input may have high weights, indicating this is a key factor, etc …

Page 39: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 40: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 41: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 42: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 43: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 44: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.
Page 45: Neural Network Oleh Danny Manongga. 2 Overview Basics of Neural Network Advanced Features of Neural Network Applications I-II Summary.

Recommended