+ All Categories
Home > Documents > Recognition using Boosting

Recognition using Boosting

Date post: 16-Jan-2016
Category:
Upload: marlon
View: 22 times
Download: 0 times
Share this document with a friend
Description:
Recognition using Boosting. Modified from various sources including http:// people.csail.mit.edu/torralba/courses/6.869/6.869.computervision.htm. Background. Computer screen. Bag of image patches. In some feature space. Object Recognition. - PowerPoint PPT Presentation
49
Recognition using Boosting Modified from various sources including http:// people.csail.mit.edu/torralba/courses/6.869/6.869.computervisi on.htm
Transcript
Page 1: Recognition using Boosting

Recognition using Boosting

Modified from various sources including http://people.csail.mit.edu/torralba/courses/6.869/6.869.computervision.htm

Page 2: Recognition using Boosting

Object RecognitionObject detection and recognition is formulated as a classification problem.

Bag of image patches

Decision boundary

… and a decision is taken at each window about if it contains a target object or not.

Computer screen

Background

In some feature space

Where are the screens?

The image is partitioned into a set of overlapping windows

Page 3: Recognition using Boosting

Recognition Techniques…a few

106 examples

Nearest neighbor

Shakhnarovich, Viola, Darrell 2003Berg, Berg, Malik 2005…

Neural networks

LeCun, Bottou, Bengio, Haffner 1998Rowley, Baluja, Kanade 1998…

Support Vector Machines and Kernels Conditional Random Fields

McCallum, Freitag, Pereira 2000Kumar, Hebert 2003…

Guyon, VapnikHeisele, Serre, Poggio, 2001…

Page 4: Recognition using Boosting

• Formulation: binary classification

Formulation

+1-1

x1 x2 x3 xN

… xN+1 xN+2 xN+M

-1 -1 ? ? ?

Training data: each image patch is labeledas containing the object or background

Test data

Features x =

Labels y =

Where belongs to some family of functions

• Classification function

• Minimize misclassification error(Not that simple: we need some guarantees that there will be generalization)

Page 5: Recognition using Boosting

Lets look at one technique called Boosting

• Boosting– Gentle boosting– Weak detectors– Object model– Object detection

Page 6: Recognition using Boosting

A simple object detector with Boosting

Download

• Toolbox for manipulating dataset

• Code and dataset

Matlab code (an API that has many useful functions)

• Gentle boosting

• Object detector using a part based model

Dataset with cars and computer monitors

http://people.csail.mit.edu/torralba/iccv2005/

Page 7: Recognition using Boosting

• A simple algorithm for learning robust classifiers– Freund & Shapire, 1995– Friedman, Hastie, Tibshhirani, 1998

• Provides efficient algorithm for sparse visual feature selection– Tieu & Viola, 2000– Viola & Jones, 2003

• Easy to implement, not requires external optimization tools.

Why boosting?

Page 8: Recognition using Boosting

• Defines a classifier using an additive model:

Boosting

Strong classifier

Weak classifier

WeightFeaturesvector Boosting classifier = sum of weighted “weaker” classifier

IDEA = stronger classifier is sum of weaker classifiers

Example of ONE Weaker classifier = say you might be trying to find computer screens and your examples all have black screens –so you might use color and if there is black in the content then there might be a computer screen there.

Page 9: Recognition using Boosting

• Defines a classifier using an additive model:

• We need to define a family of weak classifiers

Boosting

Strong classifier

Weak classifier

WeightFeaturesvector

from a family of weak classifiers

Page 10: Recognition using Boosting

Each data point has

a class label:

wt =1and a weight:

+1 ( )

-1 ( )yt =

Boosting• It is a sequential procedure:

xt=1

xt=2

xt

How do we find week classifiers ----first we need some sample data in our feature space X where we have a class label for each sample. In this simple example we have 2 classes +1(red) and -1(blue) ---representing abstractly 2 classes

Our example blue =computer screen

Red = not a computer screen

Page 11: Recognition using Boosting

Toy exampleWeak learners from the family of lines

h => p(error) = 0.5 it is at chance

Each data point has

a class label:

wt =1and a weight:

+1 ( )

-1 ( )yt =

IDEA = find a single linethat will separate best the2 classes from each other

Page 12: Recognition using Boosting

Toy example

This one seems to be the best

Each data point has

a class label:

wt =1and a weight:

+1 ( )

-1 ( )yt =

This is a ‘weak classifier’: It performs slightly better than chance.

Page 13: Recognition using Boosting

Toy example

We set a new problem for which the previous weak classifier performs at chance again

Each data point has

a class label:

wt wt exp{-yt Ht}

We update the weights:

+1 ( )

-1 ( )yt =

Page 14: Recognition using Boosting

Toy example

We set a new problem for which the previous weak classifier performs at chance again

Each data point has

a class label:

wt wt exp{-yt Ht}

We update the weights:

+1 ( )

-1 ( )yt =

Page 15: Recognition using Boosting

Toy example

We set a new problem for which the previous weak classifier performs at chance again

Each data point has

a class label:

wt wt exp{-yt Ht}

We update the weights:

+1 ( )

-1 ( )yt =

Page 16: Recognition using Boosting

Toy example

We set a new problem for which the previous weak classifier performs at chance again

Each data point has

a class label:

wt wt exp{-yt Ht}

We update the weights:

+1 ( )

-1 ( )yt =

Page 17: Recognition using Boosting

Toy example

The strong (non- linear) classifier is built as the combination of all the weak (linear) classifiers.

f1 f2

f3

f4

F= f1 + f2 + f3 +f4

Page 18: Recognition using Boosting

Boosting

• Different cost functions and minimization algorithms result is various flavors of Boosting

• In this demo, I will use gentleBoosting: it is simple to implement and numerically stable.

Page 19: Recognition using Boosting

Overview of section

• Boosting– Gentle boosting– Weak detectors– Object model– Object detection

Page 20: Recognition using Boosting

Boosting Boosting fits the additive model

by minimizing the exponential loss

Training samples

The exponential loss is a differentiable upper bound to the misclassification error.

Page 21: Recognition using Boosting

Exponential loss

-1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

3.5

4 Squared error

Exponential loss

yF(x) = margin

Misclassification errorLoss

Squared error

Exponential loss

Page 22: Recognition using Boosting

Boosting Sequential procedure. At each step we add

For more details: Friedman, Hastie, Tibshirani. “Additive Logistic Regression: a Statistical View of Boosting” (1998)

to minimize the residual loss

inputDesired outputParametersweak classifier

Page 23: Recognition using Boosting

gentleBoosting

For more details: Friedman, Hastie, Tibshirani. “Additive Logistic Regression: a Statistical View of Boosting” (1998)

We chose that minimizes the cost:

At each iterations we just need to solve a weighted least squares problem

Weights at this iteration

• At each iteration:

Instead of doing exact optimization, gentle Boosting minimizes a Taylor approximation of the error:

Page 24: Recognition using Boosting

Weak classifiers

• The input is a set of weighted training samples (x,y,w)

• Regression stumps: simple but commonly used in object detection.

Four parameters:

b=Ew(y [x> ])

a=Ew(y [x< ])x

fm(x)

fitRegressionStump.m

Page 25: Recognition using Boosting

gentleBoosting.m

function classifier = gentleBoost(x, y, Nrounds)

for m = 1:Nrounds

fm = selectBestWeakClassifier(x, y, w); w = w .* exp(- y .* fm); % store parameters of fm in classifier …end

Solve weighted least-squares

Re-weight training samples

Initialize weights w = 1

Page 26: Recognition using Boosting

Demo gentleBoosting

> demoGentleBoost.m

Demo using Gentle boost and stumps with hand selected 2D data:

Page 27: Recognition using Boosting

Flavors of boosting

• AdaBoost (Freund and Shapire, 1995)

• Real AdaBoost (Friedman et al, 1998)

• LogitBoost (Friedman et al, 1998)

• Gentle AdaBoost (Friedman et al, 1998)

• BrownBoosting (Freund, 2000)

• FloatBoost (Li et al, 2002)

• …

Page 28: Recognition using Boosting

Overview of section

• Boosting– Gentle boosting– Weak detectors– Object model– Object detection

Page 29: Recognition using Boosting

From images to features:Weak detectors

We will now define a family of visual features that can be used as weak classifiers (“weak detectors”)

Takes image as input and the output is binary response.The output is a weak detector.

Page 30: Recognition using Boosting

Weak detectorsTextures of textures Tieu and Viola, CVPR 2000

Every combination of three filters generates a different feature

This gives thousands of features. Boosting selects a sparse subset, so computations on test time are very efficient. Boosting also avoids overfitting to some extend.

Page 31: Recognition using Boosting

Weak detectorsHaar filters and integral imageViola and Jones, ICCV 2001

The average intensity in the block is computed with four sums independently of the block size.

Page 32: Recognition using Boosting

Edge fragments

Weak detector = k edge fragments and threshold. Chamfer distance uses 8 orientation planes

Opelt, Pinz, Zisserman, ECCV 2006

Page 33: Recognition using Boosting

Weak detectors

Other weak detectors:• Carmichael, Hebert 2004• Yuille, Snow, Nitzbert, 1998• Amit, Geman 1998• Papageorgiou, Poggio, 2000• Heisele, Serre, Poggio, 2001• Agarwal, Awan, Roth, 2004• Schneiderman, Kanade 2004 • …

Page 34: Recognition using Boosting

Weak detectorsPart based: similar to part-based generative

models. We create weak detectors by using parts and voting for the object center location

Car model Screen model

These features are used for the detector on the course web site.

Page 35: Recognition using Boosting

Weak detectorsFirst we collect a set of part templates from a set of training objects.

Vidal-Naquet, Ullman (2003)

Page 36: Recognition using Boosting

Weak detectorsWe now define a family of “weak detectors” as:

= =

Better than chance

*

Page 37: Recognition using Boosting

Weak detectorsWe can do a better job using filtered images

Still a weak detectorbut better than before

* * ===

Page 38: Recognition using Boosting

TrainingFirst we evaluate all the N features on all the training images.

Then, we sample the feature outputs on the object center and at random locations in the background:

Page 39: Recognition using Boosting

Representation and object model

4 10

Selected features for the screen detector

1 2 3

100

Lousy painter

Page 40: Recognition using Boosting

Representation and object modelSelected features for the car detector

1 2 3 4 10 100

… …

Page 41: Recognition using Boosting

Overview of section

• Boosting– Gentle boosting– Weak detectors– Object model– Object detection

Page 42: Recognition using Boosting

Example: screen detectionFeature output

Page 43: Recognition using Boosting

Example: screen detectionFeature output

Thresholded output

Weak ‘detector’Produces many false alarms.

Page 44: Recognition using Boosting

Example: screen detectionFeature output

Thresholded output

Strong classifier at iteration 1

Page 45: Recognition using Boosting

Example: screen detectionFeature output

Thresholded output

Strongclassifier

Second weak ‘detector’Produces a different set of false alarms.

Page 46: Recognition using Boosting

Example: screen detection

+

Feature output

Thresholded output

Strongclassifier

Strong classifier at iteration 2

Page 47: Recognition using Boosting

Example: screen detection

+

Feature output

Thresholded output

Strongclassifier

Strong classifier at iteration 10

Page 48: Recognition using Boosting

Example: screen detection

+

Feature output

Thresholded output

Strongclassifier

Adding features

Finalclassification

Strong classifier at iteration 200

Page 49: Recognition using Boosting

Demo

> runDetector.m

Demo of screen and car detectors using parts, Gentle boost, and stumps:


Recommended