+ All Categories
Home > Documents > Setting up your ML application - Stanford University

Setting up your ML application - Stanford University

Date post: 09-Dec-2021
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
39
Setting up your ML application Train/dev/test sets deeplearning.ai
Transcript

Setting up your ML application

Train/dev/testsetsdeeplearning.ai

Andrew Ng

Applied ML is a highly iterative process

Idea

Experiment Code

# layers

# hidden units

learning rates

activation functions

Andrew Ng

Train/dev/test sets

Andrew Ng

Mismatched train/test distribution

Training set:Cat pictures from webpages

Dev/test sets:Cat pictures from users using your app

Not having a test set might be okay. (Only dev set.)

Setting up your ML application

Bias/Variancedeeplearning.ai

Andrew Ng

Bias and Variance

high bias “just right” high variance

Andrew Ng

Bias and Variance

Cat classification

Train set error:

Dev set error:

Andrew Ng

High bias and high variance

!"

!#

Setting up your ML application

Basic “recipe” for machine learningdeeplearning.ai

Andrew Ng

Basic recipe for machine learning

Regularizing your neural network

Regularizationdeeplearning.ai

Andrew Ng

Logistic regression

min$,& '(), *)

Andrew Ng

Neural network

Regularizing your neural network

Why regularization reduces overfittingdeeplearning.ai

Andrew Ng

How does regularization prevent overfitting?

!"!#!$

%&

high bias “just right” high variance

Andrew Ng

How does regularization prevent overfitting?

Regularizing your neural network

Dropoutregularizationdeeplearning.ai

Andrew Ng

Dropout regularization

!"!#

$%

!&

!'

!"!#

$%

!&

!'

Andrew Ng

Implementing dropout (“Inverted dropout”)

Andrew Ng

Making predictions at test time

Regularizing your neural network

Understandingdropoutdeeplearning.ai

Andrew Ng

Why does drop-out work?

Intuition: Can’t rely on any one feature, so have to spread out weights.

!"!# $%

!&

Regularizing your neural network

Other regularizationmethodsdeeplearning.ai

Andrew Ng

Data augmentation

4

Andrew Ng

Early stopping

# iterations

Setting up your optimization problem

Normalizing inputsdeeplearning.ai

Andrew Ng

Normalizing training sets

!"

!# 5

3

!#

!"

!#

!"

Andrew Ng

Why normalize inputs?! ", $ = 1

'(*

+,-ℒ(01 + , 0(+))

"

$

"

$

"$

!Unnormalized:

"$

!Normalized:

Vanishing/explodinggradientsdeeplearning.ai

Setting up your optimization problem

Andrew Ng

Vanishing/exploding gradients!"!# =

$%

Andrew Ng

Single neuron example!"!#!$!%

&'( = *(,)

Numerical approximationof gradientsdeeplearning.ai

Setting up your optimization problem

Andrew Ng

Checking your derivative computation

!

Andrew Ng

Checking your derivative computation

!

Gradient Checkingdeeplearning.ai

Setting up your optimization problem

Andrew Ng

Gradient check for a neural network

Take ! " , $["], … ,! ( , $ ( and reshape into a big vector ).

Take +! " , +$["], … , +! ( , +$ ( and reshape into a big vector d).

Andrew Ng

Gradient checking (Grad check)

Gradient Checkingimplementation notesdeeplearning.ai

Setting up your optimization problem

Andrew Ng

Gradient checking implementation notes

- Don’t use in training – only to debug

- If algorithm fails grad check, look at components to try to identify bug.

- Remember regularization.

- Doesn’t work with dropout.

- Run at random initialization; perhaps again after some training.


Recommended