+ All Categories
Home > Documents > Lecture 15 Basics of Regression Analysis By Aziza Munir.

Lecture 15 Basics of Regression Analysis By Aziza Munir.

Date post: 24-Dec-2015
Category:
Upload: bethanie-hopkins
View: 220 times
Download: 0 times
Share this document with a friend
Popular Tags:
44
Lecture 15 Basics of Regression Analysis By Aziza Munir
Transcript
Page 1: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Lecture 15Basics of Regression Analysis

ByAziza Munir

Page 2: Lecture 15 Basics of Regression Analysis By Aziza Munir.

What we will be covering

• Concept of regression• Linear regression• Methods of calculating regression• Coefficient of regression• Pearson• R square• ANOVA

Page 3: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Regression Introduction

• There are many statistical investigations in which the main objective is to determine if a relationship exists between two or more variables, we use mathematical formulas for making predictions. The reliability of any prediction will depend on the strength of the relationship between variables under study.

Page 4: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Linear Regression

• A mathematical equation that allow us to predict values of one dependent variable from known values of one or more independent variables is called regression equation.

y=a+bx

Page 5: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Linear Regression

A statistical technique that uses a single, independent variable (X) to estimate a single dependent variable (Y).

Based on the equation for a line: Y = b + mX

Page 6: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Linear Regression - Model

ie

X

Y

Y X b b0 1+=Yi

Xi

? (the actual value of Yi)

Page 7: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Linear Regression - Model

ii iY X 0 1

Regression Coefficients for a . . .

Population

SampleˆY = b0 + b1Xi + e

Y = b0 + b1Xiˆ

Page 8: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Simple Linear Regression

Independent variable (x)

Dep

ende

nt v

aria

ble

(y)

The output of a regression is a function that predicts the dependent variable based upon values of the independent variables.

Simple regression fits a straight line to the data.

y’ = b0 + b1X ± є

b0 (y intercept)

B1 = slope= ∆y/ ∆x

є

Page 9: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Simple Linear Regression

Independent variable (x)

Dep

ende

nt v

aria

ble

The function will make a prediction for each observed data point.

The observation is denoted by y and the prediction is denoted by y.

Zero

Prediction: y

Observation: y

^

^

Page 10: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Simple Linear Regression

For each observation, the variation can be described as:

y = y + ε

Actual = Explained + Error

Zero

Prediction error: ε

^

Prediction: y ̂ Observation: y

Page 11: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Regression

Independent variable (x)

Dep

ende

nt v

aria

ble

A least squares regression selects the line with the lowest total sum of squared prediction errors.

This value is called the Sum of Squares of Error, or SSE.

Page 12: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Calculating SSR

Independent variable (x)

Dep

ende

nt v

aria

ble

The Sum of Squares Regression (SSR) is the sum of the squared differences between the prediction for each observation and the population mean.

Population mean: y

Page 13: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Regression Formulas

The Total Sum of Squares (SST) is equal to SSR + SSE.

Mathematically,

SSR = ∑ ( y – y ) (measure of explained variation)

SSE = ∑ ( y – y ) (measure of unexplained variation)

SST = SSR + SSE = ∑ ( y – y ) (measure of total variation in y)

^

^

2

2

Page 14: Lecture 15 Basics of Regression Analysis By Aziza Munir.

The Coefficient of Determination

The proportion of total variation (SST) that is explained by the regression (SSR) is known as the Coefficient of Determination, and is often referred to as R .

R = SSR/SST= SSR\SSR+SSE

The value of R can range between 0 and 1, and the higher its value the more accurate the regression model is. It is often referred to as a percentage.

2

2

2

Page 15: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Scatter plots

• Regression analysis requires interval and ratio-level data.

• To see if your data fits the models of regression, it is wise to conduct a scatter plot analysis.

• The reason?– Regression analysis assumes a linear relationship.

If you have a curvilinear relationship or no relationship, regression analysis is of little use.

Page 16: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Types of Lines

Page 17: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Scatter plot

15.0 20.0 25.0 30.0 35.0

Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates

20000

25000

30000

35000

40000

Per

son

al In

com

e P

er C

apit

a, c

urr

ent

do

llars

, 19

99

Percent of Population with Bachelor's Degree by Personal Income Per Capita

•This is a linear relationship•It is a positive relationship.•As population with BA’s increases so does the personal income per capita.

Page 18: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Regression Line

15.0 20.0 25.0 30.0 35.0

Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates

20000

25000

30000

35000

40000

Per

son

al In

com

e P

er C

apit

a, c

urr

ent

do

llars

, 19

99

Percent of Population with Bachelor's Degree by Personal Income Per Capita

R Sq Linear = 0.542

•Regression line is the best straight line description of the plotted points and use can use it to describe the association between the variables.•If all the lines fall exactly on the line then the line is 0 and you have a perfect relationship.

Page 19: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Things to remember • Regressions are still focuses on association, not

causation.• Association is a necessary prerequisite for

inferring causation, but also:1. The independent variable must preceded the

dependent variable in time.2. The two variables must be plausibly lined by a

theory,3. Competing independent variables must be

eliminated.

Page 20: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Regression Table

•The regression coefficient is not a good indicator for the strength of the relationship.•Two scatter plots with very different dispersions could produce the same regression line.

15.0 20.0 25.0 30.0 35.0

Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates

20000

25000

30000

35000

40000

Per

son

al In

com

e P

er C

apit

a, c

urr

ent

do

llars

, 19

99

Percent of Population with Bachelor's Degree by Personal Income Per Capita

R Sq Linear = 0.542

0.00 200.00 400.00 600.00 800.00 1000.00 1200.00

Population Per Square Mile

20000

25000

30000

35000

40000

Per

son

al In

com

e P

er C

apit

a, c

urr

ent

do

llars

, 199

9

Percent of Population with Bachelor's Degree by Personal Income Per Capita

R Sq Linear = 0.463

Page 21: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Regression coefficient

• The regression coefficient is the slope of the regression line and tells you what the nature of the relationship between the variables is.

• How much change in the independent variables is associated with how much change in the dependent variable.

• The larger the regression coefficient the more change.

Page 22: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Pearson’s r

• To determine strength you look at how closely the dots are clustered around the line. The more tightly the cases are clustered, the stronger the relationship, while the more distant, the weaker.

• Pearson’s r is given a range of -1 to + 1 with 0 being no linear relationship at all.

Page 23: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Reading the tables

•When you run regression analysis on SPSS you get a 3 tables. Each tells you something about the relationship.•The first is the model summary.•The R is the Pearson Product Moment Correlation Coefficient.•In this case R is .736•R is the square root of R-Squared and is the correlation between the observed and predicted values of dependent variable.

Model Summary

.736a .542 .532 2760.003Model1

R R SquareAdjustedR Square

Std. Error ofthe Estimate

Predictors: (Constant), Percent of Population 25 yearsand Over with Bachelor's Degree or More, March 2000estimates

a.

Page 24: Lecture 15 Basics of Regression Analysis By Aziza Munir.

R-Square

•R-Square is the proportion of variance in the dependent variable (income per capita) which can be predicted from the independent variable (level of education). •This value indicates that 54.2% of the variance in income can be predicted from the variable education. Note that this is an overall measure of the strength of association, and does not reflect the extent to which any particular independent variable is associated with the dependent variable. •R-Square is also called the coefficient of determination.

Model Summary

.736a .542 .532 2760.003Model1

R R SquareAdjustedR Square

Std. Error ofthe Estimate

Predictors: (Constant), Percent of Population 25 yearsand Over with Bachelor's Degree or More, March 2000estimates

a.

Page 25: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Adjusted R-square

•As predictors are added to the model, each predictor will explain some of the variance in the dependent variable simply due to chance. •One could continue to add predictors to the model which would continue to improve the ability of the predictors to explain the dependent variable, although some of this increase in R-square would be simply due to chance variation in that particular sample. •The adjusted R-square attempts to yield a more honest value to estimate the R-squared for the population. The value of R-square was .542, while the value of Adjusted R-square was .532. There isn’t much difference because we are dealing with only one variable. •When the number of observations is small and the number of predictors is large, there will be a much greater difference between R-square and adjusted R-square.•By contrast, when the number of observations is very large compared to the number of predictors, the value of R-square and adjusted R-square will be much closer.

Model Summary

.736a .542 .532 2760.003Model1

R R SquareAdjustedR Square

Std. Error ofthe Estimate

Predictors: (Constant), Percent of Population 25 yearsand Over with Bachelor's Degree or More, March 2000estimates

a.

Page 26: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Determining the Regression Line/Model

Use Excel (or any other popular statistical software) Select Tools, Data Analysis, Regression Provide the X range Provide the Y range Output the analysis to a new sheet

Manual Calculations

X YTemperature Sales

63 1.5270 1.6873 1.875 2.0580 2.3682 2.2585 2.6888 2.990 3.1491 3.0692 3.2475 1.9298 3.4

100 3.2892 3.1787 2.8384 2.5888 2.8680 2.2682 2.1476 1.98

Page 27: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Determining the Regression Line/Model using Excel

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.969534312R Square 0.939996782Adjusted R Square 0.936838718Standard Error 0.1461076Observations 21

ANOVAdf SS MS F

Regression 1 6.35405596 6.354056 297.6496823Residual 19 0.405601183 0.021347Total 20 6.759657143

Coefficients Standard Error t Stat P-valueIntercept -2.534985905 0.295223266 -8.58667 5.7673E-08X Variable 1 0.060727986 0.003519947 17.25253 4.58812E-13

Page 28: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Determining the Regression Line/Model Manual Calculations

SST = (Yi - Y)2SSE =(Yi - Yi )2 SSR = (Yi - Y)2

__

SSx =(Xi - X )2 _

SSy =(Yi - Y)2 _

SSxy =(Xi - X )(Xi - Y )_ _

b1=SSxy/SSx

b0 = Y – b1X_ _

MSE = SSE / dfMSR = SSR / df

R2 = SSR/SST

YX

SSES

n-2 =

t-test = b1 / Sb1

Page 29: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Measures of Model Goodness

1. R2 – Coefficient of Determination2. F-test > F-crit or p-value less than alpha3. Standard Error4. t-test

Page 30: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Hypothesis testing for

• Testing to see if the linear relationship between X and Y is significant at the population level.

• t-test• Follow the 5-step process

H0:

HA:t-crit, alpha or alpha/2, n-2 df

1

Page 31: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Standard Error Terms in Linear Regression

• Se (standard error of the estimate)

A measure of variation around the regression lineIf the Se is small…

Standard deviation Of the Errors

• Sb1 (standard error of the the sampling distribution of b1)

Standard deviation of the slopes

A measure of the variation of the slopes from different samples If the Sb1 is small…our b1 estimate is probably very accurateEstimates of …

b1

b1

b1

Page 32: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Standard Error of Regression

The Standard Error of a regression is a measure of its variability. It can be used in a similar manner to standard deviation, allowing for prediction intervals.

y ± 2 standard errors will provide approximately 95% accuracy, and 3 standard errors will provide a 99% confidence interval.

Standard Error is calculated by taking the square root of the average prediction error.

Standard Error = SSE n-k

Where n is the number of observations in the sample and k is the total number of variables in the model

Page 33: Lecture 15 Basics of Regression Analysis By Aziza Munir.

The output of a simple regression is the coefficient β and the constant A. The equation is then:

y = A + β * x + ε

where ε is the residual error.

β is the per unit change in the dependent variable for each unit change in the independent variable. Mathematically:

β =∆ y∆ x

Page 34: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Multiple Linear Regression

More than one independent variable can be used to explain variance in the dependent variable, as long as they are not linearly related.

A multiple regression takes the form:

y = A + β X + β X + … + β k Xk + ε

where k is the number of variables, or parameters.

1 1 2 2

Page 35: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Linear Regression Example

• Petfood, Estimate Sales based on Shelf Space• Two sets of samples, 12 observations each• Perform a Regression Analysis on both sets of

dataSpace Sales

5 1.65 2.25 1.4

10 1.910 2.410 2.615 2.315 2.715 2.820 2.620 2.920 3.1

Space Sales5 1.65 1.95 1.4

10 210 2.710 2.415 3.515 3.215 3.320 4.220 4.620 4.5

Sample1 Sample2

Page 36: Lecture 15 Basics of Regression Analysis By Aziza Munir.

ANOVA

•The p-value associated with this F value is very small (0.0000). •These values are used to answer the question "Do the independent variables reliably predict the dependent variable?". •The p-value is compared to your alpha level (typically 0.05) and, if smaller, you can conclude "Yes, the independent variables reliably predict the dependent variable". •If the p-value were greater than 0.05, you would say that the group of independent variables does not show a statistically significant relationship with the dependent variable, or that the group of independent variables does not reliably predict the dependent variable.

ANOVAb

4.32E+08 1 432493775.8 56.775 .000a

3.66E+08 48 7617618.586

7.98E+08 49

Regression

Residual

Total

Model1

Sum ofSquares df Mean Square F Sig.

Predictors: (Constant), Percent of Population 25 years and Over with Bachelor'sDegree or More, March 2000 estimates

a.

Dependent Variable: Personal Income Per Capita, current dollars, 1999b.

Page 37: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Coefficients

•B - These are the values for the regression equation for predicting the dependent variable from the independent variable. •These are called unstandardized coefficients because they are measured in their natural units. As such, the coefficients cannot be compared with one another to determine which one is more influential in the model, because they can be measured on different scales.

Coefficientsa

10078.565 2312.771 4.358 .000

688.939 91.433 .736 7.535 .000

(Constant)

Percent of Population25 years and Overwith Bachelor'sDegree or More,March 2000 estimates

Model1

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig.

Dependent Variable: Personal Income Per Capita, current dollars, 1999a.

Page 38: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Coefficients

•This chart looks at two variables and shows how the different bases affect the B value. That is why you need to look at the standardized Beta to see the differences.

Coefficientsa

13032.847 1902.700 6.850 .000

517.628 78.613 .553 6.584 .000

7.953 1.450 .461 5.486 .000

(Constant)

Percent of Population25 years and Overwith Bachelor'sDegree or More,March 2000 estimates

Population PerSquare Mile

Model1

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig.

Dependent Variable: Personal Income Per Capita, current dollars, 1999a.

Page 39: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Coefficients

•Beta - The are the standardized coefficients. •These are the coefficients that you would obtain if you standardized all of the variables in the regression, including the dependent and all of the independent variables, and ran the regression. •By standardizing the variables before running the regression, you have put all of the variables on the same scale, and you can compare the magnitude of the coefficients to see which one has more of an effect. •You will also notice that the larger betas are associated with the larger t-values.

Coefficientsa

10078.565 2312.771 4.358 .000

688.939 91.433 .736 7.535 .000

(Constant)

Percent of Population25 years and Overwith Bachelor'sDegree or More,March 2000 estimates

Model1

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig.

Dependent Variable: Personal Income Per Capita, current dollars, 1999a.

Page 40: Lecture 15 Basics of Regression Analysis By Aziza Munir.

How to translate a typical table

Regression Analysis Level of Education by Income per capita

Income per capita Independent variables b Beta Percent population with BA 688.939 .736 R2 .542 Number of Cases 49

Page 41: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Part of the Regression Equation

• b represents the slope of the line– It is calculated by dividing the change in the

dependent variable by the change in the independent variable.

– The difference between the actual value of Y and the calculated amount is called the residual.

– The represents how much error there is in the prediction of the regression equation for the y value of any individual case as a function of X.

Page 42: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Comparing two variables• Regression analysis is useful for comparing two

variables to see whether controlling for other independent variable affects your model.

• For the first independent variable, education, the argument is that a more educated populace will have higher-paying jobs, producing a higher level of per capita income in the state.

• The second independent variable is included because we expect to find better-paying jobs, and therefore more opportunity for state residents to obtain them, in urban rather than rural areas.

Page 43: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Single

Model Summary

.849a .721 .709 2177.791Model1

R R SquareAdjustedR Square

Std. Error ofthe Estimate

Predictors: (Constant), Population Per Square Mile,Percent of Population 25 years and Over withBachelor's Degree or More, March 2000 estimates

a.

ANOVAb

5.75E+08 2 287614518.2 60.643 .000a

2.23E+08 47 4742775.141

7.98E+08 49

Regression

Residual

Total

Model1

Sum ofSquares df Mean Square F Sig.

Predictors: (Constant), Population Per Square Mile, Percent of Population 25 yearsand Over with Bachelor's Degree or More, March 2000 estimates

a.

Dependent Variable: Personal Income Per Capita, current dollars, 1999b.

Coefficientsa

13032.847 1902.700 6.850 .000

517.628 78.613 .553 6.584 .000

7.953 1.450 .461 5.486 .000

(Constant)

Percent of Population25 years and Overwith Bachelor'sDegree or More,March 2000 estimates

Population PerSquare Mile

Model1

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig.

Dependent Variable: Personal Income Per Capita, current dollars, 1999a.

Model Summary

.736a .542 .532 2760.003Model1

R R SquareAdjustedR Square

Std. Error ofthe Estimate

Predictors: (Constant), Percent of Population 25 yearsand Over with Bachelor's Degree or More, March 2000estimates

a.

ANOVAb

4.32E+08 1 432493775.8 56.775 .000a

3.66E+08 48 7617618.586

7.98E+08 49

Regression

Residual

Total

Model1

Sum ofSquares df Mean Square F Sig.

Predictors: (Constant), Percent of Population 25 years and Over with Bachelor'sDegree or More, March 2000 estimates

a.

Dependent Variable: Personal Income Per Capita, current dollars, 1999b.

Coefficientsa

10078.565 2312.771 4.358 .000

688.939 91.433 .736 7.535 .000

(Constant)

Percent of Population25 years and Overwith Bachelor'sDegree or More,March 2000 estimates

Model1

B Std. Error

UnstandardizedCoefficients

Beta

StandardizedCoefficients

t Sig.

Dependent Variable: Personal Income Per Capita, current dollars, 1999a.

Multiple Regression

Page 44: Lecture 15 Basics of Regression Analysis By Aziza Munir.

Single Regression

Income per capita Independent variables b Beta Percent population with BA 688.939 .736 R2 .542 Number of Cases 49

Multiple Regression

Income per capita Independent variables b Beta Percent population with BA 517.628 .553 Population Density 7.953 .461 R2 .721 Adjusted R2 .709 Number of Cases 49


Recommended