Diagnostic Test andTesting for Econometrics Problemsfin.bus.ku.ac.th/01135534 Financial...

Post on 18-May-2020

32 views 0 download

transcript

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Diagnostic Test and Testing for

Econometrics Problems

&

Econometric Problem Remedy

1

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

OUTLINE

Basic Concept: Multiple Regression

MULTICOLLINEARITY

AUTOCORRELATION

HETEROSCEDASTICITY

REASEARCH IN FINANCE

2

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

𝑌𝑖 = 𝛽1 + 𝛽2𝑋1𝑖 + 𝛽3𝑋2𝑖 + 𝛽4𝑋3𝑖 + 𝑢𝑖

BASIC CONCEPTS: Multiple Regression

3

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

BASIC CONCEPTS: Normality Assumption for

• CLRM assumes that each is distributed normally with

𝑢𝑖

𝑢𝑖

𝑌𝑖 = 𝛽1 + 𝛽2𝑋1𝑖 + 𝛽3𝑋2𝑖 + 𝛽4𝑋3𝑖 + 𝑢𝑖

4

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

BASIC CONCEPTS: Why we need Normality Assumptions of 𝑢𝑖

𝛽2~ Normal𝛽1~ Normal

5

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. Influence of the omitted or neglected variables is small and at best

random Central Limit Theorem (CLT)

2. Even if the number of variables is not very large or if these variables

are not strictly independent, their sum may still be normally distributed

3. Must be normally distributed in order to make assumption of OLS

estimators , are normally distributed

4. Normal distribution is a comparatively simple distribution involving

only two parameters (mean and variance)

5. Let’s say sample < 100 , normality assumption assumes a critical

role. If the sample size is reasonably large, normality is relaxed.

6. Large samples, t and F statistics have appropriately.

𝑢𝑖𝛽1 𝛽2

TEST ‘BLUE’ Condition

BASIC CONCEPTS: Why we need Normality Assumptions of 𝑢𝑖

6

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

0

20

40

60

80

100

120

140

Jan

-09

Ap

r-0

9

Jul-

09

Oct

-09

Jan

-10

Ap

r-1

0

Jul-

10

Oct

-10

Jan

-11

Ap

r-1

1

Jul-

11

Oct

-11

Jan

-12

Ap

r-1

2

OIL OIL_SA

• …is statistical methods of removing the seasonal

component of a time series that is used when analyzing

non-seasonal trends

• Many economic phenomena have seasonal cycles

Seasonally Adjusted :Census X12 Method

0

20

40

60

80

100

120

140

Jan Feb Mar Apr MayJune Jul Aug Sep Oct Nov Dec

Dubai Crude Oil Price

2009 2010 2011 2012

DATA PREPARATION: Seasonally Adjusted

7

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

DATA PREPARATION: Seasonally Adjusted

8

KULKUNYA PRAYARACH, PH.D.

1 : Multiple Regression

: William H. Greene, Dr. Kulkunya Prayarach

VIF (βi) = 1 / (1-R2)

If Autocorrelation

D.W. not 2, then AR(1)

(

If Multicollinearity

VIF > 10, then drop variable

(

If Heteroscedasticity (p ≤ 0.05)

Transform Regression

Yi /xi = b0\Xi, +b1

Yi/Xi2 = b0\ Xi2, +b1/Xi

Yi/ 2i = b0, +b1Xi /2

i

(

ECONOMETRIC PROBLEMS

Multicollinearity

Run: Xi = f(X1, X2,..,Xk)

Rule of Thumb: VIF ≤ 10 No Multi

VIF (i) = 1 / 1 –R2)

(

Stationary

(Unit Root Test: ADF)

H0: Non Station (unit root)

Stationary : I(0) (Reject H0), p ≤ 0.05

Non Stationary : I(1) (Fail to Reject H0) p> 0.05

Stationary Data at

I(0) or I(1)

(

First Diff D(data)

Autocorrelation

Test: Durbin Watson (D.W.) 2

No Autocorrelation

( Heteroscedasticity

Test: White Test

H0 : Homoscedasticity, p > 0.05

( Clean Econometrix Problems

GO AHEAD!!! RUN OLS

ALTERNATIVE MODELS

VAR/VECM

Granger

Causality Test

ARCH/GARCH

9

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

• …is a stochastic process whose joint probability distribution does

not change when shifted in time or space

>>> Parameters (mean, variance) will not change overtime or position

I(0)

Stationary at level

DATA PREPARATION: Stationary

10

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Random Walk without Drift

DATA PREPARATION: Random Walk (Unit Root Process)

Random Walk with Drift

11

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

… a test of stationary (or nonstationary)

Where ut is a white noise error term.

Test Augmented Dickey-Fuller (ADF) Test for Unit Root Test

Test H0 : then UNIT ROOT (nonstationary) ~

Random walk without drift

>>> CANNOT simply regress Yt on its lagged value Yt-1

𝜌 = 1

𝑌𝑡 = 𝜌𝑌𝑡−1 + 𝑢𝑡−1 ≤ 𝜌 ≤ 1where

DATA PREPARATION: Unit Root Test

12

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

STEP 1: First Differentiate

STEP 2 : Test Unit Root again

Test H0: ~ >>> Unit root (ACCEPT)

STEP 3 : Second Differentiate

Test H0: if reject then NO Unit root

𝛿 = 0 𝜌 = 1

𝜗 = 0

DATA PREPARATION: How to Solve Unit Root Problem

13

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Exchange Rate

27

29

31

33

35

37

1/1/2009 1/1/2010 1/1/2011 1/1/2012

0

20

40

60

80

100

120

140

160

1/3

/200

6

3/2

1/2

00

6

6/6

/200

6

8/2

1/2

00

6

11

/3/2

00

6

1/2

3/2

00

7

4/1

0/2

00

7

6/2

5/2

00

7

9/1

0/2

00

7

22

No

v 0

7

5 F

eb 0

8

18

Ap

r 0

8

2 J

ul 0

8

15

Sep

08

27

No

v 0

8

10

Feb

09

24

Ap

r 0

9

8 J

ul 0

9

21

Sep

09

3 D

ec 0

9

16

Feb

10

30

Ap

r 1

0

14

Ju

l 10

27

Sep

10

9 D

ec 1

0

22

Feb

11

6 M

ay 1

1

20

Ju

l 11

3 O

ct 1

1

15

Dec

11

28

Feb

12

11

May

12

Oil Price (WTI)

14

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

DATA PREPARATION: Gaussian, Standard or Classical Linear

Regression Model (CLRM)

15

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

# of stock

Ab

no

rmal

pro

fit

%

Assumption 1:

16

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Taylor Series ExpansionGauss-Newton iterativeNewton-Raphson iterative

Method

Nonlinear Regression

Assumption 2:

17

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Assumption 3:

18

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Assumption 4:

19

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Assumption 5:

20

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

I. Conceptual Framework

III. My MappingIV. Linkages:

Internal Factor, External Factor, Shock

II. Empirical Evidence

There must be sufficient variability in the values

taken by the regressors. Assumption 6:

21

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

• X variables

Should be vary

Assumption 7:

22

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

• What is the nature of multicollinearity?

• Is Multicollinearity really a problem?

• What are its practical consequences?

• How does one detect it?

• What remedial measures can be taken to alleviate the

problem of multicollinearity?

Assumption 8:

MULTICOLLINEARITY: Is Multicollinearity seriously Problem?

23

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

MULTICOLLINEARITY: Is Multicollinearity seriously Problem?

• The Nature of Multicollinearity is the existence of a “perfect” or exact,

linear relationship among some or all explanatory variables of a

regression model

24

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Best

Linear

Unbiased Estimator

Collinearity does

not destroy the

property of BLUE

MULTICOLLINEARITY: Consequences of Multicollinearity

25

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. High R2 but few significant t ratios.

Example: R2 = 0.8 but individual t tests wilshow that none or few of the partial slope coefficients are statisticallly different from zero.

2. High pair-wise correlations among regressors.

3. Examination of partial correlations

MULTICOLLINEARITY: Detecting of Multicollinearity

26

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

4. Auxiliary regression

5. Eigenvalues and condition index

if 100 < k <1000 moderate multicollinearity

k > 1000 severe multicollinearity

6. Tolerance and variance inflation factors

TOL >>> 0 or VIF > 10

MULTICOLLINEARITY: Detecting of Multicollinearity

27

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. Do nothing

“Multicollinearity is God’s will, not a problem with OLS or statistical techique in general” (Blanchard)

2. Rule of Thumb Procedures

(1) A priori information

(2) Combining cross-sectional and time series data

(3) Dropping variable(s) and specification bias

(4) Transformation of variables

(5) (Additional or new data) Increase a size of sample

(6) Polynomial Regression

(7) Factor analysis

MULTICOLLINEARITY: Remedial Measures

28

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. What is the nature of autocorrelation?

2. What are the theoretical and practical consequences of

autocorrelation?

3. How does one remedy the problem of autocorrelation?

Assumption 9:

Autocorrelation: Nature of Autocorrelation

29

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Positive serial correlation Negative serial correlation

Zero correlation

Autocorrelation: Nature of Autocorrelation

30

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. Specification Bias: Excluded variables Case

2. Nonstationarity

3. Spurious problem

Autocorrelation: Types of Autocorrelation

31

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Best

Linear

Unbiased Estimator

Autocorrelation

destroy

property of BLUE

• Autocorrelation destroys the property of BLUE due to not minimum

variance

• The residual variance is likely to underestimate

• The usual t and F tests of significance are no longer valid, and if

applied, are likely to give seriously misleading conclusions about

the statiscal signifcance of the estimated regression coefficients

Autocorrelation: Consequences of using OLS in the Presence of Autocorrelation

32

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. Graph Residual Plot

2. Run Test

3. Durbin-Watson Test

4. Breusch-Godfrey (BG) test ~ LM test

nonstochastic regressors, higher-order autoregressive : AR(1) , AR(2))

Autocorrelation: Detecting Autocorrelation

33

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. Transform the original model >>>

o Generalized least-square (GLS) Method

o Feasible Generalized least-square (FGLS) method

2. First-Difference Method

3. When is not known then estimate from the residuals AR(1)

4. Change Model to ARCH and GARCH Models

5. Change Model to ARMA or ARIMA

𝜌𝜌

Autocorrelation: Remedial Measure

34

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Assumption 10:

Heteroscedasticity: Nature of Heteroscedasticity

35

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

What is the nature of heteroscedasticity?

What are its consequences?

How does one detect it?

What are the remedial measures?

Heteroscedasticity: Nature of Heteroscedasticity

36

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Why the variances of ui may be variable?

1. Following the error-learning models, as people learn their

errors of behavior become smaller over time.

2. Growth oriented companies

3. As data collecting techniques improves, is likely to

decrease.

4. The presence of outliers

5. Skewness

𝜎𝑖2

Heteroscedasticity: Nature of Heteroscedasticity

37

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Best

Linear

Unbiased Estimator

“If we persist in using the usual testing procedure despite heteroscedasticity, whatever conclusions we draw or inferences we make may be very misleading”

Heteroscedasticity

destroy

property of BLUE

Heteroscedasticity: Consequences of using OLS in the Presence of

Heteroscedasticity

38

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. Graph Residual Plot against Y and X

2. Park Test

3. Glejser Test

4. Spearman’s Rank Correlation Test

5. Glejser Test

6. Goldfeld-Quandt Test

7. Breusch-Pagon-Godfrey Test (BPG)

8. White’s General Heteroscedasticity Test

Heteroscedasticity: Detecting of Heteroscedasticity

39

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

1. Weighted Least Square (WLS) o Weighted by Y, 1/X, Different variables

o Error Term

Heteroscedasticity: Remedial Measures

40

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Omitting Variables

Assumption 11:

41

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

42

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

43

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Heteroscedasticity

44

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

Variable Definitions

45

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

WORK SHOP

#246

I. Basic Concepts

KULKUNYA PRAYARACH, PH.D.

Multiple Regression Analysis

II. Multicollinearity IV. HeteroscedasticityIII. Autocorrelation V. Research & Group Work

WORK ORDERS : Multiple Regression

(1) Run Multiple Regression

Take care of seasonal effect and smooth data (by taking log)

(2) Test Multicollinearity and remedy if happens

(3) Test Autocorrelation and remedy if happens

(4) Test Heteroscedasticity and remedy if happens

(5) Analyze your results

47