Serial Correlation and Heteroscedasticity in Time series Regressions. Econometric (EC3090) - Week 11 Agustín Bénétrix

Similar documents
Model Diagnostic tests

OLS Assumptions and Goodness of Fit

Title. Description. time series Introduction to time-series commands

Gov Troubleshooting the Linear Model II: Heteroskedasticity

Example 1 of panel data : Data for 6 airlines (groups) over 15 years (time periods) Example 1

Within these three broad categories, similar commands have been grouped together. Declare data to be time-series data [TS] tsfill

Multicollinearity and Validation CIVL 7012/8012

Analysis of Panel Data. Third Edition. Cheng Hsiao University of Southern California CAMBRIDGE UNIVERSITY PRESS

Intro to E-Views. E-views is a statistical package useful for cross sectional, time series and panel data statistical analysis.

Econometrics I: OLS. Dean Fantazzini. Dipartimento di Economia Politica e Metodi Quantitativi. University of Pavia

7. Collinearity and Model Selection

Detecting and Circumventing Collinearity or Ill-Conditioning Problems

First-level fmri modeling

Week 4: Simple Linear Regression II

Two-Stage Least Squares

Conditional Volatility Estimation by. Conditional Quantile Autoregression

Heteroscedasticity-Consistent Standard Error Estimates for the Linear Regression Model: SPSS and SAS Implementation. Andrew F.

Source:

SOCY7706: Longitudinal Data Analysis Instructor: Natasha Sarkisian. Panel Data Analysis: Fixed Effects Models

Birkbeck College Department of Economics, Mathematics and Statistics.

Robust Linear Regression (Passing- Bablok Median-Slope)

Chapter 7: Dual Modeling in the Presence of Constant Variance

[spa-temp.inf] Spatial-temporal information

Chapter 5 Parameter Estimation:

The cointardl addon for gretl

Regression. Dr. G. Bharadwaja Kumar VIT Chennai

SAS Econometrics and Time Series Analysis 2 for JMP. Third Edition

Linear Methods for Regression and Shrinkage Methods

Dr. V. Alhanaqtah. Econometrics. Graded assignment

set mem 10m we can also decide to have the more separation line on the screen or not when the software displays results: set more on set more off

Lab Session 1. Introduction to Eviews

Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors

Time Series Analysis by State Space Methods

Here is Kellogg s custom menu for their core statistics class, which can be loaded by typing the do statement shown in the command window at the very

Study Guide. Module 1. Key Terms

Chapter 15: Forecasting

Standard Errors in OLS Luke Sonnet

Scholz, Hill and Rambaldi: Weekly Hedonic House Price Indexes Discussion

INTRODUCTION TO PANEL DATA ANALYSIS

Section 3.2: Multiple Linear Regression II. Jared S. Murray The University of Texas at Austin McCombs School of Business

Spatial Patterns Point Pattern Analysis Geographic Patterns in Areal Data

Stat 500 lab notes c Philip M. Dixon, Week 10: Autocorrelated errors

Bivariate Linear Regression James M. Murray, Ph.D. University of Wisconsin - La Crosse Updated: October 04, 2017

SYS 6021 Linear Statistical Models

Multivariate Analysis Multivariate Calibration part 2

Estimation and Inference by the Method of Projection Minimum Distance. Òscar Jordà Sharon Kozicki U.C. Davis Bank of Canada

MS&E 226: Small Data

Further Maths Notes. Common Mistakes. Read the bold words in the exam! Always check data entry. Write equations in terms of variables

A Very Brief EViews Tutorial

1. (20%) a) Calculate and plot the impulse response functions for the model. u1t 1 u 2t 1

An Econometric Study: The Cost of Mobile Broadband

PSY 9556B (Feb 5) Latent Growth Modeling

1 Methods for Posterior Simulation

Resources for statistical assistance. Quantitative covariates and regression analysis. Methods for predicting continuous outcomes.

Table Of Contents. Table Of Contents

SAS Econometrics and Time Series Analysis 1.1 for JMP

Section 4.1: Time Series I. Jared S. Murray The University of Texas at Austin McCombs School of Business

Extending the GLM. Outline. Mixed effects motivation Evaluating mixed effects methods Three methods. Conclusions. Overview

Introduction to Mixed-Effects Models for Hierarchical and Longitudinal Data

A First Tutorial in Stata

Session 2: Fixed and Random Effects Estimation

Multiple Regression White paper

Package endogenous. October 29, 2016

Mean Tests & X 2 Parametric vs Nonparametric Errors Selection of a Statistical Test SW242

Handbook of Statistical Modeling for the Social and Behavioral Sciences

Everything taken from (Hair, Hult et al. 2017) but some formulas taken elswere or created by Erik Mønness.

Week 11: Interpretation plus

Monte Carlo Analysis

Module 29.1: nag tsa identify Time Series Analysis Identification. Contents

Time-Varying Volatility and ARCH Models

CREATING THE ANALYSIS

Applied Statistics and Econometrics Lecture 6

STA 4273H: Statistical Machine Learning

5. Key ingredients for programming a random walk

MDM 4UI: Unit 8 Day 2: Regression and Correlation

Long Run Relationship between Global Electronic Cycle, Yen/Dollar Exchange Rate and Malaysia Export. - 國貿四乙 劉易宣

SASEG 9B Regression Assumptions

Long Term Analysis for the BAM device Donata Bonino and Daniele Gardiol INAF Osservatorio Astronomico di Torino

Intro to ARMA models. FISH 507 Applied Time Series Analysis. Mark Scheuerell 15 Jan 2019

PANEL DATA REGRESSION MODELS IN EVIEWS: Pooled OLS, Fixed or Random effect model?

Goals of the Lecture. SOC6078 Advanced Statistics: 9. Generalized Additive Models. Limitations of the Multiple Nonparametric Models (2)

Package DBfit. October 25, 2018

Autocorrelated errors explain the apparent relationship between disapproval of the US

Data Analysis and Solver Plugins for KSpread USER S MANUAL. Tomasz Maliszewski

Package PSTR. September 25, 2017

STAT 2607 REVIEW PROBLEMS Word problems must be answered in words of the problem.

Multiple Equations Model Selection Algorithm with Iterative Estimation Method

Subject Index. 2 2SLS estimation method See two-stage least squares. 3 3SLS estimation method See three-stage least squares

Machine Learning: An Applied Econometric Approach Online Appendix

Data Analysis Multiple Regression

Model combination. Resampling techniques p.1/34

Building Better Parametric Cost Models

Medical Image Analysis

Basics of Multivariate Modelling and Data Analysis

Package lgarch. September 15, 2015

ST512. Fall Quarter, Exam 1. Directions: Answer questions as directed. Please show work. For true/false questions, circle either true or false.

Multiple Linear Regression

Estimation of Item Response Models

The Bootstrap and Jackknife

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

Transcription:

Serial Correlation and Heteroscedasticity in Time series Regressions Econometric (EC3090) - Week 11 Agustín Bénétrix 1

Properties of OLS with serially correlated errors OLS still unbiased and consistent if errors are serially correlated Correctness of R-squared also does not depend on serial correlation OLS standard errors and tests will be invalid if there is serial correlation OLS will not be efficient anymore if there is serial correlation 2

Serial correlation and the presence of lagged dependent variables Is OLS inconsistent if there are serially correlated and lagged dependent variables? No: Including enough lags so that weak dependence holds guarantees consistency Including too few lags will cause an omitted variable problem and serial correlation because some lagged dependent variable end up in the error term 3

Efficiency and Inference In the presence of serial correlation the standard errors and test statistics are not valid (even asymptotically) 4

Efficiency and Inference Let s compute the variance of the OLS estimator in the presence of serial correlation.. Assume that the error term follows an AR(1) process : are uncorrelated random variables with mean zero and variance 5

Efficiency and Inference Consider the variance of the OLS slope estimator in the simple regression model The OLS estimator of can be written as where 6

Efficiency and Inference The variance of conditional on X is given by is the variance of under the Gauss- Markov assumptions when 7

Serial correlation in the presence of lagged dependent variables Suppose we have and assume that for stability The expression above can always be written as an error term as This satisfies the key assumption of no perfect collinearity so and are consistent 8

Serial correlation in the presence of lagged dependent variables However, the error term can be serially correlated The condition ensures that is uncorrelated with but it does not ensure that and are uncorrelated The covariance between and is which is not necessarily zero 9

Serial correlation in the presence of lagged dependent variables Thus, the errors exhibit serial correlation and the model contains a lagged dependent variable, but OLS consistently estimates and because these are the parameters in the conditional expectation The serial correlation in the errors will cause the usual OLS statistics to be invalid for testing purposes, but it will not affect consistency 10

Testing for serial correlation Testing for AR(1) serial correlation with strictly exogenous regressors AR(1) model for serial correlation (with an i.i.d. series e t ) Test in Replace true unobserved errors by estimated residuals 11

Example: Static Phillips curve Reject null hypothesis of no serial correlation 12

Durbin-Watson test under classical assumptions Under the 6 time series assumptions the Durbin- Watson test is an exact test (previous t-test is only valid asymptotically) vs. Reject if, "Accept" if Unfortunately, the Durbin-Watson test works with a lower and and an upper bound for the critical value. In the area between the bounds the test result is inconclusive. 13

Testing for serial correlation Testing for AR(1) serial correlation with general regressors The t-test for autocorrelation can be easily generalized to allow for the possibility that the explanatory variables are not strictly exogenous: Test for The test now allows for the possibility that the strict exogeneity assumption is violated The test may be carried out in a heteroscedasticity robust way 14

Testing for serial correlation Breusch-Godfrey test for AR(q) serial correlation 15

Correcting for serial correlation with strictly exogenous regressors Under the assumption of AR(1) errors, one can transform the model so that it satisfies all Gauss- Markov assumptions. For this model, OLS is BLUE Simple case of regression with only one explanatory variable. The general case works analogously Lag and multiply by 16

Correcting for serial correlation with strictly exogenous regressors The transformed error satisfies the GMassumptions Problem: The AR(1)-coefficient is not known and has to be estimated 17

Correcting for serial correlation with strictly exogenous regressors Replacing the unknown by leads to a FGLSestimator There are two variants: Cochrane-Orcutt estimation omits the first observation Prais-Winsten estimation adds a transformed first observation In smaller samples, Prais-Winsten estimation should be more efficient 18

Comparing OLS and FGLS with autocorrelation For consistency of FGLS more than the strict exogeneity assumption is needed because the transformed regressors include variables from different periods If OLS and FGLS differ dramatically this might indicate violation of strict exogeneity 19

Serial correlation-robust inference after OLS In the presence of serial correlation, OLS standard errors overstate statistical significance because there is less independent variation One can compute serial correlation-robust standard errors after OLS This is useful because FGLS requires strict exogeneity and assumes a very specific form of serial correlation (AR(1) or, generally, AR(q)) 20

Serial correlation-robust inference after OLS Serial correlation-robust standard errors: The usual OLS standard errors are normalized and then inflated by a correction factor Serial correlation-robust F- and t-tests are also available 21

Correction factor for serial correlation (Newey-West formula) : this term is the product of the residuals and the residuals of a regression of x tj on all other explanatory variables 22

Correction factor for serial correlation (Newey-West formula) The integer g controls how much serial correlation is allowed g=2: g=3: The weight of higher order autocorrelations is declining 23

Discussion of serial correlation-robust standard errors The formulas are also robust to heteroscedasticity, they are therefore called "heteroscedasticity and autocorrelation consistent" (HAC) For the integer g, values such as g=2 or g=3 are normally sufficient (there are more involved rules of thumb for how to choose g) Serial correlation-robust standard errors are only valid asymptotically; they may be severely biased if the sample size is not large enough 24

Discussion of serial correlation-robust standard errors The bias is the higher the more autocorrelation there is if the series are highly correlated, it might be a good idea to difference them first Serial correlation-robust errors should be used if there is serial correlation and strict exogeneity fails (e.g. in the presence of lagged depependent variables) 25

Heteroscedasticity in time series regressions Heteroscedasticity usually receives less attention than serial correlation Heteroscedasticity-robust standard errors also work for time series Heteroscedasticity is automatically corrected for if one uses the serial correlation-robust formulas for standard errors and test statistics 26

Testing for heteroscedasticity The usual heteroscedasticity tests assume absence of serial correlation Before testing for heteroscedasticity one should therefore test for serial correlation first, using a heteroscedasticity-robust test if necessary After serial correlation has been corrected for, test for heteroscedasticity 27

Example: Serial correlation and homoscedasticity in the EMH Test equation for the EMH Test for serial correlation: No evidence for serial correlation 28

Example: Serial correlation and homoscedasticity in the EMH Test for heteroscedasticity: Strong evidence for heteroscedasticity Note: Volatility is higher if returns are low 29

Autoregressive Conditional Heteroscedasticity (ARCH) Even if there is no heteroscedasticity in the usual sense (the error variance depends on the explanatory variables), there may be heteroscedasticity in the sense that the variance depends on how volatile the time series was in previous periods: ARCH(1) model 30

Consequences of ARCH in static and distributed lag models If there are no lagged dependent variables among the regressors, i.e. in static or distributed lag models, OLS remains BLUE under TS.1-TS.5 Also, OLS is consistent etc. for this case under assumptions TS.1 -TS.5 In this case, the assumption of homokedasticity still holds under ARCH 31

Consequences of ARCH in dynamic models In dynamic models, i.e. models including lagged dependent variables, the homoscedasticity assumption will necessarily be violated: because 32

Consequences of ARCH in dynamic models This means that the error variance indirectly depends on explanatory variables In this case, heteroscedasticity-robust standard error and test statistics should be computed, or a FGLS/WLS-procedure should be applied Using a FGLS/WLS-procedure will also increase efficiency 33

Example: Testing for ARCH-effects in stock returns Are there ARCH-effects in these errors? 34

Example: Testing for ARCH-effects in stock returns Estimating equation for ARCH(1) model There are statistically significant ARCH-effects: If returns were particularly high or low (squared returns were high) they tend to be particularly high or low again, i.e. high volatility is followed by high volatility 35

A FGLS procedure for serial correlation and heteroscedasticity Given or estimated model for heteroscedasticity: Model for serial correlation: 36

A FGLS procedure for serial correlation and heteroscedasticity Estimate transformed model by Cochrane- Orcutt or Prais-Winsten techniques (because of serial correlation in transformed error term) 37