Econometric Analysis of Financial and Economic Time Series: Volume 20 Part 2

Subject:

Table of contents

(20 chapters)

Volume 20 of Advances in Econometrics is dedicated to Rob Engle and Sir Clive Granger, winners of the 2003 Nobel Prize in Economics, for their many valuable contributions to the econometrics profession. The Royal Swedish Academy of Sciences cited Rob “for methods of analyzing economic time series with time-varying volatility (ARCH)” while Clive was cited “for methods of analyzing economic time series with common trends (cointegration).” Of course, these citations are meant for public consumption but we specialists in time series analysis know their contributions go far beyond these brief citations. Consider some of Rob's other contributions to our literature: Aggregation of Time Series, Band Spectrum Regression, Dynamic Factor Models, Exogeneity, Forecasting in the Presence of Cointegration, Seasonal Cointegration, Common Features, ARCH-M, Multivariate GARCH, Analysis of High Frequency Data, and CAViaR. Some of Sir Clive's additional contributions include Spectral Analysis of Economic Time Series, Bilinear Time Series Models, Combination Forecasting, Spurious Regression, Forecasting Transformed Time Series, Causality, Aggregation of Time Series, Long Memory, Extreme Bounds, Multi-Cointegration, and Non-linear Cointegration. No doubt, their Nobel Prizes are richly deserved. And the 48 authors of the two parts of this volume think likewise. They have authored some very fine papers that contribute nicely to the same literature that Rob's and Clive's research helped build.

The editors are pleased to offer the following papers to the reader in recognition and appreciation of the contributions to our literature made by Robert Engle and Sir Clive Granger, winners of the 2003 Nobel Prize in Economics. Please see the previous dedication page of this volume. The basic themes of this part of Volume 20 of Advances in Econometrics are time-varying betas of the capital asset pricing model, analysis of predictive densities of nonlinear models of stock returns, modeling multivariate dynamic correlations, flexible seasonal time series models, estimation of long-memory time series models, the application of the technique of boosting in volatility forecasting, the use of different time scales in Generalized Auto-Regressive Conditional Heteroskedasticity (GARCH) modeling, out-of-sample evaluation of the ‘Fed Model’ in stock price valuation, structural change as an alternative to long memory, the use of smooth transition autoregressions in stochastic volatility modeling, the analysis of the “balancedness” of regressions analyzing Taylor-type rules of the Fed Funds rate, a mixture-of-experts approach for the estimation of stochastic volatility, a modern assessment of Clive's first published paper on sunspot activity, and a new class of models of tail-dependence in time series subject to jumps. Of course, we are also pleased to include Rob's and Clive's remarks on their careers and their views on innovation in econometric theory and practice that were given at the Third Annual Advances in Econometrics Conference held at Louisiana State University, Baton Rouge, on November 5–7, 2004.

The Nobel Prize is given for good ideas – very good ideas. These ideas often shape the direction of research for an academic discipline. These ideas are often accompanied by a great deal of work by many researchers.

In 1956, I was searching for a Ph.D. topic and I selected time series analysis as being an area that was not very developed and was potentially interesting. I have never regretted that choice. Occasionally, I have tried to develop other interests but after a couple of years away I would always return to time series topics where I am more comfortable.

A large literature over several decades reveals both extensive concern with the question of time-varying betas and an emerging consensus that betas are in fact time-varying, leading to the prominence of the conditional CAPM. Set against that background, we assess the dynamics in realized betas, vis-à-vis the dynamics in the underlying realized market variance and individual equity covariances with the market. Working in the recently popularized framework of realized volatility, we are led to a framework of nonlinear fractional cointegration: although realized variances and covariances are very highly persistent and well approximated as fractionally integrated, realized betas, which are simple nonlinear functions of those realized variances and covariances, are less persistent and arguably best modeled as stationary I(0) processes. We conclude by drawing implications for asset pricing and portfolio management.

We investigate predictive abilities of nonlinear models for stock returns when density forecasts are evaluated and compared instead of the conditional mean point forecasts. The aim of this paper is to show whether the in-sample evidence of strong nonlinearity in mean may be exploited for out-of-sample prediction and whether a nonlinear model may beat the martingale model in out-of-sample prediction. We use the Kullback–Leibler Information Criterion (KLIC) divergence measure to characterize the extent of misspecification of a forecast model. The reality check test of White (2000) using the KLIC as a loss function is conducted to compare the out-of-sample performance of competing conditional mean models. In this framework, the KLIC measures not only model specification error but also parameter estimation error, and thus we treat both types of errors as loss. The conditional mean models we use for the daily closing S&P 500 index returns include the martingale difference, ARMA, STAR, SETAR, artificial neural network, and polynomial models. Our empirical findings suggest the out-of-sample predictive abilities of nonlinear models for stock returns are asymmetric in the sense that the right tails of the return series are predictable via many of the nonlinear models, while we find no such evidence for the left tails or the entire distribution.

In this article, we propose a new class of flexible seasonal time series models to characterize the trend and seasonal variations. The proposed model consists of a common trend function over periods and additive individual trend (seasonal effect) functions that are specific to each season within periods. A local linear approach is developed to estimate the trend and seasonal effect functions. The consistency and asymptotic normality of the proposed estimators, together with a consistent estimator of the asymptotic variance, are obtained under the α-mixing conditions and without specifying the error distribution. The proposed methodologies are illustrated with a simulated example and two economic and financial time series, which exhibit nonlinear and nonstationary behavior.

Since the seminal works by Granger and Joyeux (1980) and Hosking (1981), estimations of long-memory time series models have been receiving considerable attention and a number of parameter estimation procedures have been proposed. This paper gives an overview of this plethora of methodologies with special focus on likelihood-based techniques. Broadly speaking, likelihood-based techniques can be classified into the following categories: the exact maximum likelihood (ML) estimation (Sowell, 1992; Dahlhaus, 1989), ML estimates based on autoregressive approximations (Granger & Joyeux, 1980; Li & McLeod, 1986), Whittle estimates (Fox & Taqqu, 1986; Giraitis & Surgailis, 1990), Whittle estimates with autoregressive truncation (Beran, 1994a), approximate estimates based on the Durbin–Levinson algorithm (Haslett & Raftery, 1989), state-space-based maximum likelihood estimates for ARFIMA models (Chan & Palma, 1998), and estimation of stochastic volatility models (Ghysels, Harvey, & Renault, 1996; Breidt, Crato, & de Lima, 1998; Chan & Petris, 2000) among others. Given the diversified applications of these techniques in different areas, this review aims at providing a succinct survey of these methodologies as well as an overview of important related problems such as the ML estimation with missing data (Palma & Chan, 1997), influence of subsets of observations on estimates and the estimation of seasonal long-memory models (Palma & Chan, 2005). Performances and asymptotic properties of these techniques are compared and examined. Inter-connections and finite sample performances among these procedures are studied. Finally, applications to financial time series of these methodologies are discussed.

Increasing availability of the financial data has opened new opportunities for quantitative modeling. It has also exposed limitations of the existing frameworks, such as low accuracy of the simplified analytical models and insufficient interpretability and stability of the adaptive data-driven algorithms. I make the case that boosting (a novel, ensemble learning technique) can serve as a simple and robust framework for combining the best features of the analytical and data-driven models. Boosting-based frameworks for typical financial and econometric applications are outlined. The implementation of a standard boosting procedure is illustrated in the context of the problem of symbolic volatility forecasting for IBM stock time series. It is shown that the boosted collection of the generalized autoregressive conditional heteroskedastic (GARCH)-type models is systematically more accurate than both the best single model in the collection and the widely used GARCH(1,1) model.

Apart from the well-known, high persistence of daily financial volatility data, there is also a short correlation structure that reverts to the mean in less than a month. We find this short correlation time scale in six different daily financial time series and use it to improve the short-term forecasts from generalized auto-regressive conditional heteroskedasticity (GARCH) models. We study different generalizations of GARCH that allow for several time scales. On our holding sample, none of the considered models can fully exploit the information contained in the short scale. Wavelet analysis shows a correlation between fluctuations on long and on short scales. Models accounting for this correlation as well as long-memory models for absolute returns appear to be promising.

The “Fed Model” postulates a cointegrating relationship between the equity yield on the S&P 500 and the bond yield. We evaluate the Fed Model as a vector error correction forecasting model for stock prices and for bond yields. We compare out-of-sample forecasts of each of these two variables from a univariate model and various versions of the Fed Model including both linear and nonlinear vector error correction models. We find that for stock prices the Fed Model improves on the univariate model for longer-horizon forecasts, and the nonlinear vector error correction model performs even better than its linear version.

This paper shows that volatility persistence in GARCH models and spurious long memory in autoregressive models may arise if the possibility of structural changes is not incorporated in the time series model. It also describes a tractable hidden Markov model (HMM) in which the regression parameters and error variances may undergo abrupt changes at unknown time points, while staying constant between adjacent change-points. Applications to real and simulated financial time series are given to illustrate the issues and methods.

In this paper, we propose a Bayesian approach to model the level and the variance of (financial) time series by the special class of nonlinear time series models known as the logistic smooth transition autoregressive models, or simply the LSTAR models. We first propose a Markov Chain Monte Carlo (MCMC) algorithm for the levels of the time series and then adapt it to model the stochastic volatilities. The LSTAR order is selected by three information criteria: the well-known AIC and BIC, and by the deviance information criteria, or DIC. We apply our algorithm to a synthetic data and two real time series, namely the canadian lynx data and the SP500 returns.

Relying on Clive Granger's many and varied contributions to econometric analysis, this paper considers some of the key econometric considerations involved in estimating Taylor-type rules for US data. We focus on the roles of unit roots, cointegration, structural breaks, and non-linearities to make the case that most existing estimates are based on an unbalanced regression. A variety of estimates reveal that neglected cointegration results in the omission of a necessary error correction term and that Federal Reserve (Fed) reactions during the Greenspan era appear to have been asymmetric. We argue that error correction and non-linearities may be one way to estimate Taylor rules over long samples when the underlying policy regime may have changed significantly.

The problem of model mixing in time series, for which the interest lies in the estimation of stochastic volatility, is addressed using the approach known as Mixture-of-Experts (ME). Specifically, this work proposes a ME model where the experts are defined through ARCH, GARCH and EGARCH structures. Estimates of the predictive distribution of volatilities are obtained using a full Bayesian approach. The methodology is illustrated with an analysis of a section of US dollar/German mark exchange rates and a study of the Mexican stock market index using the Dow Jones Industrial index as a covariate.

In a brilliant career spanning almost five decades, Sir Clive Granger has made numerous contributions to time series econometrics. This paper reappraises his very first paper, published in 1957 on sunspot numbers.

There is a type of art that is known as “naive,” which can be very simplistic and have a certain amount of charm. Reading about my own work on sunspots published in 1957 also brings to mind the description “naive.” I was certainly naive in thinking that I could undertake a piece of simple numerical analysis and then send it off to a major journal. The world of empirical research was then very simplistic as we had no computer at the University of Nottingham where I was employed, and all calculations had to be done on an electronic calculator and all plots by hand. It was clear that if monthly data for sunspots had been available, I would have been overwhelmed by the quantity of data! Trying to plot by hand nearly 200 years of monthly data is a lengthy task! The computing restrictions also limited the types of model that could be considered.

In this paper, the gamma test is used to determine the order of lag-k tail dependence existing in financial time series. Using standardized return series, statistical evidences based on the test results show that jumps in returns are not transient. New time series models which combine a specific class of max-stable processes, Markov processes, and GARCH processes are proposed and used to model tail dependencies within asset returns. Estimators for parameters in the models are developed and proved to be consistent and asymptotically joint normal. These new models are tested on simulation examples and some real data, the S&P 500.

DOI
10.1016/S0731-9053(2006)20_Part_2
Publication date
Book series
Advances in Econometrics
Editors
Series copyright holder
Emerald Publishing Limited
ISBN
978-0-76231-273-3
eISBN
978-1-84950-388-4
Book series ISSN
0731-9053