ECONOMETRIC FORECASTING AND HIGH-FREQUENCY DATA ANALYSIS
(5 Apr - 22 May 2004)

Jointly organized by Institute for Mathematical Sciences, National University of Singapore and
School of Economics and Social Sciences, Singapore Management University

~ Abstracts ~

Econometric analysis of high-frequency financial data
Jeffrey Russell, University of Chicago

A new form of financial data has been developed and distributed over the last decade. These new data sets contain sometimes 10's of thousands of prices for a given asset in a single day. With these new data sets come new challenges for the empirical investigator. For example, the price data need not occur at regular intervals and price changes often take only a handful of values. Additionally, intraday prices may exhibit strong dependence perhaps even long memory in the volatility. Diurnal and day of week patterns are commonplace. These challenges, however, are worth confronting. The continually developing field of theoretical market microstructure provides a rich theory from which to view the data and test hypothesis about the market structure. Frequently tests of market microstructure theory requires analysis of intraday price data. These lectures examine the detailed features of high-frequency data and modeling strategies designed to capture them. In doing so, an eye will be kept toward the applications of such models in order to learn about the economics of price setting and market structure.

« Back...

Forecasting seasonal time series
Philip Hans Franses, Erasmus University Rotterdam, The Netherlands

There will be four lectures:

  1. Seasonality properties of macro, financial and marketing data
  2. Basic models for stable and non-stable seasonality
  3. Advanced models for non-stable seasonality
  4. Further topics: Multivariate models, panels and nonlinearity

Many economic time series variables, in particular in macroeconomics, finance and marketing, display some form of seasonality. This concerns seasonal variation in means, variances and also in correlation structures. Sometimes, this variation is not constant over time.

There is overwhelming evidence that out-of-sample forecasts improve for models which include seasonal variation in a proper way.

This tutorial starts off with an illustration of seasonality properties of macro, financial and marketing data. Then, we review basic models for stable and non-stable seasonality. Sometimes more advanced models for non-stable seasonality are needed, like periodic models. Finally, we address various topics like multivariate models, panels of seasonal time series and models for nonlinear data.

« Back...

An unbiased measure of realized variance
Asger Lunde, Aarhus School of Business

The realized variance (RV) is known to be biased because intraday returns are contaminated with market microstructure noise, in particular if intraday returns are sampled at high frequencies. In this paper, we characterize the bias under a general specification for the market microstructure noise, where the noise may be autocorrelated and need not be independent of the latent price process. Within this framework, we propose a simple Newey-West type correction of the RV that yields an unbiased measure of volatility, and we characterize the optimal unbiased RV in terms of the mean squared error criterion. Our empirical analysis of the 30 stocks of the Dow Jones Industrial Average index shows the necessity of our general assumptions about the noise process. Further, the empirical results show that the modified RV is unbiased even if intraday returns are sampled every second.

« Back...

Forecast uncertainty, its representation and evaluation
Kenneth Wallis, University of Warwick

Sources of error in econometric model-based forecasts. Calculation of forecast error variances, one-step and multi-step forecasts. Representation and reporting of uncertainty: interval forecasts (central intervals, shortest intervals), density forecasts (histograms, fan charts), event probability forecasts, forecast scenarios. Decision theory considerations. Survey forecasts, disagreement and uncertainty. Evaluation of forecasts; economic value, statistical performance. Goodness-of-fit tests; Pearson’s chi-squared test, likelihood ratio tests, probability integral transform, Kolmogorov-Smirnov test. Applications: Bank of England inflation forecasts, Survey of Professional Forecasters.

« Back...

Microstructure noise, realized volatility, and optimal sampling
Federico Bandi, University of Chicago

Recorded prices are known to diverge from their “efficient” values due to the presence of market microstructure contaminations. The microstructure noise creates a dichotomy in the model-free estimation of integrated volatility. While it is theoretically necessary to sum squared returns that are computed over very small intervals to better indentify the underlying quadratic variation over a period, the summing of numerous contaminated return data entails substantial accumulation of noise.

Using asymptotic arguments as in the extant theoretical literature on the subject, we argue that the realized volatility estimator diverges to infinity almost surely when noise plays a role. While realized volatility cannot be a consistent estimate of the quadratic variation of the log price process, we show that a standardized version of the realized volatility estimator can be employed to uncover the second moment of the (unobserved) noise process. More generally, we show that straightforward sample moments of the noisy return data provide consistent estimates of the moments of the noise process.

Finally, we quantify the finite sample bias/variance trade-off that is induced by the accumulation of noisy observations and provide clear and easily implementable directions for optimally sampling contaminated high frequency return data for the purpose of volatility estimation.

« Back...

Full-information transaction costs
Jeffrey Russell, University of Chicago

In a world with private information and learning on the part of the market participants, transaction costs should be defined as the (positive) differences between transaction prices and full-information prices, i.e., the prices that reflect all information, private and public, about the asset of interest. While current approaches to the estimation of execution costs largely focus on measuring the differences between transaction prices and efficient prices, i.e., the prices that embed all publicly available information about the asset, this work pro- vides a simple and robust methodology to identify full-information transaction costs based on high-frequency transaction price data. Our estimator is defined in terms of sample moments and is model-free in nature. Specifically, our measure of transaction costs is robust to unrestricted temporary and permanent market microstructure frictions as induced by operating costs (order-processing and inventory-keeping, among others) and adverse selection. Using a sample of S&P 100 stocks we provide support for both the operating cost and the asymmetric information theory of transaction cost determination but show that conventional measures of execution costs have the potential to understate considerably the true cost of trade..

« Back...

On portfolio optimization: how do we benefit from high-frequency data
Qianqiu Liu, University of Hawaii

In this paper, I consider the problem faced by a professional investment manager who wants to track the return of the S&P 500 index with 30 DJIA stocks. The manager constructs many covariance matrix estimators, based on daily returns and high-frequency returns, to form his optimal portfolio. Although prior research has documented that realized volatility based on intraday returns is more precise than daily return constructed volatility, the manager will not switch from daily to intraday returns to estimate the conditional covariance matrix if he rebalances his portfolio monthly and has past 12 months of data to use. He will switch to intraday returns only when his estimation horizon is shorter than 6 months or he rebalances his portfolio daily.

« Back...

Intraday periodicity, long memory volatility, and macroeconomic announcement effects on China Treasury bond market
Wu Jie, National University of Singapore

In this paper, we provide a detailed characterization of the volatility in China Treasury bond market using a sample of 5-min excess return from January, 4, 2000 to February, 28, 2002. We use two-step regression procedure and multivariate GARCH model to show that macroeconomic announcements is an important source of the volatility in China Treasury Bond market. Among the various announcements, we identify GDP, consumer price index (CPI), retail price index (RPI), People Bank of China benchmark interest rate, Shanghai Security Exchange (SSE) A-share index as having the greatest effects, which explain the observed high degree of volatility persistence on China Treasury bond market. Our analysis also uncovers striking long-memory volatility dependencies in China Treasury bond market, which is consistent with the finding in developed Treasury bond markets.

« Back...

Statistical analysis of high frequency financial data and modeling of financial time series
Baosheng Yuan, National University of Singapore

We present our results on statistical analysis of high frequency financial data (HFFD) and our model of trend and trend reversals for describing financial time series. We introduce new statistical approaches to investigate the distributions of conditional return which show excess of return volatility. By examining the return distribution conditioned on the return of the previous period, we find clear evidence of volatility clustering, not only for HFFD, but also for daily data. This volatility cluster is represented by two observations: 1) A large return volatility (measured by the standard deviation) in the previous (return valuation) period will have a large return volatility to follow, and vice versa; 2). The standard deviation of the conditional return distribution is approximately proportional to the absolute return of the previous period. We also find that the conditional distributions when rescaled by the respective standard deviations fall to a universal curve.

We also introduce a threshold­based trended return (THBTR) to study the statistics of trend persistence and statistics on returns of up­ and down­ trends for HFFD and compare with the corresponding quantities generated using Gaussian random process. We find that THBTR can distinguish very well the real financial data from a random Gaussian time series. Based on the empirical fact observed in the real financial data, we construct a simple model based on short­time trend and trend reversal. We show that the model can generate simulation data that possess all the critical statistics we observed in the real data. The model can potentially be used for option pricing.

« Back...

Properties of realized variance for pure jump processes: calendar time sampling versus business time sampling
Roel Oomen, University of Warwick

In this paper we use a pure jump process for high frequency returns to analyze the impact of market microstructure effects on the properties of the realized variance measure. We provide closed form expressions for the bias and mean squared error of realized variance under alternative sampling schemes. Importantly, we show that business time sampling is superior to the common practice of calendar time sampling. Using IBM transaction data we estimate the model and determine the optimal sampling frequency for each day in the data set. We also provide new insights into the relation between the optimal sampling frequency and market liquidity.

« Back...

How volatile are East Asian stocks during high volatility periods?
Carlos C. Bautista, University of the Philippines

This study reports the estimates of the magnitude of volatility during abnormal times relative to normal periods for seven East Asian economies using a rudimentary univariate Markov switching ARCH method. The results show that global events like the 1990 gulf war, the opening up of country borders in the mid-1990s and the 1997 Asian currency crisis led to high volatility episodes whose magnitude relative to normal times differ from country to country. Country specific events are also observed to lead to high volatility periods. Additional insights are obtained when volatility was assumed to evolve according to a three-state Markov regime switching process.

« Back...

Wishart quadratic term structure models
Christian Gourieroux, University of Toronto

This paper reveals that the class of affine term structure models introduced by Duffie and Kan (1996) is much larger than it is considered in the literature. We study ”fundamental” risk factors, which represent multivariate risk aversion of the consumer or the volatility matrix of the technological activity returns, and argue that they can be defined as symmetric positive matrices. For such matrices we introduce a dynamic affine process called theWishart autoregressive (WAR) process; this process is used to reveal the associated term structure. In this framework: i) we derive very simple restrictions on the parameters to ensure positive yields at all maturities; ii) we observe that the usual constraint on the volatility matrix of an affine process be diagonal up to a path independent linear invertible transformation can be considerably relaxed.

The Wishart Quadratic Term Structure Model is the natural extension of the one-dimensional Cox-Ingersoll-Ross model and of the quadratic models introduced in the literature.

« Back...

Using out-of-sample mean squared prediction errors to test the martingale difference hypothesis
Kenneth D. West, University of Wisconsin

We consider using out-of-sample mean squared prediction errors (MSPEs) to evaluate the null that a given series follows a zero mean martingale difference against the alternative that it is linearly predictable. Under the null of zero predictability, the population MSPE of the null “no change” model equals that of the linear alternative. We show analytically and via simulations that despite this equality, the alternative model’s sample MSPE is expected to be greater than the null’s. We propose and evaluate an asymptotically normal test that properly accounts for the upward shift of the sample MSPE of the alternative model. Our simulations indicate that our proposed procedure works well.

« Back...

Forecasting jumps in conditional volatility: The GARCH-IE model
Philip Hans Franses, Erasmus University Rotterdam, The Netherlands

Conditional volatility models, like ARCH models, are often used in empirical finance. These models are useful for forecasting next period's increased volatility, given that the current observation marks the start of such a period. This property follows from the autoregressive nature of these models.

In this paper we put forward a GARCH-type model with an additional component (an innovation effect), which is introduced to attempt to forecast sudden jumps in volatility. That is, we aim to forecast not only the second volatile observation but also the first.

We discuss representation, parameter estimation and inference for this so-called GARCH-IE model. We illustrate the model for seven stock markets, and document that certain technical trading rules can have predictive value for such jumps.

« Back...

An integrated smoothed maximum score estimator for a generalized censored Quantile regression model
Songnian Chen, Hong Kong University of Science and Technology

Quantile regression models have received a great deal of attention in both theoretical and applied econometrics since the influential work of Koenker and Bassett (1978). While the linear setup is the most common approach, nonlinearity is certainly a common phenomenon in empirical applications. In this paper we consider semiparametric estimation of a general quantile regression model with censored data. Based on the observation that the censored regression model corresponds to a sequence of binary choice models with different cutoff points, we propose an integrated smoothed maximum score estimator by combining different cutoff points, following the insight of Horowitz (1992) and Manski (1985) for the binary choice model under quantile regression. Similar to Horowitz (1992), we only use one-dimensional kernels.

« Back...

Wavelet transform for log periodogram regression in long memory stochastic volatility model
Jin Lee, National University of Singapore

We consider semiparametric log periodogram regression estimation of memory parameter for the latent process in long memory stochastic volatility models. It is known that though widely used among researchers, the Geweke and Porter-Hudak (1983; GPH) LP estimator violates the Gaussian or Martingale assumption, which results in significant negative bias due to the existence of the spectrum of non-Gaussian noise. Through wavelet transform of the squared process, we effectively remove the noise spectrum around zero frequency, and obtain Gaussian-approximate spectral represenation at zero frequency. We propose wavelet-based regression estimator, and derive the asymptotic mean squared error and the consistency in line with the asymptotic theory in the long memory literature. Simulation studies show that wavelet-based regression estimation is an effective way in reducing the bias, compared with the GPH estimator.

Keywords: long memory stochastic volatility, wavelet transform, log periodogram regression.

« Back...

Constructing a coincident index of business cycles without assuming a one-factor model
Yasutomo Murasawa, Osaka Prefecture University

The Stock--Watson coincident index and its subsequent extentions assume a static linear one-factor model for the component indicators. Such assumption is restrictive in practice, however, with as few as four indicators. In fact, such assumption is unnecessary if one poses the index construction problem as optimal prediction of latent monthly real GDP. This paper estimates a VAR model for latent monthly real GDP and other indicators using the observable mixed-frequency series. The EM algorithm is useful for overcoming the computational difficulty, especially in model selection. The smoothed estimate of latent monthly real GDP is the proposed index.

« Back...

The BSP’s structural long-term inflation forecasting model for the Philippines
Francis Dakila, Bangko Sentral ng Pilipinas

The objective of the project is to construct a structural long-term annual macroeconometric model of the Philippine economy that will serve as a quantitative tool of the BSP to forecast headline and core inflation rates one to two years into the future; to analyze the impact on headline and core inflation of key factors such as the exchange rate, world oil price, interest rates, wages, government borrowing and other relevant variables; to determine the effectiveness of different channels and instruments of monetary policy, with special attention to the impact of changes in the BSP’s short-term borrowing and lending rates, which are the BSP’s policy levers; and to guide the monetary authorities in their decision making process pertaining to the appropriate policies for the attainment of the BSP’s primary mandate of promoting price stability conducive to balanced and sustainable economic growth.

« Back...

Are the directions of stock price changes predictable? Statistical theory and evidence
Yongmiao Hong, Cornell University, USA

We propose a model-free omnibus statistical procedure to check whether the direction of changes in an economic variable is predictable using the history of its past changes. A class of separate inference procedures are also given to gauge possible sources of directional predictability. In particular, they can reveal information about whether the direction of future changes is predictable using the direction, level, volatility, skewness, and kurtosis of past changes. An important feature of the proposed procedures is that they check many lags simultaneously, which is particularly suitable for detecting the alternatives where directional dependence is small at each lag but it carries over a long distributional lag. At the same time, the tests naturally discount higher order lags, which is consistent with the conventional wisdom that  financial markets are more in influenced by the recent past events than by the remote past events.

We apply the proposed procedures to four daily stock price indices and twenty daily individual stock prices in the U.S. We find overwhelming evidence that the direction of excess stock returns-for both indices and individual stocks, are predictable using past excess stock returns. The evidence is stronger for the directional predictability of large excess stock returns-both positive and negative. In particular, the direction and level of past excess stock returns can be used to predict the direction of future excess stock returns with any threshold, and the volatility, skewness and kurtosis of past excess stock returns can be used to predict the direction of future excess stock returns with nonzero thresholds. The well-known strong volatility clustering together with weak serial dependence in mean can explain much but not all of the documented directional predictability for stock returns. The direction of standardized estimated return residuals can still be predicted by the level and direction of past standardized return residuals. To exploit the economic significance of the documented directional predictability for stock returns, we consider a class of autologit models for directional forecasts and find that they have significant out-of-sample directional predictive power. We further investigate whether a dynamic trading strategy based on the out-of-sample directional forecasts of these models can earn a significant risk-adjusted extra profit against the buy-and-hold trading strategy.

Key words: Characteristic function, Directional predictability, Efficient market hypothesis, Generalized spectrum, Market timing, Sign Dependence, Stock prices, Threshold GARCH, Volatility clustering.

« Back...

A smooth test for density forecast evaluation
Anil K. Bera, University of Illinois at Urbana-Champaign

Recently financial econometricians have shifted their attention from point and interval forecasts to density forecasts mainly to address the issue of the huge loss of information that results from depicting portfolio risk by a measure of dispersion alone. One of the major problems in this area has been the evaluation of the quality of different density forecasts. In this paper we propose an analytical test for density forecast evaluation using the Smooth Test procedure for both independent and serially dependent data. Apart from indicating the acceptance or rejection of the hypothesized model, this approach provides specific sources (such as the mean, variance, skewness and kurtosis or the location, scale and shape of the distribution or types of dependence) of departure, thereby helping in deciding possible modifications of the assumed forecast model.

We also address the issue of where to split the sample into in-sample (estimation sample) and out-of-sample (testing sample) observations in order to evaluate the “goodness-of-fit” of the forecasting model both analytically, as well as through simulation exercises. Monte Carlo studies revealed that the proposed test has good size and power properties. We also further investigate applications to value weighted S&P 500 returns that initially indicates that introduction of a conditional heteroscedasticity model significantly improve the model over one with constant conditional variance.

Keywords: Smooth test, score test, locally most powerful unbiased test, density forecast evaluation, probability integral transform, sample selection method, t-GARCH model, simulation based method, sample split

« Back...

Stock market volatility: examining North America, Europe and Asia
Gamini Premaratne, National University of Singapore

An understanding of volatility in stock markets is important for determining the cost of capital and for assessing investment and leverage decisions as volatility is synonymous with risk. Substantial changes in volatility of financial markets are capable of having significant negative effects on risk averse investors. Using daily returns from 1992 to 2002, we investigate volatility co-movement between the Singapore stock market and the markets of US, UK, Hong Kong and Japan. In order to gauge volatility co-movement, we employ econometric models of (i) Univariate GARCH (ii) Vector Autoregression and (iii) a Multivariate and Asymmetric Multivariate GARCH model with GJR extensions. The empirical results indicate that there is a high degree of volatility co-movement between Singapore stock market and that of Hong Kong, US, Japan and UK (in that order). Results support small but significant volatility spillover from Singapore into Hong Kong, Japan and US markets despite the latter three being dominant markets. Most of the previous research concludes that spillover effects are significant only from the dominant market to the smaller market and that the volatility spillover effects are unidirectional. Our study evinces that it is plausible for volatility to spill over from the smaller market to the dominant market. At a substantive level, studies on volatility co-movement and spillover provide useful information for risk analysis.

« Back...

Univariate and multivariate analysis and modeling of high-frequency financial data
Wolfgang Breymann, Institut für Physik, Switzerland

Intraday financial data require special methods for data analysis and modeling. The reason is that due to the high observation frequency of the order of minutes or hours, new phenomena can be observed and the amount of data is about two orders of magnitude higher than for daily data.

In this series of lectures the following subjects will be covered:

  • Univariate stylized facts of high-frequency financial data. They include scaling behavior, volatility clustering, heavy tails, and seasonalities and the change of these characteristics with the time horizon.
  • A universal method for univariate and multivariate deseasonalization of financial time series. This is an indispensable prerequisite for further analysis.
  • Dependence structure analysis and modeling be means of copula techniques, test of ellipticality, spectral measure estimation, and modeling of multivariate excesses.
  • Modeling financial time series by means of a hierarchical model containing a volatility cascade from long to short time horizons. This model can also be used for volatility forecasting.
  • Practical demonstration of the analysis of high-frequency financial data using R/S-PLUS.

« Back...

Intraday diversified world stock indices: dynamics, return distributions, dependence structure
Wolfgang Breymann, Institut für Physik, Switzerland

This paper proposes an approach to the intraday analysis of diversified world stock accumulation indices. The growth optimal portfolio (GOP) is used as reference unit or benchmark in a continuous financial market model. Diversified portfolios, covering the world stock market, are constructed and shown to approximate the GOP, providing the basis for a range of financial applications. The normalized GOP is modeled as a time transformed square root process of dimension four. Its dynamics are empirically verified for several world stock indices. Furthermore, the evolution of the transformed time is modeled as the integral over a rapidly evolving mean-reverting market activity process with deterministic volatility. The empirical findings suggest a rather simple and robust model for a world stock index that reflects the historical evolution, by using only a few readily observable parameters.

« Back...

A comparison of U.S. and Hong Kong cap-floor volatility dynamics
Paul D. McNelis, Georgetown University

In this paper we investigate the dynamics of Hong Kong Cap/Floor volatilities and compare their dynamics with the US Cap/Floor volatilities. We use linear and non-linear factor models and VAR’s. The results show that the first principal components, both linear and nonlinear, do a very good job in explaining the dynamics of the volatility curve and but there is not much to be gained by moving to nonlinear models for the case of Hong Kong data. Secondly, we see that Hong Kong cap-floor volatilities cannot be obtained from the USD cap-floor volatilities by simply adding a volatility spread. The two sets of volatilities are non-trivially related to each other.

Key words: Cap-floor volatilities, linear and non-linear principal components

« Back...

An assessment of Bank of England and National Institute inflation forecast uncertainties
Kenneth Wallis, University of Warwick

This paper evaluates the density forecasts of inflation published by the Bank of England and the National Institute of Economic and Social Research. It extends the analysis of the Bank of England’s fan charts in an earlier article by considering data up to 2003, quarter 4, and by correcting some technical details in the light of information published on the Bank’s website in Summer 2003. National Institute forecasts are also considered, although there are fewer comparable observations. Both groups’ central point forecasts are found to be unbiased, but their density forecasts substantially overstated forecast uncertainty.

« Back...

Multivariate time series analysis and forecasting
Manfred Deistler, Technische Universität Wien

Lecture 1:

  • Stationary processes and covariance functions. Examples for stationary processes: White noise, MA and MA(infinity) processes, harmonic processes.
  • Covariance functions and spectral densities. Linear Transformations of stationary processes.
  • Forecasting, the Wold decomposition.
  • Linear systems. Linear vector difference equations and their solutions.
  • Integrated processes.

Lecture 2:

  • Identifiability and estimation of (vector) AR(X), ARMA(X) and state space systems.
  • Least squares and maximum likelihood estimators. The curse of dimensionality.
  • Model selection by information criteria.
  • Cointegration.

Lecture 3:

  • Reducing the dimension of the parameter-space.
  • Static and dynamic factor models: Principal components and idiosyncratic noise models: Identifiability and estimation.
  • Inputs: Forecasting factors and noise by ARX models.
  • Dynamic (ARX) reduced rank models: Estimation via singular value decomposition.
  • Model selection ( input selection and dynamic specification ) procedures.

Lecture 4:

  • Forecasting returns, a case study for the European banking sector.
  • An algorithm for input selection and dynamic specification. Problems in evaluating the forecasting quality.
  • A comparison of principal components-, idiosyncratic noise- and reduced rank models. The influence of design parameters.

« Back...

Temporal aggregation, causality distortions, and a sign rule
Tilak Abeysinghe, National University of Singapore

Temporally aggregated data is a bane for Granger causality tests. The same set of variables may lead to contradictory causality inferences at different levels of temporal aggregation. Obtaining temporally disaggregated data series is impractical in many situations. Since cointegration is invariant to temporal aggregation and implies Granger causality this paper proposes a sign rule to establish the direction of causality. Temporal aggregation leads to a distortion of the sign of the adjustment coefficients of an error correction model. The sign rule works better with highly temporally aggregated data. The practitioners, therefore, may revert to using annual data for Granger causality testing instead of looking for quarterly, monthly or weekly data. The method is illustrated through three applications.

« Back...

Extreme value analysis of Taiwan stock market
Jin-Lung (Henry) Lin, Academia Sinica, National Taiwan University

Extreme quantiles assess the probability of occurrence of a very large or small value and are essential components of risk management. Conventional extreme value analysis focuses on the asymptotic behavior of the maximum (or minimum) of an independent and identically distributed random sample. While the maximal and minimal value are important quantities, other large or small observations are also important as they also have a huge impact on risk management. The modern extreme value theory takes these observations into consideration and focuses on the analysis of exceedances over some high thresholds. Exceedance times and the excesses are modeled simultaneously; see Davidson and Smith (1990) and the reference therein.

This paper employs the new extreme value approach to analyze selected returns from the Taiwan Stock Exchange. We study two return series, Taiwan Stock Exchange Weighted Index (TAIEX) and the return series of Taiwan Semiconductor Manufacturing (TSM) stock. In this study, our goals are twofold. First, we study the behavior of extreme value theory when the daily price limit is ignored. Second, we identify variables that can explain the extreme movements of Taiwan Stock Exchange as described by parameters of the intensity function of the extreme value theory. In particular, we examine the impact of U.S. Stock Markets on the extreme value of Taiwan Stock Market.

The empirical analysis confirms the effect of extreme values of U.S. market on Taiwan market. There appears to have certain extreme-value spillover effect from the U.S. market to Taiwan market, especially from the high-tech dominated NASDAQ market. We also find that the daily price limit makes estimation of extreme value parameters difficult, even with explanatory variables. Domestic explanatory variables such as duration from the prior extreme events, time trend, volatility indicator, and trading behavior of the previous trading day all have some effects on the intensity of exceedance for the positive returns. The effect on the negative returns does not show any clear pattern. Finally, we find that the discreteness of the price of TSM stock, which is due to the tick size constraint, increases substantially the complexity in our study.

« Back...

System identifiation general aspects and structure
Manfred Deistler, Technische Universität Wien

The art of identi cation is to nd a good model from noisy data: Data driven modeling. This is an important problem in many elds of application. Systematic approaches: Statistics, System Theory, Econometrics, Inverse Problems.

« Back...

A multivariate threshold GARCH model with time-varying correlations
Wai Keung Li, The University of Hong Kong

In this article, a Multivariate Threshold Generalized Autoregressive Conditional Heteroscedasticity model with time-varying correlation (VC-MTGARCH) is proposed. The model extends the idea of Engle (2002) and Tse & Tsui (2002) in a threshold framework. This model retains the interpretation of the univariate threshold GARCH model and allows for dynamic conditional correlations. Extension of Bollerslev, Engle and Wooldridge (1988) in a threshold framework is also proposed as a by-product. Techniques of model identification, estimation and model checking are developed. Some simulation results are reported on the finite sample distribution of the maximum likelihood estimate of the VC- MTGARCH model. Real examples demonstrate the asymmetric behavior of the mean and the variance in financial time series and that the VC-MTGARCH model can capture these phenomena.

« Back...

Affine processes with financial applications
Christian Gourieroux, University of Toronto

Lecture 1:

  • Definition of affine processes: conditional Laplace transform,prediction formulas,examples.
  • Reference:Darolles,Gourieroux,Jasiak:Compound Laplace Transform and Compound Autoregressive Models

Lecture 2:

  • Autoregressive Gamma Process and Liquidity Analysis Definition of ARG process,CIR process,Prediction formula,long memory feature,application to intertrade duration model
  • Reference:Gourieroux,Jasiak:Autoregressive Gamma Process

Lecture 3:

  • Analysis of Realized Volatility Wishart Autoregressive process,prediction at various horizons,factor representation,estimation,application to realized volatility
  • Reference:Gourieroux,Jasiak,Sufana:Wishart Autoregressive Model for Stochastic Volatility

Lecture 4:

  • Affine Term Structure Models The affine models will be used for discussing term structure models,both for T-bonds and corporate bonds
  • Reference:Gourieroux,Monfort,Polimenis:Affine Term Structure Models Gourieroux,Monfort,Polimenis:Affine Model for Credit Risk

« Back...

Dynamic leverage and threshold effects in stochastic volatility models
Michael McAleer, University of Western Australia

In this paper we examine two methods for modelling asymmetries, namely dynamic leverage and threshold effects, in Stochastic Volatility (SV) models, one based on the threshold effects (TE) indicator function of Glosten, Jagannathan and Runkle (1992), and the other on dynamic leverage (DL), or the negative correlation between the innovations in returns and volatility. A general dynamic leverage threshold effects (DLTE) SV model is also used to enable non-nested tests of the two asymmetric SV models against each other to be calculated. The three SV models are estimated by the Monte Carlo likelihood method proposed by Sandmann and Koopman (1998), and the finite sample properties of the estimator are investigated using numerical simulations. Four financial time series are used to estimate the SV models, with empirical asymmetric effects found to be statistically significant in each case. The empirical results for S&P 500, TOPIX and Yen/USD returns indicate that dynamic leverage dominates the threshold effects model for capturing asymmetric behaviour, while the results for USD/AUD returns show that both the non-nested dynamic leverage and threshold effects models are rejected against each other. For the four data series considered, the dynamic leverage model dominates the threshold effects model in capturing asymmetric effects. In all cases, there is significant evidence of asymmetries in the general DLTE model.

« Back...

A continuous-time measurement of the buy-sell pressure in a limit order book market
Nikolaus Hautsch, Institute of Economics, University of Copenhagen

In this paper, we investigate the buy and sell arrival process in a limit order book market. Using an intensity framework allows to estimate the simultaneous buy and sell intensity and to derive a continuous-time measure for the buy-sell pressure in the market. Based on limit order book data from the Australian Stock Exchange (ASX), we show that the buy-sell pressure is particularly influenced by recent market and limit orders and the current depth in the ask and bid queue. We find evidence for the hypothesis that traders use order book information in order to infer from the price setting behavior of market participants. Furthermore, our results indicate that the buy-sell pressure is clearly predictable and is a significant determinant of trade-to-trade returns and volatility.

« Back...

A composite leading economic indicator for the Philippines
Lisa Grace S. Bersales, University of the Philippines

In 2002, the National Statistical Coordination Board of the Philippines called for the enhancement of its Leading Economic Indicator System. As a result, the following procedure was developed:

  1. Seasonally adjust each series using TRAMO-SEATS to obtain the TREND-CYCLE of each of the leading indicators and the non-agriculture GVA.
  2. For each series mentioned in step 1, remove the trend component from the TREND-CYCLE using the Hodrick Prescott filter. This will produce the CYCLE.
  3. Standardize the CYCLES by using:
    Standardized CYCLE=(CYCLE-mean(CYCLE))/(standard deviation(CYCLE))
  4. Construct the cross-correlogram of the the CYCLE of each indicator with the cycle of the non-agriculture GVA to obtain the lead period. The lead period determines the lagged indicator CYCLE series to be used in the construction of the composite indicator.
  5. Forecast missing values using appropriate Exponential Smoothing procedures.
  6. The index is computed as the linear combination of the CYCLES of the indicators with the following as weights- regression coefficients of the past values of the CYCLES of the indicators with the CYCLE of the non-agriculture GVA.

The differences of this procedure with the previous one used are: TRAMO-SEATS is used in seasonal adjustment instead of X11-ARIMA, Hodrick-Prescott Filter is used instead of trend models in detrending, the cross-correlogram is used instead of the matrix of simple correlations between current and lagged series values, standardization is introduced, unavailability of timely data is addressed by forecasting unavailable data using exponential smoothing procedures, weights used in constructing the composite indicator use partial correlations instead of simple correlations between the reference series and the respective lagged indicator series. The reference series is the Non-agriculture Gross Value Added while the indicator series used are: Consumer Price Index, Electric Energy Consumption, Exchange Rate, Hotel Occupancy Rate, Money Supply, Number of New Business Incorporations, Stock Price Index, Terms of Trade Index, Total Imports, Tourist/Visitor Arrivals, and Wholesale Price Index.

Empirical analysis was done using quarterly data from the first quarter of 1981 to the first quarter of 2003. Analysis includes the comparison of the composite indicator used in the procedure with two other composite indicators:

  1. composite indicator is the sum of all indicator series cycles; and,
  2. composite indicator is the weighted sum of the indicator series cycles with weights: simple correlations between the cycle of the reference series and the cycle of the appropriate lagged indicator.

The latter composite indicator and the composite indicator in the suggested procedure may be viewed as indicators following the optimal combination of forecasts/estimators and, thus, are expected to perform better than the former composite indicator in terms of forecasting the turning points of the cycle of Non-agriculture Gross Value Added.

« Back...